Look n Look maintains a zero-tolerance policy toward Child Sexual Abuse and Exploitation (CSAE). We are committed to creating a safe environment for all users, with special emphasis on the protection of minors. Any content, behavior, or activity that sexually exploits or endangers children is strictly prohibited and will result in immediate action.
1. Our Commitment
Look n Look is firmly committed to the safety and protection of children. We recognize that online platforms carry a responsibility to actively prevent, detect, and respond to any form of child sexual abuse and exploitation. Our commitment includes:
- Zero Tolerance: We maintain an absolute zero-tolerance policy toward any content that depicts, promotes, encourages, or facilitates the sexual abuse or exploitation of children.
- Proactive Prevention: We invest in technology, processes, and personnel dedicated to preventing CSAE on our platform.
- Cooperation with Authorities: We actively cooperate with law enforcement agencies, the National Center for Missing & Exploited Children (NCMEC), the Internet Watch Foundation (IWF), and other relevant organizations worldwide.
- Continuous Improvement: We continuously review and improve our safety measures to stay ahead of evolving threats to child safety.
2. Prohibited Content and Conduct
The following content and conduct are strictly prohibited on Look n Look:
2.1 Prohibited Content
- Any visual depiction of a minor engaged in sexually explicit conduct, including photographs, videos, or computer-generated images (Child Sexual Abuse Material / CSAM).
- Content that sexualizes minors in any way, including inappropriate or suggestive portrayals of children.
- Content that depicts, promotes, or glorifies child nudity in a sexual context.
- Content that normalizes or trivializes sexual abuse or exploitation of minors.
- Content that promotes or provides instructions for child grooming or exploitation.
- Suggestive, sexual, or provocative content featuring or involving minors in any capacity.
- AI-generated or digitally manipulated imagery that depicts minors in sexually explicit situations.
2.2 Prohibited Conduct
- Grooming or attempting to establish inappropriate relationships with minors through the platform.
- Soliciting, sharing, or distributing CSAM or any exploitative content involving minors.
- Using the platform to contact minors for the purpose of sexual exploitation.
- Sharing personal information of minors without parental or guardian consent.
- Encouraging or coercing minors to create sexual or inappropriate content.
- Sextortion or any form of blackmail or coercion targeting minors.
- Human trafficking of minors in any form.
- Any attempt to circumvent our child safety measures.
3. Detection and Prevention Measures
Look n Look employs a multi-layered approach to detect and prevent CSAE on our platform:
3.1 Technology-Based Detection
- Hash-Matching Technology: We use PhotoDNA and other hash-matching technologies to detect known CSAM by comparing uploaded content against databases of known illegal imagery maintained by NCMEC and other organizations.
- AI and Machine Learning: We deploy artificial intelligence and machine learning models to proactively detect and flag potentially harmful content involving minors, including previously unknown material.
- Automated Content Scanning: All user-uploaded content is automatically scanned before it becomes publicly available on the platform.
- Behavioral Analysis: We monitor usage patterns and behavioral signals that may indicate grooming, exploitation, or other harmful activities targeting minors.
3.2 Human Review
- Trained Moderators: Our content moderation team includes members specifically trained in identifying CSAE, grooming behaviors, and other threats to child safety.
- Escalation Procedures: We have clear escalation procedures for content flagged as potentially involving CSAE, ensuring rapid review and action.
- Specialist Review: Complex cases are reviewed by specialized child safety personnel with expertise in online child exploitation.
3.3 Platform Design Safeguards
- Age Restrictions: Look n Look requires users to be at least 13 years of age to create an account. Users under 18 must have parental or guardian consent.
- Limited Contact with Minors: We implement restrictions on direct messaging and interactions between adult users and identified minor users.
- Location Privacy for Minors: Enhanced location privacy protections are applied for users identified as minors, including restrictions on precise location sharing.
- Safe Search and Feed Filtering: Our content recommendation algorithms are designed to prevent surfacing inappropriate content to younger users.
4. Reporting Mechanisms
We provide accessible and easy-to-use mechanisms for reporting suspected CSAE:
4.1 In-App Reporting
- Every piece of content on Look n Look has a Report option that allows users to flag content they believe involves the exploitation of minors.
- Reports related to child safety are prioritized and reviewed on an urgent basis.
- Users can report suspicious accounts, messages, and behaviors through the same reporting mechanism.
4.2 Direct Reporting
You can report CSAE concerns directly to our safety team:
Email: indiansparrow222@gmail.com
Subject Line: Use "CSAE Report" or "Child Safety Concern" for priority handling.
All reports are treated as urgent and are reviewed within 24 hours.
4.3 External Reporting
We encourage users to also report suspected child exploitation to relevant authorities:
- NCMEC CyberTipline: www.missingkids.org/gethelpnow/cybertipline (for reporting within the United States)
- Internet Watch Foundation (IWF): www.iwf.org.uk (for reporting globally)
- Local Law Enforcement: Contact your local police or law enforcement agency to report suspected child abuse or exploitation.
- India: Report to the Cyber Crime Portal at cybercrime.gov.in or call the Childline helpline at 1098.
5. Enforcement and Consequences
Look n Look takes swift and decisive action against any violation of our child safety standards:
5.1 Immediate Actions
- Content Removal: Any content identified as CSAM or otherwise harmful to children is immediately removed from the platform.
- Account Suspension: Accounts involved in creating, distributing, or possessing CSAM are immediately and permanently suspended.
- Content Preservation: Relevant content is preserved and documented for law enforcement purposes before removal from public view.
- Law Enforcement Reporting: All confirmed or suspected CSAM is reported to NCMEC through the CyberTipline and to relevant local law enforcement agencies, as required by law.
5.2 Permanent Consequences
- Users found to have violated our child safety policies are permanently banned from Look n Look with no possibility of appeal.
- We take steps to prevent banned users from creating new accounts.
- We may share information about banned users with other platforms and organizations to prevent further abuse.
6. Employee and Contractor Standards
We hold our team to the highest standards when it comes to child safety:
- Background Checks: All employees and contractors who may have access to user content or data undergo thorough background checks.
- Mandatory Training: All team members receive mandatory training on recognizing and reporting CSAE, updated annually.
- Wellness Support: Employees involved in content moderation and child safety have access to wellness and mental health support services.
- Code of Conduct: Our internal code of conduct explicitly prohibits any employee from engaging in or tolerating CSAE in any form.
7. Data Handling and Privacy for Minors
We implement enhanced data protections for minor users:
- We collect the minimum amount of personal information necessary from minor users.
- We do not use data from minor users for targeted advertising.
- Location data for minor users is subject to enhanced privacy protections.
- We comply with the Children's Online Privacy Protection Act (COPPA), the General Data Protection Regulation (GDPR) provisions for children, and applicable Indian data protection laws.
- Parental controls and account management features are available for parents and guardians.
8. Partnerships and Collaboration
We actively collaborate with organizations dedicated to child safety:
- NCMEC (National Center for Missing & Exploited Children): We report all identified CSAM through the CyberTipline as required by U.S. federal law.
- IWF (Internet Watch Foundation): We use IWF's hash lists to identify and block known CSAM.
- Tech Coalition: We support and align with the voluntary principles of the Technology Coalition to combat online child sexual exploitation and abuse.
- Law Enforcement: We maintain responsive channels for cooperation with law enforcement agencies worldwide in investigating and prosecuting child exploitation.
9. Transparency and Accountability
We believe in transparency regarding our child safety efforts:
- We publish regular updates on our child safety measures and their effectiveness.
- We track and report key metrics related to CSAE detection, removal, and reporting.
- We engage with external stakeholders, including child safety organizations and policy makers, to continuously improve our standards.
- We participate in industry-wide initiatives and information-sharing efforts to combat CSAE.
10. User Education and Awareness
We are committed to educating our user community about child safety:
- Safety Resources: We provide in-app safety resources and guidance for parents, guardians, and young users.
- Recognizing Grooming: We educate users on the signs of grooming and how to protect themselves and others.
- Reporting Awareness: We regularly promote awareness of our reporting tools and encourage users to report suspicious activity.
- Digital Literacy: We support initiatives that promote digital literacy and online safety for children and young people.
11. Updates to This Policy
We regularly review and update our child safety standards to reflect evolving best practices, legal requirements, and technological capabilities. Any material changes to this policy will be posted on this page with an updated effective date.
We welcome feedback on our child safety measures. If you have suggestions for how we can improve our efforts, please contact us.
12. Contact Us
If you have any questions about our child safety standards, wish to report a concern, or need assistance, please contact us at:
Email: indiansparrow222@gmail.com
App: Look n Look
Child Safety Reports: Use subject line "CSAE Report" for priority handling within 24 hours.
If a child is in immediate danger, please contact your local law enforcement or emergency services immediately. In India, call 112 (emergency) or 1098 (Childline). In the United States, call 911 or the NCMEC hotline at 1-800-843-5678.