Trust & Safety / Moderation Policy

Effective date: April 20, 2026   ·   Last updated: April 20, 2026

1. Purpose

This Trust & Safety / Moderation Policy explains the standards, reporting channels, moderation mechanisms, complaint processes, and enforcement framework that apply to content and behavior on holonet.world.

This Policy complements the Terms & Conditions. In the event of conflict, the Terms govern the contractual relationship, while this Policy describes Holonet's operational safety and moderation framework.

2. Scope

This Policy applies to:

  • User Content;
  • behavior within Worlds and platform spaces;
  • creator activity;
  • reports, notices, complaints, and appeals;
  • moderation decisions, account restrictions, and related enforcement actions.

3. Core Rules

Users must not use the Platform to engage in unlawful or prohibited activity, including:

  • illegal content or conduct;
  • infringement of intellectual property or other rights;
  • harassment, stalking, abuse, threats, hateful conduct, or intimidation;
  • sexual exploitation, child safety violations, or grooming behavior;
  • fraud, impersonation, deception, or scams;
  • malicious technical activity, including malware, exploit abuse, credential theft, or unauthorized access;
  • evasion of enforcement systems, safety controls, or access restrictions;
  • manipulation of trust, reporting, moderation, monetization, or platform integrity systems.

Holonet may publish additional product-specific content rules, creator standards, advertising rules, or child-safety requirements.

4. Reporting Illegal Content or Policy Violations

Users and third parties may report content, conduct, or account behavior that may violate law or platform policy.

Reports may be submitted through platform tools or by email at report@holonet.world.

5. Notice-and-Action Mechanism

Holonet provides a mechanism that allows any individual or entity to notify Holonet of specific information believed to constitute illegal content.

A notice is more likely to be processed efficiently where it includes:

  • a reasoned explanation of why the content is alleged to be illegal;
  • the exact electronic location of the content, such as a URL, asset identifier, account reference, or in-platform location;
  • the name and electronic contact details of the notifier, except where anonymity is justified or legally appropriate;
  • a statement confirming a good-faith belief that the information submitted is accurate and complete.

Holonet may request additional information where reasonably necessary to assess a notice.

6. How Holonet Reviews Reports

Holonet may review reports and notices using a combination of:

  • user reports;
  • rights-holder submissions;
  • trusted flagger submissions where applicable;
  • automated detection and triage systems;
  • human review.

Holonet aims to assess matters in a timely, diligent, and proportionate manner, taking into account the seriousness of the alleged harm, legal requirements, context, available evidence, and the rights and interests of affected users.

7. Possible Moderation Measures

Where Holonet identifies unlawful content, policy violations, abuse, safety risks, or integrity threats, Holonet may take one or more actions, including:

  • removing or disabling access to content;
  • reducing visibility or discoverability;
  • age-gating, warning-labeling, geoblocking, or access-limiting content;
  • restricting interactions, publishing rights, creator features, API access, or monetization features;
  • temporarily suspending or permanently terminating accounts;
  • preserving evidence where appropriate;
  • making referrals to authorities where legally required or justified.

Holonet may choose measures proportionate to the severity, recurrence, intent, context, and impact of the conduct.

8. Statement of Reasons

Where required by applicable law or where Holonet determines it is appropriate, Holonet may provide the affected user with a statement of reasons for significant moderation decisions, including where content is removed, disabled, restricted, demonetized, or where an account is suspended or terminated.

Such a statement may include, where appropriate:

  • the nature of the action taken;
  • the facts or circumstances relied upon;
  • whether the action was based on law, policy, or both; and
  • information about available complaint or review options.

Holonet may withhold or limit details where disclosure would be unlawful, would compromise an investigation, would create security risks, or would facilitate abuse, evasion, fraud, or harm.

9. Internal Complaints and Appeals

Users may challenge eligible moderation decisions by submitting an internal complaint or appeal through designated platform functionality or by contacting report@holonet.world.

Complaints should be submitted within a reasonable time after the user becomes aware of the decision.

Holonet will review eligible complaints in a timely and reasonably objective manner. Where legally required, decisions producing significant effects will not be based solely on automated processing.

Submitting a complaint does not guarantee reversal of a decision.

10. Out-of-Court Dispute Settlement

Where applicable law grants users the right to seek out-of-court dispute settlement in relation to moderation decisions, users may choose a certified dispute settlement body where available.

Holonet will engage with such mechanisms to the extent required by applicable law.

11. Trusted Flaggers

Where Holonet receives notices from entities designated as trusted flaggers under applicable law, Holonet may prioritize the processing of such notices.

Priority treatment does not guarantee any particular result. Holonet will continue to assess notices in accordance with law, evidence, and proportionality.

12. Measures Against Misuse

Holonet may take proportionate action against users or third parties who misuse reporting, complaint, publication, or safety systems, including where they:

  • frequently provide manifestly illegal content;
  • repeatedly submit manifestly unfounded notices or complaints;
  • abuse appeals, reports, or identity systems in bad faith;
  • engage in enforcement evasion or repeated policy circumvention.

Measures may include warnings, rate limits, reporting restrictions, publication restrictions, feature limitations, temporary suspensions, or account termination.

13. Use of Automated Tools

Holonet may use automated tools to assist in identifying, classifying, filtering, prioritizing, or detecting potentially unlawful, harmful, abusive, deceptive, or policy-violating content or behavior.

Automated tools may support moderation, risk scoring, ranking adjustments, spam detection, fraud detection, and abuse prevention.

Where required by law, users may request human review of significant decisions.

14. Transparency

Holonet may publish transparency reports or equivalent disclosures regarding moderation practices, including aggregate information concerning:

  • notices received;
  • categories of alleged illegal content;
  • actions taken;
  • complaint volumes and outcomes;
  • use of automated moderation tools;
  • relevant abuse-prevention metrics.

Holonet may adjust the content, format, and frequency of such reporting based on legal obligations, operational needs, and platform maturity.

15. Intellectual Property Complaints

Holonet may provide specific channels or workflows for intellectual property complaints, including copyright or trademark notices.

A claimant should provide sufficient information to identify the protected right, the allegedly infringing content, the basis of the complaint, and the requested action.

Holonet may remove or disable access to allegedly infringing content, seek counter-information, or take other appropriate action.

Repeat infringement may result in escalating enforcement measures, including suspension or termination.

16. User Safety and Emergency Cooperation

Holonet may act urgently where content or conduct presents a risk of serious harm, including threats to life, child safety risks, exploitation, severe harassment, fraud, violence, or platform security attacks.

In appropriate circumstances, Holonet may preserve evidence or cooperate with authorities consistent with applicable law.

17. Fundamental Rights and Proportionality

In applying this Policy, Holonet seeks to take account of relevant rights and interests, including freedom of expression, access to information, privacy, platform safety, creator autonomy, and the need for proportionate enforcement.

Holonet does not guarantee error-free moderation and may revise moderation outcomes where new information becomes available.

18. Changes to This Policy

Holonet may update this Policy from time to time to reflect legal requirements, operational practices, safety risks, product changes, or improvements to moderation systems.

Material changes may be communicated through the Platform where appropriate.

19. Contact Points