📅 Effective: December 28, 2025 🇪🇺 DSA Compliant

Moderation & Enforcement Policy

How we review content, enforce our guidelines, and protect your rights under the EU Digital Services Act (DSA).

Key Principles
  • All moderation decisions are lawful, proportionate, and transparent
  • You receive clear explanations for any action taken
  • You have the right to appeal any moderation decision
  • Human review is required for significant account restrictions

1. Core Principles

Our moderation operates on four foundational values:

2. Reportable Content

2.1 What We Review

  • User-generated content (posts, comments, messages)
  • Profile information and images
  • Feature usage patterns
  • Technical behavior (e.g., automation, scraping)

2.2 How Content is Flagged

  • User Reports: Reports submitted through our tools
  • Automated Detection: Systems flagging potential violations
  • Legal Obligations: Government or authority requests
  • Trusted Flaggers: Reports from designated trusted entities

3. Review Process

3.1 Submitting Reports

Reports can be submitted through in-product reporting tools or by contacting support. Include:

  • Location of the content (URL or description)
  • Reason for the report
  • Any additional context

3.2 Review Methodology

Trained moderation staff and automated systems (with human oversight) assess content against:

  • Applicable law (especially illegal content)
  • User Agreement and Terms of Service
  • Community Guidelines
  • Context and intent
  • Account history

3.3 Timelines

4. Enforcement Actions

Actions are proportionate to violation severity:

5. Statements of Reasons (DSA)

When we take action on content or accounts, we provide a statement containing:

  • Action taken: What we did (removal, restriction, etc.)
  • Factual circumstances: What triggered the review
  • Applicable rules: Which policy or law was violated
  • Reasoning: How we reached our decision
  • Appeal procedures: How to challenge the decision
  • Automation disclosure: Whether automated tools were used

6. Appeals & Redress

6.1 Internal Appeals

You may appeal any moderation decision within 14 days:

  • Use the designated appeals tool or contact support
  • Explain why you believe the decision was incorrect
  • Provide any additional context or evidence

6.2 Appeal Review

  • Appeals are reviewed by qualified personnel not involved in the original decision
  • We aim to complete reviews within 14 business days
  • You will receive a reasoned decision

6.3 Out-of-Court Resolution

If you disagree with our appeal decision, you may access:

  • Certified DSA Dispute Bodies: EU-certified out-of-court dispute settlement bodies
  • Netherlands Authority: Autoriteit Consument & Markt (ACM) - www.acm.nl

7. Automation Safeguards

7.1 How We Use Automation

Automated tools assist with:

  • Spam and malware detection
  • Fraud prevention
  • Illegal content identification
  • Pattern recognition for policy violations

7.2 Human Oversight

We require human review before:

  • Account terminations
  • Significant account restrictions
  • Complex content decisions
  • Appeals of automated decisions

8. EUnify-Specific Moderation

🎓
EUnify Community Notes
Moderation process for user-submitted content

8.1 Review Process

  1. Submission: User submits community note or correction
  2. Queue: Content enters moderation queue
  3. Review: Moderator evaluates submission
  4. Decision: Approve, request changes, or reject
  5. Notification: User informed of outcome
  6. Publication: Approved content goes live

8.2 Review Criteria

  • Factual accuracy (verifiable information)
  • Relevance to the institution/program
  • Compliance with Community Guidelines
  • Appropriate tone and language

8.3 Timeline

Community notes are typically reviewed within 5-10 business days.

Moderation Contact
Report Content support@vaicat.com
Appeals Use in-app appeals or email support
Dutch Authority (ACM) www.acm.nl