Moderation Policy
Last updated: December 1, 2025
(European Union Digital Services Act, Regulation (EU) 2022/2065)
1. General Provisions
This Moderation Policy (the "Policy") governs the processes of reviewing, moderating, and removing content on the platform (the "Platform") and establishes the rights and obligations of users in accordance with EU Regulation 2022/2065 (Digital Services Act, hereinafter "DSA").
2. Types of content subject to moderation
Any user content is subject to moderation, including:
- reviews and ratings;
- posts, photos, videos;
- comments;
- events;
- profiles of establishments and users.
3. Prohibited content
Content that violates EU law, national law, and the Platform's rules, in particular:
- illegal content (hate speech, threats, extremism, pornography, copyright infringement, etc. — DSA Art. 3(h));
- impersonating another person;
- violation of intellectual property rights;
- posting personal data of third parties without their consent;
- advertising illegal goods or services;
- defamation, insults, harmful information about establishments or users;
- content that misleads users.
4. Forms of moderation
The platform uses:
- preliminary automatic verification (AI filters, algorithms);
- manual moderation;
- reactive measures in response to user complaints;
- proactive monitoring permitted by the DSA.
5. Illegal content notification system (Notice & Action)
In accordance with DSA Art. 16, each user can report a violation using a special form.
The notification must contain:
- URL/ID of the content;
- description of the violations;
- evidence (if available).
The platform confirms receipt of the notification and informs the user of its decision.
6. Platform moderation actions
The platform may:
- delete or block content;
- restrict the display of content;
- temporarily restrict user functionality;
- delete accounts in case of systematic violations.
7. Right to appeal (DSA Art. 17)
Every user has the right to appeal any moderation decision. The review period is 14 days.
The appeal is available in the user's account.
8. Priority flaggers (Trusted Flaggers, Art. 22 DSA)
The platform cooperates with authorized organizations whose complaints are reviewed on an expedited basis.
9. Account suspension (Art. 23 DSA)
The platform has the right to temporarily suspend accounts that systematically post illegal content or knowingly false complaints.
10. Transparency of moderation (Art. 15 & 42 DSA)
The Platform publishes an annual moderation report:
- number of complaints;
- actions taken to remove content;
- number of appeals and results;
- sanction statistics.
For questions about content moderation, please contact help@plattr.me.
See also: Public Offer | Privacy Policy