Have you ever wondered how your community can stay safe and positive? The answer lies in content moderation. This article explains how the content moderation feature works within Locals.

At Locals, our mission is to foster safe, respectful, and authentic offline social connections. To support this goal, we apply structured moderation practices to detect and address harmful or inappropriate behavior and content across the Locals Services. This Moderation Policy outlines how we monitor content and user behavior, what tools we use, the types of enforcement actions we may take, and how users can respond to moderation decisions.

I. Moderation Overview

Locals employs a multi-layered approach to platform moderation that includes:

Our moderation applies to all areas of the Locals Services, including profiles, Clubs, Events, comments, messages, and media uploads.

II. Proactive Monitoring and Behavioral Analysis

To ensure platform integrity, Locals may proactively review content using using a combination of automated tools, behavioral analysis and human moderation. Unlike a purely reactive system, our proactive approach aims to identify and neutralize content that violates our guidelines before it gains wide visibility. Our moderation systems include:

Note: Proactive moderation is particularly applied to activities with public visibility or monetization features, such as Paid Events, Club Memberships, and mass user interactions.

III. Reporting and Community Flagging

We believe that community members are essential partners in maintaining a healthy environment. We empower all Users – including Creators, Participants, admins and internal moderators – to report content that violates our Terms of Use or Community Guidelines. Reports can be made via in-app tools or designated contact channels.

Manual reporting rule