How Moderation Works

Last updated: 2026-04-09

How Moderation Works

Smara uses a combination of community moderation and platform-level review to keep the app safe and respectful for all devotees.

Community-Level Moderation

Each Sangam group has its own moderation team:

  • Admins and Moderators can remove posts, mute members, and approve join requests for private Sangams.
  • Community moderators are volunteers appointed by the Sangam admin.
  • They follow Smara's Community Guidelines and may set additional rules specific to their Sangam.

Platform-Level Moderation

Smara's trust and safety team handles issues that go beyond individual Sangam moderation:

  • Reported content — When users report posts, comments, or Sangam groups, the reports are reviewed by the Smara team.
  • Account-level actions — For serious or repeated violations, the Smara team may restrict or suspend user accounts. See Account Restriction or Suspension.
  • Content review — Content that violates community guidelines is removed.

What Happens When You Report

  1. You submit a report via the in-app report feature.
  2. The report is logged and queued for review.
  3. A moderator (community or platform level, depending on severity) reviews the content.
  4. Appropriate action is taken — content removal, user warning, mute, or account restriction.
  5. Your identity is kept confidential throughout the process.

Appeals

If you believe a moderation action against you was incorrect, see Appeal a Moderation Action.

Was this article helpful?

Cannot find what you are looking for? Reach out to our support team.

Contact Support