Analytics Moderation Dark Pn Monitor your community’s health, moderate content effectively, and manage your social.plus applications with powerful administrative tools. This section covers three essential areas for application management and community safety.

Key Capabilities

Community Moderation

  • Content Moderation: Review, approve, and manage user-generated content
  • User Management: Handle user behavior, roles, and access controls
  • AI-Powered Tools: Automated content filtering and detection
  • Reporting Systems: Comprehensive flagging and review workflows

Analytics & Insights

  • User Activity: Track engagement, participation, and growth metrics
  • Content Analytics: Monitor post performance, interactions, and trends
  • Moderation Reports: Review flagging patterns and moderator performance
  • Custom Dashboards: Create tailored views for your specific needs

Moderation Activity Reports

Comprehensive audit trail system that tracks all moderation activities across your social.plus applications, providing transparency and accountability for content moderation decisions.

Activity Tracking

Monitor all moderation actions including content flags, deletions, user bans, and community removals

Export & Analysis

Download detailed CSV reports for compliance auditing and moderation effectiveness analysis

What’s Tracked

The moderation activity report captures all key moderation events:
  • Content Actions: Post/comment flagging, deletion, and flag clearing
  • User Management: Global bans, community bans, and user removals
  • Message Moderation: Chat message flags, deletions, and AI detection events
  • Community Events: Member bans, removals, and role changes
  • AI Detection: Automated AI moderation, topic detection, and PII identification

Report Data Structure

Each report entry includes comprehensive metadata in a structured format:
FieldDescription
network_idYour network identifier
user_idPublic ID of the content creator
eventType of moderation event
noteReason or note for the action
content_typeType of content moderated
content_public_idUnique identifier for the content
target_typeContext type where action occurred
target_idTarget identifier
is_aiWhether action was automated
created_atWhen the event occurred
updated_atWhen the record was last updated
metadataAdditional context and actor information

Key Fields

Enabling Reports

Feature Activation Required: Moderation activity reports must be enabled before data collection begins. Historical data before activation is not available.
To enable moderation activity reports for your network:
Use the Update Moderation Network Settings API to enable reporting:
{
  "isReportEnabled": true
}

Use Cases

Compliance Auditing

Generate audit trails for regulatory compliance and internal policy reviews

Moderator Performance

Analyze moderator activity patterns and decision-making effectiveness

AI vs Human Actions

Compare automated moderation performance against human moderator decisions

Community Health

Track moderation trends to identify community health patterns and issues

Application Management

  • Multi-Application Support: Manage up to 10 applications per organization
  • Regional Configuration: Set up applications in different geographic regions
  • Billing & Subscriptions: Monitor usage, MAUs, and subscription plans
  • Team Management: Collaborate with team members and assign roles

Getting Started

1

Access Your Console

Navigate to your Admin Console through the Admin Portal to begin managing your applications.
2

Configure Moderation

Set up content moderation rules, user roles, and community guidelines for your applications.
3

Monitor Analytics

Review user engagement, content performance, and community health metrics.
4

Integrate APIs

Implement server-to-server APIs for advanced administrative automation.

Common Workflows

Integration with SDK

The analytics and moderation tools work seamlessly with your social.plus SDK implementation:
  • Console Settings automatically apply to SDK behavior
  • API Actions can be triggered by SDK events and webhooks
  • Moderation Rules configured in console affect SDK content validation
  • Analytics Data reflects all SDK-driven user activities

Best Practices

Privacy & Compliance: Always ensure your moderation and analytics practices comply with relevant privacy laws and platform policies. Maintain transparency with users about data collection and moderation actions.
  • Clear Policies: Establish and communicate clear community guidelines
  • Balanced Automation: Combine AI tools with human judgment for best results
  • Regular Review: Continuously evaluate and improve your moderation processes
  • Data-Driven Decisions: Use analytics to inform feature and policy changes
  • User Feedback: Maintain channels for user feedback on moderation decisions

Next Steps

Choose your area of focus to dive deeper into social.plus administration: