
social.plus Console
Administrative dashboard for monitoring and moderating your social.plus applications
APIs & Services
Server-to-server APIs for programmatic administration and moderation
social.plus Portal
Application lifecycle management, billing, and organizational settings
Key Capabilities
Community Moderation
- Content Moderation: Review, approve, and manage user-generated content
- User Management: Handle user behavior, roles, and access controls
- AI-Powered Tools: Automated content filtering and detection
- Reporting Systems: Comprehensive flagging and review workflows
Analytics & Insights
- User Activity: Track engagement, participation, and growth metrics
- Content Analytics: Monitor post performance, interactions, and trends
- Moderation Reports: Review flagging patterns and moderator performance
- Custom Dashboards: Create tailored views for your specific needs
Moderation Activity Reports
Comprehensive audit trail system that tracks all moderation activities across your social.plus applications, providing transparency and accountability for content moderation decisions.Activity Tracking
Monitor all moderation actions including content flags, deletions, user bans, and community removals
Export & Analysis
Download detailed CSV reports for compliance auditing and moderation effectiveness analysis
What’s Tracked
The moderation activity report captures all key moderation events:- Content Actions: Post/comment flagging, deletion, and flag clearing
- User Management: Global bans, community bans, and user removals
- Message Moderation: Chat message flags, deletions, and AI detection events
- Community Events: Member bans, removals, and role changes
- AI Detection: Automated AI moderation, topic detection, and PII identification
Report Data Structure
Each report entry includes comprehensive metadata in a structured format:Field | Description |
---|---|
network_id | Your network identifier |
user_id | Public ID of the content creator |
event | Type of moderation event |
note | Reason or note for the action |
content_type | Type of content moderated |
content_public_id | Unique identifier for the content |
target_type | Context type where action occurred |
target_id | Target identifier |
is_ai | Whether action was automated |
created_at | When the event occurred |
updated_at | When the record was last updated |
metadata | Additional context and actor information |
Key Fields
Event Types
Event Types
post.flagged
,post.deleted
,post.flagsCleared
comment.flagged
,comment.deleted
,comment.flagsCleared
v5.message.flagged
,v5.message.deleted
,v5.message.flagCleared
community.userBanned
,community.userRemoved
user.didGlobalBan
,user.flagCleared
Content Information
Content Information
content_type
: Type of content (post, comment, message, community, user)content_public_id
: Unique identifier for the contenttarget_type
&target_id
: Context where the action occurred
Actor & Timing
Actor & Timing
user_id
: Public ID of the content creatoris_ai
: Whether the action was performed by AI automationcreated_at
&updated_at
: Timestamps for audit trailmetadata
: Additional context including actor information
Enabling Reports
Feature Activation Required: Moderation activity reports must be enabled before data collection begins. Historical data before activation is not available.
Use the Update Moderation Network Settings API to enable reporting:
Use Cases
Compliance Auditing
Generate audit trails for regulatory compliance and internal policy reviews
Moderator Performance
Analyze moderator activity patterns and decision-making effectiveness
AI vs Human Actions
Compare automated moderation performance against human moderator decisions
Community Health
Track moderation trends to identify community health patterns and issues
Application Management
- Multi-Application Support: Manage up to 10 applications per organization
- Regional Configuration: Set up applications in different geographic regions
- Billing & Subscriptions: Monitor usage, MAUs, and subscription plans
- Team Management: Collaborate with team members and assign roles
Getting Started
1
Access Your Console
Navigate to your Admin Console through the Admin Portal to begin managing your applications.
2
Configure Moderation
Set up content moderation rules, user roles, and community guidelines for your applications.
3
Monitor Analytics
Review user engagement, content performance, and community health metrics.
4
Integrate APIs
Implement server-to-server APIs for advanced administrative automation.
Common Workflows
Content Moderation Workflow
Content Moderation Workflow
- Configure Rules: Set up automated content filtering and AI moderation
- Review Queue: Process flagged content through moderation dashboard
- Take Actions: Approve, remove, or escalate content based on guidelines
- Monitor Results: Track moderation effectiveness and adjust policies
User Management Workflow
User Management Workflow
- Assign Roles: Set up moderators and assign appropriate permissions
- Monitor Behavior: Track user activity and identify problematic patterns
- Enforce Policies: Apply warnings, temporary restrictions, or bans
- Handle Appeals: Process user appeals and review moderation decisions
Moderation Activity Report Workflow
Moderation Activity Report Workflow
- Enable Reporting: Activate moderation activity reports via API or support request
- Monitor Activities: Track all moderation actions in real-time as they occur
- Generate Reports: Download CSV reports for analysis and compliance auditing
- Analyze Trends: Review patterns in moderation actions and moderator performance
Analytics Review Workflow
Analytics Review Workflow
- Daily Monitoring: Check key metrics for user activity and engagement
- Trend Analysis: Identify patterns in content performance and user behavior
- Report Generation: Create regular reports for stakeholders
- Strategy Adjustment: Modify features and policies based on data insights
Integration with SDK
The analytics and moderation tools work seamlessly with your social.plus SDK implementation:- Console Settings automatically apply to SDK behavior
- API Actions can be triggered by SDK events and webhooks
- Moderation Rules configured in console affect SDK content validation
- Analytics Data reflects all SDK-driven user activities
Best Practices
Privacy & Compliance: Always ensure your moderation and analytics practices comply with relevant privacy laws and platform policies. Maintain transparency with users about data collection and moderation actions.
- Clear Policies: Establish and communicate clear community guidelines
- Balanced Automation: Combine AI tools with human judgment for best results
- Regular Review: Continuously evaluate and improve your moderation processes
- Data-Driven Decisions: Use analytics to inform feature and policy changes
- User Feedback: Maintain channels for user feedback on moderation decisions