Foundation for harassment, illegal content, minor safety, doxxing, and harmful content policy. Also a governance question about whether T&S is Creator reserve power, separate body, or under Arbiter review.
Why this matters
Harassment, illegal content, minor safety, doxxing, and harmful content policy all need a framework before users exist. Every UGC platform has learned this the hard way — late T&S design is the most consistently expensive form of platform tech debt, both operationally and reputationally.
It is also a governance question: is T&S a Creator reserve power, a separate body, under Arbiter review? The answer shapes whether T&S decisions are appealable, whether a Franchise Team has any say in network-level T&S, and how T&S coordinates with the cross-Franchise sanction policy (G-021).
Open
- Threat model and content categorization
- Decision authority structure (Creator reserve / dedicated body / Arbiter-reviewable)
- Reporting flows and SLAs
- Coordination with Trust & Safety regulators (DSA in EU, NetzDG-style frameworks elsewhere)
- Minor protection beyond age-gating
- Coordination with vandalism handling (G-017) and cross-Franchise sanction (G-021)