Public means playable
Public vibes should stay publicly accessible.
We do not treat public playback as an account-locked surface. Moderation, safety, and capability checks must preserve anonymous public viewing for genuinely public work.
Content Policy
Because people can actually run what gets posted here, moderation has to protect them without turning public creation into a locked-down, joyless experience.
The platform allows experimentation, remixing, and weird web publishing, but it does not allow code, media, or behavior that turns public playback into harassment, malware, exploitation, or abuse.
Runnable code raises the stakes. The policy has to defend viewers and creators while still leaving enough oxygen for strange, playful, unfinished work.
Public means playable
We do not treat public playback as an account-locked surface. Moderation, safety, and capability checks must preserve anonymous public viewing for genuinely public work.
Safe by default
User code runs behind sandbox, network, and capability controls so one creator's experiment does not silently become another person's risk.
Social by design
Remixing, feedback, and public discovery are core product paths. Policy should target abuse and deception, not ordinary curiosity or experimentation.
Durable enforcement
When we hide, quarantine, or remove something, the outcome should make sense across the feed, player, profile, conversations, and embeds.
Clear enough to predict, firm enough to matter, and specific enough that ordinary curiosity does not get treated like abuse.
Not allowed
Creators cannot publish code or supporting assets intended to exfiltrate data, exploit viewers, evade platform controls, or meaningfully increase another user's exposure to harm.
Harassment and hate
Runnable projects, comments, metadata, and remixes cannot be used to target people with harassment, hate speech, intimidation, or coordinated abuse.
Deception and spam
We remove repetitive spam, fraudulent claims, impersonation, deceptive monetization funnels, and experiences designed mainly to manipulate engagement.
Escalation model
Depending on severity and intent, we may block publication, quarantine content from public surfaces, remove it entirely, or suspend the account that published it.
Community rules, terms, privacy, and security each cover a different part of the same promise: public creation without public harm.
Short answers for the places where marketing copy should stop hand-waving and say the plain thing.
No. Public means publicly playable unless a specific safety or policy control says otherwise. Account state should not silently turn public vibes into private ones.
Yes. Quarantine is one of the moderation tools available when content should stop circulating publicly but still needs a narrower visibility outcome than permanent deletion.
The format changes the enforcement details, but not the policy standard. Runnable code is evaluated as content plus behavior, so both social harm and technical abuse matter.
Build for invitation, clarity, and safe interaction. Public work should be runnable, understandable, and respectful of the people who open it.