When I was younger, I had an encounter with a predator online. I was lucky enough to recognize the warning signs and get away quickly, but that experience stayed with me. Not everyone is that fortunate, and I knew that firsthand.
Years later, when I looked at existing safety initiatives in the Roblox community, I saw something troubling: unprofessionalism, infighting, and petty drama. The very people trying to protect kids were blocking each other over nothing, engaging in immature behavior, and running appeal systems that felt more like public humiliation than justice. I understood why Roblox had such a negative impression of community moderation efforts.
When you're trying to solve a real-world problem that affects millions of children, you can't afford that kind of behavior. I knew things had to be done differently, with professionalism, transparency, and a genuine focus on protecting users rather than building clout or settling scores.
So in late 2024, I started building Rotector. Not to replace Roblox's moderation, but to bridge the gap between when predators are active and when they finally get banned. That's when victims are created, and that's exactly what we're here to prevent.