We know that by creating a platform that can reach across the globe, we have a responsibility to keep our users and our communities safe. Our community of users and the Frenlo team are the most important part of what we're building.
We try to err on the side of community safety, that includes our own team as well as our users and their community. We are committed to fighting inequity and building a platform that does not further the inequity we see in the world.
One of the ways we keep us safe is through content moderation. We know what the stakes are if we get this wrong, and we know you have placed your trust in us to get it right, and we value that trust highly. We are continuously reviewing our policies and procedures and will regularly publish our learnings and our methodology so that, hopefully, we can make places outside of Frenlo safer too.
The following are the principles we follow:
We rely on user reporting to find and remove content that violates our Terms of Service and jeopardizes community safety. We may use the learnings and information gathered from this process to build machine learning algorithms that can help us do this more effectively, accurately, and at a broad scale, if we are able to do so without inadvertently causing more harm.
A primary function of community safety is content moderation. Community moderation and transformative justice will also be cornerstones of our community safety team. Right now, there are only 3 of us on the Frenlo team. We're educating ourselves, learning from experts, and creating a strategy to hire people who are experienced in communty building, safety, and moderation.
As we grow into new regions and communities, we will expand the community safety team to include members of that community. Our team needs to reflect the diversity that we want to see in the Frenlo community and we have to understand that folks who are most often targeted online are also the people we must represent and protect. This work is difficult for anyone who must sift through what can be the worst of what you could see online, and the emotional labor required for community moderation is taxing. A focus on mental health and trauma care are critical to protecting our community safety team, and we will regularly review the way in which we are asking our team to work so that we can prioritize their safety along with yours.
Community Safety is a big investment, and we know how critical it is. Part of your subscription goes towards making sure we can provide the right support for our content moderators and towards expanding the team to better represent the communities of our users.
If you have any questions or suggestions on this community safety plan, please reach out to us at [email protected]