Fediverse Trust and Safety: The Founding and Future of IFTAS

In November of 2022 as millions of people sought alternatives to Twitter, the Fediverse experienced tremendous growth – and with that growth came an increasing number of people asking “how do we scale and support volunteer trust and safety?”. IFTAS was born out of those conversations, beginning with a working group to identify the issues and propose possible solutions, followed by a community survey to gauge the interest and needs of service administrators and moderators, and finally securing funding to kickstart the activities.

As I reflect on a year of activity and plan for what’s next, I’d like to offer my vision for IFTAS, talk about what the broader impact of a federated social web means, examine a few fundamental challenges, and offer ways we can all help safeguard and sustain .

Over the past year I’ve personally counselled moderators on the front lines of conflict resolution, bigotry, and exposure to traumatic content, and I’ve seen time and again the abuse levelled at people working for free filling the under-appreciated role of moderating human interactions online.

On the other hand, I’ve also had the unmitigated joy of working with a growing number of amazing people who are incredibly energised and devoted to making our online world work better. People who are actively reducing harm, increasing safety, developing new technologies for new problems, authoring guidance for moderators, and watching them come together for the greater good.

At IFTAS we’ve spent the past year building a community of roughly 200 moderators and administrators who are actively sharing best practices and lessons learned. We’ve also built moderation tooling and services, worked with industry and subject matter experts to provide guidance and educational materials, and hopefully we’ve helped reduce harm for our moderators and their communities. 

We’ve contracted or provided stipends to two dozen Fediverse supporters and moderators, and we’ve provided personal safety support for 20 at-risk moderators. As we head into year two for IFTAS I am hopeful we and others can continue to advance and elevate the conversation around user safety and moderation support.

Fundamental Challenges

Trust and safety at scale in volunteer-operated services is immediately observable as being underfunded and understaffed. Corporate social media services have the benefit of lawyers, money, and subject matter experts in the myriad harms perpetrated on the internet, from spam to disinformation to hate to copyright issues, the list is long

A small Fediverse instance operated by a lone admin/moderator cannot possibly be expected to acquire all the knowledge and experience required to mitigate all these issues. Even the largest teams still don’t have the time or resources necessary to manage every conceivable issue.

Here’s just one case in point. Did you know it’s illegal to offer to sell or loan a hard non-flexible plate with three or more sharp radiating points, designed to be thrown in the United Kingdom? What should a moderator do if a user offers such an item for sale or loan? How liable is the instance administrator? Are they even subject to UK jurisdiction? If so, what other items are prohibited in the UK? How about France? Tunisia? India? 

IFTAS has a long list of action items that our community has asked for, but I believe there are fundamental challenges that are of top priority and offer some easy wins to move the conversation forward.

While IFTAS can help and is working on each of these, the broader community can also impact each of these, and I will offer some ways we can gather our collective strengths to address these challenges. 

In no particular order, let’s explore:

  • Federation Management – federation is on by default and highly permissive. New servers immediately connect with thousands of peers, irrespective of their authenticity and suitability, creating easy vectors for abuse and harassment. How can we reduce the opportunity for harm?
  • Common Vocabulary – for interconnected moderators and administrators to work together, we need to agree on labels and definitions. What work has been done to create standard terminology, and how might we introduce this to the dozens of platforms and apps available?
  • Regulatory Compliance – regulations like the Digital Services Act, age verification requirements, the UK’s Online Safety Act and others are impacting social media providers, and Fediverse instances are not immune. Legal support and protection from illicit content are our moderator’s highest concerns. What can we do to protect ourselves from liability?
  • Shared Signals – we federate content, but we don’t federate information about that content. The current state of the art does not help administrators and moderators communicate with each other. Let’s fix that.

Domain Federation Management

The Fediverse is an interconnected network of services federating content with each other, running on open source software, communicating using open protocols. As I write this, we are approaching 30,000 servers visible on the network, up from just under 8,000 in October of 2022 (FediDB).

A chart showing the number of Fediverse servers since 2022, starting at 8,000 and climbing to 30,000

30,000 servers, means at least 30,000 content moderators, with roughly 30,000 definitions of what constitutes inappropriate content. 30,000 servers means connecting with 29,999 other servers that may or may not be well-moderated.

Those 30,000 servers host 14 million accounts. Federating with those servers means giving 14 million people the right to create files on your publicly-accessible hard drive. This is the de facto standard; most instances allow connection unless the administrator blocks a specific domain. It is a reactive proposition, and the more servers that join the network, the more things there are to react to. Moderating one server means moderating the entire network.

The web is used by billions of people, and not all of them have your best interests at heart. The low barrier to entry – download some free software and install it on a cheap web host with a free domain name – means that some servers are operating with the explicit intent to harbour hateful or illegal content, to disrupt the network, or to generally provide a source of troll behaviour. 

Consider this statement from one server operator:

REDACTED is an explicit free-speech instance established for, and maintained by, free-speech absolutists. That means profane language, racial pejoratives, NSFW images & videos, insensitivity and contempt toward differences in sexual orientation and gender identification, and so-called “cyberbullying” are all commonplace on this instance and the onus is on the individual user to know in advance what they are signing up for.Free speech, here, as anywhere else, means you can express any opinion you wish.We do not block instances, and I do not consider Lolicon to be CP.I pay for REDACTED out of pocket, and while I am warmed by the spirit of generosity, I refuse to accept donations for the upkeep of this instance. For me, it is enough that people have a place, somewhere remaining on the internet, to say exactly what they want without the fear of censorship.

Many well-meaning community leaders have started services only to be drowned in a torrent of hate and abuse from users of servers such as the one above, leading many to walk away, scarred, likely never to return. Those who stay to work through the problem are incredulous that the alternative to corporate social media boils down to “connect to anyone and everyone, good luck with that”. I believe this to be unacceptable, unscalable, and easily mitigated. I’d like to see far fewer messages like the one below.

Over the past few hours, my timeline was descended upon by very ugly users who have been spewing profanity and vulgar language.They obviously have a network and descended en mass.I just blocked 5 domains and didn't lose a single follower.Later I'll try to figure out how to remove their posts.Hopefully I got them all.Adding: Blocked three more domains. Didn't lose a single follower.(One of my posts was attracting them. It wasn't important so I deleted it. That way you won't see the ugliness)

One server administrator at the beginning of a brigading onslaught

Domain Federation: How IFTAS Is Helping

  1. We maintain a “Do Not Interact” list, a list of domains we believe expose service providers to harm by hosting and federating illegal content.
  2. We monitor the domain blocks put in place by a large proportion of servers, and we hand review the domains that are blocked by at least 51% of those servers. We then create a list of domains we recommend for action, using our CARIAD policy specification.
  3. We built and operate FediCheck, a Mastodon-facing web service that can synchronise your server with our lists, saving admins from having to learn about new problem servers, researching them, being exposed to their content, and manually adding the domain to a denylist.
  4. For those who have been exposed to traumatic content, we publish wellbeing and post-trauma resources in our Moderator Library.
  5. We are working with managed hosting providers to explore how they can offer safety tools to their customers.
Screenshot of the FediCheck application, showing servers that can be blacked automatically

A screenshot of the FediCheck web service application

Separately, I advocate for changes to the ecosystem:

  1. I believe all federating software should explicitly inform administrators during install that the installed service will federate with a range of servers, some of which host illegal or undesirable content.
  2. I believe all federating software should offer the option to start with the denylist currently in place by the software maintainer. For example, Mastodon installers should be able to start their service with the current mastodon.social denylist. This allows software maintainers to offer a list that reflects their own foundational principles.
  3. I believe all federating software should have a federation allowlist option, whereby the administrator can opt to only federate with servers they allow. A large number of small communities, family servers, schools and more can benefit from curating the domains they connect with.
  4. I believe all federating software that allows importation of domain blocks should allow importation of retractions.
  5. I believe all federating software maintainers have a duty of care to the people who download and install their software to educate them about resources they will likely need. One-click installers or managed host providers can be used by non-technical community managers to create an online presence in minutes, unaware of the implications and ramifications of joining a global network of federating content.
  6. I believe that a messaging protocol that federates content should also federate metadata about that content. I can federate disinformation, but I can’t federate a trusted flag that a server is operated by a persistent threat actor. I can federate an illegal media object, but I can’t federate a signal that a server is knowingly and willingly hosting illegal content. I can federate a thousand spam messages, but I can’t federate my finding that a given server is being used to send spam.

Domain Federation: What We’re Doing Next

  1. IFTAS will continue to support work on shared denylists, labelling services, and we’ll explore open sourcing FediCheck so anyone can run the service and use any upstream provider to help manage their domain federation. My hope is that rings of trust will form, allowing Servers A, B and C to automate the mutual sharing of their denylists, or Server A and B can choose to emulate Server C, or all three can decide that some other list or service is the best one for them, and plug that into FediCheck.
  2. FediCheck will be extended to accommodate email domain and toxic IP blocking, using highly trusted sources to further protect administrators, hosts and moderators from interacting with known sources of harm.
  3. Over the coming year, I hope to extend CARIAD into an API that can be consumed by other software products, making it a robust source of signal intelligence.
  4. I will continue to advocate for safety features that inform and support community managers and service providers so we can continue building a decentralised social web that offers options for everyone to build the community that’s right for them.

Domain Federation: How You Can Help

  1. If you are a software developer creating federation software or a supporting app, use prosocial and evidence-based approaches to your development process. Learn from those who have walked the path, talk with safety experts, review extant moderation tooling, and consider Safety by Design. If you are building networking features that connect users of your software to 14 million people, take the anticipated potential harms into account.
  2. If you use federated software, talk to the maintainers of your software of choice. Use their preferred feedback channel, and clearly describe the problem and the proposed solutions. Talk to the administrator of your home instance. Ask them if they monitor inauthentic domains.
  3. If you operate a server, read our Denylist Resources page. If you have additional resources to add, tell us so we can share your knowledge and experience with others. IFTAS Connect members can share resources to add in the Library discussion forum, or send me a note at @jaz
  4. If you can afford to support us, we need to pay people for their time reviewing domains, fixing the bugs and maintaining FediCheck, providing technical support, and creating the next generation of support tools. You can find ways to make a charitable donation at the bottom of this article.

Common Vocabulary

Federating platforms are proliferating, and apps and clients to connect to them are being developed by energetic teams with limited resources, each one needing to reinvent moderation tooling. Reporting workflows, labels, definitions… each platform and each app needs to consider what words to use, how to present them to users, what options to offer content moderators and service administrators.

I believe we can leverage the work performed by the larger trust and safety ecosystem to find a common vocabulary – creating an easy path for developers to ensure they have considered not only what harms their software may expose their users to, but how to robustly codify and act on those harms. 

Shared Vocabulary: How IFTAS is Helping

  1. IFTAS has adopted the Digital Trust & Safety Partnership Glossary of Terms and uses those as the core labels in everything we do. A great entry point into this vocabulary is our Shared Vocabulary page. We have created a page for each and every label, with links to guidance and resources for each label.
  2. Borrowing from the misinformation research landscape, we use the Actor, Behaviour, Content taxonomy to classify each of 34 labels.
  3. We worked with DTSP to republish the Glossary as a Creative Commons document, meaning we and you are free to use it in any form. The Glossary is the backbone of our moderator community (IFTAS Connect) and is how we classify content and collaboration in the library and the discussion forums.
  4. Our domain denylists use these labels to classify domains, and we use the same labels on our information sharing advisories.

Shared Vocabulary: What We’re Doing Next

  1. Now that we’ve obtained the Creative Commons version of the document, we will begin translating the Glossary into as many languages as we can, and maintaining those translations over time if and when the Glossary is updated.
  2. We will begin labelling content on Bluesky using our shared label list, and we will work with other developers to consider adopting a standard list.

Shared Vocabulary: How You Can Help

  1. If you’re a moderator or service administrator, consider rewriting your server rules to reflect the DTSP common labels and definitions. We have example template rules in our Library for each label.
  2. If you’re a developer, consider using the Glossary as a starting point for your reporting workflows, we have some guidance on implementation.
  3. If you’re a subject matter expert in any of the labels in our list, reach out if you can offer tools and resources you know of that can help volunteer moderators with their decision making and interventions.

Regulatory Compliance

My concern here is to protect administrators and moderators from liability for their accidental non-compliance, and to offer guidance on basic risk assessment. More and more countries are creating complex requirements for social media providers, and Fediverse servers are liable for the content they carry.

I have seen malicious use of illegal content, both to cause trauma, and to enable the takedown of an unsuspecting server. The Fediverse has seen concerted attacks of computer-generated child sexual abuse material, mass DMCA takedown requests, frivolous GDPR reports, and law enforcement hold requests.

The two resources most asked for by moderators are legal support and CSAM detection. We are working on both.

Regulatory Compliance: How IFTAS is Helping

  1. We partnered with industry experts Tremau to co-author the Digital Services Act Guide for Decentralized Services. The DSA is a broad set of rules that apply to any service with users in the European Union.
  2. We have a dedicated area for legal and regulatory resources in our community library.
  3. We have met with child safety organisations and subject matter experts, and acquired licensed access to hash matching databases to identify illegal media.
  4. We have built a working prototype of an opt-in CSAM detection and reporting service for use by Fediverse administrators.
Screenshot of the prototype CSAM detections service, demonstrating two pertinent matches of illegal content

Regulatory Compliance: What We’re Doing Next

  1. We are seeking funding to move the CSAM detection and reporting service into production, and hiring additional subject matter experts to evaluate our service and the adjacent resources.
  2. We will extend the service to offer hash and match service for non-consensusal intimate images (NCII) and terroristic/violent extremism content (TVEC).
  3. We are exploring opportunities to work with legal counsel to provide basic legal advice to Fediverse administrators.
  4. We will co-author additional guidance similar to our DSA Guide, to cover the UK’s Online Safety Act and other extra-national online regulations. We are also monitoring age verification requirements and will provide guidance there also.
  5. We are talking with law enforcement agencies and legal notice portal providers to create guidance for service providers on how to recognise and respond to a valid legal request.
  6. We will create a web service using the Lumen API to enable structured receipt of takedown notices for any Fediverse service provider.
Screenshot of the Lumen API widget, allowing end users to submit a takedown request of various types including DMCA, Trademark, and Law Enforcement Request

Regulatory Compliance: How You Can Help

This is the most expensive piece of our puzzle, and the most-requested by our membership and the broader community. The CSAM detection service will cost us over $160,000 US this year, the hosting bill for the platform is $1,500 a month, and will increase as we ramp up service. 

We are chasing a grant opportunity but we need matching funds.

If you’d like to join IFTAS as a corporate sponsor, please contact us for information on our sponsorship program. If you’d like to contribute personally, see the bottom of this article.

Separately, if you are a service provider or instance admin and you would like to register your interest in using our service, we need to show support for our work to obtain grant funding. Please fill out this short form to let us know you are interested in using our service once it’s available so we can demonstrate to potential funders that we are building something that will be used.

Shared Signals

I have a deep background in healthcare and cybersecurity. Both of these industries have a rich culture of collaboration and sharing, and I want to help accelerate the adoption of similar approaches in the Fediverse.

The Fediverse has 30,000 moderators. Twitter has fewer than 2,000. Meta has 15,000.

We have twice as many content reviewers than Meta.

I’ll say that again.

We have twice as many content reviewers than Meta.

Now, almost none of our 30,000 are full time, very few are paid, and there is little in the way of central guidance or tooling to help any of them with their work. Some are working on their individual instance and see only a fraction of the network, but what if we could find a way to join forces, share the load, and spread the love?

I believe we can harness the power of the Fediverse for the greater good, reducing workload for all moderators, and demonstrating to the world what it looks like when mission-driven communities come together to create the next generation of social media.

Shared Signals: How IFTAS is Helping

  1. We have created a structured community for admins, moderators, and subject matter experts to convene and collaborate at our web portal IFTAS Connect. We also operate a set of Matrix chat channels to allow for real-time conversation, and we are adding live conferencing options for one-on-ones and community meetups.
  2. We have created the SW-ISAC, an Information Sharing and Analysis Centre (ISAC) for the social web. This channel allows service administrators to share information with trusted partners, and release advisories that everyone can benefit from.
  3. IFTAS joined the NGO-ISAC, a cybersecurity community for nonprofits.

Shared Signals: What We’re Doing Next

  1. We are monitoring a lot of independent work in the space, most notably ThisIsMissEm’s FIRES project. Where possible, we will directly support the work.
  2. We are tracking information sharing activities in the industry, and will continue working with several groups that are exploring broader sharing activities in the social web.

Shared Signals: How You Can Help

  1. If you are a large service provider or web host, consider joining the SW-ISAC sharing channel. Contact us for details.
  2. If you are a content moderator or run your own Fediverse server, consider joining IFTAS Connect. Our community portal offers groups, direct messages, and discussion forums for sharing and learning from your peers.

Big Picture

Since starting IFTAS, I’ve been yelled at, received obscene and threatening waves of abuse, been accused of working for Big Social, people have taken issue with me being too political or not political enough, and I’ve had to navigate the complexity of building a central resource in a decentralised world. 

But I believe in the web, and I am certain that an open social web is worth our collective time and energy. Access to information, personal connections, news, sports, education, comics and games… we as a society owe it to ourselves to build the web we want, the web we need, the web that lives up to its name. 

There are no walls on a spider’s web. 

We want a web where everyone can participate, where everyone is safe from abuse and harm, where civil society can flourish, where people can meet, learn, share, live, love, laugh.

Our founding grants of $400,000 have delivered everything you see above and a whole lot more, and will keep the lights on throughout 2023, but we need to keep building, keep paying the bills, and keep paying the people doing the work. At some point, I’d like to be paid, too. I’ve been working unpaid full time since this started, and that can’t last much longer. I am supported by my amazing wife who is 100% committed to what we’re trying to do, and without her support none of this would be possible. 

We have never asked for money; my goal was to take our founding grants and demonstrate our capability, build trust, and put some wins on the board. I believe we’ve done that. Next we will build toward self sustainability, and my plan is to get there in three years. 

So far we’ve helped hundreds of instances and moderators in 30 countries. Our community is asking for legal support, CSAM reporting, spam detection, access to expensive APIs, vicarious trauma counseling and much more. We want to continue working to meet the needs, but we need your help. If you’d like to support IFTAS on our mission to support the people making this happen, please consider making a contribution. 

Supporting IFTAS

First and foremost, if you run or moderate an instance, or you can contribute your expertise to our projects, join us. Join IFTAS Connect, share, learn, teach, translate… Your peers need you. Your participation is the most important way to support the mission. The next most important thing you can do is to support your instance, the vast majority cannot cover the bills and we need a strong, vibrant community of servers to make this all work.

Nonetheless, we need to pay the bills.

We are opening up to organisational sponsorship. If you’re interested in partnering with IFTAS, please contact me, we have some great options. If you know an organisation that might be interested in sponsoring us, introduce us. 

If you run an instance, make a small one-time donation and let us list you as a supporter. We’d love to show a list of servers that believe in what we’re doing. And keep an eye out for our next Community Needs Assessment, your feedback is how we prioritise what to work on for you.

We will also be soliciting donations from the community. To celebrate our founding support, we will memorialise the first 500 donors on a special page we will keep forever, the IFTAS First 500. 

Send a dollar, send ten. We’ll list every supporter. Add your name, your blog, your profile pic, whatever you like. We want to demonstrate broad support for our work, it keeps us all energised and lets our sponsors know we have the support of the community.

If you are able to, make a sustaining contribution to help our monthly expenses. We pay moderators, developers, community managers, and we operate web services that require beefy hosting.

If you have employer matching funds at your job, see if IFTAS is on the list. We are registered with Benevity, tell your co-workers.

As a 501(c)(3) charitable organisation your support is tax deductible in the United States. We accept money, securities, donor-advised funds, cryptocurrency and more, all tax deductible.

If you can, donate today

To everyone who has supported us this far, thank you. Your time, wisdom, energy, feedback and words of encouragement have sustained us all.

Join me and the incredible team on this journey, support an open social web, and help us as we keep pushing for .

Diolch yn fawr iawn i chi gyd,

Jaz

Originally posted on IFTAS Updates...

Français