How a German Social-Media Company Tamed the Trolls

 Jodel founder Alessio Borgmeyer

When German social-media app Jodel was five months old, its founder Alessio Avellan Borgmeyer noticed a potential problem: while more people were logging on, a small number of users weren’t the community he had hoped to attract. They were posting offensive comments and pressuring other users to submit naked photos.

Apple Inc. also saw a problem. The naked photos violated its rule on sexual content, and in early 2015 it kicked Jodel Venture GmbH out of its app store for a week.

The ban prompted Mr. Borgmeyer to take steps that many social-media firms have been reluctant to adopt. He banned nudity, prominently displayed the app’s rules and introduced bans on racism, sexism, bullying and insults of any kind. He encouraged positivity among the community and empowered users to police each other’s behavior, giving unpaid moderators the authority to ban troublemakers.

Now Jodel is populated with local chatter, earnest job advice and parenting tips. Bad actors are quickly driven away.

Jodel is also back on the Apple app store, where among iPhone users in Germany ages 16 to 24, it is the fifth most-popular social-media app, according to App Annie, behind Facebook’s WhatsApp, Facebook, Facebook’s Messenger and Pinterest.

“One person can spoil the vibe,” Mr. Borgmeyer says. “They can destroy the community for everyone else.”

A screen capture of the Jodel app. Jodel reminds its users to respond with respect.
A screen capture of the Jodel app. Jodel reminds its users to respond with respect.

Jodel is a small startup with about 30 employees, but it is grappling with the same dilemma of how to moderate user behavior that dogs social-media behemoths Facebook Inc. andTwitterInc. They are constantly pulled between those who criticize these platforms for allowing hateful posts, and those who complain about the sites curtailing free speech and making inconsistent decisions about what should be allowed.

Mr. Borgmeyer said he chose to prioritize inclusion over free speech, and embraced the app’s role as the arbiter of what is acceptable.

“We don’t want to get into an endless debate of is this an opinion or is this insulting,” he said. “When someone doesn’t feel included, that is where the border is.”

Organized around news feeds that display content only from users within 10 kilometers (about 6 miles), Jodel serves as an anonymous neighborhood messaging board for its users. Mr. Borgmeyer wanted the anonymity to help anyone feel welcome and make it easier to talk, like a “slightly tipsy bar.”

After Apple banned the app in early 2015, Jodel started selecting users to serve as moderators and built software to detect abuse. It introduced sweeping rules that prohibit insults based on race, ethnicity, gender, religion, sexual orientation, disability—or “anything else.” The app had restrictions before the ban, but they were buried in the settings section.

Facebook and Twitter have also banned harassment and abuse from their platforms, but neither company has gone as far as Jodel in its blanket ban on insults, or decision to only allow jokes “as far as they do not hurt any other community members and have a positive vibe.”

Community-led moderation on the internet dates back to the 1990s. In the first years after Facebook’s launch in 2004, the social network used a small team of college-aged employees in Silicon Valley to remove offensive content.

The issue is scale, as the model often breaks down when it expands beyond small, mostly homogeneous cultures, said Kate Klonick, an assistant professor at St. John’s University who studies online speech.

The open nature of Facebook and Twitter means people from all over the world—with vastly different cultural backgrounds—now bump into each other and disagree on what they consider offensive. In India, for example, mainstream movies often censor kissing, and in France, nudity is more common on television.

It would be almost impossible for Facebook or Twitter to implement such a system now that they are global businesses, Ms. Klonick said.

“When Facebook started opening up, it became unfeasible,” she said.

Facebook now employs thousands of contractors who are tasked with deciding what postsviolate the site’s legalistic rules about what is acceptable. Twitter hired contract moderators to enforce its rules.

Mr. Borgmeyer believes his experience at Jodel shows that toxic content and social media aren’t inevitably intertwined.

“You hear a lot about the negative sides of connecting so many people,” he says, “but the vast majority of people want it to be a positive, inspiring, helpful and fun place.”

Jodel’s measures are expensive. The company spends about one-third of its budget on moderating content, including paying its engineers to improve the algorithm’s ability to detect objectionable content. It now has more than 30,000 users serving as moderators, and the algorithm frequently prompts moderators to explain their decisions.

It is still possible to find negative content on Jodel, and not all of the posts on Jodel are G-rated.

Mr. Borgmeyer says the company would continue to update its system in hopes of building the community he first imagined.

[“source=businessinsider”]