Close

Building a Trust and Safety Program in the Digital Age

LinkedInTwitterFacebookEmail

The people you interact with online are (mostly) human beings. That may seem like an absurd statement, but too often we forget that the nameless, faceless, avatar represents a real human. Irritation, anger, frustration — these are the reasonable, human reactions to adversity. As a technologist dedicated to protecting digital communities, I see the way these very human reactions sometimes translate into ugly, and often harmful, behavior online.

Building trust and safety in digital environments is essential, but it’s a new and rapidly evolving profession. We’ve had the history of all time to establish norms for face-to-face interaction. With the advent of the telephone, humanity had decades to devise new etiquette — answering within three rings and bookending conversations with “hello” and “goodbye.”

Risa Stein, Director of Product, Integrity, Slack

But the internet has accelerated us into new dimensions of complexity very quickly. Online communication norms first began to take shape out of the decisions of the institutional moderators on Usenet forums and bulletin board systems. These trust and safety pioneers were almost immediately forced to grapple with early iterations of now-ubiquitous challenges like spam, doxxing, and harassment.[1] Then, as now, the shields of anonymity and pseudonymity, combined with the dehumanizing nature of physical distance, emboldened misbehavior of a kind comparatively rare in offline spaces.

Building on the work of these self-moderating web pioneers, communications platforms tirelessly work to moderate content to head off threats of violence, distribution of illegal goods, harassment, hate, disinformation, and other harmful conduct. But it’s a herculean effort, and more than 40% of Americans say they have been harassed online. The human cost of online harm is immense.

The business cost

For most, these issues bring social media to mind, and trust in major social media platforms declined between 2.8% and 5.9% last year, depending on the brand name, according to an eMarketer report. But security, safety, and data privacy concerns are in no way limited to social media. We shop online, bank online, and work online — expanding the virtual attack area to nearly every aspect of our lives. Customer trust in any site or platform is not a given; it must be earned with every interaction.

When trust is eroded, employees, customers, and partners are more likely to limit their use of digital tools and platforms, resulting in inconvenience and lower productivity. Diminished trust is also an existential threat to any platform built around user engagement, interaction, and data storage or sharing. But when companies actively earn trust, they’re able to deliver more value for their customers while simultaneously generating goodwill, improving operations, and increasing sales. A recent McKinsey study, for example, found that organizations well-positioned to build digital trust are also more likely to see annual growth of at least 10% on their top and bottom lines.

Customer trust in any site or platform is not a given; it must be earned with every interaction.

Every industry will face different digital threats, so the work necessary to earn trust looks a little different everywhere.  On payment platforms, for example, the trust team might prioritize identifying fraud — fighting carders and phishers masquerading as legitimate brands. On a content platform, the trust team may spend more time fighting coordinated inauthentic activity campaigns or misinformation. For a youth education platform, privacy-protective automated detection of adult content would be a priority.

At Slack, we manage more than 2.5 billion actions daily and, in leading our Integrity team, it’s my job to help ensure these actions stay safe for our employees, customers, and partners across the world. Working in the integrity space has given me a few critical insights that might help companies understand the scope of the challenge and take tangible steps to build a more respectful and inclusive community.

You need a trust and safety program

Trust isn’t just a social media problem, and trust and safety should have a home in every HQ. Building out a dedicated trust function, staffed with the necessary industry expertise, is the first step towards protecting your business. It needs to be a separate department charged with forging, advancing, sharing, and enforcing your terms and acceptable use policy. It should also be empowered to advise on safety, privacy, inclusion, and security considerations across the company.

A trust or integrity team’s mission will vary depending on the context of the product offered by the company and the services it provides. Everywhere, though, the function should focus on keeping the company’s customers safe. It should build features, user-facing and internal, to prevent abuse of the product or services by bad actors. A policy function within the team should be responsible for setting and consistently enforcing clear, objective policies around how the services may be used.

To set integrity by design up for success, companies can have the function report into the product or engineering organization. Trust is an adversarial discipline, so ongoing engineering development is required to keep online services safe. It’s also critical to continually adapt trust measures to match innovation in the company’s core products and services. The make-up of a good trust and safety team is diverse and interdisciplinary. Foundational knowledge and experience in fields like law, policy, and cybersecurity are crucial, but so is ensuring employees represent a diversity of life experiences and have deep empathy for the users they’ll work to protect. The nature of trust and safety attracts mission-driven professionals, who want the substance of their work to align with their values. A well-staffed trust and safety team will be business-critical and value-accretive.

Empower your customers — they care about safety, too!

Trust and safety teams can only do so much — we can’t scale to supervise every digital space. By empowering your users to take more control of their digital experience, you can make your customer your ally. A well-designed platform gives users fine-grained permissions to control who they engage with and enables them to take safety precautions like verification of email domains. Empowering users can also take the form of admin security controls, like the ability to easily deactivate accounts that no longer need access or to use guest accounts to limit unnecessary data access. Bringing your customer along as a copilot on their safety ride builds trust and enhances overall platform security.

By empowering your users to take more control of their digital experience, you can make your customer your ally.

There’s only excellence — never perfection

The trust and safety challenges digital platforms face — impersonation, harassment, misinformation, scams — are unsolved problems. There is no such thing as perfection in integrity because, by nature, it’s adversarial work. Attackers will eventually find a way to evade every new detection or protection. The job is never done — but by investing properly and striving for excellence, you can continually earn your customers’ trust.


[1] For a look into how this history shaped the modern internet, pick up the classic Spam Kings by Brian McWilliams from your local independent bookseller.

Risa Stein is a Director of Product, Integrity at Slack, leading teams focused on protecting customer safety, privacy, and security. Prior to joining Slack, she led Transparency and Safety Experience products at LinkedIn and worked in Trust & Safety and product at Twitter. Risa received her JD from Stanford Law School, an MBA from the Stanford Graduate School of Business, and her honors BA from Brown University. 

LinkedInTwitterFacebookEmail