Addressing abuse vectors
For the remaining exercises, we're going to work on proposing solutions to the abuse vectors we just identified. Some common ways to address abuse vectors are:
- Take out the feature (you can't abuse what doesn't exist! Many news websites have removed their comments sections after realizing how much abuse and harassment takes place in them.)
- Reduce interaction (users could still interact with this feature, but their speech is filtered or limited in some other way. The online game "Hearthstone" only allows players to communicate with pre-set phrases like "Greetings" and "Threaten.")
- Reduce visibility (this feature can only be used by certain people or only under certain circumstances. Twitch allows users to limit chat on their video streams to people who have subscribed to their channel, usually for a fee.)
- Don't keep data you don't need (this data could be used to hurt people, do we really need to hang on to it? ProtonVPN has a detailed privacy policy describing what kinds of data they keep and why, as well as the kinds of data they don't keep.)
- Intervene before, during, and after harassment (software that either gently encourages people not to be abusive or exacts swift punishment when they do. On Reddit, if you post something other people don't think is appropriate, they might downvote the post to encourage you to edit it. Eventually, if it gets enough downvotes, it will be automatically hidden from view or even deleted by a moderator. If you keep posting controversial things you might get banned.)
- Make it opt-in (some people are ok with the possibility that they might be harassed, but others aren't. Most social media websites allow you to filter out certain kinds of content that you don't want to see in your feed.)
- Add moderation (it's too complicated to write a program to prevent abuse of this feature, so let's get some humans to figure it out for us on a case-by-case basis. And there are a lot more possibilities for moderation tools than just banning users!)
There are pros and cons to every approach, but usually it comes down to what makes the most sense for the product. Even if it's a huge technical effort to make a feature less of an abuse vector, it might be a huge source of revenue for your company and not something that can just be removed or hidden for some people.