Content Filters as Censorship

As long as there is content on the web, there will be arguments about what is “offensive.” Sure, we might want to think that some information or media would not be objectionable to anyone, but the climate of outrage has reached such a fevered pitch, this is unrealistic to assume.

When it comes to content that is even remotely sexual, there are constantly arguments about what is acceptable, objectionable, pornographic, etc. Parents fight with content platforms about age gates – something to make people feel good, but don’t really stop much. Then there are the piles of filters out there, offered with free-standing software, by ISP’s, and now by individual platforms like flickr, Facebook, and now Twitter.

The problem inherent in all of this is that as a general rule, the programmers and developers who are tasked with creating these filters don’t tend to consult the creators of the content they want to allow users to block. Many people who provide sexual content that people might find objectionable honestly don’t want to force their “stuff” on the world. That means they would probably help those developers create filters, assuming that the goal was to leave the decision of blocking to the end user. We get annoyed when the filters are platform-wide, and no one can choose to opt-out of them.

So, what needs to happen to “fix” this issue?

A good start would be for companies to stop acting solely on what they find in their complaint boxes. The people who are offended might have an honest issue that needs to be addressed, but lately, it could just as easily be the current faux-outrage fueling the complaints. You can’t please all of the people all of the time, and if your business plan includes catering to people who actively look for reasons to be offended, you will drive your employees insane trying to keep up.

Next, at least when it comes to the murky area of sexual content (from educational content on sex like here, to pornography, and everything in between), start talking to the content creators. Reach out to organizations that represent them, or are active politically for sexual freedom, like NCSF. Instead of randomly choosing keywords, ask us what is most likely to work for your users who want to keep their kids away from as much purely adult content as possible.

Finally, be highly suspicious of people who are demanding content blocking because it “triggers” them. Yes, it is possible you might end up ignoring someone with legitimate problems, but the fact is that no one gets over being “triggered” by anything without facing it, period. When it comes to overcoming past sexual trauma, “safe spaces” were originally meant to describe environments with mental health professionals. People would then be exposed to known “triggers,” and be able to deal with them with the help of those professionals. No one knows all of their “triggers,” so it is impossible to help anyone avoid all of them. Bluntly, this isn’t in the job description of content providers or social media platform developers, no matter how much anyone screams otherwise. It is ok for tech professionals to tell these people that they need mental help, not more content filters.

Bottom line here is that tech companies providing social media platforms for the masses need to stop worrying so much about catering to the whims of people who leap from one outrage to the next on a nearly daily basis. Don’t be afraid to reach out to the providers of the content people honestly want to filter for real reasons, outside of the outrage circus. We don’t bite – well, not unless you want us to!