Facebook Without Violence Guide

THE BASICS

What youth on Facebook need to know


How does Facebook address Cyberviolence?

When Facebook establishes that a user has violated its Community Standards, the content in question is deleted and the user is given a warning, temporarily blocked—or in the most serious cases, have their accounts deleted permanently.


Terms of Service

As a Facebook user, you. . .

  • Are only allowed one account per person.
  • Must be 13 years-of-age or older.
  • Must keep your contact information up-to-date.
  • Must not be a “convicted sex offender.”
  • Must not use your FB account “for commercial gain.”
  • Must not solicit login information from another user or access another user’s account.
  • Must not share spam[2] .*
  • Must not bully, intimidate or harass other users.
  • Must not post content that is hateful, threatening, “pornographic” or that incites violence.
  • Must not engage in illegal activity on Facebook or violate Facebook’s user policies.
  • Must not collect users’ content or information, or otherwise access Facebook using automated tools, such as harvesting bots or robots[3] .*
  • Must not upload viruses or other malicious code to Facebook.

*Spam is any unsolicited—usually irrelevant or inappropriate—message sent on the internet to a large number of recipients

*Robot is an Internet bot, also known as web robot, WWW robot or simply bot, is a software application that runs automated tasks (scripts) over the Internet.[1] Typically, bots perform tasks that are both simple and structurally repetitive, at a much higher rate than would be possible for a human alone.


Facebook. . . 

  • Does not own your content. However, becoming a Facebook user gives the platform permission to use your intellectual property[1] —photos and videos, for example—for free. 

Learn more about intellectual property rights.

  • Is allowed to delete content it does not approve of.
  • Does not check all posts for content violating their standards.
  • Is not responsible for charges to your data plan or phone bill.
  • Is allowed to change its user agreement without telling you.
  • Has access to your camera, microphone and location unless you change your settings to indicate otherwise.[2] 

*Note: This means Facebook does not have to pay you royalties or any compensation to use the content you upload to the platform. Even if you delete content, Facebook may have backup copies or access to content that has been re-shared by other users and not deleted yet.


Oh and, if you want to revoke Facebook’s access to your microphone anyway, here’s what you do:
On iPhone (iOS 9)
  1. Go to the Settings app
  2. Scroll down to Facebook, tap it
  3. Tap “Settings”
  4. Turn off the slider for Microphone (slider should be grey instead of green)
On Android (Marshmallow)
  1. Go to Settings
  2. Swipe over to “Personal”
  3. Tap “Privacy and safety”
  4. Tap “App permissions”
  5. Tap “Microphone”
  6. Find Facebook, and turn the slider to OFF

Learn more about Facebook’s Community Standards.

LEARN MORE: Read Hollaback!’s Facebook Safety Guide

LEARN MORE: Read National Network to End Domestic Violence’s Privacy & Safety Facebook Guide for Survivors


Some things we think Facebook is doing well to tackle cyberviolence!

These recommendations were created in collaboration with the Purple Sisters Youth Advisory Committee of Ottawa.

  • Facebook has made great efforts to offer accessible guides for users to better understand it’s terms of service & data use policy.

  • Facebook’s definition of hate speech is intersectional as it covers content that directly discriminates against people based on their race, ethnicity, national origin, religion, sexual orientation, sex, gender or gender identity, or serious disabilities or diseases

  • Facebook expects users to clearly indicate their purpose (context) and intentions to better understand why they are sharing certain content. Facebook allows & differentiates between humour, satire or social commentary from harmful content.

  • Facebook may also ask Page owners to associate their name and Profile with any content that is ‘insensitive’, even if the content does not violate its community standards.

  • Facebook now allows users to define their 'custom' gender providing users with UNLIMITED options to self-identify –  moving beyond the binary of “F or M”. This was an innovative move towards making the platform more LGBTQ+ inclusive.

  • Facebook has enhanced filtering options. This helps to filter content from friends, strangers and more trusted people.

  • Facebook uses secure browsing (HTTPS) by default for all users.

  • Facebook has some wonderful tools and resources like it’s Family Safety Center, Help Centre, & Bullying Center


Key recommendations for Facebook on tackling cyberviolence

These recommendations were created in collaboration with the Purple Sisters Youth Advisory Committee of Ottawa.

Facebook should. . .

  • Diversify its leadership—including expanding opportunities for women and LGBTQ+ people.

  • Hold online abusers better accountable for their actions.

  • Expand the definition of “hate speech” to include online gender-based violence and digital dating abuse.

  • Allow and encourage users to report abuse and harassment on behalf of others. Bystander intervention is key to preventing and ending cyberviolence.

  • Take proactive steps to monitor and filter content for abuse or harassment, as well as allowing users to set up their own customizable filters for content they do not wish to see.

  • Provide users the ability to block other users regardless of whether or not the user they wish to block has already blocked them.

  • Allow users to filter people on their “blocked list” so they don’t show up in their newsfeed or in tagged photos.

  • Allow users to receive a notification when they are attending the same events as users on their blocked list.

  • Allow users the ability to block other users from the “people you may know” section of their Facebook feed.

  • Ensure the teams responding to reports of cyberviolence have training on gender-based violence issues informed by experts in the field—including frontline workers and survivors of cyberviolence.

  • Prioritize reviewing reports related to harassment or abuse.

  • Make terms of service—particularly policies on data use—easier to find and easier to read.

  • Make it easier to delete a Facebook account permanently.

  • Change evidence requirements to reflect the reality of cyberviolence—including allowing screenshots to be used in evidence.

  • Ensure location services are “off” as the default and make it clearer to users when location settings are on.

  • Allow users to remain anonymous on an event guest list.

  • Restrict or completely shut down third party applications.

  • Make safety and support services accessible to users—including providing easy to find and easy to read information on local support services.

  • Build long-term partnerships with anti-violence experts and frontline workers.


Read our general recommendations for social media platforms.

Add new comment