Instagram Without Violence Guide


What youth on Instagram need to know

How does Instagram address Cyberviolence?

When Instagram establishes that a user has violated its Community Standards or an account has been established with the intent of bullying or harassing another person or if a photo or comment is intended to bully or harass someone, the content in question is deleted and the user is given a warning, temporarily blocked—or in the most serious cases, have their accounts deleted permanently.

Terms of Service

As an Instagram user, you:

  • Must be 13-years-of-age or older.
  • Must own or have the right to post the content on your account.
  • Must not post hateful or “not safe for work” (NSFW) content
  • Must not harass or abuse other users.


  • Does not own your content. However, becoming an Instagram user gives the platform permission to use your intellectual property (IP)—photos and videos—for free*.
  • Is allowed to delete your posts without informing you.
  • Is allowed to change its user agreement without telling you.
  • Does not check all posts for content that violates its standards.
  • Can ban you for not following its rules.

*Note: This means Instagram does not have to pay you royalties or any compensation to use the content you upload to the platform. Even if you delete content, Instagram may have backup copies or access to content that has been re-shared by other users and not deleted yet.

Learn more about Instagram’s Community Guidelines.

Some things we think Instagram is doing well to tackle cyberviolence!

These recommendations were created in collaboration with the Purple Sisters Youth Advisory Committee of Ottawa.

  • In it’s updated Community Standards Instagram prohibits defamation, stalking, bullying, abuse, harassment, threats, impersonation and intimidation as well as posting private or confidential information.
  • Users can hide a photo from their profile. Only the person who posts can tag people, but anyone tagged by the poster can hide the photo from Photos of You. You can hide a photo from Photos of You by tapping on your username in a post, and then selecting Hide from My Profile.
  • Instagram introducedFast Account Switching. This means users can toggle and switch between different accounts.
  • Instagram recently launched comment filtering& block lists to all of its users, which means users can moderate the comments on their photos, or even turn off the ability to comment altogether.
  • Users can also opt to unsend private messages / photo sharing.
  • Instagram does not notify users when they are blocked.
  • Anyone can report abuse on Instagram to Instagram.
  • Instagram has some wonderful tools and resources like it’ Help Centre, Privacy & Safety Centre, Eating Disorder Awareness Resource
  • Instagram collaborated with Connect Safely & Media Smarts to write Tips and advice for Parents Guide and partnered with safety organizations around the world to localize this guide.

Key recommendations for Instagram on tackling cyberviolence

These recommendations were created in collaboration with the Purple Sisters Youth Advisory Committee of Ottawa.

Instagram should. . .

  • Diversify its leadership—including expanding opportunities for women and LGBTQ+ people.
  • Expand the definition of “harassment” to include explicit reference to harassment based on transphobia, misogyny, ableism, sizeism and classism.
  • Hold online abusers better accountable for their actions.
  • Allow and encourage users to report abuse and harassment on behalf of others. Bystander intervention is key to preventing and ending cyberviolence.
  • Take proactive steps to monitor and filter content for abuse or harassment.

  • Provide users the ability to block other users regardless of whether or not the user they wish to block has already blocked them.
  • Ensure the teams responding to reports of cyberviolence have training on gender-based violence issues informed by experts in the field—including frontline workers and survivors of cyberviolence.
  • Prioritize reviewing reports related to harassment or abuse.
  • Make terms of service—particularly policies on data use—easier to find and easier to read.
  • Ensure location services are “off” as the default and make it clearer to users when location settings are on.
  • Adopt & advocate for Open Certificate Transparency Monitoring, which offers the ability to know the certificates a CT-enforcing browser will trust. This gives social media platforms more capacity to monitor and verify malicious access, certificates and code from third parties.
  • Restrict or completely shut down third party applications.
  • Make safety and support services accessible to users—including providing easy to find and easy to read information on local support services.
  • Build long-term partnerships with anti-violence experts and frontline workers.

Read our general recommendations for social media platforms.

Add new comment