Facebook’s Proactive Approach to Addressing Nonconsensual Distribution of Intimate Images

It’s well-known that technology has made sharing sexually intimate content easier. While many people share intimate images without any problems, there’s a growing issue with non-consensual distribution of intimate images (NCII[1]), or what is often referred to as “revenge porn.” Perpetrators often share - or threaten to share - intimate images in an effort to control, intimidate, coerce, shame, or humiliate others. A survivor threatened by or already victimized by someone who’s shared their intimate images not only deserves the opportunity to hold their perpetrator accountable, but also should have better options for removing content or keeping it from being posted in the first place.

Recently, Facebook announced a new pilot project aimed at stopping NCII before it can be uploaded onto their platforms. This process gives people who wish to participate the option to submit intimate images or videos they’re concerned someone will share without their permission to a small, select group of specially trained professionals within Facebook. Once submitted, the images are given what’s called a “hash value”, and the actual images are deleted. “Hashing” basically means that the images are turned into a digital code that is a unique identifier, similar to a fingerprint. Once the image has been hashed, Facebook deletes it, and all that’s left is the code. That code is then used as a way for Facebook to identify if someone is attempting to upload the image and prevent it from being posted on Facebook, Messenger, and Instagram.

Facebook’s new pilot project may not be something everyone feels comfortable using, but for some it may bring much peace of mind. For those who believe it may help in their situation, we’ve outlined detailed information about how the process works:

  1. Victims work with a trusted partner. Individuals who believe they’re at risk of NCII and wish to have their images hashed should first contact one of Facebook’s trusted partners: the Cyber Civil Rights Initiative, YWCA Canada, UK Revenge Porn Hotline, and the eSafety Commissioner in Australia. These partners will help them through the process and identify other assistance that may be useful to them.
  2. Partner organizations help ensure appropriate use. The partner organization will carefully discuss the individual’s situation with them before helping them start the hashing process. This helps ensure that individuals are seeking to protect their own image and not trying to misuse the feature against another person. It’s important to note that the feature is meant for adults and not for images of people under 18. If the images are of someone under 18, they will be reported to the National Center for Missing and Exploited Children. Partner organizations will help to explain the reporting process so that individuals can make appropriate decisions for their own case.
  3. The Image will be reviewed by trained staff at Facebook. If the images meet Facebook’s definitions of NCII, a one-time link is sent to the individual’s e-mail. The link will take the individual to a portal where they can directly upload the images. All submissions are then added to a secure review queue where they will be reviewed by a small team specifically trained in reviewing content related to NCII abuse.
  4. NCII will be hashed and deleted: All images that are reviewed and found to meet Facebook’s definition of NCII will be translated into a set of numerical values to create a code called a “hash.” The actual image will then be deleted. If an image is reviewed and Facebook determines it does not match their definition of NCII, the individual will receive an email letting them know (so it’s critical that someone use an email that cannot be accessed by someone else). If the content submitted does not meet Facebook’s definition of NCII, then the concerned individual may still have other options. For example, they may be able to report an image for a violation of Facebook’s Community Standards.
  5. Hashed images will be blocked: If someone tries to upload a copy of the original image that was hashed, Facebook will block the upload and provide a pop-up message notifying the person that their attempted upload violates Facebook’s policies.

This proactive approach has been requested by many victims, and may be appropriate on a case-by-case basis. People who believe they’re at risk of exposure and are considering this process as an option should carefully discuss their situation with one of Facebook’s partner organizations. This will help them make sure they’re fully informed about the process so that they can feel empowered to decide if this is something that’s appropriate for their unique circumstances.  

For more information about how survivors can increase their privacy and safety on Facebook, check out our Facebook Privacy & Safety Guide for Survivors of Abuse.


 

[1] NCII refers to private, sexual content that a perpetrator shares publicly or sends to other individuals without the consent of the victim. How we discuss an issue is essential to resolving it. The term “revenge porn” is misleading, because it suggests that a person shared the intimate images as a reaction to a victim’s behavior.

Cambridge Analytica and Why Privacy Matters to Survivors

Recent news that the personal information of tens of millions of people was used by Cambridge Analytica “to create algorithms aimed at ‘breaking’ American democracy” as the New Yorker phrases it, has led to a call to #DeleteFacebook. For those unfamiliar with the story, our friends at AccessNow wrote a great summary.

This kind of invasion of privacy is not new, nor is it limited to this case. The old expression, “No free lunch,” applies to any service that we don’t pay for, whether it is social media or a discount card at the grocery store or entering a raffle to win a new car. The true cost is allowing those companies to access our personal information for their own profit.

Safety is the primary concern. For survivors who face threats of harm, who live daily in fear from the abusers, the security of personal information can be a life and death issue. For survivors fleeing an abuser, information about location, work, kids’ schools, and social connections can lead an abuser to the doorstep. For survivors living with abuse, information about friends, thoughts, feelings, opinions, and interests can be misused by an abuser to control, isolate, or humiliate.

For survivors, privacy is not an abstract issue, or a theoretical right to be debated on CSPAN. Privacy is essential to safety, to dignity, to independence. Yet, we live in a time when personal information = profit.

The Cambridge Analytica story surfaces the underlying reality that our personal information is not under our control. It feels like we are seldom asked for consent to share our personal data. When we are, it is in legalese, in tiny letters that we might have to scroll through to be able to check that box, and get on with using whatever website we’re trying to use. Even if we do take the time to read through those privacy terms, we know that data is routinely stolen, or accidentally published on the Internet, or used against us to affect access to loans, insurance, employment, and services.

We are social animals. We crave connection. Research shows that we suffer without it. Isolation is a classic tactic of abuse. But the price we too often pay for connection online is our privacy.

At times like these, we may think about deleting Facebook, going offline, or throwing away our phones. We may think that survivors should give up their tech at the door of our shelters, or that they have to go off the grid in order to be safe.

Digital exile is not the answer. Technology, and the Internet, is a public space where everyone, including survivors, should have the right, to share their voices, to make connections, and to access information without fear of their personal information being collected and used without their consent. April Glaser writes in Slate that, “[d]eleting Facebook is a privilege,” pointing to the huge number of people that rely on it to connect with friends, to learn about events, to promote a business, or, in parts of the world with limited Internet access, just to be online at all.

Survivors, just like every other consumer, should be given the opportunity to give truly informed consent. That consent must be based on clear, simple, meaningful, understandable privacy policies and practices – not just a check box that no one pays attention to.

A guide to the process of changing your Facebook settings to control apps’ access to your data is available from the Electronic Frontier Foundation. Also check out our own guides to Online Privacy and Facebook Privacy and Safety.

Facebook Removes Search By Name Option

 

Last week, Facebook announced that they were removing the “Who Can Look Up My Timeline By Name” option for their users. Since then we have been contacted by many concerned advocates about what removing this feature means for survivors, many of whom use Facebook to stay connected with friends and family but whose privacy from their abusers and stalkers is equally important.

When Facebook first told us they were planning to make this change, we expressed that this feature is one method some survivors use to control their privacy. Opting out of being searchable by name was one way in which survivors could use to keep an abuser or stalker from finding their timeline/account. 

However, Facebook explained, and we agree (because we’ve known this for a while too), that this feature gave a false sense of privacy, since even if this feature was activated, people can still be found in other ways. Some of those ways include:

  • Mutual friends. If you have mutual friends, unless you choose to not allow mutual friends to see your activity, many people can be found that way. Moreover, even if they have chosen to not allow friends of friends to see their activity, we have heard of many survivors whose mutual friends simply shared the information with their abuser or other people. 

  • Username/User ID. If someone knew your exact username or userID, they can find you that way. 

  • Graph Search. Graph search is a new searching option that Facebook has been slowly rolling out, and this type of search will make anyone searchable, even if they have selected that they don’t want to be found by name. Unlike personal demographics information, graph search reveals users based on things they like or things their friends like and other demographics information about the user that public. So, for example, if you like a particular restaurant, live in Albuquerque, NM, someone can do a search for “People who like [restaurant] in [city]” and find all the people who have liked it. 

Although we are disappointed that the option to be searched by name has been removed, the safest course for survivors and advocates is to educate themselves about how they can be found on Facebook regardless of privacy settings. Users should know what kinds of information will always be public, understand how widely information can be shared online, and determine what they will share based on their own privacy risks. The reality is that social media always has, and always will, move toward a model of sharing and openness; even if something is private now, it may not always be so. 

In light if that, it is important to know that these activities/information will always be public on Facebook:

  • Your name, profile picture, your cover photo, your username and user ID, and any networks you belong to.

  • Any public pictures or posts you like or comment on. For example, if you like or commented on a picture or a post where the original author set that picture or post to public, the fact that you liked it or your comment will be public. 

There are a few things that survivors can do to maximize their privacy.

  • Check out the “view as” option, to see what someone can see when they look at your page, whether it’s as a friend, a friend of a friend, or the public. 

  • Review your timeline by going back to previous posts on your timeline and change who can see those posts. You can even delete old posts. 

  • Going forward, limit what you share by choosing only friends. You can even go further and create lists that will limit exactly who see the specific information you are sharing. 

  • Take a look at Safety Net’s handout on Facebook Privacy for more privacy tips. 

As Facebook continues to change their privacy settings and introduce new features to their users, it is critical that survivors and advocates understand those changes and how it affects the personal information they share on Facebook. Facebook allows users to delete old posts or pictures, so it might be time to do your own Facebook audit and clean up your timeline.