Data Privacy Day: Honoring A Survivor’s Right To Safely Access Technology

person on computer

When a survivor reaches out to a domestic violence program for help, it’s often as a last resort and with much trepidation. Social connection, access to financial resources, and a safe home have often been systematically stripped away from them by their abuser. Smartphones, email, and social media accounts are often the last remnants of their connection to support, and can serve as an important lifeline when they’re in danger.

Yet we often hear from survivors that when they’ve reached out for help about the harassment, stalking, and abuse they’ve experienced through technology and social media, the only advice they get is to completely disconnect from technology and delete their accounts. But this places the blame in the wrong place. The technology isn’t the issue; the abuser’s behavior is. And worse yet, this response punishes the victim for the abuse they’ve suffered, forcing them to become more isolated because their only option is to disconnect. It also impacts their safety; if a survivor is in need of help but can no longer access their support systems, the risk of danger can increase dramatically.

This Data Privacy Day, we celebrate a survivor’s right to safely access technology, and encourage programs to proactively safety plan with survivors to help them feel empowered and safe with their technology use. We need to view safe access to technology, the internet and social media as a fundamental right of survivors. Technology is a necessity in our everyday lives, and removing it is not a feasible option. Instead, domestic violence programs can help survivors not only find temporary refuge, but also help them build a new skill that will empower them to stay connected, feel less isolated, and have communication tools that can help them in emergency situations.

The Safety Net Project develops tools and resources that help both survivors and victim service agencies become more informed about how to safely use technology, and about how abusers might misuse technology to stalk and harass. On Data Privacy Day, we encourage you to explore these tools listed below, and to reach out to us with any questions you may have about the safe use of technology.

  • The TechSafety App - This app was created for anyone who thinks they might be experiencing harassment or abuse through technology or who wants to learn more about how to increase their privacy and security while using technology.

  • Technology Safety & Privacy Toolkit For Survivors - Survivors of domestic violence, sexual assault, stalking, and trafficking often need information on how to be safe while using technology. This toolkit provides safety tips, information, and privacy strategies to help survivors respond to potential technology misues and to increase their safety and privacy.

  • The App Safety Center - There’s an app for everything, right? An increasing number of apps for smartphones and tablets are attempting to address the issues of domestic violence, sexual assault, and/or stalking. With so many apps, knowing which ones to use can be difficult. The App Safety Center will highlight some of these apps by providing information on what survivors and professionals need to know to use them safely.

  • Agency’s Use of Technology: Best Practices & Policies Toolkit - The way domestic violence, sexual assault, and other victim service agencies use technology can impact the security, privacy, and safety of the survivors who access their services. This toolkit contains recommended best practices, policy suggestions, and handouts on the use of common technologies. 

Protecting Victim Privacy While Increasing Law Enforcement Transparency: Finding the Balance with Police Data Initiatives

One of the hallmark efforts of the outgoing Obama administration has been the Police Data Initiative, launched to improve the relationship between law enforcement agencies and the communities they serve. The Police Data Initiative encourages local law enforcement agencies to publicly share information about 911 calls, stops, arrests, and other police activities so that community members can look both at individual cases, as in some high-profile events covered by the media, and at trends that might reveal disproportionate response over time.

It has been more than two decades since the Violence Against Women Act was first passed, and we have seen significant improvements in the criminal justice system’s response to domestic violence, sexual assault, and stalking. This success is due in great part to the efforts of victim advocates and law enforcement officials working together to improve systems. But as we celebrate these successes, we know this work is far from finished, and that there is still much work to be done to improve police response - particularly within marginalized communities.

As we work with law enforcement to improve responses to victims and communities, we must ensure that the privacy and safety of victims who interact with law enforcement is a fundamental cornerstone of those efforts. Police data released to the public has the potential to reveal victims’ identities and consequently put them at risk of further harm, harassment, or damage to their reputation. These concerns can also significantly impact a survivor’s decision on whether they even contact law enforcement for help in an emergency.

For more than a year, Safety Net has explored the issue of how to maintain victim privacy and safety while simultaneously supporting the overall intention behind the Police Data Initiative. These efforts have been made possible by the support of the Office on Violence Against Women (U.S. Department of Justice) and Harvard University’s Berkman Center for Internet & Society, and in partnership with the White House, the Police Foundation, the International Association of Chiefs of Police, the Sunlight Foundation, the National Institute of Standards and Technology, the Vera Institute of Justice, and others.

Today, we are pleased to announce the release of a guide that outlines the results of these efforts titled, “How Law Enforcement Agencies Releasing Open Data Can Protect Victim Privacy & Safety”, which was authored collaboratively with the Police Foundation. This guide describes the need for victim privacy to be a central consideration in efforts to share data with the public, and provides specific recommendations that will assist local law enforcement agencies in their efforts to ensure victim privacy while increasing transparency.

In the coming weeks, we will be releasing a similar guide written for advocates, as well as an issue summary that describes how the Police Data Initiative intersects with our work to ensure the safety and privacy of survivors.

 

YouTube’s New Tools Attempt to Address Online Harassment

Online harassment and abuse can take many forms. Threating and hateful comments turn up across online communities from newspapers to blogs to social media. Anyone posting online can be the target of these comments, which cross the line from honest disagreement to vengeful and violent attacks. This behavior is more than someone saying something you don’t like or saying something “mean” – it often includes ongoing harassment that can be nasty, personal, or threatening in nature. For survivors of abuse, threatening comments can be traumatizing, frightening, and can lead some people to not participate in online spaces.

YouTube recently created new tools to combat online abuse occurring within comments. These tools let users who post on their site choose words or phrases to “blacklist” as well as the option to use a beta (or test) version of a filter that will flag potentially inappropriate comments. With both tools, the comments are held for the user’s approval before going public. Users can also select other people to help moderate the comments.

Here’s a summary of the tools, pulled from YouTube:

  • Choose Moderators: This was launched earlier in the year and allows users to give select people they trust the ability to remove public comments.

  • Blacklist Words and Phrases: Users can have comments with select words or phrases held back from being posted until they are approved.

  • Hold Potentially Inappropriate Comments for Review: Currently available in beta, this feature offers an automated system that will flag and hold, according to YouTube’s algorithm, any potentially inappropriate comments for approval before they are published. The algorithm may, of course, pull content that the user thinks is fine, but it will improve in its detection based on the users’ choices.

Survivors who post online know that abusive comments can come in by the hundreds or even thousands. While many sites have offered a way to report or block comments, these steps have only been available after a comment is already public, and each comment may have to be reported one by one. This new approach helps to catch abusive comments before they go live, and takes the pressure off of having to watch the comment feed 24 hours a day.

These tools also offer survivors a means to be proactive in protecting their information and safety. Since many online harassment includes tactics such as doxing (where personal information of someone is posted online with the goal of causing them harm), a YouTube user can add their personal information to the list of words and phrases that are not allowed to be posted. This can include part or all of phone numbers, addresses, email addresses, or usernames of other accounts. Proactively being able to block someone from posting your personal content in this space will be a great tool.

Everyone has the right to express themselves safely online, and survivors should be able to fully participate in online spaces. Connecting with family and friends online helps protect against the isolation that many survivors experience. These new tools can help to protect survivors’ voices online.