Privacy as a human right

See Transcript

{***Pause/Music***}
{***Noah***}

Coming up on Harvard Chan: This Week in Health…Privacy as a human right.

{***Dan Scarnecchia Soundbite***}
(Even on a day-to-day basis most people don’t think about privacy. When we’re in a crisis, we’re going to think about privacy differently, especially in an acute situation where we’re going to try to find, say our lost brother, or a lost family member, we will have a different notion of what constitutes personal information, so we might be more willing to share that.)

In this week’s episode: Amid the growing scandal over Facebook’s use of personal information, we’ll examine how the humanitarian field is grappling with ever-changing technology and increasing reliance on data and personal information.

{***Pause/Music***}
{***Noah***}

Hello and welcome to Harvard Chan: This Week in Health, it’s Thursday, April 19, 2018. I’m Noah Leavitt.

{***Amie***}

And I’m Amie Montemurro.

{***Noah***}

Last week Mark Zuckerberg went to Capitol Hill to testify about Facebook’s use of personal information amid the revelation that the private consulting firm Cambridge Analytica obtained the personal data of tens of millions of users.

{***Amie***}

The scandal drew fresh attention to one of the most pressing issues of the 21st century: How can we protect our privacy when we are willingly—or unwillingly—giving vast amounts of data to companies like Facebook, Google, or Amazon.

{***Noah***}

But those tech companies aren’t the only ones using personal information.

This kind of data is also at the core of the work of international agencies delivering humanitarian aid—whether it’s after a natural disaster or in a refugee camp.

And that’s the focus of today’s episode.

{***Dan Scarnecchia Soundbite***}
(My name is Dan Scarnecchia and I’m a researcher with the Signal Program on Human Security and Technology here at HHI)

{***Noah***}

Scarnecchia is part of a team of researchers at the Harvard Humanitarian Initiative devoted to advancing the safe, ethical, and effective use of information technologies during humanitarian and human rights emergencies.

{***Amie***}

The roots of the Signal Program date back to 2010 and the crisis in Darfur.

{***Noah***}

At the time actor George Clooney gave a million dollars to Nathaniel Raymond—now the director of the Signal Program—and Isaac Baker to track atrocities being committed against South Sudanese families.

{***Amie***}

Raymond and Baker used satellites to track the movements of Sudanese militias—which in turn could be used to build evidence of war crimes.

At the time, it was the first use of civilian satellite imagery to monitor such atrocities.

{***Noah***}

Raymond and Baker ran that program for 18 months, but they eventually stopped the program out of concerns over how this imagery might be used.

{***Dan Scarnecchia Soundbite***}
(They felt they couldn’t contain all the externalities. Sort of the potential for causing harm to civilians on the ground they felt was too great because they, of course, as they were publicly releasing imagery, they were anonymizing it and doing their best to anonymize sort of where it was coming from. But they couldn’t guarantee that they would be showing imagery that had everything identifiable to somebody who is local, and so they felt that the risk to civilians on the ground, both your aid workers, but also the local population, was just too great.)

{***Noah***}

That eventually led to the creation of the Signal Program, which focuses in three key areas:

Tools and methods, which focuses on designing and scientifically test tools and methods that remotely collect and analyze data about humanitarian emergencies.

Standards and ethics, which aims to develop standards and professional ethics for the responsible use of technology to assist disaster-affected populations.

And Mass Atrocity Remote Sensing, which involves analysis of satellite imagery and other related data to identify remotely observable forensic evidence of alleged mass atrocities, as well as issues like famine.

{***Amie***}

And an emerging area is populations and mobile technology, which specifically looks at how mobile technology is changing outcomes for refugees worldwide.

Recent research in this area looked at Syrian refugees in Greece and found links between access to mobile phones and a reduced risk of depression.

{***Dan Scarnecchia Soundbite***}
(Crises in general are happening in places where people had preexisting mobile access. I mean, Syria is one of the first real sort of clear examples of a population that had not just cellular access prior to the crisis, but they had mobile 3G internet. And understanding what that means is going to have a large impact on how we respond to a crisis in the future and of course how we think about the information agency of these populations. They are using these tools and understanding the impacts that inventions related to those tools have and that those tools themselves are having on both the population and the crisis is critical to meeting their needs, and ensuring that their rights are being protected.)

{***Amie***}

And data is now being widely used in a variety of humanitarian settings—from natural disasters to violent conflicts.

{***Pause/Nats***}

{***Noah***}

Dan touched on the use of remote sensing earlier to collect information.

And humanitarian workers on the ground are now using tablets to conduct so-called needs assessments during crises.

{***Amie***}

But there are also broader uses of information.

For example, there is a growing movement of cash as aid—where people are given digital cash—either linked to a credit card or on someone’s phone.

And in some cases the aid is directly linked to a person’s identity using biometrics—so when they go to receive their money they’ll have an iris scan to verify their identity.

{***Noah***}

While these developments are promising—they are more cost-effective and efficient—Scarnecchia and his colleagues are concerned that these open the door to violations of rights when it comes to privacy and data.

For example, as more personally identifiable information is collected, the risk of identity theft increases.

Their solution: Something called the Signal Code.

{***Dan Scarnecchia Soundbite***}
(The code, it’s represented the first step in really articulating some rights– from rights extend obligations– related to the population. And then from there, we should start talking about minimal training ethical and technical standards as it relates to the use of different technologies to populations.)

{***Noah***}

The Signal Code represents a human rights approach to information during a crisis and it identifies five human rights related to information.

Those rights are: The Right to Information; The Right to Protection; The Right to Privacy and Security; The Right to Data Agency; The Right to Rectification and Redress.

{***Amie***}

The goal of the Code is to provide a foundation for the future development of ethical obligations for humanitarian actors and minimum technical standards for the safe, ethical, and responsible conduct before, during, and after disasters strike.

{***Amie***}

Scarnecchia says that because technologies are always emerging, the Code aims to outline a basic set of standards regardless of the changing digital landscape.

{***Dan Scarnecchia Soundbite***}
(So, my background before I came a signal program was in the life sciences, and obviously that’s a different space where there’s a very strong regulator. But one of the things that I sort of always like to bring up as an example is until about 10 years ago, maybe even more recently, the most common cut type of data breach in the life sciences was somebody walking out of an office with paper files. So, in a way, what we designed this to be is a we’re thinking about it of course in the context of changing technology and digital technology, but it is meant to be technology agnostic. We see the need now because new information technology is really just they’re increasing the volume and the speed by which you can collect and process data. You know, that being said, when you read through the five rights that we’ve articulated, a lot of those rights are applicable across the board. And in terms of technology, your data collection could be on paper and it would still apply.)

{***Amie***}

Scarnecchia says the Signal Code now forms the basis for much of the work he and others in the Signal Program are doing.

{***Noah***}

For example, they’re currently consulting with the International Organization for Migration to develop a document spelling out their obligations to those that they’re serving.

And he says that at the end of the day the Signal Code is an important reminder that they have a duty to protect and care for those affected by crises—whether they are refugees or survivors of a natural disaster.

{***Amie***}

Scarnecchia says it’s also important to think about how notions of privacy change in these situations—especially among people who are in a particularly vulnerable state.

{***Dan Scarnecchia Soundbite***}
(Even on a day to day basis, most people don’t necessarily think about privacy. When you’re in a crisis or in a conflict zone, one of the things that we as humanitarians have to really take into consideration, and any of us, really, have to take into consideration when we think about privacy, is that we’re going to think about privacy differently. Especially in an acute situation where we might decide to find say our lost brother or lost family member, we will have a different notion of what constitutes personal information in that minute, and so we might be more willing to share that. Now that in and of itself is not necessarily a bad thing. If there is a clear need to be collecting that information and you are consenting to offering that information, in a crisis that might make sense to use that information. But then we don’t want to treat that as blanket consent to use that information going forward for other purposes. So, it’s very, very important to think about A, consent, and B, ensuring that that’s sort of a time delimited consent because we have to recognize that this population is particularly vulnerable. They’re also thinking about personal information differently than they would be in normal circumstances, so they might be willing to share information with us because they are in extremis. And when they do that, there is both a duty to them and I think there is a sort of a trust-based relationship there that we have to consider and remember. And remember that when they are no longer in this situation, they are going to think about that data and that privacy differently.

{***Amie***}

Just as our notions of privacy may change depending on our circumstances—the types of harm that data can cause are also changing.

{***Noah***}

Earlier in the podcast, Scarnecchia talked about the concerns surrounding the use of remote sensing imagery—or the risks when biometric information is tied to humanitarian aid.

But new threats are also emerging.

{***Pause/Nats***}

{***Noah***}

That’s a clip from a New York Times video on a story that made international headlines a few months ago.

It was revealed that a “heat map” posted online by the popular fitness-tracking company Strava could be used to track the location and reveal the identities of military personnel and others working in war zones or other sensitive locations.

{***Amie***}

That includes those working in humanitarian settings.

Researchers at the Signal Program were actually able to identify names and daily routines of foreigners working for aid agencies and the UN in Somalia.

Scarnecchia says the Strava story highlights the danger of so-called “demographically identifiable information.”

{***Noah***}

It’s something he and others in the humanitarian field have been looking at for a while—but what happened with Strava put new attention on the issue.

{***Dan Scarnecchia Soundbite***}
(It’s one of those privacy things that a lot of people don’t necessarily think about because it doesn’t really affect you on a day to day basis. So, we normally think about privacy in the context of personally identifiable information, so we don’t like to share things that can easily re-identify us. What DII is is saying that there are ways to identify a demographic based on aggregated information or information that you might not consider personal. So, in the case of the basis that we saw on Strava, you weren’t looking for specific individuals. All you had was these aggregated data points in the middle of the desert and some satellite imagery from Google and a little bit of knowledge about well, there’s really no reason for anybody to be there who’s using this tool. It is most likely, then, US service members or a military outpost of some sort, at the very least. And so that is enough information then to start identifying that community.)

{***Noah***}

In the case of Strava it would also have been possible to take that demographically identifiable information and use it to personally identify people.

{***Amie***}

But Scarnecchia says that’s not even necessary to cause real harm—especially when talking about vulnerable groups.

{***Dan Scarnecchia Soundbite***}
(If using, again, non-personal data that’s out there publicly, you can re-identify say the location of a camp where all the unaccompanied children are or where the all the young, unmarried women are, then you have an entire vulnerable community that is put at risk. And you may not necessarily be targeting individuals if you’re somebody who’s looking to cause harm, say if you’re looking to recruit child soldiers or traffic young women. All that really matters to you is the demographic and where they’re clustered. And so if you can do something like that, you’ve already potentially found a vector by which to cause harm and all of that information was publicly available because that is not necessarily something that has been, until now, really understood to be a privacy risk.)

{***Amie***}

And that is what makes the work that Scarnecchia and others are doing so difficult.

There’s no catalog of all the harm that can be caused improper use of data—simply because new technologies or methods of data collection and aggregation are always emerging.

{***Noah***}

There’s also the issue of the lack of data.

We mentioned earlier that one of the tents of the Signal Code is the right to rectification and redress.

What if you’re a refugee and you go to collect aid, but you’re not in a particular database.

What happens then?

Scarnecchia says the humanitarian sector is still grappling with the best ways to handle data—especially when it comes to displaced people on the move.

{***Dan Scarnecchia Soundbite***}
(We are adopting technology very quickly, but we don’t necessarily have the skills in the sector to really implement it properly. So, you’re either relying on outside vendors or you’re implementing things and there might not be clear standards for sharing data across organizations. And in the humanitarian sector in particular, budgets are tight, and so a database might be an Excel spreadsheet on a server underneath someone’s desk. You know, A, there are security implications to that, especially if you’re just using a password to protect the spreadsheet. But there’s also that data might be siloed, and so it might not readily be merged with data that says, oh yes, this individual in such and such a place also is entitled to this. And even though they share a common registration numbers in both databases, if the databases aren’t linked and talking, you might not necessarily know. Or this person might not be in this database even though they’re supposed to be. You might not be providing them with everything that they’re entitled to and it produces inefficiencies on the responders’ side and clear harm on the recipient side.)

{***Noah***}

And that idea of agency over your own data is important whether you’re a refugee seeking aid or a Facebook user concerned over use of personal information.

{***Amie***}

The challenge, says Scarnecchia, is that companies like Facebook and Google have such a wide reach that they’ve essentially become utilities—it’s hard for any one individual to really take control of their data.

{***Noah***}

And that’s compounded in a crisis when people may be more willing to share information to help themselves or family members.

That’s why groups like the Signal Program have to speak openly about the potential harms of misuse of data and private information.

It’s not about changing individual habits—it’s about changing how large organizations think about our personal information.

{***Dan Scarnecchia Soundbite***}
(Information is power, but I think it’s also agency. We’re talking about populations in a lot of cases that have lost quite a bit. And ensuring that they both have some access to the information that gives them an identity is, I think, profoundly important both in terms of– and this of course, has not been scientifically validated, but in my personal opinion is really important– in terms of their psychological well-being. And my colleague, Dani’s work is starting to bear that when it comes to depression. But also, it’s simply quite important in terms of meeting their needs. What that does eventually is it will, as it trickles out of the academic literature and just sort of mass understanding, is it does sort of provide a clear set of milestones for these companies to live up to if they want to be trusted. And so when you’re an individual in a crisis setting, it’s really hard to say what agency you have related to, say, one of these particular companies. We I think have to be very cautious as large organizations that are coming into sensitive political climates or just simply disaster areas where people are very vulnerable to remember that people have rights and that their use of information and technology– really, I mean, we should be thinking about it just in the same way we would at least be thinking about it with populations in a non-disaster environment with all of the rights that the rest of us have in that non sort of extremis situation. Well, just because they are a vulnerable population doesn’t mean they waived those rights.)

{***Noah***}

Scarnecchia says a key goal of the Signal Program going forward is to make sure that the research they do is connected to humanitarian workers on the ground—so that they can implement best practices in standards and ethics when it comes to privacy and data.

{***Amie***}

If you want to read the Signal Code, or learn more about the work of the Signal Program, visit our website, hsph.met/thisweekinhealth.

{***Noah***}

That’s all for this week’s episode. A reminder that you can always find us on iTunes, Soundcloud, Stitcher, and Spotify.

April 19, 2018 — The recent scandal over Facebook’s use of personal information has shone fresh light on one of the most pressing issues of the 21st century:  How can we protect our privacy when we are willingly—or unwillingly—giving vast amounts of data to companies like Facebook, Google, or Amazon. But those technology companies aren’t the only ones using personal information. This kind of data is also at the core of the work of international agencies delivering humanitarian aid. In this week’s episode we speak with Dan Scarnecchia, a researcher with the Signal Program on Human Security and Technology based at the Harvard Humanitarian Initiative. Scarnecchia and his colleagues recently wrote the Signal Code, which represents a human rights approach to privacy and data during crises. We’ll examine how the humanitarian field is now grappling with ever-changing technology and increasing reliance on data and personal information.

You can subscribe to this podcast by visiting iTunes, listen to it by following us on Soundcloud, and stream it on the Stitcher app or on Spotify.

Learn more

Refugee Connectivity: A Survey of Mobile Phones, Mental Health, and Privacy at a Syrian Refugee Camp in Greece

photo: International Committee of the Red Cross