Skip to main content

Northwestern Buffett Institute for Global Affairs

Big Data and the Human Rights of Migrants with Jacqueline Stevens

The third season of the podcast has focused on the role of technology in global affairs. In this episode, Jacqueline Stevens, Professor of Political Science at Northwestern University and Founding Faculty Director of the Deportation Research Clinic at the Northwestern Buffett Institute for Global Affairs, shares her expertise on the use of big data technology at the border and its impact on migration, deportation and human rights.

 spotify-podcast-badge-blk-wht-165x40.png en_google_podcasts_badge_2x.png us_uk_apple_podcasts_listen_badge_rgb.svg us_listenon_amazonmusic_button_white_rgb_5x.png 

Regis' headshot

There's a whole bunch of different ways that big data are being used at the U.S. border. And I would say that the biggest problem right now in terms of data collection isn't really AI, but more big data and algorithms that have these kinds of broad bands for capturing information that is putatively about individuals, but may get it wrong.”

—Jacqueline Stevens

  • Professor of Political Science, Northwestern University
  • Founding Faculty Director, Deportation Research Clinic at the Northwestern Buffett Institute for Global Affairs

Background reading

  • Find out how to get involved with the Deportation Research Clinic
  • Follow Stevens on Twitter


Subscribe to Breaking Boundaries wherever you listen to podcasts so you never miss an episode:
spotify-podcast-badge-blk-wht-165x40.png en_google_podcasts_badge_2x.png us_uk_apple_podcasts_listen_badge_rgb.svg us_listenon_amazonmusic_button_white_rgb_5x.png 

Read the transcript of this show

[00:00:00] Annelise Riles: Welcome to the Breaking Boundaries podcast. I'm Annelise Riles, Executive Director of Northwestern University's Roberta Buffett Institute for Global Affairs. The Northwestern Buffett Institute is dedicated to breaking through traditional silos of expertise, geography, culture, and language to surface novel solutions to pressing global challenges. This season on the podcast, as listeners know, we're focusing on the role of technology in global affairs. Many countries, including the United States, are embracing technologies such as artificial intelligence and big data to process people at borders, including those seeking asylum and refugee protection. But how can it be applied humanely and fairly in such high risk situations? To discuss all this and more, we're delighted to have with us today Professor Jackie Stevens, founding director of the Deportation Research Clinic right here at Northwestern Roberta Buffett Institute for Global Affairs. She's also a professor of political science at Northwestern University. Jackie, welcome. And thank you so much for joining us today.

[00:01:12] Jacqueline Stevens, PhD: Oh, thank you for having me.

[00:01:13] Annelise Riles: So Jackie, let's start with you. Can you share a little bit about your own background? What initially got you interested in the human rights of migrants?

 [00:01:22] Jacqueline Stevens, PhD: My training is actually in political theory, and I had a special interest in political theories of membership. So my question was, why is some competition between groups violent and other competition nonviolent? And it seemed to me that the variable was whether one was born into a group that an individual experienced an attachment to, as opposed to an individual joining a group as a choice. And so that maps on to the facts that nationality, ethnicity, race, and so forth seem to be the dividing lines for the groups that compete using violence. The idea is that political society and nation are two sides of the same coin and the ethnicity is the imprint of that coin somewhere else. So, I wrote a book called Reproducing the State during the Balkan War. And so the idea was to think about the ways that these very formal rules of membership created by the state were being experienced as part of affinities that were understood as natural and leading people to be disposed to either commit violence, or risk their lives on behalf of these kinds of groups.And so I kind of moved from this very theoretical orientation of, thinking about political theories of membership to looking at much more granular level operations of the law. I decided to focus on deportation. Deportation is part of that same system that leads to war. It's about ways of mobilizing the state to enforce boundaries based on ideas about intergenerational differences. So in 2012, I founded the Deportation Research Clinic through what was then the Buffett Institute for Global Studies.

[00:03:08] Annelise Riles: We here at Buffett care a lot about intergenerational justice. And I had never thought before about the way that our concept of citizenship is founded on certain ideas about intergenerational justice or injustice. That is extremely interesting. So let's talk about big data and maybe AI at the border. First of all, can you just give us a little background, Jackie? How is big data currently being used at the U.S. border?

[00:03:39] Jacqueline Stevens, PhD: There's a whole bunch of different ways that big data are being used at the U.S. border. And I would say that the biggest problem right now in terms of data collection isn't really AI, but more big data and algorithms that have these kinds of broad bands for capturing information that is putatively about individuals, but may get it wrong. And there's two different kinds of problems. So one problem is there's a program that's called Bitmap and it's an interoperable database that allows people at the border in the United States Customs and Border Protection, or ICE or any other law enforcement agency to track migrant data that actually is generated by encounters in other parts of the world and in particular South America. So if somebody is entering from Qatar, and they are originally from India, and then they arrive in Ecuador, all of those encounters will be part of their biographical information in this Bitmap database. However, when people are escaping, oppressive governments, or if people are traveling and they're under 18, and they don't have legal authority to cross borders based on their actual passports, they may be traveling with false documents. And so, when these documents are tracked to individuals through this Bitmap database and there are inconsistencies when people are entering the United States, those inconsistencies will be flagged as evidence of fraud. And so, instead of, when people are trying to then use their actual documents in order to make asylum claims, the ICE will say, oh, well, we've got all this data collected, and it shows that when you entered in Ecuador, you used a different name, and so therefore, the documents that you're sharing with us now, must be fraudulent. And so, that's one way that it harms, you know, migrants. And the problem with this is that it's not legal the way that they're using this evidence, like it's not really evidence itself. It's like, it's data in a database. But the narratives that I'm describing are from the very small number of cases that actually make their way into the federal courts, when, let's say, juveniles who are traveling under false documents are able to challenge the denial of their asylum claims. And in those cases, the federal courts that are reviewing the immigration courts that deny asylum are overturning the denials. But we know from the Deportation Research Clinic having done these FOIA requests and, you know, litigated to get the data on these age assessments that are being used to claim that people are over the ages that they claim to be, are on a pretty large scale. It's in the thousands. And so, if you just get a few cases that are litigating this and show that the outcomes are, overturning what the government is saying, they're overturning the Homeland Security claims, then that tells us that there's a more systemic problem that's occurring through the use of these kinds of databases that wouldn't occur, had the government not been trying to track this. So that's one example.

[00:06:53] Annelise Riles: So how is the Deportation Research Clinic addressing this? Are you looking at some of these cases? How are you working on it?

[00:07:01] Jacqueline Stevens, PhD: So the way I got engaged in this was about five years ago, an attorney in San Francisco who had some of these cases of children who were being moved out of the shelters and put into ICE custody, got in touch with me because they had been seeing that the basis of the children being moved out of the shelters and into the detention facilities were dental radiographs, x-rays. And so, the shelters, many of which were privately operated, would send to some guy at the University of Texas, named David Senn, these x-rays, and he would evaluate their third molars and decide on the basis of the size of their third molars, whether or not they were more or less likely to be 18 years old. And so when I heard about this, I filed a FOIA request to get copies of all of what are called the age reassessment memorandums, that ICE created in order to move people from the shelters into ICE detention. And so we received about 800 of these, and the students coded them. And the names are all redacted, right, but you could see certain kinds of patterns within these. And, This American Life did a story that was based on some of this, and we're writing up a paper on this. I also co-authored a piece that was published in the American Journal of Public Health, with some scholars who have public health credentials and, you know, to call attention to the misuse of this kind of data for purposes of, not just moving these children into detention facilities, but then also casting doubt on the credibility of their asylum applications, right? Because the idea is that, oh, if you're lying about being under 18, then you must be lying about everything else. And one of the patterns that I noticed in these records was that Bangladeshi teenagers in particular were being targeted. And one other thing that was weird is that even though the age assessments were supposed to be at a 75 percent cutoff for purposes of moving somebody out of the shelter into the detention facility, like, you know, oh, there's a 75 percent likelihood that this person's older than 18 based on this ridiculously assessed probability index, even when people were under the 75 percent cutoff, they could say 28 percent likelihood of being over 18, they still would be moved out. And so the, you know, it just showed that the whole thing was bogus. They would say, oh, there's some probability. So they wouldn't actually state in the memo the probability. I could see from the documents I had that it was a 28 percent probability, but they didn't really care about that. So I guess that's, again, what we're focused on is government misconduct. And the idea is that, you know, it's really hard to implement the country's deportation laws in a way that's consistent with the rule of law. And by highlighting these discrepancies, the object of the game isn't to say, “Oh, well, if we can deport people as long as we're doing it more efficiently,” but rather to like throw some monkey wrenches into the machine and get people to think a little bit about how it is that we're so motivated to violate the rule of law when it comes to implementing our deportation and immigration systems.

[00:10:22] Annelise Riles: So is this a problem with big data per se, or is it a problem with people's willingness to interpret the data in a way that is unjust?

[00:10:31] Jacqueline Stevens, PhD: The bigger picture, if we kind of pull out a little bit from the age assessment question, is this effort of governments to try to track different migrant populations. And I'm sure you're familiar with the UN Compact on Global Migration, which has as its, you know, it's a 2018 document and it has as its number one objective, creating data for tracking what they call migrant flows of ethnic stock. And so there's this effort to blanket refugee camps and all sorts of areas where people are interacting with government or nonprofit officials in order to collect individual level data. And so there's a huge number of different kinds of problems that occur when you have these large private organizations, as well as government organizations, collecting massive amounts of data at the level of the individual and also for purposes of larger policies. So, I think it's both. And the problem is false positives and it's also false negatives. There's just like a myriad of ways that this can go wrong. So, I think, you know, one of the main problems is just having these associations of biographical characteristics expressed in terms of ideas about ethnicity or nationality that affirm the importance of those kinds of features. And, you know, this is a concept with which I'm sure you're familiar, which is like methodological nationalism, right? That the most important characteristic of who we are comes from these kinds of political assignments. And so I would say that, beyond any of the individual level abuses that can occur, it's the larger problem of the taxonomies themselves that are creating the environment in which these policies are being implemented and naturalized.

[00:12:17] Annelise Riles: As you think about the work of the deportation clinic, can you see ways in which big data and maybe AI could be actually useful to you and your team going forward?

[00:12:28] Jacqueline Stevens, PhD: Yeah, I mean, there's a lot of different ways that it could be useful. In fact, a student who works with the Deportation Research Clinic, Kendall McKay, who's been working with me for a while, over the summer put together something called a FOIA dashboard, where we can track now the expenditures of each agency of the government since 2010 in responding to FOIA requests. And so, she had to hand code all of this, right, in order to create this FOIA dashboard. But I think there's a lot of uses of potentially of AI or other kinds of algorithms and so forth to try to organize this information, and so that would be one example. And, and I'm so proud of what Kendall did, and this is going to be something that is useful for a whole bunch of different kinds of nonprofits, because they'll be able to use that for their FOIA litigation as well.

[00:13:19] Annelise Riles: I want to finish by asking the question that I ask all my guests, which is, as you think about all these problems moving forward, what keeps you up at night? What are you most worried about? And also, where do you find your hope?

[00:13:33] Jacqueline Stevens, PhD: What I'm most worried about is cynicism and the corrosion of faith in the rule of law in this country. I'm very worried that if people are cynical, both on the left and the right, that that will erode the legitimacy of courts, so I guess that's the main thing that I'm worried about. What gives me most hope are our students. I have to say that the students that I'm encountering in the last few years, they get it, they see that the world is not such a great place right now and they're going to have to be the ones who fix it. And I think that they take that responsibility very seriously.

[00:14:09] Annelise Riles: Well, Jackie Stevens, thank you.

[00:14:12] Jacqueline Stevens, PhD: And thank you, and thank you for the support of the work that we do. It means a lot.

[00:14:16] Annelise Riles: For more information on this episode and on the Northwestern Buffett Institute for Global Affairs, visit us at