Freedom Unfinished

E4: When Technology Meets Law

October 24, 2022 ACLU of Massachusetts Season 1 Episode 4
Freedom Unfinished
E4: When Technology Meets Law
Show Notes Transcript

In our final episode, we'll shift our focus and go back in time to the mid 20th century to learn about a worst case scenario with the story of Sydney Gottlieb and MKUltra. Afterwards, we'll talk to people who are using law and policy reform together to enable technologies toward a fight for a better world. 

Listen to ACLUM executive director Carol Rose and Technology for Liberty program director Kade Crockford explore big data and artificial intelligence through the lens of power, democracy and the broken systems that will determine the future of our rights.

Join us this season wherever you get your podcasts and follow the ACLU of Massachusetts on social media @ACLU_Mass for the latest updates on Freedom Unfinished, Season 1: Decoding Oppression.

Thank you to Trent Toner and the Rian/Hunter Production team for sound mixing this episode.

Carol Rose (00:02):

Welcome back to Freedom Unfinished. So far on this podcast, we've discussed the threat to civil liberties when technologies outpace the law. Today in our final episode, we'll shift our focus a bit. First, we'll go backward in time to the mid 20th century to learn about a worst case scenario with a story of what can go wrong when scientific innovation and exploration are divorced from democratic norms like transparency, accountability, and basic justice. Then we will switch gears and talk about the present in the future and about how technology can be used to expand civil rights and civil liberties. We know technology isn't neutral and that it is neither inherently good or evil, so we'll talk to people who are using law and policy reform together to enable data and technologies to shine a light on the powerful and to fight for a better world. I'm here with my colleague, Kade Crockford, who leads the Technology for Liberty program at the ACLU Massachusetts.

Kade Crockford (01:03):

Thanks, Carol. Let's begin by talking about something that is evil. Most people alive today have probably never heard the story of the US government's super secret scientific mind control project codenamed MK Ultra. For those of you who have heard of it, I hope you'll keep listening anyway, because the details are important for us to consider if we're to deal frankly and with urgency with the fundamental questions about democracy and technology we've been raising in prior episodes. The CIA's MK Ultra Program ran from 1953 through 1973, and it had one simple goal, to determine whether the government could use drugs and psychological techniques to control a person's mind. The CIA was interested in the question of mind control because it feared that the United States Cold War enemies would discover the key to controlling the human mind and the US would be left in the technological dust.

Kade Crockford (02:01):

Powerful people like then CIA director Alan Dulles enthusiastically supported MK Ultra and justified the program by invoking quote unquote national security. The program involved the systematic and brutal torture of hundreds of people across the world, including unsuspecting US citizens. Many people died. Countless others were driven to insanity. The CIA never discovered the key to controlling the human mind, but the history of MK Ultra nonetheless offers us vital lessons if we're willing to engage with it.

Kade Crockford (02:35):

MK Ultra is an important case study and what can go wrong when scientific inquiry is totally divorced from transparency and justice. When a small group of people make decisions they view as utilitarian, but are actually deeply harmful. MK Ultra can also teach us about human beings. Dr. Stephen Kinser is a distinguished veteran journalist and most recently the author of Poisoner in Chief, the definitive biography of the man at the center of the CIA's mind control experiments, Sidney Gottlieb. Sidney Gottlieb was not a simple man.

Stephen Kinzer (03:23):

What made Gottlieb so fascinating to me was that while he was this horrific torturer and working with the full permission of the Central Intelligence Agency, he also had another side to him.

Kade Crockford (03:38):

That's author and journalist, Stephen Kinser.

Stephen Kinzer (03:41):

He didn't live in a little development and go to work every day in his car like a normal person. He was like a proto hippie.

Stephen Kinzer (03:51):

He lived in a cabin out in the Virginia Woods that had no running water. He got up before dawn to milk his goats. He grew his own vegetables. He meditated. He wrote poetry. He was a community activist. He was thought by local people as being a wonderful, very compassionate person. He and his wife traveled the world. They worked in a leprosy hospital in India. He really was the kindhearted torturer. Sometimes I wondered, did he get out of work at night and then drive home towards Virginia and kind of leave behind his entire torture persona and become the happy dad and the vegan live off the land, sort of old time clog wearing fatherly figure? How could a person who believed in what seemed to be very humanistic values have a job which was the very opposite of humanism? How did he fit these together?

Kade Crockford (04:52):

The story of MK Ultra shows how people who are involved in developing new systems or scientific programs that hurt people often do not view themselves as evil, and they may even think they're doing good in the world.

Stephen Kinzer (05:05):

Gottlieb could have thought to himself, I'm a real individualist. I live differently from everyone else. I shaped my own life. I can do that in America because we're a free country and people are allowed to be and do what they want. There's a force out there in the world, communism that wants to destroy the possibility of any individuality, wants to turn every human being on earth into an automaton, and it's urgent to stop that so that people like me can continue to live and live freely and live openly as we want. Commitment to a great cause is always held up as the most reasonable basis to commit evil or immoral acts. There's no higher cause in many people's minds than patriotism. Maybe Gottlieb felt that he was doing something for his country.

Kade Crockford (06:00):

Maybe Gottlieb and the CIA really did believe they were doing something noble, and that's instructive for us today in the Age of Information, when a handful of secret of companies and people are collecting and manipulating so much information about billions of people for their own ends and profit. I'm certain that people in positions of power at the major tech companies do not go to work every day with the intent to cause harm, but the data they create and which produces billions of dollars in profit for them every year can nonetheless be used to target undocumented immigrant workers, assist the government with putting innocent people on terrorist watch lists, and help police prosecute women for obtaining abortions. Facebook says it aims to connect the world, but it has also been incredibly accused of standing by while users of its platform instigate genocide. Ultimately, intent doesn't really matter. Only impact does.

Stephen Kinzer (06:52):

I can't believe that the CIA decided we're through with mind control and that they never went back to it. Can we say that although the CIA was conducting a bizarre murderous mind controlled program in the 1950s, nothing like that could happen today? That would be quite a stretch. And frankly, I'd be surprised if secret services, including the CIA, are not now trying to figure out ways to use new technologies for mind control. I'd wonder why not? It would seem to me it would be a logical thing to be doing,

Kade Crockford (07:28):

And this is an extreme example obviously, but we should nonetheless keep it in mind as we consider the powerful technological forces shaping our world today. Reflecting on this grizzly history should prompt all of us to be more engaged in our democratic processes to ensure we the people, not shadowy government agencies or unaccountable tech companies remain in charge of our own destinies.

Carol Rose (07:56):

We need the law to protect us so that we can make use of digital technologies and their promise without getting hurt. Because ultimately, if we're not careful and intentional, history may repeat itself. The MK Ultra story shows us that scientific inquiry and technological innovation when divorced from democratic accountability, transparency, and legal constraints can threaten great harm.

Kade Crockford (08:22):

Democracy is hard work and it requires much more than voting every few years, but it's worth it.` Not only to prevent the kinds of grotesque harms we've just discussed, but to bend the will of the powerful to ensure innovation benefits all of us without leaving anyone behind.

Carol Rose (08:45):

Okay. We've spent most of this podcast talking about the challenges that society faces at the hands of technology and the risks to liberty and civil rights, particularly for marginalized communities, but it's important to recognize that the impact technology isn't all doom and gloom.

Kade Crockford (09:02):

As much as technology can and does worsen existing inequalities and create new problems for privacy and democracy. It can also be used by people fighting for justice and liberation to make our world more free and more equitable for all of us.

Carol Rose (09:17):

And we're anti-technology, not remotely. And not just because it makes our lives more convenient some of the time.

Kade Crockford (09:24):

Right. The driving force behind what we do is built into the name of the program, Technology for Liberty. That means we don't only fight to reform the law to ensure new technologies don't negatively impact our civil rights and civil liberties. It also means we use technology to win law reforms that matter for people.

Carol Rose (09:43):

Of course, it wasn't always like that. And in fact, it wasn't until the last 10 years or so that we've seen how something like big data, which at this point feels ubiquitous and self-evident could have a meaningful impact in the service of liberty and justice.

Kade Crockford (09:59):

Especially for those communities most negatively impacted by technology at large.

Carol Rose (10:05):

The example that I always think of came out of the drug lab scandal in 2012 when a chemist named Annie Ducan fabricated drug tests on behalf of Massachusetts State prosecutors sending countless innocent people to prison.

Matthew Segal (10:20):

You don't have to be a scientist to know that that's bad. That's not what you're supposed to do. And on top of that, she'd been doing it in an untold number of cases that was presumed to be in the thousands.

Carol Rose (10:30):

To tell the story and help us understand what impact it had on how we used data in the service of justice, I went down the hall to talk with a colleague of mine.

Matthew Segal (10:39):

I'm Matthew Segal. I'm legal director of the ACLU of Massachusetts.

Carol Rose (10:43):

So Matt, where should we begin with this case?

Matthew Segal (10:46):

Well, I guess to understand the scandal, it's helpful to take a step back and think about what the war on drugs is. What that means is that people are prosecuted for drug crimes for possessing or selling or distributing substances. And typically to do this, the state that's prosecuting them has to prove that what they possessed or distributed or sold was illegal, what that illegal substance was and how much of it they possessed or distributed. And so what the scandal was is that, at least when it first began, it was with the revelation that a chemist whose job it was to help the state establish those facts had been committing enormous amounts of misconduct, including by just inventing the test results, which is a process that it's called dry labbing. It's just looking at the substance and deciding, hey, that looks like cocaine. Hey, that looks like heroin, but without actually doing the testing. And on top of that, she even on occasion, was alleged to have put drugs into substances so that if tested properly, they would test positive for that substance.

Carol Rose (11:59):

And who was hardest hit by the scandal?

Matthew Segal (12:01):

Well, it was thousands of people, mostly poor people who had been charged and convicted and in many cases imprisoned based on evidence that Annie Ducan fabricated. Essentially, she had been framing people for drug offenses, and those are the folks that were harmed. And one of the worst aspects of that harm was that precisely because she'd been doing it for years, by the time she was found out, by the time the news started to break in August 2012, not only had she harmed thousands of people, but many of those people had already served their time, served their sentences by the time the misconduct was revealed. But of course, in addition to serving this official formal prison sentences, there are so many additional consequences of having a criminal conviction, like it's harder to get a job or a house or a student loan that people were still living with even if they had come out of prison.

Carol Rose (13:01):

But how does data science factor into what on the surface seems like more of an issue for the courts?

Matthew Segal (13:07):

Well, so we at the ACLU joined with public defenders and private lawyers to try to write the injustices that had been perpetrated against people who had been convicted of these crimes. And in doing that work, which included filing lawsuits, we sort of saw a tale of two kinds of data issues. One was that the data we were given to work with, it was so terrible that the record keeping was so poor that no one was able to easily generate even a list of the cases that Annie Ducan had worked on. So figuring out how to help people, who to notify, all of that was very difficult because the data was so poor.

Matthew Segal (13:52):

The second tale of the data was that to overcome that problem, the problem of not even being able to generate a case list, the courts delivered to us and to prosecutors a list of every drug case that the commonwealth of Massachusetts had prosecuted during Annie Ducan's tenure. And the purpose of doing that was primarily to help us and help prosecutors identify the affected individuals. And to that point in doing our work and trying to help these people, what we had been told was that the people convicted based on Andy Ducan say so were kind of terrible criminals, that they were very serious drug defendants. And because of this production of data by the courts, we were able for the first time to test whether that was true, and we were able to prove that it wasn't. That it was false.

Carol Rose (14:46):

But not everyone who uses drugs as a criminal and certainly not everyone convicted of a drug crime is terrible or even guilty for that matter, but how do you prove that to a court that may be biased against drug users?

Matthew Segal (14:59):

Well, we worked with a data scientist named Paula Villareal, and she was able to take the data that had been provided by the court and to make some assessments about it. She was able to look at the proportion of those drug cases that were for possession offenses as opposed to distribution offenses, so people who were likely using drugs but not necessarily dealing drugs. And she was also able to look at where the cases were prosecuted. And when she did that, she was able to reveal two incredibly important pieces of information. The first was that the majority of the drug convictions connected to Annie Ducan were for possession offenses. So something like 60, 63% of them were for drug possession. And the second was that the vast majority, around 90% of the cases were prosecuted in district court. And so those two facts together meant we were dealing largely with drug possession cases and overwhelmingly with cases that a prosecutor had already said were not the most serious cases in the commonwealth.

Carol Rose (16:10):

Which clearly they were not the most serious cases, but still overturning that many cases was a pretty serious win for data science in the service of justice. How many convictions did the ACLU of Massachusetts overturn here?

Matthew Segal (16:23):

Well, it was a lot. In the case that [inaudible 00:16:28] worked on, we think ultimately the court dismissed around 37,000 drug charges across more than 21,000 cases. But building on that work, we later went on to bring litigation in connection with another drug lab scandal involving a chemist named Sonya Farrick who had been using drugs at the laboratory. And that litigation overturned an additional 24,000 charges across about 16,000 cases. So we're talking about all told about 60,000 dismissed drug charges.

Carol Rose (17:02):

If our audience could see me now, they'd see me shaking my head, especially because the board of a self proclaimed scientist sent those people to prison in the first place.

Matthew Segal (17:11):

Yeah, I mean, there's a funny thing about the role that science plays in law. We build this whole system that sends people, human beings to prison based on the work of scientists. And when something goes wrong in that system, everyone is quick to blame the science. And sometimes it's true. Sometimes this has happened here, the chemists were misbehaving, or there's a forensic trick that people are using to say that someone is guilty. And that turns out it's just that it's not reliable and people are going to prison based on bad science. But a lot of times it's equally true that it's bad law that's causing the injustice. That we create legal systems that allow people to be sent to prison in bulk based on the work of a chemist or that allow people to be sent to prison in bulk based on a forensic theory that doesn't hold up. And all too often when something goes wrong with the science, we require the wrongfully convicted people to fix it, and that's not really the fault of science. That's the fault of the legal system that we do that.

Carol Rose (18:17):

It's frustrating, but that's why we do the work we do, and it all culminates with the ways that we've been able to deploy data science to tell a story that actually protects people. That's a good feeling and a rare one in the grand scheme of litigation.

Matthew Segal (18:32):

Well, one of the funny things about being a lawyer is that you usually never really know whether what you're doing is making a difference. Or even if you win a case, it can be hard to know which thing you did or didn't do that helped to tip the scales or whether it was something wholly apart from anything you did. I mean, that's true a lot of things, right?

Matthew Segal (18:51):

It's hard to know where success comes from. But there was less mystery in this case because when we won the case, it was a decision issued at the beginning of 2017 that ultimately led to what we believe is the single largest dismissal of wrongful convictions in the history of the United States. When we got that decision, the court, the highest court of Massachusetts cited [inaudible 00:19:15] work, cited our data scientists work in the opinion in the course of issuing this opinion that had this remarkable result. And you don't see that every day. And it's a real credit to [inaudible 00:19:29] and to the importance of data science in this litigation and in helping everyone understand that overturning these convictions was the right thing to do.

Carol Rose (19:42):

So in this podcast, we've heard from a range of experts and activists and ordinary people about the peril and the promise of big data and surveillance technologies when it comes to civil rights and civil liberties. We heard from Harvard Law School Professor Martha Minnow, that right now three quarters of Americans get their information from the internet where there's no editor to ensure they get accurate information. And the money that's being put into the pockets of big tech is not being reinvested in news gathering, but rather it undermines our democracy by making it harder for local news media outlets to actually tell us what's going on. And we heard from investigative journalists, Julia [inaudible 00:20:21], about how we're all Guinea pigs for when these systems are being rolled out and how big tech companies operate with impunity deciding among other things who gets to speak and who doesn't.

Kade Crockford (20:33):

We also learned about how tech companies are exaggerating their ability to sway the public in pretty significant ways like in Brexit, but then hiding the ways in which they collect private information that can be used to monitor us and to put us in harm's way. We learned how data is being collected on all of us based simply on our user interfaces and now in the hands of companies and governments both, and they're extracting and using that kind of personal private information, either biometric or biographic data for profit or control over our communities. It could be our DNA, our faces, our fingerprints, location data, professional roles, religious affiliations, interests, banking, information, friendships, romantic partnerships, political views, where we travel, who we know. All of that information is now going into the hands of very shadowy corporations and the government. Some technologies, notably facial recognition are so dangerous that we should consider banning them outright. Big tech cannot be expected to regulate itself.

Carol Rose (21:36):

But it doesn't impact everyone equally. People living in communities of color too often are tracked and targeted for arrest and deportation, not because of anything they've done, but simply because of who they are. We need more people of color involved in the design and deployment of new technologies if we're going to prevent these kinds of harms going forward.

Kade Crockford (21:58):

The story of MK Ultra, a chilling CIA mind control project led by a man who thought he was doing something good, when in fact what he did harmed thousands of people. It made us really think about whether the thousands of people involved in big tech today are aware of the danger of rolling out new technologies without democratic controls and serves as a very important reminder that utilitarianism can cause tremendous harm.

Carol Rose (22:26):

Finally, we heard about ways that we can use technology, law, and people power to limit the bad impacts of technology, while also using data and technology to advance liberty. Notably, by giving power to ordinary people to do things like using our cell phones to video record the police when they violate people's rights or to highlight racial disparities in the criminal system or in access to COVID testing and vaccines.

Carol Rose (23:00):

It's been a heck of a journey and it's made me realize that the best way to ensure that technology advances liberty rather than to undermine it is to ensure that people, all of us understand the threats to our democracy and liberty posed by unfettered data capitalism. Information is power. So if the people understand the promise and the peril of new technology, then the people working together can both embrace the promise less than the danger of technology to our democracy and liberty and the digital age.

Kade Crockford (23:33):

Technology is neither inherently good nor inherently evil. When and how it's deployed and by whom makes a huge difference. The key is to change the law to ensure we can shape the digital economy in ways that enhance rather than limit liberty, justice, and equality. When we as a society fail to regulate how technology is built and deployed, we leave it to the unfettered forces of capitalism, which shape technological developments in ways that exacerbate rather than undermine existing inequalities in our society. Racial and gender disparities are magnified. Discrimination becomes encoded into our systems in new and newly dangerous ways. To build a future in which we can believe we have to design for equity and fairness. Law also plays a central role in ensuring that technologies are not deployed in secret or in ways that discriminate or hurt certain people for the benefit of others. Far too often decisions about government deployment of new technology are made without consulting the people who will use it and the people who will be most affected by it.

Carol Rose (24:37):

We've learned how privacy requires us not just to change the privacy settings on our phones, although that's obviously a good thing to do. So every person has a role to play in fixing this problem. First of all, obviously voting really matters because who we elect makes a huge difference in whether our elected officials are paying attention to these fundamental threats to our democracy. More people need to run for office. Issues of the digital divide and of privacy and discrimination, they're coming before school boards and town meetings, city halls, state legislatures, and so on. And if running for office isn't your thing, then support a candidate who will run and will lead on these issues. And finally think globally, act locally, and act digitally.

Carol Rose (25:22):

Technology gives us such great tools for engaging in our political system, and it's one of the reasons that the ACLU has been pushing to have local town and city meetings remain virtually open to the public. So phone, text, email, reach out to your elected officials and tell them that you want them to defend our democracy and to pass laws to regulate big tech in the digital age.

Kade Crockford (25:43):

And perhaps most crucially, we need to ensure that the public interest rather than private profit dictates our digital future.

Carol Rose (25:55):

We hope you enjoyed this first season of Freedom Unfinished. If you like what you heard, don't forget to rate, review, and share this series with your family and friends, and you never know. Maybe we'll be back for season two discussing a new topic that you and the ACLU of Massachusetts want to unpack. Let us know what that topic might be for you by following us on social media at ACLU_Mass. Until we meet again, I'm Carol Rose.

Kade Crockford (26:24):

And I'm Kade Crockford.

Carol Rose (26:25):

And this is season one of the ACLU of Massachusetts Freedom Unfinished: Decoding Oppression. Thanks for listening.

Carol Rose (26:39):

Freedom Unfinished is a joint production of the ACLU of Massachusetts, and Gusto, a matter company, hosted by me, Carol Rose and my colleague at the ACLU of Massachusetts Technology for Liberty program, Kade Crockford. Our producer is Jeanette Harris-Courts, with support from David Riemer and Beth York. Shaw Flick helped us develop and write the podcast, while Mandy Lawson and Jeanette Harris-Courts put it all together. Art and audiograms by Kyle Faneuff. And our theme music was composed by Ivanna Cuesta Gonzalez, who came to us from the Institute for Jazz and Gender Justice at Berkeley College of Music. 

Carol Rose (27:17):

We couldn't have done this without the support of John Ward, Rose Aleman, Tim Bradley, Larry Carpman, Sam Spencer, and the board of directors here at the ACLU of Massachusetts, as well as our national and state ACLU affiliates. Find and follow all of season one of Freedom Unfinished: Decoding Oppression, wherever you get your podcasts, and keep the conversation going with us on social. Thanks to all of our guests and contributors, and thanks to you for taking the time to listen. It's not too late to mobilize our collective willingness to act and to ensure that technology is used to enhance rather than diminish freedom. See the show notes to discover ways to get involved. And always remember to vote, and not just nationally, but locally too. Together, we can do this.