Episode 94: Transcript

Episode: 94: Big biotech is watching you

Transcription by Keffy


Annalee: [00:00:00] Welcome to Our Opinions Are Correct, a podcast about science fiction, science and possibly what coffee we were just drinking. I'm Annalee Newitz. I'm a science journalist and I'm the author of Four Lost Cities: A Secret History of the Urban Age.

Charlie Jane: [00:00:16] I'm Charlie Jane Anders and I've published a few books recently, including the space fantasy Victories Greater Than Death. The writing advice book, Never Say You Can't Survive, and the upcoming short story collection Even Greater Mistakes.

Annalee: [00:00:31] So today, we're talking about healthcare surveillance. You know, when Big Pharma and Big Biotech are watching you, or maybe your neighbors are just turning you in for helping someone get an abortion. Basically, we're talking about two forms of paranoia that go together. One of them is the fear that doctors are doing things to you without your knowledge. And the other is the fear that some oppressive entity is watching your every move.

[00:01:00] We're going to talk about how real life science plays into these science fictional fears and we'll be discussing this with the journalist Maia Szalavitz. She's the author of a new book about harm reduction called Undoing Drugs, and just wrote an incredible article for Wired about a secret algorithm that's tracking people's drug prescriptions and telling pharmacists who is likely to develop addiction. 

[00:01:24] Also, on our audio extra next week, we'll be talking about how Charlie Jane and I deal with the future of the biomedical state in our own fiction. And, by the way, Charlie Jane, did you know that our patrons get audio extras with every episode? 

Charlie Jane: [00:01:37] What? That’s amazing.

Annalee: [00:01:37] I know, that’s amazing. So what else do people get if they become patrons of the show?

Charlie Jane: [00:01:45] Oh my god, they get so much. They get discussion starter topics every Monday. And also, which could be a review, they could be a recommendation, they could be, you know, just our thoughts about the nature of existence. That’s usually what it is. Is just like—

Annalee: [00:01:59] Yeah, and you get access to our Discord channel.

Charlie Jane: [00:02:01] Right.

Annalee: [00:02:01] You get audio extras, and it's all amazing, and it can be yours for just a couple bucks a month. This podcast is entirely supported by you, the listeners. So anything you give goes right back into making us have more and more and more opinions, and just filling your ears with them every other week. So you can find us at patreon.com/ouropinionsarecorrect. And we appreciate it very much. And now, on to our episode.

[00:02:28] OOAC theme music plays: Drums with a bass drop and more science fictional bells and percussion.

Charlie Jane: [00:02:55] So Annalee, when I think about like a medical surveillance state, it kind of conjures really kind of dark dystopian images from, THX 1138 or Brave New World or whatever. But, you know, where does this idea actually come from and how do we get to be creating a medical surveillance state in real life? What's the root of this idea?

Annalee: [00:03:16] Well, so in order to really get to the root of this idea, we have to jump into our TARDIS and go back in time. 

Charlie Jane: [00:03:22] [Makes TARDIS noise]

Annalee: [00:03:23] Yeah. That was actually a pretty good TARDIS noise.

Charlie Jane: [00:03:27] I know, yeah.

Annalee: [00:03:29] Yeah, thank you very much. So really, I think the first time you see modern medicine getting involved with surveillance is back in the 19th century. And it's all thanks to Charles Darwin's creepy cousin, Francis Galton, who was also a scientist and a mathematician. And he was a contemporary of Darwin's, and they spent a lot of time talking together. And Galton is probably best remembered as the guy who coined the term eugenics. He was really influenced by Darwin, his cousin, and the idea of evolution. But he was also influenced by British colonialist ideas about racial hierarchy.

[00:04:09] And so Galton starts to put those two things together. And what he wants to do is show that different racial groups are basically on different rungs of the evolutionary ladder. So he thinks that whites inherit really meritorious traits from their parents, whereas Black and brown people are inheriting undesirable traits from theirs. So he starts to do this, this scholarly inquiry by trying to track inherited traits in humans using something he calls anthropometrics, which today we would recognize as biometrics. It’s the effort to scientifically measure people's physical traits, like the size of their heads or the color of their skin. And Galton believed that these physical traits were linked to mental traits,

Charlie Jane: [00:04:59] Everything about this is incredibly creepy and just upsetting and horrible. And it is really is the dark side of, I mean, there's a lot of darkness in Darwinism, but it's the darkest side of Darwinism. It's the worst side of Darwinism. But obviously, we didn't have DNA analysis back then, we didn't have any of the actual scientific information that we have access to now. So how did he go about collecting this information?

Annalee: [00:05:23] That's a really good question. So like you said, people didn't know about DNA back then. But they were already speculating in the scientific community that something was out there that was managing heredity. And Galton believed that one day we would discover something called the gene, which would be responsible for people inheriting traits from their parents. But since he can't find the genes, he tries to measure the physically observable traits that those genes are linked to. 

[00:05:55] So in 1904, the University College London gives Galton a lab to do his eugenics work, and he begins collecting devices for doing his work on biometrics. He got special rulers for measuring people's skulls and facial features. He's fingerprinting people, because yeah, he also popularized fingerprinting. He's using early photography to measure facial features and compare them. 

[00:06:16] And then there were other devices. Here's Carol Reeves, who teaches Science, Technology and Society at University College London, talking about one of those devices. It's called a hair scale, and it looks like an elegant silver cigarette holder from like the early 20th century. It's rectangular and flat. But when you open it up, instead of cigarettes being inside, there are a whole bunch of strands of hair lined up. 

Carol Reeves: [00:06:42] The hair scale supposedly represented all the races of the world in a hierarchical manner from flaxen blonde to deep black. It was designed to be a scientific measuring instrument, a standard hair scale, and as such, all race scientists could invest in its truth. And the truth as far as hair was concerned was that black, wavy, and so called wooly hair belonged to the inferior races of the world. People of the southern Mediterranean, Middle East, Asia, and above all Africa. 

[00:07:12] Wool is the hair of animals. And in humans, it was supposed to denote a status way down the evolutionary scale up from the ape, but only just.

Charlie Jane: [00:07:22] So already, we see Galton kind of using some of the methods from like criminal what became the criminal justice system like mug shots and fingerprints. But how does this actually become surveillance as such.

Annalee: [00:07:35] So as you might have guessed already, that list of tools, all this research was about measuring people's outward appearances and categorizing them based on how they look. So Galton wants to make inferences about people's mental and moral capabilities based on these looks, these facial features, these body features. So this is really the dawn of medical surveillance, quite simply because Galton is creating some of the first databases of identifiers that can be used to track or profile people. 

[00:08:06] These are tools like you said that you get used by law enforcement, like fingerprints and mug shots. And it's important to remember that they are explicitly linked, at their very origin to white colonialism and the effort to justify the oppression of Black and brown people using science. 

[00:08:23] And I should add that it's not just about race. Galton and his buddies were also interested in quantifying the inferiority of women and GLBT people, as well as anyone who is non-neurotypical in all kinds of ways. So he's basically gathering all of this data based on physical appearance to characterize people's interior lives and their moral worth.

Charlie Jane: [00:08:46] Right, and of course, unfortunately, we know that Galton was hugely influential on science fiction, like a lot of early science fiction writers were very influenced by his work. But where do we start to see this theme of healthcare surveillance and tracking people to assess their quality? Where do we see that starting to turn up in science fiction?

Annalee: [00:09:04] So there's really like a whole sub genre of science fiction that you could call, like a genetic discrimination sub genre, or even genetic panic. And it really does go back to Galton’s time. In the early 20th century, the incredibly famous book that many of us are forced to read in high school, Aldous Huxley's Brave New World. That comes out right at the time that Galton’s ideas about eugenics are very popular in Britain. It's explicitly a novel that's reacting against that. It's very much a criticism of the idea that your genes are your destiny. 

[00:09:39] And I think the best modern example of this kind of sub genre is the movie Gattaca, where poor old Ethan Hawke isn't going to be allowed to get a nice middle class job or become an astronaut because he lives in a future where everyone's performance is measured through genetic tests.

Gattaca Clip: [00:09:55] Now only seconds old, the exact time and cause of my death was already known.

[00:10:01] Neurological conditions, 60%, probability, manic depression 42% probability, attention deficit disorder, 89% probability, heart disorder, 99% probability. Early fatal potential, life expectancy 30.2 years.

Annalee: [00:10:21] So this movie has become incredibly influential, both because it's very stylish, but also because the idea of a genetic apartheid state. It's easy to kind of signal that by saying, oh, well, this could become kind of a Gattaca type situation. And you often hear people talking about that.

Charlie Jane: [00:10:37] Mm-hmm. It’s become a catchphrase. 

Annalee: [00:10:40] Yeah, it's become a catchphrase. But I think that what people might forget about this movie, which I just rewatched as I was preparing for this episode, is that part of it’s intensity, and the thing that makes it a really like white knuckle thriller is all of this surveillance. Because Hawke is posing as a person who has been genetically engineered. He's stolen the identity of a guy who was genetically engineered, who has good genes. And so a bunch of the action is just Ethan Hawke's character constantly having to like, vacuum up his keyboard at work to prevent people from finding hair or skin flakes from his body. And at the same time, he takes little samples of the guy whose identity he's stolen, played by Jude Law. So, looking very much like a genetically engineered specimen. So Jude Law provides him with little vials of skin flakes and hair that he places all over his desk very carefully, because in his future, in Gattaca, people are constantly having to prove their genetic viability. He actually badges into work by giving a little sample of blood which we see him preparing every day, a tiny little fake fingerprint with a little bit of blood in it that he can provide. 

Charlie Jane: [00:11:54] Oh, man.

Annalee: [00:11:55] So it's funny that we remember Gattaca, of course, because it is about this DNA apartheid state. But it is also very much about how this leads to incredible surveillance.

Charlie Jane: [00:12:06] And this is really the point, is that it's not just about like, when you think about medical surveillance, you could think well, people might face medical discrimination, which is something that we'll talk about later in the episode. But it could affect every aspect of your life. You could be turned down for a job because your employer is like, well, you're at risk of breast cancer, and we don't want to have to pay the insurance costs five years from now if you develop breast cancer, so, sorry, you can't have a job now.

Annalee: [00:12:30] If they have access to like records about your mental health, they might say, oh, well, you've been seeing a psychologist. Hmm. I don't know.

Charlie Jane: [00:12:39] And you know, it's there's no transparency about it. That's part of the thing is that you you're observed, but you're not able to actually have any understanding of how you're being observed or how it's affecting your life, which is terrifying. It's very dystopian, and it feels like it is, we're a lot closer to it now than we were when Gattaca came out.

Annalee: [00:12:55] Yeah, that's right. And I wanted to mention that there's another kind of medical surveillance story, which also is really popular kind of throughout the same period in the 20th century. And I like to call it the tinfoil hat sub genre, because it's all about people who are getting surveillance implants put into their bodies without their knowledge. 

Charlie Jane: [00:13:17] Oh, man.

Annalee: [00:13:17] And there's a ton of stories in this genre, including alien abduction stories.

Charlie Jane: [00:13:22] Right.

Annalee: [00:13:22] Classic X-Files stuff.

Charlie Jane: [00:13:25] And this is really behind a lot of the anti-vaccine paranoia. The idea that Bill Gates is putting a microchip into your body through these vaccine treatments. 

Annalee: [00:13:31] That's right. And I think one of the very best current examples of this kind of story is the HBO series Made for Love. This is a near future story about a woman who marries a tech billionaire. And the billionaire invents a brain implant called Made for Love that's supposed to allow complete telepathy between people in a couple. And so they'll never have any strife because they'll be in each other's heads all the time.

Made for Love Clip: [00:14:00] We will become a singular living God.

I want it out and I want a divorce.

Annalee: [00:14:09] So as the series begins, our protagonist has been drugged by her husband, given the implant without her knowledge into her brain. And when she finds out, she runs away from her ex. But of course, he can stalk her anywhere because now he has this biomedical implant in her brain that connects him to her and he can see everything that she can see.

Charlie Jane: [00:14:30] And this obviously cuts close to home and close to reality because in real life, we had women being sterilized throughout the 20th century without their knowledge because their doctors deemed them unfit to be mothers or various other reasons. And now we've got the situation where people with uteruses are going to be under surveillance from their neighbors in Texas and probably other states soon if they seek an abortion. Anybody who helps them can be can be punished. And this feels like it's playing on that and it goes all the way back to Galton n his eugenics project. It's all connected to that.

Annalee: [00:15:03] It really is because a lot of Galton’s scientific work is what doctors used to justify sterilizing women in the early 20th century. And now I think a lot of this stuff is being used again to justify why it is that we should enforce people having births. Basically, there's this modern day twist in Made for Love, which is that the unwanted medical procedure turns our protagonist into a surveillance device. He can basically spy on her and everyone around her.

Charlie Jane: [00:15:34] But as somebody who's written about health care a lot, I feel like there are valid reasons to want to track people. Especially like part of how we've dealt with COVID is by having good tracking and testing and tracing and trying to track the course of the infection. And the more data you have about people's exposure and people's risks, the better you're able to address it. So you know, where's the line between sort of good and bad healthcare surveillance?

Annalee: [00:16:00] Yeah, I really want to emphasize that there's a whole area of healthcare surveillance, that's actually good for us. It's not about spying on us. It's not about taking away our ability to control our reproduction. It's not about colonial projects. It's just plain old epidemiology. It's people who are tracking infectious disease outbreaks. And they're using anonymous healthcare data all the time to find disease hotspots. 

[00:16:26] So it's not about spying, it's just about trying to figure out where people are getting sick, and how many of them there are. And this is completely crucial for public health. It's crucial for developing vaccines and stopping outbreaks before they become pandemics. But that's the problem, is that it's really hard for people to see the difference between the kind of bad medical surveillance that's coming out of Galton and eugenics versus epidemiology, health care surveillance that we need to stay alive. So oftentimes, you'll read about health care surveillance, and it's actually something that's beneficial.

Charlie Jane: [00:17:03] Yeah. And coming back to the thing about like, there's no transparency even as there's like total surveillance. I think the flip side of that is you get situations like during the AIDS crisis, for example, where people were under surveillance, people were being watched and controlled. But at the same time, we weren't really supposed to talk about it. We weren't supposed to acknowledge the problem. These were people who were supposed to be out of sight, out of mind, for a mainstream culture. It's this insidious thing of certain people are kind of like almost non-people who are to be surveilled, but are not to actually have any subjectivity about their own situation, or to be able to be visible with, in terms of their problems. And I think that it's really two sides of the same coin.

Annalee: [00:17:43] It really is. And I think that's such an interesting point that, you know, the history of data gathering and biometrics coming out of Galton’s tradition is to take people who are marginalized, and oppressed. People who've been colonized, Black people, women, turn them into data that gets harvested, and often used to further marginalize those groups. 

[00:18:07] Medical surveillance is the process of gathering data, and also somehow using it to render people invisible or to take away their agency. Because once you're a data point, you don't have agency anymore. And that's the terrifying thing. And I think that's what Made for Love really deals with so well is how she's at the center of this surveillance regime. But it's all designed to turn her into a non-person, to make her a possession of this man who she was once married to. 

[00:18:36] OOAC theme music plays: Drums with a bass drop and more science fictional bells and percussion.

Annalee: [00:18:37] So on that note, when we come back, we'll be talking Maia Szalavitz about what happens when algorithms decide who deserves health care. 

[00:18:54] If you're enjoying Our Opinions Are Correct, there's another podcast I think you'll want to check out.

Charlie Jane: [00:19:00] News and culture shows aren't made with trans folks like us in mind and trans voices are routinely left out of the conversation even though stories about us are everywhere.

Annalee: [00:19:11] The TransLash podcast with a Imara Jones is changing that and giving our communities a voice.

Charlie Jane: [00:19:16] Imara Jones is a Peabody and Emmy Award winning writer she's also a Black trans woman and a journalist. And Imara understands that telling our own stories and having a voice in the conversations that affect us will save trans lives.

Annalee: [00:19:31] So if you're trans and want to show made for you, or you're an ally who wants to learn more, you should definitely tune in to the TransLash podcast. You can hear a new episode every other Thursday.

Charlie Jane: [00:19:41] Subscribe to the TransLash podcast, wherever you're listening to this right now.

Annalee: [00:19:51] So thanks for joining us, Maia.

Maia: [00:19:52] Oh, thanks so much for having me.

Annalee: [00:19:55] So I was completely fascinated by this article that you did for Wired a few weeks ago, which ties into your new book. And it was about this thing that I'd never heard of, the NarxCare database. Can you tell us a little bit about that and how you found out about it? 

Maia: [00:20:12] Sure. So over the last 10 or so years, the states have been building up these things called prescription drug monitoring databases. And they only work for a specific state. So this company called Appriss created an interface that means that they can check your prescriptions across states. But these things are an intense violation of privacy because on these databases, it's everything. Your name, address, age, and all the controlled substances you take. And so this could be everything from testosterone for trans people, to opioids, to anti-anxiety medications, to stimulants for ADHD. 

[00:20:55] And this woman named Catherine had endometriosis. And she went to the hospital because she was in extreme pain. And for a couple of days, they treated her with opioids, it was all fine. It was the same as it had been when she had flare ups of this in the past. And then one day, the nurses and the other staff come in and say, the kind of pain you have, that isn't what you should be here for. And they just discharged her. And she was just baffled. She couldn't figure out what had happened. She left the hospital in extreme pain. And her doctor then sent her a note saying due to your extremely high NarxCare score, we cannot see you. We don't treat patients like that, you need to see someone who deals with that sort of condition. 

[00:21:42] So she searches online and she reached out to these other pain patient advocates, and discovers that part of what must have gotten her into trouble was the fact that she rescues extremely sick dogs, and two of her dogs required Valium or some kind of benzodiazepine, and opioids. And apparently dogs metabolize Valium and other benzodiazepines differently than humans so they actually need larger doses, even though they're small. And so it looked like she was getting these massive prescriptions of these anti-anxiety medications from many different doctors, some of whom, in fact, were veterinarians who were actually treating her dog who could not go and pick up the prescription for him or herself. 

[00:22:28] These NarxCare scores, we all have them, and they are like a hidden credit score that is unappealable. And that you don't even know that you have and that is being used to determine your medical care and especially care around pain, which is like obviously, the most painful thing.

Annalee: [00:22:49] Yeah. And it's really bizarre. She hasn't been able to get her NarxCare score tinkered with in any way, it sounds like there's no recourse.

Maia: [00:22:58] Yeah, I mean, some states apparently do provide some method of recourse, but that requires you to actually know that this is what is causing you to be denied care.

Annalee: [00:23:08] Yeah.

Charlie Jane: [00:23:09] That's just insane.

Annalee: [00:23:10] It really is. It's like, it does sound like something out of like a paranoid thriller. You know, it's like, there’s a secret score that's following you around and it's because of your dogs.

Maia: [00:23:23] Right.

Charlie Jane: [00:23:22] Crazy.

Annalee: [00:23:25] So it's just controlled substances, it's not looking at like whether I'm taking penicillin or like anti-inflammatants?

Maia: [00:23:31] Well, some states will actually look at that, too. 

Annalee: [00:23:34] And I was curious, does HIPAA have any jurisdiction here?

Maia: [00:23:38] HIPAA has absolutely no use whatsoever, because it doesn't even protect you from police without a warrant. HIPAA basically protects people from journalists who want to find out or relatives who want to find out about what's going on with a patient. It has absolutely zero teeth. 

Charlie Jane: [00:23:54] Wow.

Maia: [00:23:54] So basically, if a police person anywhere wants to check up on you, as long as you are a suspect or a victim of a crime, they can get your entire medical record without a warrant.

Annalee: [00:24:06] And so now thanks to NarxCare, they can get it from any state that you've been in, too.

Maia: [00:24:12] Yes.

Annalee: [00:24:12] They can, if I'm arrested in California, they can say, oh, I see that you got this prescription in Maryland.

Maia: [00:24:17] Exactly. And you don't even have to be arrested. You can just be going to a doctor to get a checkup or to get some kind of prescription for pain.

Annalee: [00:24:26] Yeah, so that was the thing that was so interesting was that it seems like there's been kind of usage creep with NarxCare, because you talk about pharmacists using it and doctors using it like what's your sense of who's using the NarxCare database at this point?

Maia: [00:24:42] So basically, you're legally required to check it under certain circumstances. So for example, if somebody’s starting a new opioid prescription in most states, the doctor’s legally required to check it, and the pharmacist who fills it is legally required to check it.

Annalee: [00:25:00] And then what do they see? So I come in and I have a prescription for opioids, and they call up my name, what are they going to see when the NarxCare database looks at me? Do they get like a little, I don't know, like a visualization? Or what are they getting?

Maia: [00:25:13] Sure, so they'll get a visualization. And they'll get a number of different scores related to say that class of drugs, so stimulants, depressants, opioids, anti-anxiety meds, some of them have different scores. And then there's a combined thing in some states called a overdose risk score. 

Annalee: [00:25:32] Yeah. And so that's that's the thing that's been tripping people up, right? Is this overdose risk score? That's what a lot of doctors and pharmacists are looking at?

Maia: [00:25:40] Well, the thing about that is, and the reason that Wired was interested in it, is it's one of these machine learning AI things. And supposedly, this score can look at how risky it is to prescribe to you. So it sucks in the data from the prescription drug monitoring databases across the country. But it also may include criminal records, it may include your own doctor’s healthcare records, because electronic health record systems also include this score. So it can include the most intimate information about you.

Charlie Jane: [00:26:22] So basically, if this thing gives you like a bad score, you're just cut off from drugs, right? And also, there could be other consequences, right? Like, what's the worst that can happen if this system that doesn't have any human intervention kind of decides to assign you like a negative score?

Maia: [00:26:37] Well, the doctor could decide to call the police, but that is somewhat unusual. Typically, what happens is a doctor will look at the score. Doctors are rated by their own algorithm. So for example, if they are the number one prescriber of opioids in the state, they will probably get looked at by the DEA. And since math, you're going to always have a number one prescriber.

Charlie Jane: [00:27:01] So the doctors could actually face legal consequences of their own?

Maia: [00:27:04] Yeah, absolutely. And right now, for example, a doctor, I've just heard reports about this doctor getting prosecuted for prescribing the anti-overdose medication, buprenorphine. And the DEA basically thinks that the doctors who prescribe this medication without seriously scrutinizing their patients and making them show up every day and making them get urine tests every day, and kind of making them jump through hoops. The DEA sees that as, if you're not doing those things, you're not treating the patient, you're just running a pill mill. When in fact, if you actually want people to be on these medications to reduce the risk by 50%. It reduces mortality risk by 50%. That and methadone are the only two medications known to do that. So you should be able to get them as easily from a doctor as you would get them from a drug dealer. But no, the DEA doesn't like that. 

[00:27:58] So if you don't make the patients jump through the amount of hoops that the cops have decided they should be jumping through, which are arbitrary, and there's no actual standard for. These guys can get arrested to and then they wonder why no one wants to prescribe medication for addiction.

Charlie Jane: [00:28:14] Wow. So what this feels like is that we've decided as a society that we're not willing or able to actually address the opioid addiction crisis. And instead, we're just going to come out with these blunt force punitive methods to try to sort of punish people who we feel are over prescribing or being over prescribed to, is that basically accurate? 

Maia: [00:28:32] Yeah, I mean, what's astonishing about this whole thing is that since 2011, we have cut the medical opioid supply by about 60%. During that same time, the overdose death rate doubled. And actually, the overdose death rate has increased more while prescribing has been declining than it did when it was rising. And this was 100% utterly predictable. It's something known as the iron law of prohibition, which basically says that if you control something tightly and you crack down on it, you will get purer and stronger and more dangerous forms of drugs, simply because smaller things are easier to smuggle. 

Annalee: [00:29:17] Right. 

Maia: [00:29:19] So this is why we have fentanyl now, and I mean, from the drug dealer’s perspective, fentanyl is a great thing, because if you want to sell heroin, you need to grow poppies. You need to have farmers, you need to have processing mills, you need to have smugglers, you need to have guards, you need to have this whole giant supply chain. But with fentanyl, you can order it from a lab in China. And it's much smaller and it doesn't smell. So why would you sell heroin? 

Annalee: [00:29:48] Yeah, so basically, this is the result of prohibition and that we're seeing this rise in, essentially, the opioid crisis is half doctors over prescribing and half doctors under prescribing, it sounds like. Or maybe not even half and half, that isn't probably an accurate dividing up of the blame here.

Maia: [00:30:10] This is what I would say. What astonishes me to this day is that when you had pill mills, and when you had people who may have actually been over prescribing. We had a list of all the people who were getting those prescriptions because you need a real ID in order to get the prescription. We could have given them all buprenorphine or methadone. We could have treated their pain, if pain was what they had, and their addiction, if addiction was what they had. But what did we do? We said, okay, we'll just cut the supply and who cares what happens because of course, no one would actually go to the black market, and no one will kill themselves from the kind of chronic intractable pain that people get opioids for.

Annalee: [00:30:47] I mean, you're talk in this Wired article, but also in your book about, people who legitimately need opioids for the exact things that we prescribe them, for pain. And now because we have this abstinence only approach to dealing with controlled substances, they're basically just left to suffer. Like, what kind of recourse do they really have?

Maia: [00:31:10] They have none whatsoever. And what's been so frustrating to me as a journalist, is that throughout this whole opioid crisis, we've had this very simple narrative. Oh, evil pharma made greedy doctors turn patients into addicts. In reality, 80% of the people who misuse prescription opioids were not prescribed them in the first place. They got them from their friends or family or dealers or someone's medicine cabinet. They did not legitimately get them for pain.

[00:31:38] So basically, most people who have opioids for pain don't get addicted. And most people who get addicted are not pain patients. But the media doesn't like that, because white people have to be innocent. So.

Charlie Jane: [00:31:51] Oh, God.

Maia: [00:31:51] And I mean, literally, that's—I mean, somebody was emailing me this morning about this story from the early 2000s that the, I think it was the Orlando Sentinel did. And it was entitled “The Accidental addict.” And it had to be retracted because it turned out that the guy whose story was being told had a previous cocaine conviction. But he was portraying himself as a guy who, you know, he got back pain, and the doctor turned him into an addict. And that is a far more common story, than grandma gets addicted by a doctor. Because most people who become addicted, are seeking drugs.

[00:32:32] And this sounds very silly, but if you just think about it, like how are you going to stop a teenager who's trying to experiment? By taking away grandma's morphine. 

Annalee: [00:32:41] Yeah.

Maia: [00:32:42] Now maybe the teenager would have gotten into grandma's morphine. But if he doesn't, he'll just buy heroin on the street. And so you're making the situation worse, in many ways. Now, I'm not saying that people shouldn't lock up their opioids. Because if you can prevent somebody who's just curious, but not already addicted, or already in trouble, that's a good thing. But locking up your opioids is very different from not allowing people who are dying, people who have cancer, people who have sickle cell, people who have pain that would kill you, if you were experiencing it, but it's going to last 30 years, rather than 30 days. It just is so insane. And I mean, when you look at the actual numbers, but you're not allowed to say these numbers because that means you're bought off by Purdue Pharma. The reality is, can you see I'm exasperated? The reality is that fewer than 8% of people who take opioids long term for chronic pain become addicted to them.

Annalee: [00:33:44] Yeah, I mean, I think this is part of what's so fascinating about your book, Undoing Drugs, because you kind of tell two histories of drug management in this country. One is sort of the abstinence only, cut them off approach that things like the NarxCare database are part of, right? Like it's part of this idea that we're going to catch you, we're going to nip it in the bud. And we're not going to let you have any opioids again, because your score shows that you're a flawed person. 

[00:34:12] And then there's the harm reduction model, which has been becoming more and more popular. And I mean, partly because of your writing, actually, and your advocacy. And that's a very different approach. I wonder if you could talk about how harm reduction would change the story that we're telling here about NarxCare and about taking away people's meds when they need them.

Maia: [00:34:33] The thing that you have to keep in mind and that harm reduction recognizes is people take drugs for reasons. If you take the drug away, you're not doing anything about the reason the person was seeking oblivion or pain relief or whatever they happen to be seeking through the drug. 

[00:34:52] So harm reduction also recognizes that risks have to be considered in context. So for example, if you have a drug like fentanyl that is many, many times stronger than heroin, and you have heroin. If you can somehow preferentially get people to move towards the heroin, you are actually saving lives. Now that sounds terrible, because everybody’s like, they're not stopping. But if you're dead, you can't stop. And the fundamental of harm reduction is that all people deserve to live. And that people who use drugs should not be made to die as an example to others. Because A: that is cruel and wrong. And B: it doesn't even work because it's not like if you show kids people dying from drugs, it actually prevents addiction. We've tried that. It fails.

Annalee: [00:35:43] Yeah. So do you think if we were to, as a nation, like really embrace a harm reduction model, that we wouldn't need something like NarxCare, we wouldn't have these kinds of scores? Would we just have doctors prescribing painkillers and other kinds of controlled substances for people who need them? And then that would just be completely separate from how we deal with addiction? Does that make sense?

Maia: [00:36:08] Well, I think there's nothing wrong with seeing who is at the highest risk of becoming addicted if you prescribe to them. For example, if you're a doctor, and an 18 year old guy comes in with an injury from a motorcycle accident, and he's had a history of child trauma, and he's already using cocaine. Super, super high risk, you would like to know that. That's a good thing. But that doesn't mean that he doesn't have extreme pain if he lost his leg. So instead of not prescribing, if you can prescribe opioid alternatives that work, that's great. Oftentimes, there aren't any. If there aren't any, talk to his mom, talk to his girlfriend, get him a lockbox, do all of the things you can to prevent those opioids from being misused. 

[00:36:53] But what you have to keep in mind is people with addiction can still have pain. And the stress of being in extreme agony is just as likely to make you relapse as the euphoria of getting opioids. 

Annalee: [00:37:06] Yeah. 

Maia: [00:37:07] So, it's just, the whole thing is just so stupid. We are so obsessed with the idea that drugs turn you into these zombies that are subhuman and have no soul, that we would prefer to leave people in agony. And it makes no sense when we can do things like, you know, for example, if you're in a 12 step program, if you're working the program, you probably have a sponsor, and your sponsor has more experience. And if you need opioids, let the sponsor hold the opioids. If the sponsor doesn't feel up to that, you know, get another sponsor. But it's just like, there are practical things you can do. We are so afraid of opioids that we are afraid to give them to people in the hospital, where it's not like they're suddenly going to grow a drug dealer when they walk out the door.

Annalee: [00:37:53] Oh, man, that's not one of the things you get when you leave in the wheelchair.

Maia: [00:37:57] I know, right?

Charlie Jane: [00:37:57] We got rehab, you get skilled nursing, you get home care, you get a drug dealer, you know, it’s just the standard package.  Medicare pays for it, Medicaid pays for it. 

Annalee: [00:38:06] Yeah.

Charlie Jane: [00:38:06] So when I was a baby journalist, I covered health care, and HIPAA was brand new. And we were all really excited about electronic medical records. And you know, the idea that like we were going to have a world in which, on the one hand, your privacy would be protected, your records would be secure. And there would be like steps to keep people from knowing your private medical information. But on the other hand, your doctor would have access to all of this information about you that's holistic, that comes from all these different sites of care. 

[00:38:32] And it sounds like what you're saying is that neither of those promises was kept. The promise of privacy, but also the promise that your doctor would be able to be like, okay, well, you've been using cocaine, and also you had this other issue. And, you know, I can see that this other doctors found this thing. Doctors often don't seem to have access to that stuff. So why did we fail so badly at both of those tasks?

Maia: [00:38:52] Criminalization.

Charlie Jane: [00:38:53] Right. 

Maia: [00:38:53] You cannot have people tell the truth in a medical system of surveillance that's going to punish them, if they tell the truth. If you criminalize taking drugs, or just taking certain drugs, I should say, because there's plenty of drugs we don't criminalize. But really, if you just think about it, we don't use criminal penalties anywhere else in healthcare. We don't have the doughnut police for diabetics. I kind of wish we did, because then maybe the police would recognize how messed up this is. 

Charlie Jane: [00:39:23] They do love donuts, you know.

Maia: [00:39:26] I mean, I'm sorry, that is a stereotype and I apologize. You know, it's just, this is not the way to do health care. And the thing that's so beautiful, and why I wanted to share the history of harm reduction is that when you treat people with dignity and respect and love and like a human being that matters, they tend to treat themselves that way. And you can see this amazing transformation when, you know, there's a homeless person who anything that they are going to have access to, they have to do something. You got to promise to be abstinent. You got to say a prayer. You got to do this or that. And when someone just says, I'm going to give you a clean needle, because I want you to live, you don't have to do anything for me. I'm just doing this because I think you deserve to live. That is hugely powerful. And you see people actually get into recovery as a result of that, because they realize, like, wait a minute, I'm worth something. Maybe I can do something more than just have oblivion all the time. 

[00:40:26] So what's so crazy about this is that harm reduction just leads us back to the fundamental virtues of humanity. The things that every spiritual tradition says we should be doing.

Annalee: [00:40:36] Yeah, that's really great. I think that's a good place to wind up, actually. That's a really powerful way of thinking about, rethinking really, how we handle medical surveillance and how we handle just being nice to each other. I mean, it does really get back to the fundamentals. 

[00:40:53] So where can people find out more information about you online,

Maia: [00:40:57] They could go to MaiaSz.com, which I will spell for you. So it's M-A-I-A, S like Sam, Z like zebra dot com. And you'll find information about my books and articles and appearances and stuff like that there.

Annalee: [00:41:11] Awesome. And the latest book is Undoing Drugs. So look for that in bookstores or online or wherever you get books.

Charlie Jane: [00:41:19] From your book dealer.

Annalee: [00:41:19] From your book dealer.

Charlie Jane: [00:41:22] You know.

Annalee: [00:41:23] Yeah, it's a controlled substance. 

[00:41:24] OOAC theme music plays: Drums with a bass drop and more science fictional bells and percussion.

Charlie Jane: [00:41:26] It is.

Annalee: [00:41:26] Yeah. Thanks again for joining us.

Charlie Jane: [00:41:27] Thank you so much.

Annalee: [00:41:30] Thank you so much for listening to Our Opinions Are Correct. Remember, you can find us on Twitter at @OOACpod, or you can support us on patreon patreon.com/ouropinionsarecorrect. 

[00:41:42] Thank you so much to our incredible producer Veronica Simonetti, thanks to Women's Audio Mission, where we record and thanks to Chris Palmer for the music. Talk to you later. And if you're a patron, we'll see you on Discord.

[00:41:53] Bye!

Charlie Jane: [00:41:53] Bye!


Annalee Newitz