Ramadan Abed/Reuters

Israeli-Palestinian Conflict

While the first phase in a deal to end the two-year Hamas-Israel war is a major achievement, a sustainable truce remains a challenge and a lasting solution to Palestinian governance appears remote.

Israeli-Palestinian Conflict

Elliott Abrams, senior fellow for Middle East studies and the Council, and Ed Husain, senior fellow at the Council, sit down with James M. Lindsay to discuss the state of the Israeli-Palestinian conflict on the second anniversary of the October 7 attacks and whether President Donald Trump's twenty-point peace plan will produce a lasting ceasefire.

 

Venezuela

United States

The White House may believe that international drug trafficking threatens Americans, but it hasn’t made the legal case that cartels are at war with the United States.

United States

The Trump administration has ramped up pressure on Venezuela through an unprecedented military deployment and use of force in international waters off its coast. Concerns of further escalation have mounted as U.S. officials signal more strikes are likely to come.

 

Foreign Policy

Foreign Policy

As a CFR fellow of more than twenty years, Max Boot has traced the ups and downs of American foreign policy. He chatted with CFR about how his career burgeoned from a love of history and the benefits of starting off in journalism.

Foreign Policy

Ebenezer Obadare has explored the intricacies of African politics over the course of a career in journalism and academia. He sat down with CFR to talk about how he continues to challenge himself and what young people intent on studying the continent should focus on.

Foreign Policy

Farah Pandith has spearheaded efforts to counter violent extremism in both Republican and Democratic administrations. CFR sat down with her to understand where such efforts stand today and what it’s like to pioneer new roles in the government.
Economics

United States

A flurry of White House economic and trade policies intended to put “America First” have yielded unexpected returns in developing economies around the world. It’s a timely reminder of how policy decisions can feed through to global financial markets in unexpected ways.

United States

Four CFR fellows examine the knock-on effects of Trump’s tariff drama on five of the United States’ closest allies—Canada, the European Union, Japan, Australia, and New Zealand.

Economics

The “core” U.S. trade deficit is still expanding, thanks to strong electronics imports. 
Russia

Moldova

A majority of voters in Moldova reaffirmed the country’s desire to break free of Russian interference and continue its path to join the European Union. The victory for reformist leader Maia Sandu, who was reelected president last year, represents one of Europe’s signal successes as Moscow seeks to intimidate and divide the continent.

Russia

NATO chased away the jets a week after shooting down Russian drones in Poland. These incidents add to a troubling trend.

Europe

Liana Fix, senior fellow for Europe at the Council, sits down with James M. Lindsay to discuss Russia’s recent drone incursions into Polish airspace, and whether the move signals an expansion of the war in Ukraine.
CFR examines how it advanced its mission over the past year through analysis from the Studies Program and Foreign Affairs, high-impact programming nationwide, and innovative digital and media engagement.

Events

Canada

Prime Minister Mark Carney discusses Canada's foreign policy priorities and the new global economy. Inaugurated in 1969, the Russell C. Leffingwell Lecture was named for Russell C. Leffingwell, a charter member of the Council who served as its president from 1944 to 1946 and as its chairman from 1946 to 1953. The lecture is given by distinguished foreign officials, who are invited to address Council members on a topic of major international significance. If you wish to attend virtually, log-in information and instructions on how to participate during the question-and-answer portion will be provided the evening before the event to those who register. Please note the audio, video, and transcript of this hybrid event will be posted on the CFR website.  

United States

In partnership with Columbia University's School on International and Public Affairs, panelists discuss what effective crisis decision-making looks like in practice, how to understand America’s adversaries, and lessons for future leaders navigating crises in national security.Secretary Hillary Rodham Clinton and Columbia SIPA Dean Keren Yarhi-Milo's new book, Inside the Situation Room, offers a window into how presidents and policymakers weigh risks, build consensus, and communicate their decisions to the wider public. Blending fresh case studies with insights from political science, and inspired by their popular class at Columbia, the book offers a framework for understanding leadership under pressure and the art of managing crises in real time. Copies of Inside the Situation Room will be available for purchase during the event.The David A. Morse Lecture was inaugurated in 1994 and supports an annual meeting with distinguished speakers. It honors the memory of David A. Morse, an active Council on Foreign Relations member for nearly thirty years.

Digital and Cyberspace Policy Program

Renée DiResta, associate research professor at Georgetown University, discusses how disinformation and digital manipulation are undermining public trust and reshaping the media landscape. The host of the webinar is Carla Anne Robbins, senior fellow at CFR and former deputy editorial page editor at the New York Times. TRANSCRIPT FASKIANOS: Thank you. Welcome to the Council on Foreign Relations Local Journalists Webinar. I am Irina Faskianos, vice president for the National Program and Outreach here at CFR. CFR is an independent, nonpartisan national membership organization, think tank, educator, and publisher focused on U.S. foreign policy. CFR generates policy-relevant ideas and analysis, convenes experts and policymakers, and is the publisher of Foreign Affairs magazine. As always, CFR takes no institutional positions on matters of policy. This webinar is part of CFR’s Local Journalists Initiative, created to help you draw connections between the local issues you cover and national and international dynamics. Our programming puts you in touch with CFR resources and expertise on international issues and provides a forum for sharing best practices. We are delighted to have confirmed over seventy participants from thirty-two states and U.S. territories. We appreciate your taking the time to be with us. I want to remind everyone again this webinar is on the record, and the video and transcript will be posted on our website after the fact at CFR.org/localjournalists. We are pleased to have Renee DiResta and host Carla Anne Robbins with us today to lead the conversation. Renee DiResta is an associate research professor at Georgetown University’s McCourt School of Public Policy. She was previously the technical research manager at the Stanford Internet Observatory, a cross-disciplinary program of research, teaching, and policy engagement for the study of adversarial abuse in current internet technologies. Her work focuses primarily on rumors and propaganda, and in understanding how narratives spread across social and media networks. Carla Anne Robbins is a senior fellow at CFR and host of the Local Journalists Webinar Series. She also serves as faculty director of the Master of International Affairs Program and clinical professor of national security studies at Baruch College’s Marxe School of Public and International Affairs. And previously she was deputy editorial page editor at the New York Times and chief diplomatic correspondent at the Wall Street Journal. I’m going to turn it over to Carla to have a conversation with Renee, and then we’ll go to all of you for your oral and written questions. We prefer you to speak them, so raise your hand and say your affiliation. And again, love to have this forum be one where we’re sharing best practices so we can inform each other’s work. OK. With that, Carla, over to you. ROBBINS: Thanks. Thanks, Irina, so much. And thanks, Renee, for joining us. This is—this is great. The work you do is extraordinary, if frightening. So, as journalists, we have been battling a hostile information environment for a long time and a steady erosion of public trust. Gallup—I don’t know if you saw this—they just released their annual measurement of trust in media, and it is once again at a new low. Twenty-eight percent of those polled said they had either a great deal or fair amount of trust in newspapers, television, and radio to report the news fully, accurately, and fairly. That’s just 28 percent. And that’s down from 31 percent last year and 40 percent five years ago. And Republicans’ confidence has dropped to single digits, to 8 percent, for the first time in the trend. And the numbers are even worse for young people. So, for someone who’s been in the news business for a very long time, that is a very frightening—I mean, frightening polling. And Gallup’s been doing this, and just if you look at the numbers and you see this drop, very—so what’s driving this, of course, is what I want to talk about to begin with. We talk about this challenge and the threats that are out there. You have written about the weaponization of rumor and elite amplification. Is this new? Is it different in recent years? And can you talk about where you see this starting? Because for the longest time we saw this as externally driven—as something the Russians were doing, something the Iranians were doing. This is domestically driven, I assume. And where do you date the beginning of this? DIRESTA: Well, so, on that front, I mean, I think you can look to things like Fox News, right. The partisan dynamics around that are not new. The marketing one outlet as trustworthy and all of the other outlets as non-trustworthy has been something that has been a marketing differentiator in the conservative media sphere for some time now. Yochai Benkler and Rob Faris wrote a book called Network Propaganda back in 2016 that looked exactly at this topic in the context of emerging media ecosystems on the internet, so not so much the influencer ecosystem—which we can talk about after—but more the what we might call, you know, new media, quaintly, a decade ago, but this idea of the small, emerging media ecosystem that really marketed itself very explicitly as being in opposition to mainstream media. That was its differentiator. That was how it framed itself, how it positioned itself. And it—and it leaned quite heavily into that for a particular partisan niche. So it’s not entirely surprising—I wish I had that up in front of me. (Laughs.) I didn’t realize that that had come out. But it’s not entirely surprising to me to hear that that has continued, because it is quite lucrative to position yourself as the one true voice, the one true outlet, and then to continue to lean into that with a particular partisan niche. We see influencers now doing this as well. That has—that began maybe five, six years ago or so, where they don’t even position themselves as media but as opposition in some ways to media, where they’re just talking to you as themselves—so a voice that is entirely not media, somebody speaking to you as somebody who is just like you, a member of a fellow shared identity, and relying on that notion of being highly trustworthy—they’re trustworthy because they are just like you, because they’re not professional media—when, of course, in reality they have the audience size of media, they monetize, they earn money from advertising, and they are potentially subject to some of the same audience capture and other dynamics that maybe lead them to shape their content in certain ways in response to financial incentives as well. ROBBINS: So can you talk more about influencers? Are you talking about people on TikTok? Are you talking about people on Spotify? Are you talking about Joe Rogan? Are you talking about more than—more than that? When I originally thought about influencers, I thought about, you know, girls teaching me how to put on makeup. DIRESTA: Well, so “influencers,” a term that refers to content creators who monetize their content usually, right? So they are earning money from the content that they make, because anybody can be a content creator now, you know. I think most people on the internet maybe are familiar with the idea of small content creators who are just regularly posting videos of themselves—as you mentioned, you know, the women who teach you to put on makeup. But some subset of them are also being sponsored by brands, maybe the makeup brands. (Laughs.) Some of them are being—are earning revenue share. So, for example, when you watch a YouTube video there is usually an ad that will play in front of a video that is monetized, and then the creator will earn some percentage of the revenue—the advertising revenue that YouTube earns. So revenue sharing is another monetization structure. It’s on all social platforms now. You can—influencers are oftentimes investing most heavily in one or two platforms. Some will be video-based. Some might be written-content-based. On even X, or Twitter, you can monetize your content now by writing good tweets—or, writing sensational tweets, maybe I should say. So there is a process that allows just sort of ordinary people who want to create content and earn money from that creation process to do so. And where it tends to intersect with media or to get out of the realm of just teaching you to put on makeup is that you do see people who are monetizing in, for example, the wellness space, you know, giving out health information; or in a political space you have political influencers who seem a little bit like pundits or commentators. So it really runs the gamut. There’s a very broad spectrum of topics, anything under the sun actually, that people can create content about and have the option to monetize that content to earn a living and to potentially earn quite a good living from it up at the upper echelons. And so that is a form of media that is sometimes not thought of as being media because it feels so personalized. ROBBINS: So has there been research—and I apologize if this has been done or you’ve done it—has there been research that looks at do these influencer(s) focus more on particular topics? Are they focused on health more than anything else? I mean, is it—did it begin with, you know, vax deniers? Or are they focused on don’t—you know, don’t support Ukraine? Or are they focused— DIRESTA: So it runs the gamut. There’s people across the—across all topics and spectrums. It didn’t start as a—I mean, I just want to be really clear that it’s not a negative thing; it’s just a way that people have engaged on the internet for a very long time. You can think about it, maybe, as, like, an outgrowth of the—you know, what’s sometimes called the mommy blogger culture in Web 1.0, where it was often women but people who were raising their children at home would write lifestyle content speaking, again, just as a person kind of mom to mom as content moved to being very video-focused—you can just kind of take out your phone, sit at your kitchen table, and produce this very highly-relatable content that, again, doesn’t feel necessarily like media. It’s not—it doesn’t have a wrapper of being the something Herald or it’s abstracted away. The person is speaking to you as themselves. It is very individual-centric. You know, very much you’re hearing from them as a person as opposed to a news-specific brand. ROBBINS: And how does that position—how does that undercut the credibility of—I hate the term “mainstream media,” but how does that undercut the credibility of those of us who do more traditional, fact-based, professional reporting? DIRESTA: Well, it doesn’t have to undercut the credibility necessarily, though, again, some of those—in the realm of the political some of those creators, some of those—you know, the influencers do position themselves as being, you know, truth tellers, or more direct, or more authentic, whereas mainstream media is positioned as being more corporate, more beholden to special interests, right? So this is something of a—again, I would argue a marketing tactic, personally. (Laughs.) That’s how I see it. That’s how I write about it, in part because the monetization is maybe a little bit different. There’s an argument that media—mainstream media might, for example, not cover pharma so directly because it might receive advertising dollars from pharmaceutical companies, whereas an influencer—and you’ll see—wellness influencers might say this quite directly—is being very hard on pharma because mainstream media won’t do that, right? And this is an argument that you’ll see presented, that influencers are more independent. On the flipside, oftentimes what influencers are producing or that type of content creator—sometimes they don’t like the term “influencer,” but we’ll just go with it for this conversation. What you’ll hear this—what you’ll hear from them oftentimes is more commentary as opposed to fact-finding. So if you want to know what is happening somewhere in the world, while there might be an influencer serving as a citizen journalist—meaning taking their phone out and actually physically showing you something that is happening in their neighborhood—oftentimes they are a degree removed and they are commentating on something that mainstream media has covered. So mainstream media and journalism is still doing that work of going out there and finding facts and doing that verification process, whereas influencer media I would say is a little bit more competitive with the commentary or opinion side of traditional media sources. ROBBINS: And that sounds organic. It sounds as if it’s—it may be good, it may be bad, it may be competing with the work that we do and driving further the mistrust in the mainstream media, intentionally or not. Do you see more of an astroturf phenomenon going on there, that it’s something that looks like it’s organic but is actually organized, that it’s being sponsored either by partisan groups, or? You know, it took us a while to figure out that the Tea Party was actually not organic, that there were—you know, that there were big money groups behind them. Do you see an astroturf phenomenon like that out there? DIRESTA: I mean, there certainly is in some capacities, but there’s also a lot of real people who are part of the influencer community. So it’s, you know— ROBBINS: I’m asking the question of are there—are there groups—are there money groups, are there partisan groups that are using this—we know about Fox News and Murdochs and whatever their political or economic interests are there. Are there, you know, particular political groups or money groups that are using these platforms or particular individuals to basically run against the mainstream press? DIRESTA: I mean, sure, there’s times when they’re—when they are, you know, doing things like undisclosed affiliate links, and things like that. There have certainly been—you know, I’m thinking of Ben Wofford wrote an article in Wired a bit ago, maybe a couple of years ago now, covering the extent to which political influencers weren’t disclosing that some of their posts were paid, that they were receiving affiliate links. Meaning—an affiliate link is when you share a URL and anytime people click on the URL or perform an action after clicking on the URL, maybe they signed a petition, maybe they purchase something, the person who shared the URL receives a percentage of the revenue, right? So that’s what an affiliate link is. For example, if I were to share an Amazon link and somebody—you know, I share that Amazon link, somebody clicks on my Amazon link, and then they go to Amazon and they buy that thing, I receive some percentage of revenue from that Amazon transaction. Is what an affiliate link is. I can either disclose that I have shared an affiliate link, or I can not. Different platforms have different rules. But the person who is doing the purchasing either knows or does not know, based on my disclosure. So that is how an affiliate link works. And there were—there are cases where political influencers will share content with affiliate links and not disclose that that is—that they are making money off of that. So there are dynamics where you do see political influencers who are not necessarily being transparent with their audiences about why they are sharing a particular petition, why they are sharing a particular website, a particular article, a webinar, for example. And so there is a monetization component that the audience might not necessarily be aware of. And that is an element that is—you know, it—I don’t know if—it’s not necessarily astroturfing in quite the same way, though, that has been a component of it at times. There’s also, of course, you know, opportunities where influencers will coordinate amongst themselves, political influencers will, who are ideologically aligned will coordinate, you know, to all share a message at the same time. Is that astroturfing? I mean, the line between astroturfing and activism is really blurry at times. And so that question of how do you ensure that disclosures are clear and required is something that we are sort of at the mercy of regulators and platforms to create ethical guidelines around. And what we have seen is that there are more ethical guidelines around the sale of material—of products, meaning the Federal Trade Commission has guidelines saying that an influencer must disclose to you if they are receiving compensation for promoting a product, but not compensation—the FEC, the Federal Election Commission, does not have similar disclosure rules necessarily for making sure that it is disclosed to you if an influencer is receiving compensation for promoting a political affiliate link. ROBBINS: So to look at, you know, the changes, what’s new about all this, it would seem to me that there are maybe two or 2.5 things in all this. One is this—what, this synthetic enhancement of rumors, AI. And we’ve all read about Sora. I don’t have a Sora link. I’m dying for one. (Laughs.) And which, of course, what—you know, who are you going to believe, me or your lying eyes? And there’s that. There’s the disappearance of platform governance and fact checking, which is a reasonably new phenomenon. Just the platform governance issue. And then there’s the politicization of—you know, that fact checking is increasingly portrayed as censorship. So those are reasonably new phenomenon which makes our jobs infinitely harder. So I want to talk about all three of those before I throw it open. And I do really want to throw it open. So which one do you want to talk about first? DIRESTA: (Laughs.) We can go in order. Do you want to do one by one? ROBBINS: Sure. So, quote, “synthetic enhancement” of rumors. DIRESTA: So, I mean, this is something that, you know, a lot of us have written about and talked about for half a decade now. I know I have. Look, there’s two components to it. One, it makes it easier to deny the real, which I think is something that people don’t pay as much attention to, and they should. And then two, it makes it easier to believe the fake, right, to create unreality. And we’ve known this was coming for a very long time. The public is much more aware of it now. There have been a lot of educational campaigns trying to help people understand what is coming. The problem is there’s no good solution for it, because oftentimes it’s actually the somewhat mundane stuff that is what slides through more easily. There was a lot of focus on would it be deepfakes of politicians that would be the thing that would manipulate the public? And you saw this reflected even in legislation introduced by members of Congress. Senator Klobuchar introduced a few of these bills around, you know, trying to create laws about manipulative content related to politicians. We saw that in California as well. Interestingly, you know, politicians are the ones who are perhaps best suited to defend themselves—(laughs)— to say, that’s not me. This is me. This is where I was at this time. You know, there are usually more angles of videos. And I think there’s obviously the potential for faked audio. You can obviously create manipulation. It is a very real risk. But interestingly, I think that one of the points that several of us have made over the years is that it’ll actually be ordinary people who are going to be much more negatively impacted by some of these things. The likenesses of ordinary people who are going to be misused in certain ways, and placed potentially into videos that they don’t want to be in. And sometimes that is sexually exploitive content. You know, there is now at least things like the TAKE IT DOWN Act, which I know a lot of people have issues with some of the actual take it down provisions that go along with that, concerns that it will be misapplied and used to try to get platforms to censor content that it shouldn’t be applied to, but it is at least a recognition, I think, that AI-generated exploitative content in this form is something that we need to be taking very seriously. Particularly because it does impact minors as well. And so that, I think, is something that is now part of public awareness. But otherwise, you are seeing just a lot of areas in which it is manifesting in the form of spam and scams. And that is going to continue to happen. And so this question of how existing laws catch up to and are adapted to the continued prevalence of—the growing prevalence of inauthentic content is something that I think state legislatures and the federal government need to be much more proactive about thinking through. I think scams and—you know, scams in particular are really rising, and identity-based scams in particular are critical for people to understand how to think about, you know, you should have—particularly if you’re a prominent person, like a journalist or somebody where there’s a lot of audio content of you out there, even if you’re just an ordinary person with a TikTok account making a lot of audio content, there’s been a growing rise in things like identity theft and people trying to use your voice to authenticate with your bank and things like that. So just being very, very cognizant of how this is going to—actually beyond political manipulation—how this is going to have significant impacts on fraud. ROBBINS: So is there any good legislation out there? I mean, the EU tends to be ahead of us on this. The argument has been made that they’ve been, on AI at least, been ahead of us, but the technology has been so far ahead of the EU that there’s no point in codifying something because the technology is moving so quickly. Biden administration tried to do things with executive orders and at least the way the government behaved itself. The Trump administration has come in and said, not going to do anything, not to my bros. You see anything that has any value? There was talk about watermarking. Other people said it’s not possible. DIRESTA: I mean, it doesn’t work as well as people want it to. That’s the honest answer. I mean, look, I was a big supporter of watermarking. I think that platforms that produce generative content—you know, Sora and things—it makes sense for them to have it. There are also open-source models that just never will, right? And there are also—one of the real challenges is what do you want the watermark to convey? This is where there’s a thing that tends to happen, you know, like a—we call it sometimes, like, label fatigue. I don’t know if you ever been to California. You walk into basically any establishment, and I think it’s called, like, Prop 68, if I—I lived in California for ten years, I’m trying to remember the name— ROBBINS: Fetal damage, cancer causing chemicals. DIRESTA: Exactly, cancer-causing chemical. And you’re, like, whatever, you know, there’s, like, the cleaning fluid on the floor. You don’t even pay attention to it because you just, OK, yeah, whatever. Literally, every establishment in California throws this label on the wall that says there’s a cancer-causing chemical somewhere in here. It’s like the CYA approach to labeling your business, right? And one of the things that even people like me, who early on, you know, I completely admit, thought that labeling would be a beneficial thing, when we saw the platforms actually begin to roll it out, it turns out that if you use Adobe to edit one of your photos, there was the photographer who took a picture of Mount Fuji, which is real. You know, took it with his camera. And he used an Adobe product to edit the photo, you know, remove some dust, some lens flares, the sort of things that you would do in post-production. And I believe it was Threads or Instagram, one of the Meta products that did rapidly roll out labeling, labeled it “imagined with AI,” or, you know, labeled it “AI edited,” or “AI created.” Because this is a problem, right? Where is edited versus created? When you have things like in painting, where you can change a section of an image, that threshold between edited and created, you know, maybe it starts on a device and then you change nine tenths of it, right? So there’s some questions around that. And then with this particular instance, it is labeled as “AI edited,” even though it’s not edited in a deceptive way. It’s just a lens flare, right, that’s been removed. And so you have this question of, if we start putting these labels on everything, what people want them for is something that is deceptive, or something that is different, or something that has materially changed the tone or tenor of the image. Not something that has glossed it in some way. And so the labeling-like regimes, if you will, or, you know, rubrics are not quite there yet. It’s just not—we haven’t quite figured out what we want the labels to communicate. And so communicating that something is real or that something is true, these are not the same thing. Trying to highlight what is deceptive is, you know, something else entirely. And so it’s not quite clear yet what kind of—the term Google uses is “assertive provenance.” How do you assert provenance, wow do you say this came from this type of device, when you can create that label but somebody can take a screenshot of the thing, and then all of a sudden the metadata has been stripped, right? So there’s just a lot of different ways in which bad actors or offenders are going to be able to strip that out, whereas good actors will keep it in. And so this question of what is the labeling actually going to accomplish is one of the challenges that people who are in the field of thinking about provenance and labeling are really trying to work through right now. What is the best way to do this that informs the viewer of, you know, in the best way possible, without creating this kind of like Prop 68 fatigue, if you will. ROBBINS: So I want to throw it open. And I’m sure other people want to ask you the other questions that I asked. But I do want to point out that there’s the individual issue here, which is can you trust the individual photograph. And then there are the two broader questions, one which is the photograph or the story or whatever, is that could potentially, you know, set off a really bad reaction that’s, you know, the provocative that really is a screaming, you know, fire in a crowded theater. And then there’s the larger flooding the zone. And we saw the Russians do this in Ukraine with the shoot down of a plane and all that, that you get to the point in which you just don’t know what truth is. You get so overwhelmed with it. And that is, to a certain extent, what those other influencers are doing. You know, who could you possibly trust here? Everybody has their own version of reality. You have your reality. I have my reality. Why would I trust the mainstream media? That’s their reality. I will have my own reality. And that is, to me, that one of the most frightening things, which poses this question for journalists. Which is, those of us who spent our lifetime developing this craft have—in other people’s mind, have no more credibility than, you know, than a Sora video. DIRESTA: I think in the—this is one of the—one of the theories of the future, is that you will see increased trust return to mainstream media in these moments of trying to determine if a video or image is real, right? If you are an outlet that has a photographer in a war zone with a camera that has provenance tools built into it, and the tools—those sort of provenance—you know, the metadata and things are posted transparently so that users can go and look at them, that is potentially a differentiator, versus some random account—you know, some random clout-chasing influencer on X tossing something up there and saying, no, this really is. You know, I remember—just per the point about real and true, right—there’s a lot of real photos of, like wildfires that are, you know, a fire that was in—I remember wildfires in Brazil. And there’s these images of India that get attached to the—you know, to the to the tweets about Brazil, right? And so these are the things where creating that context, where you do have accounts that are seen as being more credible, like, that is an opportunity to differentiate and to sort of pull people back. And being in that—in that reliable realm, I think, is something that journalism can fill that gap. Even if, on the commentary front, people choose to look to the TikTokers and the Reels creators for their commentary, for their relatable commentary. ROBBINS: So I’m going to start calling on people if you guys don’t ask questions. I mean, I’m still asking—my other job as I’m a professor. Oh, we have a hand up. Yay. So Rob Ferrett. Oh, yay. Can you—Rob, can you identify yourself and ask your question? Q: Yeah. I’m Rob Ferrett from Wisconsin Public Radio. My question is, given some of your past work on Russian disinformation social media campaigns targeting U.S. culture and trying to heighten divisions, do you still see that happening now? Whether from Russia or other foreign actors, very deliberate disinformation campaigns? DIRESTA: Oh, of course. Yeah. I mean, there’s no reason for them to give up, right? We’ve made it easier. (Laughs.) Look, there’s a few ways in which we’ve made it easier. First and foremost is that the U.S. government under this administration, for political reasons—purely political reasons—has chosen to largely exit the space of doing detection and mitigation. I wrote about this for Lawfare in February of this year, which for some reason feels like it was two years ago, writing about the dismantling of state capacity. And it only increased after that. So and, I mean, it was for purely political reasons. And transparently BS political reasons too, right? So there was the dismantling of Foreign Influence Task Force, certain aspects of, you know, work that the FBI did, the Foreign Malign Influence Center at CIA—or, at ODNI, I should say, perhaps, where Tulsi Gabbard decided that it had something to do with, you know, Hunter Biden’s laptop. Which was complete BS. It didn’t even exist in 2020. But that’s OK, Tulsi said it did, right? You know, and we have these little propaganda campaigns where they just say that this center, or that center contributed to the theft of the 2020 election by censoring something. And that is how the justification goes through. And then we dismantle the state capacity. And ultimately, what we do is we open the door for Russia, China, and Iran to continue to run influence campaigns without anybody in the government being tasked to look at them. Meanwhile, similarly, the academic institutions that used to study these things, many of them have been defunded and dismantled, again, because entities that worked on that were similarly tagged as being somehow, you know, censorious plotters who sought to steal the 2020 election, right? So utter nonsense, but again, this is how the politicization of that work went. And so the combination of the increasing sophistication of state actors recognizing that this is table stakes and they may as well do it, combined with then the platforms themselves deciding that they may as well walk it back because it’s expensive, time consuming. And there too, they are trying to please the administration, which has made it clear that it, you know, it has just decided to launch an entire reinvestigation into the very idea of Russian interference in 2016. You know, we’re relitigating things that happened a decade ago, that transparently absolutely did happen. And yet, the politicization of that has meant that nearly every sector that was responsible for looking at it has either walked it back or dismantled capacity. ROBBINS: Or been indicted. DIRESTA: Or been indicted. Yes. (Laughs.) ROBBINS: You missed that one. DIRESTA: Right. Yes, no, I think that’s actually happening—it’s tomorrow morning, right, is the Comey indictment, yeah? So yes. Or, sorry, the sort of courthouse appearance, yeah? So here we are. ROBBINS: Jordan Coll, you have a question from PantherNOW? Q: Yeah. Funny enough, PantherNOW back in my editor-in-chief days for my undergrad. But, yeah. I always try to change it, but at the current capacity I’m at New Jersey Urban News. I cover the state of New Jersey. But yeah, thank you guys so much for— ROBBINS: You look young enough to be an undergraduate in that picture. Q: Yeah. Yeah, no, totally. Totally. I’m also a professor, you know, at the moment. So, yeah, pretty young. But, Renee, I actually do know—well, I’m sure our names have crossed, because I know Emily Bell. I took a course with her at Columbia Journalism School. And she mentioned a lot of your work. And, again, pretty neat stuff with everything with the campaigns and definitely, you know, an inspiration. And I know Emily Bell speaks highly of you. So she was a good mentor of mine. But, yeah. I wanted to—I don’t know if this is working—yeah? Is it, because I’m seeing my face? ROBBINS: Yes. Q: OK, awesome. So, yeah. So I had two questions. One, again, it deals a lot with, you know, like the sense of AI, right, and these generated robocalls, essentially, where you have, you know, Joe Biden’s voice ahead of, like, New Hampshire’s, you know, like pretty much having political consultants coming in over these fake robocalls. I wanted to ask, how do you see, you know, like, in that realm of AI sophistication when it comes to intersecting with political figures, and then the recent image—I’m forgetting the grifter. It was like a grim reaper aspect of the Washington—I believe he was, like, the economy management director. I’m totally forgetting his name. But my question is, do you see—in your respective field, do you see there’s more of this sophistication, to the point that we won’t as reporters, you know, ultimately not catch this? And, again, this is something as a professor and I deal with, but it’s will the sophistication of these robo entities, these AI entities, be more, you know? You mentioned that, you know, we’ve essentially made it easier for these folks to come in. But, yeah, I wanted to hear your insights on that. DIRESTA: So the robocall, if I’m not mistaken the gentleman who did that got charged with—I don’t want to misspeak. I want to say impersonation or something. There are certain—you know, there are certain laws that that this is going to violate. But—so there was—if I’m not mistaken, there was actually a charging in that particular case. But, yes. I mean, one of the—one of the areas where you see this happen is, like, ransom phone calls where they—you know, they pretend to have a family member, or something like that, and then they have the audio of the family member and they demand compensation. You know, they demand money. Basically, they call you up. They’ll spoof a number. It’ll look like it’s coming from a family member. They’ll pretend that they are holding that family member hostage. They have that family member say something. And that’s how the scam works. The FBI has been putting out advisories on this. It’s been going on for a little over maybe two years now. This is also manifesting in the realm of the political, where they can make plausible audio of a political candidate saying something. There are still ways to do detection. One of the challenges is the gap between when this goes out and the detection happening. In the ransom moment, the reason that they use a ransom phone call is to create urgency, right? The sense that something bad is going to happen if you don’t respond immediately. And that’s where you see a lot of the time the sort of—the way that the scams work that involve voice cloning will often involve this sense of urgency. In the realm of the political, you’ll see these things drop maybe twenty-four, forty-eight hours before an election, something very time—you know, very closely timed. There was a couple of—I’m trying to remember where it was—but there was one where there was, you know, audio of a candidate saying something compromising. And I want to—maybe Slovakia. Where it took—it happened maybe twenty-four hours before people went to polls. And then they couldn’t authenticate it until afterwards, right? So, again, they’re trying to create that urgency or to create that that sense of shifting judgment before the authentication can happen. So this is a challenge for media and for authenticators. And it is going to—you know, it is going to continue to increase. You are seeing people who are trying to come up with lightweight ways to do some authentication on device, some ways to improve that so that people are standing by ready to do it ahead of key, important elections. But that is the—that is the challenge. I mean, as you mentioned, Carla, we didn’t talk about in that list of three things, fact checkers, you know? (Laughs.) Can you trust the fact checker, right? The reframing of fact checkers as somehow, like, being censorious or being in the tank for the left or whatever, that’s another, you know, political narrative that was done and undertaken quite deliberately, quite intentionally to create distrust so that the—you know, so that if this is—you know, so that that model of distrust can be potentially leveraged, you know, in service to other things as well. ROBBINS: Steven Kramer, fifty-six, of New Orleans, was acquitted. DIRESTA: Oh, was he? He wasn’t convicted of anything? ROBBINS: He was acquitted, but he still faces a fine from the FCC. What do you think the chances he’s actually going to be fined? DIRESTA: (Laughs.) What did they charge him with? Do you have it in front of you? ROBBINS: I’ve got the AP story on my phone. Would have faced a decade in prison if convicted. Let’s see. It said he was—he was about disrupting an election. DIRESTA: OK. OK. ROBBINS: And he and his defense argued that it was just a straw poll and it wasn’t recognized by the DNC. But I think it got caught up in that whole is this, you know, first in the nation, you know, primary—that whole thing. Anyway, he was— DIRESTA: There’ll be more of these. I’m sure there’ll be— ROBBINS: He was acquitted in June, so—but I don’t see something about whether the FCC has subsequently moved on this. Diego Lopez, can you identify yourself? Hi, Diego. It’s nice to see you again. And can you ask your question, or should I read it for you? Q: Hi. Thank you very much for hosting this. Good afternoon to you guys. It’s good morning for me. This is Diego Lopez with the Cibola Citizen newspaper here in New Mexico. I’m just wondering if you could explain a little bit about what the best way to make a layperson understand the danger of this disinformation is. I think we’ve all seen these videos with Sora 2. And it is just mind blowing how realistic some of these things are. So how can we help our people to understand the danger of this disinformation? Should we be writing articles in our local newspaper? Or are there other ways that we can do this? I struggle, because we are a print-based publication. Thank you. DIRESTA: Yeah. It’s hard to illustrate how good it is when people can’t see it. I mean, you can show some of the sophistication of even the still images, maybe? Just, like, this is a still screen from a video generated with whatever the tool is. I think that it is important to help people see it. I remember doing some of the PSAs. I remember NPR doing a PSA with us when I was at Stanford Internet Observatory trying to help people understand that people were using profiles with fake faces on LinkedIn to try to make you connect, to try to scam you. I think that scams is really the thing that the average person needs to be aware of, right? The potential, you know, for understanding that. You can actually do that with still images too. These are products that don’t exist. These are people that don’t exist. They’re asking for money for charitable donations for moments that never happened. You know, I’ve seen on Facebook the AI slop of—we wrote a paper on this—of, you know, wounded children or sick pets that don’t exist, right? So, helping people understand it. Elderly people saying things like, nobody ever supported me. Nobody said happy birthday to me. These sort of sympathy ploys. Helping people understand—oh, yeah. Helping people understand the most common scams, right? The most common ways that this manifests. So the combination of, look, this is where the technology is. This is how convincing it is. This is what it can do, right? So that’s important, sort of like almost tech articles. This is what it can do. This is where it is. This is how cheap and fast it is, right? And then, these are the common ways in which these scams are being deployed. And that’s almost like the PSA-type of reporting. Here are the common scams. Here are the ways it’s being used. And then, if you want to focus on the state actor and the way it’s being deployed in the realm of the political, you know, there’s always an election somewhere, right? And you’ll usually see something. There’s usually some story from—because these things are—you know, it is human nature, I think, to try to incorporate the latest and greatest manipulation technology and service to your political movement or candidate. You will usually have some example from a new election where you can, again, relate that story and say, in this election, in fill-in-the-blank, the Czech Republic, there is this—you know, this is what just happened over there, if there is an example that is relevant. And so you can illustrate it with stories. I think really telling the story is what makes it resonate with people and stick in their mind, as opposed to something that is communicated in the abstract. If you say, like, this creates a possible risk of that, then it feels very hysterical. People don’t know what to—what to believe. You don’t want to create moral panic or create a sense of foreboding and constant fear. But talking about what it is, what’s possible, and what people are actually doing, or you’re actually seeing happen that I think is, you know, using storytelling to help inform the public. ROBBINS: Clare McGrane, can you identify yourself and ask your question? Q: Yes. Hi. I’m a podcast producer and reporter here at KUOW Public Radio in Seattle. And I have a question that’s a little kind of tangential, but is relative related to what you were discussing earlier, about, you know, content creator—like, journalists who are content creators online, or content creators kind of masquerading as journalists. I’m working on a project right now that is aimed at kind of creating a conversational show for KUOW, for the station, that fits kind of the tone of content creator journalism that you might hear on other podcasts, but is backed with data reporting and comes from our newsroom. So this is our first attempt to try and kind of meet that younger audience where they’re at, like, with the kind of content that they’re used to hearing, but being very clear about the fact that it comes from our reporting and, like, really clearly drawing the line of, this is how we found this story, this is how we investigated this story. And, Renee, I’m just curious if you have any kind of advice on how media can bridge that gap and recognize that people’s, you know, preferences and patterns for taking in information has changed, and how we can try and meet our listeners and readers in the spaces and in the ways that they would like to be spoken to these days? DIRESTA: I mean, I think that that approach of storytelling and incorporating in the multimedia or the podcast style really does resonate with people. I know NBC, when Bandy Zadrozny was over there, did some interesting work on telling the story. There’s one that sticks out in my head of one of the nurses who fainted during COVID. She was very famous case during COVID. She fainted when she was getting her shot. So it was super early on. And then the conspiracy theorists on the internet decided she had really died and been replaced by a body double. And the—and the hospital didn’t do itself any favors by literally having her at a press conference where they didn’t let her talk. They had a mask on her and they didn’t let her talk, right? So it became a bit of a nightmare. And so this was a really great way to use storytelling and a multi-part podcast to tell the story of Tiffany Dover, was her name. And so I think it was—you know, many people wrote articles about this incident. But this was a way to really tell the story. And I think eventually, in the end, she managed to kind of track her down and get her to agree to an interview to kind of tell the story in her own words about, like, what the impact had—you know, had been for her, right, for Tiffany. And so there’s a lot of ways to do this. I think people really enjoy that model. I’m thinking there was also that one of—oh, boy. What was the name of it? With Adnan Syed. What was his name? This was the one with, like, the—was he guilty of a crime? Or there’s a lot of these— ROBBINS: Serial, the original one? DIRESTA: Thank you. Yes. There we go. Like, I remember listening to— ROBBINS: Podcast of all podcasts, the mother podcast, yes. DIRESTA: Yeah. (Laughs.) Yeah, Serial, that was the name of it. OK. So there were a few of these—like these models of where they become these, like, big cultural moments, and then everybody talks about them, and there’s, like—and I think that one—gosh, I mean, there were, like, several parts to that story that I personally don’t remember because I didn’t follow it very closely personally. But I know that, you know, I remember that there were multiple parts of it. So I think that that—and then I’ve also seen—there are a lot of journalists. I’m not really a big TikTok person myself, but you do see them going and creating content. You do see reporters kind of creating content and, like, relating the contents of their story, almost in video form. I’ve started doing that every now and then on Instagram Reels with, like, essays that I write. And I’m not a journalist. I just—I write for Lawfare. And so sometimes, when I write an analysis, I’ll sit there and I’ll give a three-minute summary of it, in hopes of just reaching people who only want to get it through video. Just because I think that, like, why not? Why not try? So just different ways of exploring the medium and experimenting with it is really important. ROBBINS: Well, the New York Times is doing that. I mean, they’ve got verticals on the front—on the front every day. I mean, it’s—people seem to—it’s the summary. And it personalizes it. So— DIRESTA: Some people literally just read their Substack articles on video. Like, literally will just be reading the first couple paragraphs and then say, hey, if you want to read the rest, you know it’s over here. ROBBINS: But there’s also—there’s also people who haven’t written the stories who are reading other people’s stories, with sort of snarky commentary. So there’s that as well. Carly Winchell has a question. Which is—Carly, do you want to read your question? Q: Hi. Sure. I’m Carly Winchell. I’m with the Ark Valley Voice from Colorado. And my question was just, what kind of advice can we give to the general public about identifying, like, AI generated disinformation? DIRESTA: Oh, it’s so hard now. (Laughs.) Q: I know, right? DIRESTA: I wish I had a better answer for you, because I remember, like, we did these—that handy PSA thing I referenced with NPR was, like, here’s how you can look at the ears, and the teeth, and the hair blends into the color. And can’t really do that anymore. Now it’s more of, you know, check and see does this exist somewhere else. Did this account come out of nowhere and share this one video and, like, that’s the only thing it’s ever posted? So sort of, like, almost like source provenance, you know, where what is the account that is sharing it? The account really does matter. There’s a mnemonic that we use in disinformation research when we do our own work, which is actors, behaviors, content, right? Where when we’re looking at our own stuff content is almost, like, third. And we’re looking at the actors—you know, who is sharing this, where did it come from, are these accounts all new? It’s hard for the average person to look at that, but you can usually tell if it’s, like, one video and, like, one account that’s come out of nowhere. That kind of helps. Similarly is, you know, what is the—when you’re looking at the content, is it incredibly incendiary and inflammatory? Sometimes that is indicative of something that is more out there to get a rise, as opposed to something that is real. I think that that doesn’t mean that there aren’t, you know, sensational moments of, you know, mean people doing mean things. That happens in the world. But oftentimes taking that extra beat is—you know, to try to verify it or to see what other people are saying about it, or to see if there’s another angle of the interaction. I think a lot about that moment Covington Catholic, that happened a few years back, of the kid in the MAGA hat and the Native American elder, at that—which was not disinformation, or false—you know, or generated video. But you did see over time multiple different angles of the video, multiple different shots, multiple different cuts that kind of, when put together, told a more complete story. When you have video that’s AI generated, you’re usually just going to see it kind of one angle, one sensational moment. You know, the investment is going to be made in generating one thing. And you’re not going to have a whole lot of different perspectives or other people talking about it. So right now, it’s going to be much more of a flat sort of single-shot thing with a—sometimes, you know, different accounts will be sharing it with inflammatory commentary in the tweet, but we haven’t yet seen, to the best of my knowledge, things that appear to have been shot from multiple angles of the same fake moment. So it’s hard, though. It’s really hard. That’s why I think more saying, like, take a beat and try to see if there’s other content out there or if somebody has verified it is better than trying to say, like, count the fingers, at this point. ROBBINS: Robert Chaney from the Missoulian, I think our last question. Q: Hi, there. Thanks. The Washington Post just had a story this morning showing multiple perspective fake videos of getting arrested for a DUI and getting in an argument— DIRESTA: (Laughs.) Q: —at a restaurant, where back and forth between, like, two camera presentations. So it was pretty scary. DIRESTA: Yeah, I mean, that was—I remember when the stills came out, one of the things that I was very curious about was how long it would take for you to be able to generate multiple shots of the same person in that way. And it was—you know, the answer was, like, a couple months. So it’s not—it’s not surprising that I guess we’ve got the ability to do that now. So, oh, there you go. Q: Anyway, I am with the Mountain Journal in Montana, which is also part of the Montana Free Press. We’re both digital-only, digital-native productions. In the last election cycle, we saw both the Republican and Democratic party’s operations setting up news sites that had, you know, local news in local communities, plus stories that were generated to, you know, be on the issues of their favorite candidates showing them in their preferred light, and whatnot. And then they just sort of dried up and blew away as soon as the campaign was done. But they pop these things up, like, you know, traveling circus tents, and tried to be a legitimate part of the news world. What I’m wondering is, have you seen any seal of Good Housekeeping watermarking, type of things that say: I am a human with a newsroom with editors and checks and balances and credibility of the old institutional kind, that is worth—is getting any traction on the positive side of saying, go back to the old institutional mainstream media model? DIRESTA: That’s what—I actually think that is what is going to happen. That is my personal feeling, that you are going to see that model of, you know, verified, credible, you know, somebody who actually attests proactively to: I am human. This is who I am. This is where I am. This is what I do. This is my device. These are the photographs I’ve taken with this device. You know, really just trying to establish provenance and credibility, and to—and to proactively offer that as a differentiator. I think that that is something that is going to become more valued and people are going to look for it as this sense of unease and not being able to tell what’s real becomes more pervasive for people. I was writing this maybe in 2020—(laughs)—saying that the killer app that AI is going to unlock is really going to be this notion of, like, how do you have privacy protecting credentialing? Or how do you have credentialing where people proactively and voluntarily indicate: I am real. This is who I am. And this is my sort of, like, vetted, validated content that I am putting out as this real person in the world. And I think that that is going to be something that you’re going to start to see very soon. ROBBINS: Maybe that’s our role. So this is—Renee, I just really want to thank you. And I’m going to turn it back to Irina. And thank you, everybody, for great questions. Been an extraordinary conversation. FASKIANOS: Yes. I second that. Thanks to all of you. And thank you, Renee DiResta, and Carla Anne Robbins. We really appreciate it. We will send out a link to this webinar recording and transcript. And, as always, we encourage you to visit CFR.org, ForeignAffairs.com, and ThinkGlobalHealth.org for the latest developments and analysis on international trends and how they’re affecting the U.S. And, of course, we welcome your suggestions of how we can be a resource for you. You can email us at [email protected]. So, again, thank you both. And we hope you enjoy the rest of your day. ROBBINS: Thanks, Irina. Thank you, Renee. It was great.

United States

At the midpoint of CFR’s Global Board of Advisors’ annual two-day summit, we invite you to a seated lunch with members of CFR’s Global Board. The lunch will be followed by a discussion featuring Global Board members on the shifting international system and the future of global governance in an era of geopolitical and economic uncertainty. Please note there is no virtual component to the meeting. The audio, video, and transcript of this meeting will be posted on the CFR website.

Explainers

Expert Spotlight

Featured Publications

Sub-Saharan Africa

An approachable guide to the political, social, and demographic changes happening in Africa and why they matter for the rest of the world.

United Nations

David J. Scheffer and Mark S. Ellis provide an introduction to the UN Charter and make the case that it is the most important secular document in the world.

International Law

Few Americans have done more than Jerome A. Cohen to advance the rule of law in East Asia. The founder of the study of Chinese law in the United States and a tireless advocate for human rights, Cohen has been a scholar, teacher, lawyer, and activist for more than sixty years. Moving among the United States, China, and Taiwan, he has encouraged legal reforms, promoted economic cooperation, mentored law students—including a future president of Taiwan—and brokered international crises. In this compelling, conversational memoir, Cohen recounts a dramatic life of striving for a better world from Washington, DC, to Beijing, offering vital first-hand insights from the study and practice of Sino-American relations. In the early 1960s, when Americans were not permitted to enter China, he met with émigrés in Hong Kong and interviewed them on Chinese criminal procedure. After economic reform under Deng Xiaoping, Cohen’s knowledge of Chinese law took on a new importance as foreign companies began to pursue business opportunities. Helping China develop and reconstruct its legal system, he made an influential case for the roles of Western law and lawyers. Cohen helped break political barriers in both China and Taiwan, and he was instrumental in securing the release of political prisoners in several countries. Sharing these experiences and many others, this book tells the full story of an unparalleled career bridging East and West.