
Scaling Laws
221 episodes — Page 3 of 5

The Supreme Court Blocks the Texas Social Media Law
On May 31, by a five-four vote, the Supreme Court blocked a Texas law from going into effect that would have sharply limited how social media companies could moderate their platforms and required companies to abide by various transparency requirements. We’ve covered the law on this show before—we recorded an episode right after the U.S. Court of Appeals for the Fifth Circuit allowed Texas to implement the law, in the same ruling that the Supreme Court just vacated. But there’s enough interesting stuff in the Supreme Court’s order—and in Justice Samuel Alito’s dissent—that we thought it was worth another bite at the apple. So this week on Arbiters of Truth, our series on the online information ecosystem, Evelyn Douek and Quinta Jurecic invited Genevieve Lakier, professor of law at the University of Chicago and Evelyn’s colleague at the Knight First Amendment Institute, to walk us through just what happened. What exactly did the Supreme Court do? Why does Justice Alito seem to think that the Texas law has a decent chance of surviving a First Amendment challenge? And what does this suggest about the possible futures of the extremely unsettled landscape of First Amendment law? Hosted on Acast. See acast.com/privacy for more information.

Bringing in the Content Moderation Auditors
As transparency reporting about content moderation enforcement has become standard across the platform industry, there's been growing questions about the reliability and accuracy of the reports the platforms are producing. With all reporting being entirely voluntary and the content moderation industry in general being very opaque, it’s hard to know how much to trust the figures that companies report in their quarterly or biannual enforcement reports. As a result, there's been growing calls for independent audits of these figures, and last month, Meta released its first ever independent audit of its content moderation reporting systems. This week on Arbiters of Truth, our series on the online information ecosystem, Evelyn Douek sat down with someone who actually knows something about auditing: Colleen Honigsberg, an associate professor of law at Stanford Law School, whose research is focused on the empirical study of corporate and securities law. They talked about how auditors work, the promises and pitfalls of auditing in other contexts and what that might teach us for auditing in the content moderation context, and whether this is going to be a useful regulatory tool. Hosted on Acast. See acast.com/privacy for more information.

Social Media Platforms and the Buffalo Shooting
On May 14, a shooter attacked a supermarket in a historically Black neighborhood of Buffalo, New York, killing ten people and wounding three. The streaming platform Twitch quickly disabled the livestream the shooter had published of the attack—but video of the violence, and copies of the white supremacist manifesto released by the attacker online, continue to circulate on the internet. How should we evaluate the response of social media platforms to the tragedy in Buffalo? This week on Arbiters of Truth, our series on the online information ecosystem, Evelyn Douek and Quinta Jurecic spoke with Brian Fishman, who formerly worked at Facebook, now Meta, as the policy director for counterterrorism and dangerous organizations. Brian helped lead Facebook’s response to the 2019 Christchurch shooting, another act of far-right violence livestreamed online. He walked us through how platforms respond to crises like these, why it’s so difficult to remove material like the Buffalo video and manifesto from the internet, and what it would look like for platforms to do better. Hosted on Acast. See acast.com/privacy for more information.

The Platforms versus Texas in the Supreme Court
On May 12, the U.S. Court of Appeals for the Fifth Circuit allowed an aggressive new Texas law regulating social media to go into effect. The law, known as HB20, seeks to restrict large social media platforms from taking down content on the basis of viewpoint—effectively restricting companies from engaging in a great deal of the content moderation that they currently perform. It also imposes a range of transparency and due process requirements on platforms with respect to their content moderation. A group of technology companies challenging the law have filed an emergency application to the Supreme Court seeking to put HB20 back on hold while they continue to litigate the law’s constitutionality under the First Amendment. This week on Arbiters of Truth, our series on the online information ecosystem, Evelyn Douek and Quinta Jurecic spoke with Alex Abdo, litigation director at the Knight First Amendment Institute, and Scott Wilkens, senior staff attorney at Knight. The Institute, where Evelyn is a senior research fellow, filed an amicus brief in the Fifth Circuit, taking a middle ground between Texas—which argues that the First Amendment poses no bar to HB20—and the plaintiffs—who argue that the First Amendment prohibits this regulation and many other types of social media regulation besides. So what does the Texas law actually do? Where does the litigation stand—and what will the impact of the Fifth Circuit’s ruling be? And how does the Knight First Amendment Institute interpret, well, the First Amendment? Hosted on Acast. See acast.com/privacy for more information.

When Governments Turn Off the Internet
EInternet blackouts are on the rise. Since 2016, governments around the world have fully or partially shut down access to the internet almost 1000 times, according to a tally by the human rights organization Access Now. As the power of the internet grows, this tactic has only become more common as a means of political repression. Why is this and how, exactly, does a government go about turning off the internet? This week on Arbiters of Truth, our series on the online information ecosystem, Evelyn Douek and Quinta Jurecic spoke on this topic with Peter Guest, the enterprise editor for the publication Rest of World, which covers technology outside the regions usually described as the West. He’s just published a new project with Rest of World diving deep into internet shutdowns—and the three dug into the mechanics of internet blackouts, why they’re increasing and their wide-reaching effects. Hosted on Acast. See acast.com/privacy for more information.

Pay Attention to Europe’s Digital Services Act
While the U.S. Congress has been doing hearing after hearing with tech executives that include a lot of yelling and not much progress, Europe has been quietly working away on some major tech regulations. Last month, it reached agreement on the content moderation piece of this package: the Digital Services Act. It's sweeping in scope and likely to have effects far beyond Europe. This week on Arbiters of Truth, our series on the online information ecosystem, Evelyn Douek sat down with Daphne Keller, the director of the Program on Platform Regulation at the Stanford Cyber Policy Center, to get the rundown. What exactly is in the act? What does she like and what doesn't she? And how will the internet look different once it comes into force? Hosted on Acast. See acast.com/privacy for more information.

The Professionalization of Content Moderation
This week on Arbiters of Truth, our series on the online information ecosystem, Evelyn Douek spoke to Charlotte Willner, who has been working in content moderation longer than just about anyone. Charlotte is now the executive director of the Trust and Safety Professionals Association, an organization that brings together the professionals that write and enforce the rules for what’s fair game and what’s not on online platforms. Before that, she worked in Trust and Safety at Pinterest and before that she built the very first safety operations team at Facebook. Evelyn asked Charlotte what it was like trying to build a content moderation system from the ground up, what has changed since those early days (spoilers: it’s a lot!) and—of course—if she had any advice for Twitter’s new owner given all her experience helping keep platforms safe. Hosted on Acast. See acast.com/privacy for more information.

Taylor Lorenz on Taking Internet Culture Seriously
This week on Arbiters of Truth, our series on the online information ecosystem, Evelyn Douek and Quinta Jurecic spoke with a reporter who has carved out a unique beat writing about not just technology but the creativity and peculiarities of the people who use it—Taylor Lorenz, a columnist at the Washington Post covering technology and online culture. Her recent writing includes reporting on “algospeak”—that is, how algorithmic amplification changes how people talk online—and coverage of the viral Twitter account Libs of TikTok, which promotes social media posts of LGBTQ people for right-wing mockery. They talked about the quirks of a culture shaped in conversation with algorithms, the porous border between internet culture and political life in the United States, and what it means to take the influence of social media seriously, for good and for ill. Hosted on Acast. See acast.com/privacy for more information.

Bringing Evidence of War Crimes From Twitter to the Hague
The internet is increasingly emerging as a source for identification and documentation of war crimes, as the Russian invasion of Ukraine has devastatingly proven yet again. But how does an image of a possible war crime go from social media to before a tribunal in a potential war crimes prosecution? On a recent episode of Arbiters of Truth, our series on the online information ecosystem, Evelyn Douek and Quinta Jurecic spoke with Nick Waters, the lead on Justice and Accountability at Bellingcat, about how open-source investigators go about documenting evidence of atrocity. This week on the show, Evelyn and Quinta interviewed Alexa Koenig, the executive director of the Human Rights Center at the University of California, Berkeley, and an expert on using digital evidence for justice and accountability. They talked about how international tribunals have adapted to using new forms of evidence derived from the internet, how social media platforms have helped—and hindered—collection of this kind of evidence, and the work Alexa has done to create a playbook for investigators downloading and collecting material documenting atrocities.Because of the nature of the conversation, this discussion contains some descriptions of violence that might be upsetting for some listeners. Hosted on Acast. See acast.com/privacy for more information.

How the Press and the Platforms Handled the Hunter Biden Laptop
We’re taking a look back at one of the stranger stories about social media platforms and the role of the press in the last presidential election. In the weeks before the 2020 election, the New York Post published an “October Surprise”: a set of stories on the business and personal life of Hunter Biden, the son of Democratic presidential candidate Joe Biden, based on emails contained on a mysterious laptop. A great deal was questionable about the Post’s reporting, including to what extent the emails in question were real and how the tabloid had obtained them in the first place. The mainstream press was far more circumspect in reporting out the story—and meanwhile, Twitter and Facebook sharply restricted circulation of the Post’s stories on their platforms. It’s a year and half later. And the Washington Post just published a lengthy report verifying the authenticity of some of the emails on the mysterious laptop—though a lot still remains unclear about the incident. In light of this news, how should we understand Facebook and Twitter’s actions in 2020? Washington Post technology reporter Will Oremus weighed in on this question in his own reflection for the paper. This week on Arbiters of Truth, our series on the online information ecosystem, Evelyn Douek and Quinta Jurecic asked him on the show to discuss the story. Did the social media platforms go too far in limiting access to the New York Post’s reporting? How did the mainstream press deal with the incident? What have we learned from the failures of how the press and social media responded to information operations around the 2016 election, and what can we learn from how they behaved differently in 2020? Hosted on Acast. See acast.com/privacy for more information.

What’s in the U.K. Online Safety Bill?
This week on Arbiters of Truth, our series on the online information environment, we’re turning our attention to the United Kingdom, where the government has just introduced into Parliament a broad proposal for regulating the internet: the Online Safety Bill. The U.K. government has proclaimed that the Bill represents new “world-first online safety laws” and includes “tougher and quicker criminal sanctions for tech bosses.” So … what would it actually do?To answer this question, Evelyn Douek and Quinta Jurecic spoke with Ellen Judson, a senior researcher at the Centre for the Analysis of Social Media at Demos, a U.K. think tank. Ellen has been closely tracking the legislation as it has developed. And she helped walk us through the tangled system of regulations created by the bill. What new obligations does the Online Safety Bill create, and what companies would those obligations apply to? Why is the answer to so many questions “yet to be defined”—a phrase we kept saying again and again throughout the show—and how much of the legislation is just punting the really difficult questions for another day? What happens now that the bill has been formally introduced in Parliament? Hosted on Acast. See acast.com/privacy for more information.

Getting Information Into Russia
Over the last few weeks, we’ve talked a lot about the war in Ukraine on this series—how the Russian, Ukrainian and American governments are leveraging information as part of the conflict; how tech platforms are navigating the flood of information coming out of Ukraine and the crackdown from the Kremlin; and how open-source investigators are documenting the war. This week on Arbiters of Truth, our series on the online information environment, we’re going to talk about getting information into Russia during a period of rapidly increasing repression by the Russian government. Evelyn Douek and Quinta Jurecic spoke with Thomas Kent, a former president of the U.S. government-funded media organization Radio Free Europe/Radio Liberty, who now teaches at Columbia University. He recently wrote an essay published by the Center for European Policy Analysis on “How to Reach Russian Ears,” suggesting creative ways that reporters, civil society and even the U.S. government might approach communicating the truth about the war in Ukraine to Russians. This was a thoughtful and nuanced conversation about a tricky topic—whether, and how, democracies should think about leveraging information as a tool against repressive governments, and how to distinguish journalism from such strategic efforts. Hosted on Acast. See acast.com/privacy for more information.

How Open-Source Investigators are Documenting the War in Ukraine
Open-source investigations—sometimes referred to as OSINT, or open-source intelligence—have been crucial to public understanding of the Russian invasion of Ukraine. An enormous number of researchers have devoted their time to sifting through social media posts, satellite images, and even Google Maps to track what’s happening in Ukraine and debunk false claims about the conflict. This week on Arbiters of Truth, our series on the online information ecosystem, we devoted the show to understanding how open-source investigations work and why they’re important. Evelyn Douek and Quinta Jurecic spoke to Nick Waters, the lead on Justice and Accountability at Bellingcat, one of the most prominent groups devoted to conducting these types of investigations. They talked about the crucial role played by open-source investigators in documenting the conflict in Syria—well before the war in Ukraine—and how the field has developed since its origins in the Arab Spring and the start of the Syrian Civil War. And Nick walked us through the mechanics of how open-source investigations actually happen, and how social media platforms have helped—and hindered—that work. Hosted on Acast. See acast.com/privacy for more information.

How Tech Platforms are Navigating the War in Ukraine
As Russia’s brutal war in Ukraine continues, tech platforms like Facebook and Twitter have been key geopolitical players in the conflict. The Kremlin has banned those platforms and others as part of a sharp clampdown on freedoms within Russia. Meanwhile, these companies must decide what to do with state-funded Russian propaganda outlets like RT and Sputnik that have accounts on their platforms—and how best to moderate the flood of information, some of it gruesome or untrue, that’s appearing as users share material about the war.This week on Arbiters of Truth, our podcast series on the online information ecosystem, Evelyn Douek and Quinta Jurecic spoke with Alex Stamos, director of the Stanford Internet Observatory. They discussed how various platforms, from Twitter to TikTok and Telegram, are moderating the content coming out of Russia and Ukraine right now; the costs and benefits of Western companies pulling operations out of Russia during a period of increasing crackdown; and how the events of the last few weeks might shape our thinking about the nature and power of information operations. Hosted on Acast. See acast.com/privacy for more information.

You Can’t Handle the Truth (Social)
Almost immediately since he was banned from Twitter and Facebook in January 2021, Donald Trump has been promising the launch of a new, Trump-run platform to share his thoughts with the world. In February 2022, that network—Truth Social—finally launched. But it’s been a debacle from start to finish, with a lengthy waitlist and a glitchy website that awaits users who finally make it online. Drew Harwell, who covers technology at the Washington Post, has been reporting on the less-than-smooth launch of Truth Social. This week on Arbiters of Truth, our podcast series on the online information ecosystem, Evelyn Douek and Quinta Jurecic spoke with him about who, exactly, this platform is for and who is running it. What explains the glitchy rollout? What’s the business plan … if there is one? And how does the platform fit into the ever-expanding universe of alternative social media sites for right-wing users? Hosted on Acast. See acast.com/privacy for more information.

The Information War in Ukraine
Over the last several weeks, Russian aggression toward Ukraine has escalated dramatically. Russian President Vladimir Putin announced on Feb. 21 that Russia would recognize the sovereignty of two breakaway regions in Ukraine’s east, Donetsk and Luhansk, whose years-long effort to secede from Ukraine has been engineered by Russia. Russian troops have entered eastern Ukraine as supposed “peacekeepers,” and the Russian military has taken up positions along a broad stretch of Ukraine’s border.Along with the military dimensions of the crisis, there’s also the question of how various actors are using information to provoke or defuse violence. Russia has been spreading disinformation about supposed violence against ethnic Russians in Ukraine. The United States and its Western partners, meanwhile, have been releasing intelligence about Russia’s plans—and about Russian disinformation—at a rapid and maybe even unprecedented clip.So today on Arbiters of Truth, our series on the online information ecosystem, we’re bringing you an episode about the role of truth and falsehoods in the Russian attack on Ukraine. Evelyn Douek and Quinta Jurecic spoke with Olga Lautman, a non-resident senior fellow at the Center for European Policy Analysis—who has been tracking Russian disinformation in Ukraine—and Shane Harris, a reporter at the Washington Post—who has been reporting on the crisis. Hosted on Acast. See acast.com/privacy for more information.

The Nuts and Bolts of Social Media Transparency
Brandon Silverman is a former Facebook executive and founder of the data analytics tool CrowdTangle. Brandon joined Facebook in 2016 after the company acquired CrowdTangle, a startup designed to provide insight into what content is performing well on Facebook and Instagram, and he left in October 2021, in the midst of a debate over how much information the company should make public about its platform. As the New York Times described it, CrowdTangle “had increasingly become an irritant” to Facebook’s leadership “as it revealed the extent to which Facebook users engaged with hyperpartisan right-wing politics and misleading health information.”This week on Arbiters of Truth, our series on the online information ecosystem, Evelyn Douek and Quinta Jurecic spoke with Brandon about what we mean when we talk about transparency from social media platforms and why that transparency matters. They also discussed his work with the Congress and other regulators to advise on what legislation ensuring more openness from platforms would look like—and why it’s so hard to draft regulation that works. Hosted on Acast. See acast.com/privacy for more information.

Spotify Faces the Content Moderation Music
The Joe Rogan Experience is perhaps the most popular podcast in the world—and it’s been at the center of a weeks-long controversy over COVID misinformation and content moderation. After Rogan invited on a guest who told falsehoods about the safety of COVID vaccines, outrage mounted toward Spotify, the podcasting and music streaming company that recently signed an exclusive deal with Rogan to distribute his show. Spotify came under pressure to intervene, as nearly 300 experts sent the company a letter demanding it take action, and musicians Neil Young and Joni Mitchell pulled their music from Spotify’s streaming service. And the controversy only seems to be growing. This week on Arbiters of Truth, our series on the online information ecosystem, Evelyn Douek and Quinta Jurecic spoke with Ashley Carman, a senior reporter at The Verge who writes the newsletter Hot Pod, covering the podcast and audio industry. She’s broken news on Spotify’s content guidelines and Spotify CEO’s Daniel Ek’s comments to the company’s staff, and we couldn’t think of a better person to talk to about this slow-moving disaster. How has Spotify responded to the complaints over Rogan, and what does that tell us about how the company is thinking about its responsibilities in curating content? What’s Ashley’s read on the state of content moderation in the podcast industry more broadly? And … is this debate even about content moderation at all? Hosted on Acast. See acast.com/privacy for more information.

Is Block Party the Future of Content Moderation?
We talk a lot on this show about the responsibility of major tech platforms when it comes to content moderation. But what about problems the platforms can’t—or won’t—fix? Tracy Chou’s solution involves going around platforms entirely and creating tools that give power back to users to control their own experience. She’s the engineer behind Block Party, an app that allows Twitter users to protect themselves against online harassment and abuse. It’s a fine-tuned solution to a problem that a lot of Twitter users struggle with, especially women and particularly women of color. This week on Arbiters of Truth, our series on the online information ecosystem, Evelyn Douek and Quinta Jurecic spoke with Tracy about her work developing Block Party and how the persistent lack of diversity in Silicon Valley contributes to an environment where users have little protection against harassment. They also talked about what it’s like working with the platforms that Block Party and other apps like it are seeking to improve. And they discussed what content moderation problems these kinds of user-driven tools might help solve–and which they won’t. Hosted on Acast. See acast.com/privacy for more information.

Defunding the Insurrectionists
As we’ve discussed on the show, online advertisements are the shifting, unstable sand on which the contemporary internet is built. And one of the many, many ways in which the online ad ecosystem is confusing and opaque involves how advertisers can find their ads popping up alongside content they’d rather not be associated with—and, all too often, not having any idea how that happened.This week on Arbiters of Truth, our series on the online information ecosystem, Evelyn Douek and Quinta Jurecic spoke to Nandini Jammi and Claire Atkin of the Check My Ads Institute. Their goal is to serve as a watchdog for the ad industry, and they’ve just started a campaign to let companies know—and call them out—when their ads are showing up next to content published by far-right figures like Steve Bannon who supported the Jan. 6 insurrection. So what is it about the ads industry that makes things so opaque, even for the companies paying to have their ads appear online? What techniques do Claire and Nandini use to trace ad distribution? And how do advertisers usually respond when Check My Ads alerts them that they’re funding “brand unsafe” content? Hosted on Acast. See acast.com/privacy for more information.

Why the Online Advertising Market is Broken
In December 2020, ten state attorneys general sued Google, alleging that the tech giant had created an illegal monopoly over online advertising. The lawsuit is ongoing, and just this January, new allegations in the states’ complaint were freshly unsealed: the states have accused Google of tinkering with its ad auctions to mislead publishers and advertisers and expand its own power in the marketplace. (Google told the Wall Street Journal that the complaint was “full of inaccuracies and lacks legal merit.”)The complaint touches on a crucial debate about the online advertising industry: does it, well, work? This week on Arbiters of Truth, our series on the online information ecosystem, Evelyn Douek and Quinta Jurecic spoke with Tim Hwang, Substack’s general counsel and the author of the book “Subprime Attention Crisis: Advertising and the Time Bomb at the Heart of the Internet.” Tim argues that online advertising, which underpins the structure of the internet as we know it today, is a house of cards—that advertisers aren’t nearly as good as they claim at monetizing our attention, even as they keep marketing it anyway. So how worried should we be about this structure collapsing? If ads can’t convince us to buy things, what does that mean about our understanding of the internet? And what other possibilities are there for designing a better online space? Hosted on Acast. See acast.com/privacy for more information.

Podcasts Are the Laboratories of Misinformation
Valerie Wirtschafter and Chris Meserole, our friends at the Brookings Institution, recently published an analysis of how popular podcasters on the American right used their shows to spread the “big lie” that the 2020 election was stolen from Donald Trump. These are the same issues that led tech platforms to crack down on misinformation in the runup to the election—and yet, the question of whether podcast apps have a responsibility to moderate audio content on their platforms has largely flown under the radar. Why is that? This week on Arbiters of Truth, our series on the online information ecosystem, Evelyn Douek and Quinta Jurecic talked through this puzzle with Valerie and Chris. They discussed their findings about podcasts and the “big lie,” why it’s so hard to detect misinformation in podcasting, and what we should expect when it comes to content moderation in podcasts going forward. Hosted on Acast. See acast.com/privacy for more information.

Content Moderation After January 6
One year ago, a violent mob broke into the U.S. Capitol during the certification of the electoral vote, aiming to overturn Joe Biden’s victory and keep Donald Trump in power as the president of the United States. The internet played a central role in the insurrection: Trump used Twitter to broadcast his falsehoods about the integrity of the election and gin up excitement over January 6, and rioters coordinated ahead of time on social media and posted pictures afterwards of the violence. In the wake of the riot, a crackdown by major social media platforms ended with Trump suspended or banned from Facebook, Twitter and other outlets.So how have platforms been dealing with content moderation issues in the shadow of the insurrection? This week on Arbiters of Truth, our series on the online information ecosystem, Evelyn Douek and Quinta Jurecic sat down for a discussion with Lawfare managing editor Jacob Schulz. To frame their conversation, they looked to the recent Twitter ban and Facebook suspension of Representative Marjorie Taylor Greene—which took place almost exactly a year after Trump’s ban. Hosted on Acast. See acast.com/privacy for more information.

Working Toward Transparency and Accountability in Content Moderation
In 2018, a group of academics and free expression advocates convened in Santa Clara, California, for a workshop. They emerged with the Santa Clara Principles on Transparency and Accountability in Content Moderation—a high level list of procedural steps that social media companies should take when making decisions about the content on their services. The principles quickly became influential, earning the endorsement of a number of major technology companies like Facebook.Three years later, a second, more detailed edition of the principles has just been released—the product of a broader consultation process. So what’s changed? This week on Arbiters of Truth, our series on the online information ecosystem, Evelyn Douek and Quinta Jurecic spoke with David Greene, senior staff attorney and civil liberties director at the Electronic Frontier Foundation. At EFF, he’s been centrally involved in the creation of version 2.0 of the principles. They talked about what motivated the effort to put together a new edition and what role he sees the principles playing in the conversation around content moderation. And they discussed amicus briefs that EFF has filed in the ongoing litigation over social media regulation laws passed by Texas and Florida. Hosted on Acast. See acast.com/privacy for more information.

Free the Data!
On this show, we’ve discussed no end of proposals for how to regulate online platforms. But there’s something many of those proposals are missing: data about how the platforms actually work. Now, there’s legislation in Congress that aims to change that. The Platform Accountability and Transparency Act, sponsored by Senators Chris Coons, Rob Portman and Amy Klobuchar, would create a process through which academic researchers could gain access to information about the operation of these platforms—peering under the hood to see what’s actually happening in our online ecosystems, and perhaps how they could be improved. This week on Arbiters of Truth, our series on the online information ecosystem, Evelyn Douek and Quinta Jurecic spoke with the man who drafted the original version of this legislation—Nate Persily, the James B. McClatchy Professor of Law at Stanford Law School. He’s been hard at work on the draft bill, which he finally published this October. And he collaborated with Coons, Portman and Klobuchar to work his ideas into the Platform Accountability and Transparency Act. They talked about how Nate’s proposal would work, why researcher access to data is so important and what the prospects are for lasting reforms like this out of Congress. Hosted on Acast. See acast.com/privacy for more information.

Content Moderation’s Original ‘Decider’
We talk a lot about how content moderation involves a lot of hard decisions and trade-offs—but at the end of the day, someone has to make a decision about what stays on a platform and what comes down. This week on Arbiters of Truth, our series on the online information ecosystem, Evelyn Douek and Quinta Jurecic spoke with “The Decider”—Nicole Wong, who earned that tongue-in-cheek nickname during her time at Google in the 2000s. As the company’s deputy general counsel, Nicole was in charge of decisionmaking over what content Google should remove or keep up in response to complaints from users and governments alike. Since then, she moved on to roles as Twitter’s legal director of products and the deputy chief technology officer of the United States under the Obama administration. In that time, the role of social media platforms in shaping society has grown enormously, but how much have content moderation debates really changed? Quinta and Evelyn spoke with Nicole about her time as the Decider, what’s new and what’s stayed the same since the early days of content moderation, and how her thinking about the danger and promise of the internet has changed over the years. Hosted on Acast. See acast.com/privacy for more information.

How Zoom Thinks About Content Moderation
This week on Arbiters of Truth, our series on the online information ecosystem, Evelyn Douek and Quinta Jurecic spoke with some of the people behind the app that, by this point in the pandemic, you’re probably sick of: Zoom. Quinta and Evelyn sat down with Josh Kallmer, Zoom’s head of global public policy and government relations, and Josh Parecki, Zoom’s associate general counsel and head of trust and safety.Most of us have used Zoom regularly over the last few years thanks to COVID-19, but while you’re likely familiar with the platform as a mechanism for work meetings and virtual happy hours, you may not have thought about it in the context of content moderation. Josh and Josh explained the kinds of content moderation issues they grapple with in their roles at Zoom, how their moderation and user appeals process works, and why Zoom doesn’t think of itself like a phone line or a mail carrier, services that are almost entirely hands-off when it comes to the content they carry. Hosted on Acast. See acast.com/privacy for more information.

Rational Security's The 'Nothing To Be Thankful For' Edition
For Thanksgiving, we’re bringing you something a little different—an episode of Rational Security, our light, conversational show about national security and related topics. This week, Alan, Quinta and Scott were joined by special guest, Quinta's co-host of the Arbiters of Truth series on the Lawfare podcast feed Evelyn Douek! They sat down to discuss:—“Getting Rittenhoused”: A jury recently acquitted 17-year-old Kyle Rittenhouse of murder charges for shooting two men in what he claimed was self-defense during last summer’s unrest. What does his trial and its aftermath tell us about the intersection of politics with our criminal justice system?— “Now That’s a Power Serve”: A global pressure campaign by professional tennis players has forced Chinese officials to disclose the location of Chinese tennis player Peng Shuai, who disappeared after publicly accusing a former senior official of sexual assault. Is this a new model for dealing with Chinese human rights abuses?— “Duck Say Quack and Fish Go Blub—But What Did Fox Say?”: Two prominent conservative commentators have resigned from Fox News over its release of a Tucker Carlson film that they say spreads misinformation and promotes violence. Will this be enough to force the network to curb its behavior?For object lessons, Quinta endorsed her favorite pie dough recipe. Alan in turn made an unorthodox recommendation of what to put in that dough: sweet potato pie. Scott encouraged listeners to follow up that big meal with a cup of coffee, made on his beloved Aeropress with a Prismo filter attachment. And if that doesn't work, Evelyn suggested folks tuck in for a nap with her favorite weighted blanket from Bearaby. Hosted on Acast. See acast.com/privacy for more information.

The Facebook Oversight Board, One Year On
It’s been roughly a year since the Facebook Oversight Board opened its doors for business—and while you may mostly remember the Board from its decision on Donald Trump’s suspension from Facebook, but there’s been a lot going on since then. So we thought it was a good time to check in on how this experiment in platform governance is faring. In October, the Board released its first transparency report, and Facebook—now Meta—has published its own update on how it’s been responding to the Board’s decisions and recommendations. Meanwhile, Lawfare is keeping track of developments on our Facebook Oversight Board Blog, run by the inimitable Tia Sewell.On this episode of Arbiters of Truth, our series on the online information ecosystem, Evelyn Douek and Quinta Jurecic talked about what the data shows about what cases the Board is taking, how the Board’s role seems to be evolving, and, of course, whether we’re going to have to start calling this the Meta Oversight Board, thanks to Facebook’s name change. Hosted on Acast. See acast.com/privacy for more information.

Video Games Cannot Escape the Content Moderation Reckoning
Content moderation in video games turns out to be just as much of a bummer as content moderation everywhere else, perhaps even more so. This week on Arbiters of Truth, our series on the online information ecosystem, Evelyn Douek and Quinta Jurecic spoke with Daniel Kelley, the director of strategy and operations for the Anti-Defamation League’s Center for Technology and Society. He studies how companies deal with the many moderation issues that pop up in gaming, from harassment to digital recreations of violent hate crimes and white nationalist propaganda. And his team at the Anti-Defamation League has a new report out on how players experience abuse—but also joy and connection—while gaming. Quinta and Evelyn asked Daniel to make the case for why everyone, gamers and non-gamers alike, should care about games, why harassment in gaming seems particularly bad compared to non-gaming platforms, and where the gaming industry stands when it comes to investing in content moderation. Hosted on Acast. See acast.com/privacy for more information.

What Is Integrity in Social Media?
There’s been a lot of news recently about Facebook, and a lot of that news has focused on the frustration of employees assigned to the platform’s civic integrity team or other corners of the company focused on ensuring user trust and safety. If you read reporting on the documents leaked by Facebook whistleblower Frances Haugen, you’ll see again and again how these Facebook employees raised concerns about the platform and proposed solutions only to be shot down by executives.That’s why it’s an interesting time to talk to two former Facebook employees who both worked on the platform’s civic integrity team. This week on Arbiters of Truth, our series on the online information ecosystem, Evelyn Douek and Quinta Jurecic spoke with Sahar Massachi and Jeff Allen, who recently unveiled a new project, the Integrity Institute, aimed at building better social media. The goal is to bring the expertise of current and former tech employees to inform the ongoing discussion around if and how to regulate big social media platforms. They dug into the details of what they feel the Institute can add to the conversation, the nitty-gritty of some of the proposals around transparency and algorithms that the Institute has already set out, and what the mood is among people who work in platform integrity right now. Hosted on Acast. See acast.com/privacy for more information.

The SEC and the Facebook Papers
This week on Arbiters of Truth, our series on the online information ecosystem, we’re talking about a subject that doesn’t come up much on the Lawfare Podcast: the Securities and Exchange Commission. Facebook whistleblower Frances Haugen has made waves with her congressional testimony and the many damaging news stories being reported about Facebook based on the documents she released. But before these documents became the Facebook Papers, Haugen also handed them to the SEC as part of a whistleblower complaint against the company. So, we thought we should dig into what that actually means. What is the likelihood that Haugen’s SEC filings turn into an investigation into the company? Should Facebook be worried? Evelyn Douek and Quinta Jurecic discussed these questions with Jacob Frenkel, who spent years at the SEC and is now the chair of government investigations and securities enforcement at the law firm Dickinson Wright. He explained how to understand the SEC’s role in cases like these, why whistleblowers like Haugen file complaints with the SEC, and why he thinks it’s unlikely that the agency will investigate Facebook based on Haugen’s disclosures. Hosted on Acast. See acast.com/privacy for more information.

Twitter’s Head of Public Policy Explains the Company’s Advice to Regulators
On this week’s episode of Arbiters of Truth, our series on the online information ecosystem, Evelyn Douek and Quinta Jurecic spoke with Nick Pickles, Twitter's senior director for global public policy strategy, development and partnerships. They discussed a new paper just released by Twitter, “Protecting the Open Internet: Regulatory Principles for Policy Makers”—which sketches out, in broad strokes, the company’s vision for what global technology policy should look like. The paper discusses a range of issues, from transparency to everyone’s favorite new topic, algorithms. As a platform that’s often mentioned in the same breath as Google and Facebook, but is far smaller—with hundreds of millions of users rather than billions—Twitter stands at an interesting place in the social media landscape. How does Twitter define the “open internet,” exactly? How much guidance is the company actually giving to policymakers? And, what does the director of global public policy strategy do all day? Hosted on Acast. See acast.com/privacy for more information.

Finstas, Falsehoods and the First Amendment
Facebook whistleblower Frances Haugen’s recent testimony before Congress has set in motion a renewed cycle of outrage over the company’s practices—and a renewed round of discussion around what, if anything, Congress should do to rein Facebook in. But how workable are these proposals, really?This week on Arbiters of Truth, our series on the online information ecosystem, Evelyn Douek and Quinta Jurecic spoke with Jeff Kosseff, an associate professor of cybersecurity law at the United States Naval Academy, and the guy that has literally written not just the book on this, but two of them. He is the author of “The Twenty-Six Words That Created the Internet,” a book about Section 230, and he has another book coming out next year about First Amendment protections for anonymous speech, titled “The United States of Anonymous.” So Jeff is very well positioned to evaluate recent suggestions that Facebook should, for example, limit the ability of young people to create what users call Finstas, a second, secret Instagram account for a close circle of friends—or Haugen’s suggestion that the government should regulate how Facebook amplifies certain content through its algorithms. Jeff discussed the importance of online anonymity, the danger of skipping past the First Amendment when proposing tech reforms, and why he thinks that Section 230 reform has become unavoidable … even if that reform might not make any legal or policy sense. Hosted on Acast. See acast.com/privacy for more information.

Russia Cracks Down on Social Media
In the last few weeks, the Russian government has been turning up the heat on tech platforms in an escalation of its long-standing efforts to bring the internet under its control. First, Russia forced Apple and Google to remove an app from their app stores that would have helped voters select non-Kremlin-backed candidates in the country’s recent parliamentary elections. Then, the government threatened to block YouTube within Russia if the platform refused to reinstate two German-language channels run by the state-backed outlet RT. And after we recorded this podcast, the Russian government announced that it would fine Facebook for not being quick enough in removing content that Russia identified as illegal.What’s driving this latest offensive, and what does it mean for the future of the Russian internet? This week on Arbiters of Truth, our series on the online information ecosystem, Evelyn Douek and Quinta Jurecic spoke with Alina Polyakova, the president and CEO of the Center for European Policy Analysis, and Anastasiia Zlobina, the coordinator for Europe and Central Asia at Human Rights Watch. They explained what this crackdown means for social media platforms whose Russian employees might soon be at risk, the legal structures behind the Russian government’s actions and what’s motivating the Kremlin to extend its control over the internet. Hosted on Acast. See acast.com/privacy for more information.

Defamation Down Under
Just two days ago, on September 28, CNN announced that it was turning off access to its Facebook pages in Australia. Why would the network cut off Facebook users Down Under?It’s not a protest of Facebook or… Australians. CNN’s move was prompted by a recent ruling by the High Court of Australia in Fairfax Media and Voller, which held that media companies can be held liable for defamatory statements made by third parties in the comments on their public pages, even if they didn’t know about them. This is a pretty extraordinary expansion of potential liability for organizations that run public pages with a lot of engagement.On this week’s episode of Arbiters of Truth, our series on the online information ecosystem, Evelyn Douek and Quinta Jurecic spoke with David Rolph, a professor at the University Hosted on Acast. See acast.com/privacy for more information.

Inside the Facebook Files
Today, we’re bringing you another episode of Arbiters of Truth, our series on the online information ecosystem. We’ll be talking about “The Facebook Files”—a series of stories by the Wall Street Journal about Facebook’s failures to mitigate harms on its platform. There’s a lot of critical reporting about Facebook out there, but what makes the Journal’s series different is that it’s based on documents from within the company itself—memos from Facebook researchers, identifying problems based on hard data, proposing solutions that Facebook leadership then fails or refuses to implement and contradicts in public statements. One memo literally says, “We are not actually doing what we say we do publicly.”To discuss the Journal’s reporting, Evelyn Douek and Quinta Jurecic spoke with Jeff Horwitz, a technology reporter at the paper who obtained the leaked documents and led the team reporting the Facebook Files. What was it like working on the series? What's his response to Facebook's pushback? And why is there so much discontent within the company? Hosted on Acast. See acast.com/privacy for more information.

The Broken Rube Goldberg Machine of Online Advertising
Today, we’re bringing you another episode of Arbiters of Truth, our series on the online information ecosystem.In a 2018 Senate hearing, Facebook CEO Mark Zuckerberg responded to a question about how his company makes money with a line that quickly became famous: “Senator, we sell ads.” And indeed, when you open up your Facebook page—or most other pages on the internet—you’ll find advertisements of all sorts following you around. Sometimes they’re things you might really be interested in buying, even if you’ve never heard of them before—tailored to your interests with spooky accuracy. Other times, they’re redundant or just … weird. Like the aid for a pair of strange plaid pajamas with a onesie-style flap on the bottom that briefly took over the internet in December 2020.Shoshana Wodinsky, a staff reporter at Gizmodo, wrote a great piece explaining how exactly those onesie pajamas made their way to so many people’s screens. She’s one of very few reporters covering the business of online advertisements outside industry publications—so Evelyn Douek and Quinta Jurecic spoke to her this week about what it’s like reporting on ads. How exactly does ad technology work? Why is it that the ad ecosystem gets so little public attention, even as it undergirds the internet as we know it? And what’s the connection between online ads and content moderation? Hosted on Acast. See acast.com/privacy for more information.

Content Moderation Comes for Parler and Gettr
Let’s say you’re a freedom-loving American fed up with Big Tech’s effort to censor your posts. Where can you take your business? One option is Parler—the social media platform that became notorious for its use by the Capitol rioters. Another is Gettr—a new site started by former Trump aide Jason Miller.Unfortunately, both platforms have problems. They don’t work very well. They might leak your personal data. They’re full of spam. And they seem less than concerned about hosting some of the internet’s worst illegal content. Can it be that some content moderation is necessary after all?Today, we’re bringing you another episode of our Arbiters of Truth series on the online information ecosystem. Evelyn Douek and Quinta Jurecic spoke with David Thiel, the big data architect and chief technical officer of the Stanford Internet Observatory. With his colleagues at Stanford, David has put together reports on the inner workings of both Parler and Gettr. They talked about how these websites work (and don’t), the strange contours of what both platforms are and aren’t willing to moderate, and what we should expect from the odd world of “alt-tech.” Hosted on Acast. See acast.com/privacy for more information.

The Disinformation Industrial Complex
This week on our Arbiters of Truth series on our online information ecosystem, we’re going to be talking about … disinformation! What else? It’s everywhere. It’s ruining society. It’s the subject of endless academic articles, news reports, opinion columns, and, well, podcasts.Welcome to what BuzzFeed News reporter Joe Bernstein has termed “Big Disinformation.” In a provocative essay in the September issue of Harper’s Magazine, he argues that anxiety over bad information has become a cultural juggernaut that draws in far more attention and funding than the problem really merits—and that the intellectual foundations of that juggernaut are, to a large extent, built on sand.Joe joined Evelyn Douek and Quinta Jurecic to discuss his article and the response to it among researchers and reporters who work in the field. Joe explained his argument and described what it feels like to be unexpectedly cited by Facebook PR. What led him to essentially drop a bomb into an entire discipline? What does his critique mean for how we think about the role of platforms in American society right now? And … is he right? Hosted on Acast. See acast.com/privacy for more information.

Why the Taliban Can’t Use Facebook
When the Taliban seized power following the U.S. withdrawal from Afghanistan this month, major platforms like Facebook and Twitter faced a quandary. What should they do with accounts and content belonging to the fundamentalist insurgency that was suddenly running a country? Should they treat the Taliban as the Afghan government and let them post, or should they remove Taliban content under U.S. sanctions law?If you’re coming at this from the tech sphere, you may have been seeing conversation in recent weeks about how this has raised new and difficult issues for platforms thrust into the center of geopolitics by questions of what to do about Taliban accounts. But, how new are these problems, really? On this week’s episode of our Arbiters of Truth series on our online information ecosystem, Evelyn Douek and Quinta Jurecic spoke with Scott R. Anderson, a senior editor at Lawfare and a fellow at the Brookings Institution, whom you might have heard on some other Lawfare podcasts about Afghanistan in recent weeks. They talked about the problems of recognition and sanctions law that platforms are now running into—and they debated whether or not the platforms are navigating uncharted territory, or whether they’re dealing with the same problems that other institutions, like banks, have long grappled with. Hosted on Acast. See acast.com/privacy for more information.

Facebook Shuts Down Research On Itself
In October 2020, Facebook sent a cease and desist letter to two New York University researchers collecting data on the ads Facebook hosts on its platform, arguing that the researchers were breaching the company’s terms of service. The researchers disagreed and kept up with their work. On August 3, after months of failed negotiations, Facebook shut off access to their accounts—an aggressive move that journalists and scholars denounced as an effort by the company to shield itself from transparency.For this week’s episode of our Arbiters of Truth series on our online information ecosystem, Evelyn Douek and Quinta Jurecic spoke with Alex Abdo, the litigation director at the Knight First Amendment Institute at Columbia University (where, full disclosure, Evelyn will soon join as a senior research fellow). The Knight Institute is providing legal representation to the two NYU researchers, Laura Edelson and Damon McCoy—and Alex walked us through what exactly is happening here. Why did Facebook ban Edelson and McCoy’s accounts, and what does their research tool, Ad Observer, do? What’s the state of the law, and is there any merit to Facebook’s claims that its hands are tied? And what does this mean for the future of research and journalism on Facebook? Hosted on Acast. See acast.com/privacy for more information.

With Disinformation, The Past Isn’t Past
We live in the Disinformation Age. The internet has revolutionized our information ecosystem and caused disruption totally unprecedented in human history, and democracy may not survive. ... Just like it didn’t survive the television, radio, telegram and printing press before it. Right?When it comes to talking about the internet, all too often history is either completely ignored with bold claims about how nothing like this has ever happened before—or it’s invoked with simple analogies to historical events without acknowledging their very different contexts. As usual, the real answer is more complicated: talking about history can inform our understanding of the dilemmas we face today, but it rarely provides a clear answer one way or another to contemporary problems. This week on our Arbiters of Truth series on our online information ecosystem, Quinta Jurecic spoke with Heidi Tworek, an associate professor at the School of Public Policy and Global Affairs and History at the University of British Columbia. In a recent essay, she made the case for how a nuanced view of history can better inform ongoing conversations around how to approach disinformation and misinformation. So how do current discussions around disinformation leave out or misinterpret history? What’s the difference between a useful historical comparison and a bad one? And why should policymakers care? Hosted on Acast. See acast.com/privacy for more information.

Facebook’s Thoughts on Its Oversight Board
There have been a thousand hot takes about the Facebook Oversight Board, the Supreme Court-like thing Facebook set up to oversee its content moderation. The Board generated so much press coverage when it handed down its decision on Donald Trump’s account that Kaitlyn Tiffany at The Atlantic called the whole circus “like Shark Week, but less scenic.” Everyone weighed in, from Board Members, to lawmakers, academics, critics and even Lawfare podcast hosts. But there’s a group we haven’t heard much from: the people at Facebook who are actually responsible for sending cases to the Board and responding to the Board’s policy recommendations. Everyone focuses on the Board Members, but the people at Facebook are the ones that can make the Board experiment actually translate into change—or not. So this week for our Arbiters of Truth series on our online information environment, in light of Facebook’s first quarterly update on the Board, Evelyn Douek talked with Jennifer Broxmeyer and Rachel Lambert, both of whom work at Facebook on Facebook’s side of the Oversight Board experiment. What do they think of the first six or so months of the Oversight Board’s work? How do they grade their own efforts? Why is their mark different from Evelyn’s? And, will the Oversight Board get jurisdiction over the metaverse? Hosted on Acast. See acast.com/privacy for more information.

The FBI, Social Media and Jan. 6
The attempted insurrection on January 6 is back in the headlines. This week, the House select committee investigating the Capitol riot began its work with its very first hearing. So for our Arbiters of Truth series on our online information environment, Evelyn Douek interviewed Quinta Jurecic about social media’s role in warning of the riot. Specifically, they talked about an essay Quinta wrote in Lawfare on the FBI’s failure to examine social media posts announcing plans to storm the Capitol—and how FBI Director Christopher Wray’s explanations don’t hold water.So why does Quinta think Wray has been misleading in his answers to Congress on why the FBI didn’t review those posts from soon-to-be-rioters? What about the First Amendment issues raised by the U.S. government refreshing your Twitter feed? What role is social media playing in the Jan. 6 prosecutions—and what does that say about how tech companies should preserve online evidence of wrongdoing, rather than just taking it down? Hosted on Acast. See acast.com/privacy for more information.

Facebook v. the White House: Renee DiResta and Brendan Nyhan Weigh In
This week we're bringing you the breakdown of the heavyweight bout of the century—a battle over vaccine misinformation. In the left corner we have the White House. Known for its impressive arsenal and bully pulpit, this week it asked for the fight and came out swinging with claims that Facebook is a killer—and not in a good way. In the right corner we have Facebook, known for its ability to just keep taking punches while continuing to grace our screens and rake in the cash. The company has hit back with gusto, saying that Facebook has actually helped people learn the facts on vaccines. Period. Will either of them land a knockout blow? Is this just the first round of many match ups?On this episode of our Arbiters of Truth series on our online information ecosystem, we devote the conversation to the latest slugfest between Facebook and the White House. Evelyn Douek and Quinta Jurecic spoke with Renee DiResta, the research manager at the Stanford Internet Observatory, and Brendan Nyhan, professor of government at Dartmouth College, both of whom have been working on questions of online health misinformation. Let’s get ready to rumble. Hosted on Acast. See acast.com/privacy for more information.

Florida Man Regulates Social Media
On May 24, Florida Governor Ron DeSantis signed into law a bill designed to limit how social media platforms can moderate content. Technology companies, predictably, sued—and on June 30, Judge Robert Hinkle of the U.S. District Court for the Northern District of Florida granted a preliminary injunction against the law.The legislation, which purported to end “censorship” online by “big tech,” received a lot of commentary and a great deal of mockery from academics and journalists. Among other things, it included an exemption for companies that operate theme parks. But Alan Rozenshtein argues in a piece for Lawfare that though the law may be poorly written, the issues raised by the litigation are worth taking seriously. This week on our Arbiters of Truth miniseries on our online information ecosystem, Evelyn Douek and Quinta Jurecic spoke with Alan—an associate professor of law at the University of Minnesota Law School and a senior editor at Lawfare—about the Florida legislation.What exactly would the law have done, anyway? Why does Alan think the judge underplays the potential First Amendment considerations raised by private companies exerting control over huge swaths of the online public sphere? And what’s with the theme park stuff? Hosted on Acast. See acast.com/privacy for more information.

Can America Save the News?
The news business in America is in crisis. Between 2008 and 2019, newspapers in the U.S. lost half of their newsroom employees. Journalism jobs cut during the pandemic number in the tens of thousands. Local news is suffering the most, with cutbacks across the country and many communities left without a reliable source of information for what’s going on in their area.Why is this a crisis not just for journalists, but also for democracy?In today’s episode of our Arbiters of Truth series on the online information ecosystem, Evelyn Douek and Quinta Jurecic turn to that question with Martha Minow, the 300th Anniversary University Professor at Harvard Law School. She’s written a new book, titled “Saving the News: Why the Constitution Calls for Government Action to Protect Freedom of Speech.” How should we understand the crisis facing American newsrooms? How has the U.S. government historically used its power to create a hospitable environment for news--and how should that history shape our understanding of what interventions are possible today? And what role does the First Amendment play in all this? Hosted on Acast. See acast.com/privacy for more information.

Coordinating Inauthentic Behavior With Facebook’s Head of Security Policy
This week on Arbiters of Truth, our podcast on our online information ecosystem, Evelyn Douek and Quinta Jurecic bring you an episode they’ve wanted to record for a while: a conversation with Nathaniel Gleicher, the head of security policy at Facebook. He runs the corner of Facebook that focuses on identifying and tackling threats aimed at the platform, including information operations.They discussed a new report released by Nathaniel’s team on “The State of Influence Operations 2017-2020.” What kinds of trends is Facebook seeing? What is Nathaniel’s response to reports that Facebook is slower to act in taking down dangerous content outside the U.S.? What about the argument that Facebook is designed to encourage circulation of exactly the kind of incendiary content that Nathaniel is trying to get rid of?And, of course, they argued over Facebook’s use of the term “coordinated inauthentic behavior” to describe what Nathaniel argues is a particularly troubling type of influence operation. How does Facebook define it? Does it mean what you think it means? Hosted on Acast. See acast.com/privacy for more information.

Information Operations, Then and Now
This week on Arbiters of Truth, our podcast on our online information ecosystem, Evelyn Douek and Quinta Jurecic spoke with Camille François, the chief innovation officer at Graphika, about a new report released by her team earlier this month on an apparent Russian influence operation aimed at so-called “alt-tech” platforms, like Gab and Parler. A group linked to the Russian Internet Research Agency “troll farm” has been posting far-right memes and content on these platforms over the last year. But how effective has their effort really been? What does the relatively small scale of the operation tell us about how foreign interference has changed in the last four years? Has the media’s—and the public’s—understanding of information operations caught up to that changing picture?One note: Camille references the “ABC framework” for understanding information operations. That’s referring to a framework she developed where operations can be understood along three vectors: manipulative actors, deceptive behavior and harmful content. Hosted on Acast. See acast.com/privacy for more information.