In February 2021, I moderated a conversation to address the issues and solutions surrounding news media's elevated levels of mistrust.
Since then, it has gotten worse. Mistrust has turned to distrust, and this conversation is still as relevant as it was then.
The experts on the panel are still addressing the problem from different perspectives. We talked about the economics of the news, how fake news is spread, and the realities of producing the news.
I have included the video, so that you can reference the images, tweets, and metrics that we discussed. If you want to zoom in on a specific part of the conversation, just skip to the timestamp referenced in the transcript.
I used Otter.ai to create the transcript, which is an amazing tool. Because AI is not perfect, I tried to correct mistakes. However, I may have missed a couple of issues. Please forgive me.
Enjoy!!!
[TRANSCRIPT BELOW]
Christine Alemany 0:17
Good evening, everyone, and welcome to today's event brought to you by Columbia Business School's Women's Circle. The Women's Circle cultivates an engaged community that provides a platform for alumnae to support and learn from each other, strengthen partnerships between business leaders and Columbia Business School, and celebrate our collective impact on keeping Columbia Business School at the forefront of management education.
Part of our mission is to encourage open dialog among alumni, faculty, and students around keep issues like today's "Theory, Applied" panel.
My name is Christine Alemany, and I will be your moderator this evening. I currently serve as the CEO of TBGA, a marketing and branding firm. I'm an engineer by training and a marketer by heart, and I received my MBA from Columbia Business School.
I am also a founding member of the Women's Circle, which is bringing today's panel together of esteemed leaders to help us better understand our media problem. Today, I'll begin with a brief introduction of our panelists. Each panelist will walk you through their areas of expertise, and then we're going to open up to a live q&a session where you will have a chance to ask us questions directly. To wrap it all up. Our panelists will discuss potential solutions to give us all some light at the end of the tunnel. So please submit your questions using the Q&A button and use the raise-your-hand feature during the live portion.
Today, I am so grateful for these three esteemed leaders who have joined us today to address the issue and solutions of a current news situation.
Professor James T. Hamilton is the Hearst Professor of Communication Chair of the Department of Communication and Director of the journalism program at Stanford. He's authored multiple books on media markets and his most recent book, "Democracy's Detectives: the Economics of Investigative Journalism" focuses on the market for investigative recording. He's currently exploring how the cost of story discovery can be lowered through better use of data and algorithms. James is also the co-founder of the Stanford Computational Journalism Lab, Senior Fellow at the Stanford Institute for Economic Policy Research, and affiliated faculty at the Brown Institute for Media Innovation, and a member of the JFK Journalism Fellowship Board of Adviosors. Welcome, Jay.
Next, we have Professor Gita Johar. She is a Meyer Feldberg Professor of Business at Columbia Business School, and the school's inaugural Vice Dean of Diversity, Equity, and Inclusion. Gita studies consumer identity beliefs and persuasion as they relate to branding, advertising, and media. Her work has been published in top marketing and psychology journals, as well as PNAS and the Stanford Social Innovation Review, get his current research studies why people behave and share fake news and how to develop interventions based on this understanding to help clean up the media ecosystem. She received her PhD from the NYU Stern School of Business and her MBA from the Indian Institute of Management, Calcutta. She received a Distinguished Alumnus Award from IIMC in 2019.
And last, but not least, is Ariana Pekary who is an award-winning group producer with two decades of experience across public radio and primetime news. Using her unique perspective, Arianna highlights stories found below the fold that helped help explain those that are above. As a producer at MSNBC, she oversaw an Emmy-nominated episode on public housing in addition to helping plan a live daily program. Currently, Arianna is working to rethink financial incentives in broadcast news while serving as a public editor for Columbia Journalism Review.
Gita, Ariana, James: Thank you so much for joining us this afternoon.
So let me set the stage...
Our media problem has been developing for over a decade, but it's recently come to the forefront during the run-up to the 2020 election. And in 2019, The Pew Research Center found that many Americans were beginning to see fake news as a critical problem that needed to be fixed. Those surveyed viewed politicians as the major creators of fake news, and they saw journalists as the ones who should fix it. Pointedly, those surveys also considered this trust problem as more dangerous and violent crime, climate change, and racism.
In November 2020, the Pew Research Center saw that the problem had grown. Now two-thirds of adults believe that their news sources presented facts in a way that favors one side during the 2020 election coverage. More than half said that new sources published breaking news before it was fully verified. And 37% said the sources have reported made-up news intended to mislead. At this point, Americans were blaming unfair news coverage on media outlets, and not the journalists who work for them.
Which brings us to today, where we're entering an era of distrust. Edelman's latest Trust Barometer found that people do not trust our society's leaders to do what's right. And now it's including journalists. While we still trust our neighbors and our scientists, our trust is degrading at a higher level than our society leaders.
And now let's talk about how we got here. Jay, how have you seen market forces come into play?
James T. Hamilton 6:21
Thanks, Christine. What I'd like to talk about today is the market economics for unbiased reporting from three perspectives: audience information demands, the cost structure of stories, and supply incentives. So, if you think about audience information demands Anthony Downs wrote a great book called "An Economic Theory of Democracy" in which he said everybody on this zoom call has four basic information demands. Consumer, producer, entertainment audience member, and voter. So consumer.
So Consumer: How do I get this quick?
Producer: What information is going to help me do my day job?
Entertainment audience member: That's what we all watch on Netflix or Hulu or HBO Max.
And voter: That's information that helps you make a better voting decision.
The first three markets work pretty well because if you don't seek out the information, you don't get the benefit. But what Dan's pointed out was in terms of the voter demand, even if you care a lot about which candidate might win an election—and even if more information might help you make a better decision—from the perspective of your own preferences, the statistical probability that you're going to determine the outcome of the election is so small that the net benefits to becoming informed are negative.
He called that rational ignorance. And what he meant was that if you look at becoming informed about the details of policy from an investment perspective, the opportunity cost of your time and what you paid in subscription would be greater than the benefits from your vote because your vote has such a small probability of mattering. That sets up a gap between what you need to know as a citizen and what you want to know as an audience member.
Now, there are three things that are hopeful about that connector conundrum I call them the 3-Ds: duty, diversion, and drama. Some people feel they have a duty to become informed about politics. So they turn up and they watch and they become informed. For some people, C-SPAN is as interesting as ESPN. Don't try to build a business around that. That's about 1% of the people who turn up for the News Hour on PBS. But for them, the details of politics are inherently interesting. And maybe I can't tell you about the details of politics, but I can tell you who's ahead or who's behind in the horse race of election or who's involved in scandal. So that's essentially covering politics, through the lens of entertainment demand.
But when you step back and say, "How does the market for fact-based public affairs reporting work?" One glaring problem is that there's this gap between what people need to know as citizens and what they want to know as audience members. And Anthony Downs gave that gap a name, he called it Rational Ignorance.
Let's next think about the cost structure of public affairs reporting. Maybe your particular type of investigative reporting is original reporting about issues of substantive interest to the community that someone is trying to keep secret. The economic concepts there are [the costs of] original reporting. That means there's a fixed cost to figuring out what's going on of substantive interest to people in the community. That's a story that might change laws or lives. And the benefits of that story might spill over onto people who weren't watching it, or were paying for it but weren't consuming it.
Those are positive spillovers. Those are given a term called externalities by economists, and they raise the problem that it's hard to monetize that story because the benefits spill over onto so many other people. And then transaction costs or hassle costs, the government may be trying to keep a story secret, and that makes it more difficult for you to figure out what's going on.
In my book, "Democracy's Detectives," one of the things I tried to show was that investigative reporting can have great benefits to society, but it's hard to produce. So one of the case studies I focused on was the Raleigh The News & Observer in 2008. The paper did a series on the North Carolina probation system. What they found was that between 2000 and 2008, 580 people had murdered someone while they were out on probation in North Carolina. That was a statistic that had not existed before they devoted six months and $200,000 to do that investigation. Nobody in the penal system, nobody in the court system, nobody in state government had an incentive to do the math, to tell you that. Once that story was told, people were fired in the probation system. New people were hired. The state legislature appropriated more than $10 million a year in additional funding for probation. And in my research, I estimate that there were eight fewer murders because of those policy changes in the first year of the policy’s implementation.
So, how would you evaluate that if you have eight fewer murders, those are eight statistical lives saved? The Office of Management and Budget uses value of statistical life of $9.2 million. And so you have about $72 million in benefits as the government would calculate it. You have $11 million in additional expenditures in the probation system. So, on net—because of what the Raleigh The News & Observer did—you have more than $60 million in policy benefits from that investigation. It cost them about $200,000 to do.
So, for every dollar the paper invested in that story, society got almost $300 in net policy benefits.
But there are a couple of things to note there. #1. The paper couldn't recover most of those benefits that were spilled over on to many different counties—There are 100 counties in North Carolina spilled over on to many different counties—And people didn't really recognize it. Nobody rolled out of bed in the morning and said, "It's a great day not to be murdered by somebody else on probation because The News & Observer changed the probation system." And it costs $200,000. It's much cheaper, as we know, for somebody to repeat what somebody else has done in a story—or to make something up—than to spend $200,000 to establish the facts of what's really going on.
Finally, let's think a little bit about supply incentives. There are five incentives for the creation of information. One is "Pay Me." That's the subscription model. One is "I want to sell your attention." That's the advertising model. One is "I want your vote." That's the partisan model. One is "I want to change how you think about the world." That's the non-profit model where I just like to talk that is self-expression. That's essentially the basis of social media.
And what we're seeing now in the news ecosystem is a shift away from advertising because advertising has gone to Google and Facebook towards subscription and towards the non-profit incentive. And we also see people increasingly consuming information that's been generated by self-expression that's been generated on Twitter and on Facebook.
And a problem with that type of supply incentive is that relative to the reputation incentives that a reporter faces, the person on social media who's circulating misinformation may face less reputational damage from doing that.
And thinking about the world of social media is the province of Professor Johar, so I'm going to turn over the screen to her.
Gita V. Johar 14:39
Thank you very much. That was really interesting. And I'm going to share my screen, and I have some slides to share with you.
Okay, so, as we put it, these days, there's also the need for social expression, which drives a lot of news sharing. It's no longer all curated and all from the producer to the consumer, but there are consumers, themselves, producing news as well as sharing it. So what I want to talk about is really, why do consumers do this? Why do they share information with each other? And also, why do they tend to share false information?
So Jay talked about this misinformation problem. So that's what I'd like to focus on.
So I think right up front, Christine talked about this, the fact that there's lots of misleading content out there. And that's a problem that people have talked about now, definitely for the last four or five years. And the fact that this is not just evident, but it's increasing over time as well. So that's something that most people acknowledge.
In fact, in this Pew Research Center poll, a lot of Americans said that made-up news is a bigger problem than violent crime or climate change, for example. So clearly, people see this as a big problem, and a really influential paper that was published in Science a couple of years ago showed that once lies are created, they tend to be diffused 10 times faster than the truth. So lies on social media spread to more people and spread faster than the truth. And they did this by looking at Twitter data and looking at how different types of information was shared.
So this is kind of the background to what I want to talk about. And to me, I've been interested in trying to understand, why do people even though we all acknowledge it's a problem, why do we tend to believe this false news? Why don't we fact-check it instead of simply buying into anything that's shared with us or anything that we see? Why do we tend to share this false news? And these are related to papers that I'll talk about when people talk about why people don't fact check and the second set of papers talks about why do people share what's the motivation for sharing, sharing false news?
So let me start with this idea of why don't people fact check. So we all know there are organizations like Snopes and PolitiFact out there that do fact-check some news items, but people don't really take advantage of this. Most people tend not to fact-check news. And in fact, this is particularly the case on social media.
And in this paper, what we talk about and what we do empirically is study why is it that on social media, people tend not to fact-check. And the one intuition here is that when you are in the presence of other people, you feel a diffusion of responsibility. You don't feel it's your responsibility, to fact-check and therefore you do. So this was kind of the genesis of the study where we did a number of experiments trying to understand why don't people fact check means on social media.
And actually what we found out is that on social media, people tend to feel safe, they tend to feel you're in the company of others, and therefore they tend to lower their vigilance. This is similar to animals in a herd, who kind of feel their safety in numbers. So that's an analogy to how we feel on social media. And so what we find in our research is that when you are alone, you tend to fact-check a lot more—or when you feel you're alone—than when you're in a group. And in fact, in social media, it doesn't matter whether you're trying to think of other people or not because social media by its very purpose makes you think of being in the presence of other people. And therefore on social media, such as Facebook, you tend to really fact-check a lot less than you do at other times.
So we tried to think about what can we do in order to increase fact-checking behavior, and we found two kinds of interventions, if you will. One is to make people think that their behavior online is going to be known to other people like their friends and family. They're accountable to them in some way. And what we find is if we make people feel accountable to others, as you can see in the chart, their fact-checking actually goes up to the same level as if they were alone. And in a different kind of intervention. We show that making people feel vigilant in this case in a very stylized way. We told people to think about their duties and responsibilities in life. So just simply by doing this, we were able to increase the levels of fact-checking in the condition when they still felt they're in a group so what you can see as simply inducing this mindset of being vigilant, made people fact-check more. So while in our paper, these are kind of psychological manipulations, if you will, the broader takeaway is that you can create an ecosystem, a platform environment, that actually encourages people to be vigilant. And that might improve the beliefs that we all hold so that we don't kind of get taken in by conspiracy theories, for example. So this was a paper on fact-checking.
I want to turn now to why do people share fake news or share news in general? And I've been looking at this now for a couple of years. And I want to talk very quickly about two papers and two findings. In one paper what we found specifically with COVID-19 News, is that people who are feeling the most uncertain and people who are looking for meaning in life and trying to make sense of all of this news that's out there, and we know about the COVID infodemic we know there's a lot of news out there. It's hard to tell what's true, what's false. So what we find is people who are the most feeling the most uncertain, most likely to share news on social media, and they don't care if it's true or false, or if it's surprising or unsurprising, they're sharing it all in order to try to make sense of the world. And ironically, people who are hit hard by this COVID pandemic, people who are vulnerable, people who are feeling socially marginalized, are in fact the most likely to contribute to this COVID infodemic by sharing a lot of information.
So this research kind of suggests that if you can provide people with some meaning in their life, then they're going to be less likely to do this. And again, we tried different interventions and one intervention was actually trying to, in the moment, boost people's sense of meaning. And we again did this in a slightly stylized way. But there are ways you can do it that's more ecologically friendly, like friendly in the platform. And we do find that people are less likely to share, share information, whether false or true if they find that they actually have a sense of meaning and their sense of uncertainty is diminished.
The last thing I'll talk about is a paper where we're looking at Twitter information and trying to profile those who share fake news on Twitter. So here you see there's often fake news. In fact, talking about media ecosystems, Alex Jones was actually taken off of a lot of these platforms because Infowars was spreading misinformation.
So what do you see as misinformation? And then what you see is that there are people who fact-check this information, which in this case is factcheck.org. And our question is, what types of people are sharing the fact check, and what types of people are sharing fake news?
And why do we care about this? The big point I want to make is that if platforms can figure out what kinds of people share this news, they can actually intervene in real-time and prioritize for screening the posts of these people. So that's kind of where we're headed.
And I'll give you a small flavor of some of the findings. So what we're trying to look at is demographics, political ideology, social media user behavior, and from the text of all of the tweets of people who are sharing fake news and fact checks, we're trying to infer their personality and their emotions and build a predictive model that will tell us who is actually likely to share fake news.
So essentially, you can think that each one of us will have a number associated with us, which tells people what is our propensity to share fake news. And a social media platform could therefore prioritize for screening our posts. We're not saying that it should be censored, but we're saying that because there's a limited number of people that can do this fact-checking, we can prioritize based on our model.
So very quickly, I'll give you some findings. What we find is fake news sharers tend to be more male; they tend to be more conservative. Regardless of whether the fake news is political or about sports, we find that people who have a more conservative ideology are much more likely to share fake news. The people who share fake news are very active on social media platforms.
We also find that they share with fact-check sharers of this idea of having heightened negative emotions. They tend to be more angry and more anxious than other people, the average social media user. We also find interestingly—and there's theoretical reasons for this that I won't get into—that fake news sharers tend to talk a lot more about existential things like death, and they talk more about religion. And in terms of personality, what we find is fake news sharers and fact-check shares are less conscientious, less agreeable, and more neurotic than adults. So all of this is interesting. Again, it's based on theory that I don't have the time to get into.
But I think what is most interesting is that, by putting all of this into a predictive model, we can actually find out who is going to be a fake news sharer and who isn't. So as I said, this can help platforms, and this is just to show you that we can improve by throwing in emotions and personality inferred from the tweets; we can improve the prediction of who's likely to be a fake news sharer.
So I will actually stop there in the interest of time, but I'm happy to take any questions later on. The big takeaway from this last paper is that emotions and personality are fake news sharers and fact-check sharers differ from that of others in predictable ways that make us able to classify who these people are likely to be.
So just to end with a provocative comment here, most people when you ask them who's responsible for stopping fake news online, almost everybody agrees that the internet companies are responsible. Whereas predictably Republicans believe that consumers are responsible, whereas Democrats believe the government is responsible. So this brings me to this question of who gets to decide what kinds of stories are shared, and particularly online, we can all be citizen journalists. But when it comes to the media and television and so on, who gets to decide and what stories get told? And how is that playing into this news media ecosystem that we're seeing needs a fix so on that note, I'll turn it over to Arianna.
Ariana Pekary 26:31
Everyone, thank you so much. I'm honored to be here. And it's hard to follow up, Gita and Jay, with their expertise and academic background. But I would like to make the point that all of these issues that we're seeing are not academic. So that's the kind of the point behind my presentation and what I thought I would share some real-life experiences from the newsroom and my experience at MSNBC.
Now, I will start off by telling you, it seems obvious, but it was one of the first things I learned when I started an NPR and public radio. And this applies to all broadcast whether it is public media (NPR, PBS) or cable news (MSNBC).
The difference between us and online outlets for print outlets is we have a clock that we have to meet. You can't add any more minutes into an hour you certainly can't add more hours into a day the way that they might be able to add pages, and whether online or in the print outlet. So choices really have to be made. And that's why I argue there is misinformation and disinformation by omission because of those choices that have to be made.
So I will tell you an example that I have that came up recently. I quit MSNBC in July. So I can give you some of the background as we go forward.
But this was a tweet by Josh Marshall, who is a reputable journalist, and he works with an online outlet. And he tweeted, "Remarkable watching people awake to this issue. So many of us have been writing about for years." He tweeted that on January 12. And he was responding to an NBC news story about right-wing extremists. And he is talking about how, you know, we've been writing about this issue for years.
And that is true when I saw this tweet, it really triggered something in me because I have been reading these stories—valid news stories—for years and starting probably five years ago. You know, the FBI was starting to talk about you know, the growing threat of right-wing extremists, and it continued on through obviously, January 6.
So I would when I worked at MSNBC, I would pitch the stories on a regular basis. And for whatever reason, they never made it on air and so that was, you know, I felt the same sense of frustration that Josh Marshall did. In his experience, you know, just as a print reporter.
So I what I did then, for this presentation, was to try to find what were the stories that were out there. And I know I specifically remember trying to pitch certain stories, especially in June, when there were the Black Lives Matters protests. And, you know, there were stories from and reports from DHS warnings: domestic terrorist actors could exploit Floyd protests. DHS memo warns, there was one story here federal arrests show no sign the Antifa applauded a protest that was in the New York Times these other stories white supremacists and racist terrorists pose the greatest threat of violence. You get the picture. But the story down here in blue, that's actually an NBC news story. Three men connected to the Boogaloo movement tried to promote violence and protests, per se.
So there's all this evidence of white right-wing extremist threats, but we didn't do any of those stories when I was a producer at MSNBC for a primetime show. And I wanted to fact-check myself in some ways, and I wanted to make sure that I wasn't making this up.
So I did, you know, went online to see what my host may have talked about on this subject, and I actually found this story, which is a bit of an extreme example. But Lawrence O'Donnell explains why Donald Trump's warning of violence is gibberish. And I think the story was actually from March of last year.
But he made the same points on Twitter. So O'Donnell made the same points on Twitter in response to fellow MSNBC anchor Chris Hayes, asking if Trump's comments constituted a threat of fascist violence by the President. And Lawrence O'Donnell says, "I think it's more of a hope than a threat." He said, "Trump's supporters aren't as bad and violent and criminal as he hopes they are." So, again, normally our host doesn't actually conflict/oppose the actual stories out there.
But this is part of the problem as I see it, and it's (or was) it's frustrating. For me, because obviously there are lots of reliable quantifiable stories that don't make it on for all the wrong reasons. And I say that this is the classic cable cycle. They often will ignore warning signs such as these reports. Like if an FBI report comes out, explaining what some threat might be, but they will ignore it because, in general, they think that a report itself is boring. And there's usually something else more exciting for them to report in a day.
So they do. They focus on current disaster, which in my mind probably was preventable based on an earlier report. But anyhow, we will focus on the current scandal of the day. And then once the earlier report comes true, they will act shocked. And we're displeased, and this cycle starts all over. You rinse, repeat.
Now, why does this happen? Again, I call this daily it happens in my opinion every day. There's disinformation and misinformation by omission. And I think you can call it disinformation because there's certain stories; at this point, they know what the effect is. Perhaps a lot of it is misinformation. They don't really understand the effects of their choices, but I make the argument that they do know the effects of their choices.
So there's some structural issues in my mind in cable and commercial news broadcasts and cable news. One of them, from my experience at MSNBC, is a top-down management structure. They have the morning meetings, and the President of the network tells everyone what they want to cover that day and kind of how they want to cover.
And they do the same thing at CNN. I don't know for certain that that's how it happens at Fox, but I'm pretty confident it's a similar structure.
There is risk-averse management, so there is fear of scrutiny. So you don't necessarily want to report something that somebody else is not reporting. You kind of, they, all copy each other. I know I've been held at MSNBC until the wee hours of the morning waiting to see what CNN does on some breaking news story: Are they actually going to cover it? Are they going to go live or not?
And you know, some of it also is they all do the same types of stories because they, you know, their buddy who works on the other show is also doing it. And they want to keep up, and they feel like if they are doing something different then they're not cool or they aren't doing the hottest story of the day.
There's a fear of the unknown. I remember pitching a story. It was probably a year and a half ago. I heard it on the NPR station. It was a Boston story that came out of Boston about medical deportations, and it was pretty horrific the administration was going to be stopping the deferrals for these immigrants who were here on, for medical reasons from life-saving treatments. And I remember pitching the story to my team. They didn't do it. They didn't want to do it. I pitched it several times. There was no reaction, no response.
And then a couple of nights later, Rachel Maddow picked up the story. Then all of a sudden, they were like, "Oh, we have to do the story. We have to do the story." Because—all of a sudden—it was out there, and somebody else was talking about it. So they didn't have that fear of the unknown.
And overall their fear is due to they don't want to lose the audience. They feel like if you try to introduce another story that is not on the audience's radar already, they will tune out. And at the bottom of the line that what they care about everyday are the/is the ratings, how big the audience is. That's how they're, they're paid.
The advertisers are charged based on the size of the audience. And so that is their job, and that's how they get evaluated. You know, if the show is not rating obviously then eventually it will get cut.
And so every day at four o'clock, the ratings report comes out. It's broken down by quarter-hour and they can tell what had rated well or what didn't write well the day before. And they will often make changes based on that. They'll either add something back in if it did well or take something out and if it didn't. They also look to see what's trending on Facebook and Twitter, which I think is a questionable way to make editorial decisions.
But anyhow, ratings are always front of mind. I also will make the point that ratings supersede ideology. There have been times when the... Senator Harris in October 2019 approached our show offering herself as a guest. Everybody there was a big fan of/admired Senator Harris, and we had been in fact trying to get her as a guest pretty much on a weekly basis since she entered the campaign that year in January. And I think she'd been on the show maybe once or twice, but she hadn't made herself available very often. But at this point, she made herself available. I think things were probably tightening in the Democratic primary.
And she also sat on the committees for intelligence and judiciary, which were relevant to news of the day. At that point, it was the Ukraine phone call. And so in my mind, this was a no-brainer that we're going to want to have her on. But I poked my head into the executive producers office and said. "You know, Kamala Harris, you want to do you want to commit to her tonight?" And he was digging through his email looking for the ratings for when she was on the last time. He said, "I don't think so." he's like, "I think she takes the ratings last time she was on."
So this is a example I sometimes tell former journalists who may work in public radio. Now, they tend to be kind of horrified by that because it just isn't... The idea that you/we're gonna say no to Senator Harris is just a little shocking. But anyway, I've seen a journal... This was the host who determined/decided one night this is in March 2018.
I'll wrap this up quickly. There was a shooting of a young man, Stephon Clark, in Sacramento by police. The host is a... he's an expert in police use of violence. And he chose by asking the young take producer on the show, "Who would you rather watch: Stormy Daniels or Stephon Clark?" And she kind of sheepishly said, you know, "I'd rather watch Stormy Daniels." So, instead of doing Stephon Clark and the shooting, they did some non-news story about Stormy Daniels that night.
So I'm kind of cherry-picking a few examples, but I can tell you that this happened pretty much on a daily basis. And you know, I can go on but that pressure also tends to push extremist ideas and hate. The host and on-air guests are also evaluated based on how they rate. It's not just the subject. So there are guests that they always felt rated a bit better than others.
And, you know, there's a problem solvers caucus. You might not have heard of them, but you've probably heard of the squad. And I'm not calling them extremists, but they're the ones who, you know, moderate voices and nuanced voices tend to lose time on air. And so we're losing that for the audience.
And I think that is my last slide. So I will stop sharing my screen, and we can get to the presentation. I don't see where stop sharing. You're all set, Arianna. Christine, you're on mute.
Christine Alemany 40:50
We're gonna open up the Q&A, and we already have quite a bit of questions. The first question is my own question. And I'm gonna ask you, Ariana. You know, what is the baking breaking news from today was the passing of Rush Limbaugh. And one of the things that has been trending lately is talking about the fairness doctrine, and I wanted to get your thoughts on how that would affect newsrooms.
Ariana Pekary 41:23
Well, there's, I spent a lot of time recently looking into this. And certainly, the Fairness Doctrine was the FCC rule that said that you had to treat news topics, controversial new news topics, you're supposed to present, you know, each side equally and fairly.
And so with the repeal of that doctrine in the 80s that gave way to people like Rush Limbaugh. And that's, you know, it's pretty clear that you can see, you know. Without the... If the Fairness Doctrine had still been on the books, it would have been hard for someone like Rush Limbaugh to come to prominence.
There are other factors at play. Media consolidation also is a big factor that happened at the same time. So, you know, that affected the economics, and I'm sure Jay could probably talk about that more than I do or what then what I know.
But I can, and, you know... The other thing about the fairness doctrine is it was also misinterpreted quite a bit. So you know, when you talk about having to give equal time or equal credence to both sides. You know, there isn't always a moral equivalency. And so that the Fairness Doctrine is, from my perspective and what I understand of it at this point, it is a complicated thing to try to bring back, you know, as a lot of people do.
But there are I, from my perspective, people just kind of throw up their hands and said, "Okay, you know, we can't infringe on someone's right to free speech being the media outlets, that so we're not going to do anything at all." And from my perspective, I think they're probably you can probably find some happy medium in there. Find some other solution to help oversee at news outlets.
Christine Alemany 43:15
And this touches on one of the questions that we had in our Q&A spot. It's about commenting on how capitalism and really the pressure to monetize and grow social media and other media platforms. Is that really at the root of fake news, or you know, how much has that really been / helped give the rise to fake news?
Ariana Pekary 43:50
Who do you want do are you asking me or?
Christine Alemany 43:53
I'm opening it up. I would...I'd love to get Jason's thought on it. Given the market
James T. Hamilton 44:04
I think that if you look at Google and Facebook, in my mind, they are media companies. They wrap content around, and I see... And but they don't want to call themselves up because historically, we've always felt that they have of social responsibility. It's always been the case that news can generate positive externalities or spill overs. And we've often relied on means other than the market to generate that—such as owners who take some of their utility from contributing to the public interest.
I think that it's clear that Google and Facebook resist being called media companies. And yet, I think there is a hopeful thing that, first of all, both of them are dual-structured companies. So that, in the case of Facebook, one, in the case of Google, three, people have voting control. And so if those individuals decide to do something other than maximize profits in both of their annual reports warn people that they can that they, those folks can take interests, action against profit maximization.
But if those four people chose to do something about news, they could do it. They also are starting to care about the employee morale, that engineers went to some of those places not to be evil. And so if you care about productivity, you care.
Also, advertisers… Now that there have been people, like gentle giants and others, targeting people whose support whose advertising supports misinformation. They care about advertising because that's where their money comes from. So that's a long way of saying, on the social media platforms, they're driven by engagement; engagement is advertising. But I'm seeing some hopeful signs, especially in terms of pressuring advertisers or employees pressuring the social media companies.
Christine Alemany 46:24
Thank you, Jay. Gita, Katherine had a question. About your research and how if you could expand more about how you manipulated a user's sense of meaning, and what the implications are for social platforms, as well as the broader interventions that we could use as a society?
Gita V. Johar 46:48
Thank you, Christine. That's a good question. In our experiment, we did it in an artificial way, where we made people imagine that they were the boss of a big company versus the employee of the company. So it was a way of giving power and status, which has been shown to restore a feeling of meaning.
So that's what we did just to kind of show the effect in that experiment. But I think, in reality, what you could do is use nudges or use environmental cues that make people feel more in control. So you could use either imagery or you could use, you know, textual cues or anything that makes people feel more of a sense of control, less of a sense of uncertainty. So anything that could accomplish that feeling I think would lead to people not sharing in this community.
Christine Alemany 47:42
Thank you. Ariana, we have a question from Valerie. Is performance, or ratings, of cable news based solely on the number of viewers and advertising revenue? Or are there any agencies that track quality or number of false stories?
Ariana Pekary 48:11
They're based on the new Nielsen ratings numbers that come in every day, and I think there there are probably some people who question those numbers. And obviously they're basing this on a sampling and so the demographics could very well be off. And so some people have have questioned that from time to time. But my, you know, that what they're basing their decisions on on a daily basis are the Nielsen numbers.
And I think there's some other numbers that get into more analytics. I don't think those come out on a daily basis, and I never actually saw those. I think they keep them for a smaller group of people. But then the decision that we were making every day what based on the Nielsen's numbers.
Christine Alemany 49:01
Thank you. We have a very interesting question that I'm going to open to the entire panel. Dana wanted to hear some comments about Australia's proposed news media bargaining code. Are any of you familiar with that? Jay, do you want to give a little bit of your thoughts on the topic?
James T. Hamilton 49:33
Sure, just quickly, all across the world, countries have been trying to put pressure on Google for years to pay a link tax or to pay some/have some transfer to news companies. In Spain, when they did on link tax, Google shut down Google news. And research showed what that did was it meant that established media players have more traffic because the function of Google News was to impart surface stories and smaller outlets that you might not otherwise see. So, it actually benefited incumbents.
Now, recently, you've seen more policies where countries are saying, "Pay a tax or go into forced arbitration." And Google's response has been to create something called the Google News Showcase, which is basically $1B over three years that Google will pay companies to create something called Google New Showcase that they can have on Google. So it's an indirect way to pay money to the news industry without getting into a world of actually paying per link because Google would that would be anathema to Google.
Christine Alemany 50:48
Thank you. Gita, we also had a question that I thought maybe you could give us some insights on most much of the news sharing is is not happening on traditional social media platforms like Twitter and Facebook, but it's also happening on dark social, such as WhatsApp, text, or email. Do you have any research on how to combat misinformation when it's behind closed doors?
Gita V. Johar 51:29
That's a great question. Actually, the BBC did a big study, and this was in India, on WhatsApp because WhatsApp is used, you know, to a great extent in all over the world, and news is being shared all the time. It's recently had a big flap because people thought it used to be encrypted and private, but then Facebook kind of said something about the terms of use. And so WhatsApp people got into a flurry about it because people do share a lot of private information that they don't necessarily want the government to know or whatever.
That's a little bit of an aside, but to say, "How do you study sharing of fake news on media like WhatsApp?"
So I feel like the approach we use with Twitter data is an approach that can be used anytime text is being shared, right? So anytime people are writing texts, you can mine the text to understand who are the people who are sharing, sharing fake stories. So I feel like a similar approach can be implemented using in WhatsApp but it is all private and encrypted. So I don't know to what extent one could have access to those data. But Facebook in the companies could perhaps think about it if they're if they have access to these data in any in any case.
I think, even though we're looking at Twitter data, I feel like some of the findings are converged with what you would expect theoretically. So I do feel like we have a sense of who are these people who are likely to share fake news. And I think it's important to then understand from a policy perspective, what do you do with this? Like how do you use this information? Christine, you're on mute.
Christine Alemany 53:09
I love mute. And this brings us to a point where, you know, we've talked about all the problems with media. And it would be great to get, you know, some of the things that you guys are thinking about on, you know, give us some light at the end of the tunnel. Jay, do you want to start?
James T. Hamilton 53:30
Sure. First, I'd like to thank you for listening and organizing this. So really, we all appreciate that.
I'm hopeful because of the clunky term called Computational Journalism which is stories by, through, and about algorithms. Case in point, the Associated Press now does about 4,400 quarterly earnings reports because they do it by software, partnering with a company called Automated Insights. In the old days, when you had to do it by hand, they cover 300 companies. So writing stories by software can expand what you do.
Also, if you look at stories through algorithms, the Atlanta Journal-Constitution uses machine learning to do an amazing story about sexual abuse by doctors across 50 states. They scraped the web, found 100,000 possible cases, and then did a machine learning model to reduce that to 6,000. Then they read all 6,000 cases. So that makes me hopeful.
Christine Alemany 54:29
Gita, what are you thinking about?
Gita V. Johar 54:32
So I mean, as we all know, in the last few days some of the people with the biggest megaphones that are also spreading fake news have been taken off platforms. And I think that's definitely cause for both, because it is true that with COVID, for example, very few websites, published most of the fake news. So I think it's a question of getting to those few websites that are sending out these massive amounts of fake news that gets shared among all their followers and really create this huge fake news ecosystem. So I think pinpointing and identifying those and removing them from the platforms is possible.
Christine Alemany 55:11
Ariana, what are you working on right now?
Ariana Pekary 55:21
I have to say, I was in a very dark place last summer before I resigned from MSNBC, and I felt like I was in the middle of all of these crises from the pandemic to the George Floyd crisis to the what I felt like the presidential election campaign. I felt like things were going the wrong direction on all three fronts, and I felt like, "Even now, they can't do the right thing," was kind of how I felt.
And so I resigned, and I posted a statement on my personal website, which until then I think it had maybe two visitors. And I was just explaining why I was leaving—and a lot of it has to do with ratings driving these editorial decisions and how damaging that is. And I wrote that thinking it was first going to it was going to get lost to the internet because it certainly wasn't going to compete with everything else out there. But and I figured maybe a handful of former colleagues would reach out and you know, comment about what I had written, but I got thousands and thousands of notes.
My piece, as I say, went viral and it got a lot of attention. And a lot of people wrote to me saying, "Thank you so much. You know, I might not agree with everything that you say or believe but thank you for coming out and saying something I sensed that there was a problem, but I didn't really understand what it was." And so a lot of the hatred that I felt like is out there is being driven by this and by your financial reasons, and people really responded to that.
And so that's what gives me hope. From average people, moms and dads to professors to former people who worked in the industry to people who work in the industry currently, the response I got was overwhelmingly positive. Columbia Journalism Review at that point then reached out and asked me to be a public editor for them covering the industry. So, I sense that the people do want something better and that makes me think that we will be able to get something better. So that's that's where I am now.
Christine Alemany 57:39
Thank you so much, guys. I just wanted to share ways that you guys, everybody here can follow our expert. I'm so grateful for you to join us to address this issue to help us think through some solutions of our current news situation. Jay, Gita, Arianna thank you so much for joining us this afternoon.