Fake News and the Future of Journalism

From friend of the section Pablo J. Boczkowski of Northwestern University:

“Every public has its own universe of discourse and…humanly speaking, a fact is only a fact in some universe of discourse.”

Writing those words three quarters of a century before the Oxford Dictionaries named “post-truth” the 2016 word of the year, Robert Park — a former newspaper journalist and one of the founders of the Chicago School of sociology — understood fake news to be an intrinsic element of any information ecology. Long before Mark Zuckerberg started to be treated as a rapacious business man, noted real and fictitious publishers such as William Randolph Hearst and Charles Foster Kane aimed to exploit the commercial potential of fake news, as did others who predated and succeeded them. On top of intentional attempts to distort or misinform, many unintentional mistakes caught by the public — and a suspicion that there might exist more unidentified ones — have further reinforced a certain stance of skepticism among media audiences over the inherent veracity of the news report.

Read the full article.

 

A Media Ecosystem for an Age of Fracture

by Shreeharsh Kelkar, University of California at Berkeley

As the dust settles after Donald Trump’s shocking upset of Hillary Clinton in the US Presidential Election, and we await the long-term fallout, I am left wondering about what “objectivity” in media institutions will—and should—look like in the future in this age of political polarization. How might we—social scientists interested in understanding how knowledge is made credible—use our insights and contribute to this debate about the future of media institutions in the United States?

The United States today is heavily polarized along political lines. As political scientists have shown time and time again, the polity of the United States experienced a realignment after the Civil Rights movement. The two major parties no longer have substantial overlap, if any; they cater to entirely different constituencies. Republicans are the party of managers, evangelicals, and the white working class; Democrats of organized labor, affluent professionals and minorities. The Democrats have a wider tent and therefore more disagreements, but both parties and their constituencies have sorted themselves into competing and opposing positions on a number of vital issues: the size and role of the welfare state, the scope and status of abortion, and the parameters of religion in public life.

This fracture was bound to have some effects on the media ecosystem. As Paul Starr has shown, American media institutions have constructed themselves as independent organizations, free of political bias—partly because their distribution was heavily subsidized by the federal government and partly because they started to rely on advertising as their main source of revenue. This is typically true of some of the largest circulation media institutions: the New York Times, the Washington Post, Time Magazine, Newsweek, the Atlantic Monthly, CBS, NBC, CNN, and others. These institutions produce this objectivity through a variety of practices: separating their business and editorial divisions, keeping fact and opinion separate, and scrupulously reporting (political) conflicts by giving equal coverage to every side. This arrangement worked very well for a pre-polarization age where both political parties presided over coalitions that spanned the ideological spectrum; indeed, it was constitutive of it.

Polarization caused the first crack in this media ecosystem as it led to the creation of new media institutions at the margins (as well as a public that consumed them avidly) that did not adhere to (and indeed, flouted) this form of objectivity, conservative talk radio being a prime example. Internet-enabled business models multiplied and scrambled this effect. The early Internet bloggers, attuned to the age of polarization, despised the he-said-she-said model of media objectivity, which they sometimes characterized as “false equivalence.” Their blog-posts combined both an obsession with facts and an emphasis on their personal, partisan, voice. Blogging has long since become institutionalized with many early bloggers now writing for more established media publications. What has stayed constant is that their model of writing (embodied in outlets like Vox and Talking Points Memo) is attuned to a different model of objectivity: while being obsessed with facts, figures and truth, it proudly spurns the bipartisan style of, say, the NYT or CNN. Facts and opinions are mixed, and every side does not get equal (or similar) coverage.

But perhaps nothing has been more consequential for media institutions than the rise of curatorial “platforms”: the Facebooks and the Googles who remain many Americans’ gateway to the news that they consume. Curiously, Facebook and Google have adopted the same stance of media objectivity even as their role in guiding public discourse has increased: they stress that they are non-political arbiters of political discourse. They have invested their organizational identity into becoming “platforms”: as platforms, they have no politics and are simply channeling the voice of the users. As Tarleton Gillespie has argued, the word “platform” sometimes refers to “technical platforms, sometimes as platforms from which to speak, sometimes as platforms of opportunity. Whatever tensions exist in serving all of these constituencies are carefully elided.” Unlike media institutions who maintained their objectivity through specific organizing practices like separating fact from opinion, the platforms’ main justification is that they use algorithms. In their telling, the mechanical objectivity of algorithms—and the relative absence of human judgment in this process—is the reason their curatorial decisions are objective.

Why have Facebook and Google studiously resisted being styled as media institutions and stuck to their guns in calling themselves mere platforms? A couple of reasons are clear: their reliance on advertising for revenue, and the fact that, in this age of polarization, being labeled “political” turns away people of the opposite political affiliation. But there is a third, perhaps more insidious, reason. Facebook and Google are committed to the principle of what has been called “context collapse“: Google wants to index all the information in the world and Facebook wants to make all activities “social” all the way down. These aspirations are sweeping in that they seek to dismantle the boundaries between activities such that everything is mediated through these platforms; the distinctions between the social and the commercial, the personal and the public, and the political and the non-political are blurred. To slightly twist a phrase coined by Helen Nissenbaum, platforms seek to blur, even exterminate, the “contextual integrity” of particular activities so that they all start to be mediated by the technological machinery that these platforms specialize in. Expanding their activities to cover more and more domains is the key to Facebook and Google’s Silicon Valley-inspired business models. Explicitly regulating content or regulating content-producers is not conducive to the growth of these platforms.

To understand this, consider the dispute over Facebook Trends, and the current debate about “fake news.” In this age of polarization, how then do we restore a meaningful public sphere of reasonable political debates?

Facebook Trends, Fake News and Objectivity

After Trump’s victory, an increasing amount of liberal dismay focused on the rise of “fake news“: transparently false and outrageous claims about Hillary Clinton, manufactured sometimes in Macedonia, at other times in California (and in a demonstration, that sometimes commerce knows no politics, by gung-ho entrepreneurial Democrats!), that managed to nonetheless circulate widely among conservative audiences primarily through sharing on Facebook. Critics alleged that these fake stories circulated more than real stories did, and that this had some role in the election results, and that Facebook should do something about this.

But before fake news, there was Facebook Trends. A few months ago, the website Gizmodo unveiled what seemed to many a scandalous revelation. Facebook’s Trending Topics were not really algorithmically generated; rather they were curated, by a group of people (to be sure, a very small group of people who were neither trained in the practices of journalism, nor paid very well), who worked with what Facebook’s massive programs discovered. Even more damning, this group of people was making choices that systematically deleted conservative news headlines while privileging more liberal ones. Or so alleged the main source for the article: a conservative Trends curator.

Scholars have expressed dismay about what the Trending fracas revealed: first, that the general public sometimes seems unaware that there are people behind the algorithms of Facebook, and second, that Facebook’s Trend curators were abysmally paid employees with little or no formal training in any journalistic enterprise, an indication that Facebook treated a fundamental job as contingent work. Conservatives expressed a different kind of dismay: that Facebook had a liberal bias just like the New York Times and was systematically discriminating against them.

Facebook’s reaction to the controversy was instructive. Conservative luminaries were invited to Facebook’s headquarters to have their concerns registered. And then, Facebook essentially fired its Trending staff and left the algorithms essentially unsupervised. The result has not especially been good, as the Washington Post has shown: the output of the Trends feature has steadily grown more incoherent.

Facebook could have certainly dealt with the Trends controversy in a different way. They could have strengthened the job description of these Trends curators and hired journalists rather than contingent workers. They might have made the process through which a Trend was captured and published transparent (at a similar level of abstraction they use for their much-vaunted Edgerank algorithm so that they revealed no proprietary information). They could have hired more conservative curators and explicitly claimed that their curators were as representative as possible of the general US population. Instead, they opted for an argument where they fell back on the mechanical objectivity of algorithms. This allowed Facebook to remain a neutral, non-political platform, and it seemed to assuage affronted conservatives.

Facebook’s approach to dealing with fake news has been notably different, maybe because influential Facebook employees may have felt unease at their own complicity in the election of Donald Trump. After some initial denials by Mark Zuckerberg, Facebook recently announced that independent fact-checkers would get privileged access to Facebook and would be able to flag egregious articles. Shared Facebook items would thus carry some sort of mark verifying their authenticity. Facebook is also clearly at work on more computational mechanisms to identify false news items. Time will tell whether these methods raise an outcry from conservatives who argue that the fact-checkers are not fair and have some sort of liberal bias.

Rather than falling back and doubling down once again on the objectivity of algorithms, Facebook created what is essentially a political solution to the fake news problem: it outsourced the verification of shared links to third parties who have established practices of adjudicating truth claims by media institutions. Certain issues remain unresolved: what happens when particular parties dispute the judgment of the fact-checkers? How is the judgment of the fact-checkers incorporated into the Edge rank algorithm, given the brute fact that items at the top of the Facebook newsfeed tend to be read the most? Time will tell.

We—and Facebook—might conceivably think of other models of objectivity in our efforts to create trust-worthy media institutions (i.e. trusted by both liberals and conservatives) in an age of polarization and platforms. In her comparison of the American, British and German regulation of biotechnology, Sheila Jasanoff characterizes each national regulatory style as “contentious,” “communitarian,” and “consensus-seeking” respectively. In the German polity, for example, she finds an emphasis on the representativeness of knowledge-making bodies, while the United States hopes to resolve regulatory issues by having the different parties face off in an adversarial manner through disputes between experts. That Facebook chose to tackle its fake news problem (in as much as this can be “tackled”) not by emphasizing the mechanical objectivity of its algorithms, but rather by drawing on the prestige of fact-checking institutions, is a hopeful first step.   Platforms are new; there is a great deal of interpretive flexibility in their underlying technical and institutional infrastructure. The future of media objectivity in an age of polarization, Trump, and social media platforms might lie in drawing on alternative civic epistemologies. That might be an area where we social scientists might make a useful contribution.

Contact Dr. Kelkar at: skelkar[at]berkeley.edu

Knowledge and Expertise after the Election

By Dan Morrison, Vanderbilt University

Note: This is a revised version of the leading essay in the November 2016 issue of Skatology. Dan Morrison takes full responsibility for its content, and thanks Scott Frickel for his comments on an earlier draft.

Donald J. Trump is President-elect here in the United States. Articles like those Joe Waggle wrote in the November 2016 issue of SKATology on science policy under Republican, Democrat, Libertarian, and Green administrations both an artifact of pre-election history and an important document on what might have been. We do know that Mr. Trump garnered over 270 electoral votes and thus will be the next President.

As Waggle recognizes, we do not know much about what science policy under a Trump administration will look like except to the extent that any decisions will be made with an eye towards economic competitiveness and market dominance. We do know that Myron Bell, Mr. Trump’s choice to oversee the EPA transition from President Obama to Trump is a well-known denier of the overwhelming consensus on climate. His nominee for EPA administrator, Oklahoma Attorney General Scott Pruitt, has sued the agency he is nominated to run 14 times. One suit aimed to stop protections for his state’s air. Pruitt is known to deny scientific consensus on climate change and his selection has potentially devastating consequences for the recent Paris Climate Accord.

Based on our many discussions with colleagues in the immediate post-election period, I think it is likely that those who rely on federally funded research institutions such as the NSF, NIH, NEH, and others, are experiencing a profound level of anxiety. Those with “soft-money” jobs are concerned that their grants will be either cut, or that funding for their granting agency will be slashed to such an extent that future work is in peril. There are just too many unknowns at this point. Past Republican-controlled Congress sessions have voted to cut funding for political science. It seems likely that the incoming administration will finance its other priorities by reducing or eliminating several federally funded research programs, with the possible exception of research aimed at protecting national security or increasing economic competitiveness.

We may well be in an era of retrenchment. But we may also be in an era that is ready for sociological analyses of expertise and knowledge. Our area of the discipline may be more important than ever. We have studied the rise of new professions, the creation of academic disciplines, and the construction of expertise. Sociologists of science and knowledge have been active for decades in investigating how expertise is legitimated, and the links between legitimation and power. What might we do within the public sphere to advocate for justified beliefs without turning to naïve positivism?

Related to the problem of expertise is the problem of low-information, or active ignorance. In a 2008 article for Sociology Compass, Robert Evans wrote:

… how are we to understand decision-making in the absence of information? This problem is particularly acute for the political sphere where a disinterested or uninformed public can undermine the legitimacy of democratic institutionsbased on mass participation (228).

I have been reflecting on the earliest sociologists in America, the Atlanta and Chicago schools, seeking inspiration for what may be a difficult four years for those of us who would foster democratic values and want America to become America for all.

W.E.B. DuBois wrote in his The Souls of Black Folk, “Honest and earnest criticism from those whose interests are most nearly touched, –criticism of writers by readers, of government by those governed, of leaders by those led,–this is the soul of democracy and the safeguard of modern society” (1903: 45-46). I think that as scholars we have a great deal of responsibility moving ahead. If you are an American citizen, you may have special duties as well. We all must take up that responsibility and defend our society and our institutions, including our colleges and universities as sanctuaries for critical reflection and action. The philosopher and pragmatist John Dewey once wrote: 

Society exists through a process of transmission… this transmission occurs by means of communication of habits of doing, thinking, and feeling from the older to the younger. Without this communication of ideals, hopes, expectations, standards, opinions, from those members of society who are passing out of the group life to those who are coming into it, social life could not survive… Unless pains are taken to see that genuine and thorough transmission takes place, the most civilized group will relapse into barbarism and then into savagery (1916: 3).

As always, we have much to do, and several SKAT section members have written extensively about these issues. I am thinking specifically of scholars such as Alondra Nelson, Ruha Benjamin, and Tony Hatch.

Let us begin again in our sociological work that is also political and, if we take up the challenge, oriented towards justice.

References

Dewey, John. 1916. Democracy and Education. New York: Macmillan.

Evans, Robert. 2008. “The Sociology of Expertise: The Distribution of Social Fluency.” Sociology Compass 2: 281-298.

Note: This post has been updated to include information about Scott Pruitt, Mr. Trump’s nominee to lead the Environmental Protection Agency.

In This Together

By Elise Paradis†, Michael W. Freeman*, Michael Kim*, Patricia J. Leake*, and Umesh Poopalarajah*

University of Toronto

† Corresponding author * Equal contributors, in alphabetical order

We, the authors of this text, were brought together in a research methods class taught by Elise Paradis at the University of Toronto, Canada. We are a diverse, interdisciplinary group that includes one American citizen. In this essay, we share a condensed version of our collective thoughts on the 2016 U.S. Presidential Election, as we experienced it individually on November 9th, and collectively in the classroom on November 10th. We see this essay as an act of resistance against disinformation and the degrading of science and truth, and wish to share our personal and scientific journey, as well as our hopes for the future. This text was shortened to meet guidelines; the full-length essay can be found at http://j.mp/ITT_2016E.

November 10, 2016: 1PM. Qualitative research methods class begins.

Mike Freeman

While the confusion, discomfort, and distrust was clear on the faces of fellow Toronto commuters, engaging in conversation about the election, given the jarring result, seemed almost taboo. Sensing this tension and trepidation, Elise decided that conversation “therapy” was not only necessary before learning could begin, but that the analysis we engaged in would also be beautifully aligned with the curriculum of our course. How did members of our class make sense of this election? What themes did we recognize as integral to this election’s result?

Michael Kim

This semester, our Thursday afternoon class was a favourite part of my week. Starting graduate studies for a third time, I felt that I was learning a new approach to how I view and interact with knowledge production. I never needed this approach more than that afternoon in early November. Once we acknowledged that the recent election had had an emotional impact on all of us, documenting, analyzing, and interpreting those thoughts seemed appropriate not only as an exercise in learning methodology, but also as a systematic, scientific and therapeutic approach to the rhetoric, controversy, and fallacies of the Election. Whether we made sense from the senseless is arguable, but I do believe that we jointly created meaning through that exercise. While I wish the campaign and results of the election had been different, there may be no better representation of understanding the social construction of knowledge than that offered by the 2016 U.S. Presidential Election.

Umesh Poopalarajah

As I entered our class two days after the election, it became very clear that its results had impacted the lives of my peers: facial expressions alone were enough to indicate that confusion and discomfort resonated through each of us in different ways. Elise engaged us in discussions about our feelings surrounding the results of the election to help alleviate any existing tensions, and I found solace in hearing others describe similar reactions. Could it be that half of all Americans are racist and hate women? Did they share these prejudices? Elise jumped to jot our ideas on the whiteboard, hoping to distinguish overlaps that we could further explore (see Photos). Through inductive coding, we came up with several ideas and themes that could be used to explain why Americans voted the way they did. Clearly, there was much more to consider.

Elise Paradis

That afternoon it was impossible to avoid discussing the election. Our conversations on qualitative research methods and analytic techniques seemed absurdly irrelevant in the face of such a momentous event. What we had felt to be an impossibility only two days earlier had come true: our neighbours elected someone who fundamentally disregards facts and promised policies that we felt would negatively impact immigrants, women, the poor, people of colour, LGBTQ-identified individuals, religious minorities, etc.

We first started with individual responses to the election, listening to everyone as they spoke. Over the next 40 minutes we explored and conceptually mapped issues of class, money, inequality, and the economy; “building walls” as a metaphor; and the blurred distinction between fictions and facts in the election.

From our perspective, neither Clinton nor Trump were particularly well suited to be the darlings of the poor and disenfranchised. Trump, however, seemed to have channelled American’s self-made aspirations better than Clinton, in spite of his wealth being, ironically, inherited. We consequently questioned whether perceptions of both candidates’ undeniable class privilege had been mediated by deep-seated gender beliefs, racism, and a growing rejection of the left-leaning educated elites. Americans seemed to embrace the idea of a Twitter-accessible President, and though Clinton engaged with social media, voters seemed to have seen these efforts as mere strategy. Trump’s imperfect, meandering online rhetoric perhaps rang closer to these experience of his supporters—more human, more American. Indeed, Trump’s position of “saying it like it is” aligned well with the backlash against Washington’s out-of-touch, established “elites”, and what Van Jones called “whitelash”—a rebellion of white voters “against a changing country” and against a “black president”—on CNN on November 8th.

Finally, we brainstormed several words that we could associate with this basic idea of “lack of truth” in the election: fictions, myths, fallacies, lies, and fantasies. We were puzzled by how many outright lies and fallacies had circulated, and contrasted them with facts, science, and evidence. We agreed that the Republicans’ rejection of facts and misrepresentations of reality helped explain the Democrats’ failure to win the Presidential Election. We thus summarized the study we might lead to explore these themes into the following title: “Facts, fantasies, fallacies, and failure in the U.S. Presidential Election.”

Patti Leake

Our class has had a profound effect on how I view my own environment. Our conversations about epistemology and paradigms helped me see and try to understand how people’s diverse experiences and knowledge are denied in pursuit of one “objective truth.” The entitlement to disregard others and their perspectives was a theme that I saw repeated in Trump’s campaign. The topic of our class that week was data analysis, and more specifically, thematic analysis. This was real-world learning—teaching us how to be critical thinkers and meaning-makers.

I did not understand how someone who, from my point of view, had no actionable policy plans and who had expressed discriminatory attitudes towards more than half of the population of the U.S.A. (considering race and gender alone) could be elected President. My friends living in the U.S.A. were already sharing their post-election experiences, which included reminders of past sexual and racial violence. Clearly, those in power have an impact on our lives; the intolerance of Donald Trump had reopened old wounds.

December 29th, 2016. In this together.

As we finish writing this, the United States is a deeply divided country. A majority of Americans have voted for Clinton and yet Trump won. The Electoral College has shown its limitations and once again elected a Republican who has not won the popular vote. Trump’s cabinet is a plutocratic line-up of intolerance, bigotry and climate change denial, which puts us all at risk.

Varied and muddy understandings of ‘truth’ and the broad use of narrative have played roles in politics since the conception of politics itself, but evidence of this discrepancy has never been clearer than during this most recent American election. Through the direct access and repetition afforded by modern media, today’s politicians can readily and frequently manipulate what we believe to be real, verifiable. This brings into question how we can approach knowledge given the fluidity of truth in this modern age of technology and politics—what is our working epistemology in an age that is increasingly wary of facts?

In this context it is easy to be fearful or defeatist; to give up on hope and on some of the key values that have made contemporary Western democracies the thriving places that they are: a focus on equality, freedom and progress. As Canadians, we have weathered the twelve-year storm of Stephen Harper’s conservative anti-science government to re-emerge as a progressive force on the world stage. Our new Prime Minister, Justin Trudeau, identifies as a feminist, and appointed a racially-diverse cabinet that is 50% female. He acknowledges climate change, reinstated the census, and chose to reinvest in science and scientists. As an interdisciplinary group of scientists, we believe that we need to speak up and bring all of our analytical tools to the challenges that lie ahead. We need to continue to resist post-truth politics, and continue to engage in critical knowledge production. We have to listen, question, think, learn, and act upon our findings. We must reassert the principles we stand by—knowledge, integrity, equality, social accountability—and be ready to fight.

We hope that you, our neighbours of the south, will learn faster than we did here in Canada when faced with an anti-science government. We hope that you will protect and rebuild your institutions to reflect the values that have been overshadowed by a barrage of lies and a fear of others. We are in this together; we are stronger together.