Blog

Working for All of Us: We Need Democratic Control over Technology

David A. Banks, Rensselaer Polytechnic Institute

It is laudable that in 2016 our leaders are thinking about technology as something that could work against some humans’ interests. When the President of the United States asks how to “make technology work for us, and not against us—especially when it comes to solving urgent challenges,” our first inclination as social scientists should be to define the “us” under investigation. Presumably, the president would reply with “every American,” but even the most cursory reading of science and technology studies literature would suggest that this is impossible. Technologies are neither inherently good nor bad for all humans; rather, they are the artifacts of social action and often participate in political controversy and social stratification before, during, and after their initial creation. The levies in New Orleans, the drones that fly over the U.S. southern border and throughout the Middle East, and the corroded pipes of Flint, Michigan stand in testament to the ways in which technology participates in politics.

A far more realistic question might be: “How do we redistribute control over and allocation of technologies?” Such a reframing decenters the artifact itself and brings analytic focus to the methods we employ to make technology in the first place. If decades of research into innovation and regulation by social scientists has taught us anything, it is that our current paradigm of “design first and regulate later” makes for bad products and even worse policy. The former sets loose unintended consequences that cause real harm to people and the latter is ham-fisted, too late, or most commonly, both. Worse yet, as social scientists we may be over-prescribing DIY, citizen-led science and technology at the expense of social movements that would otherwise be changing the landscape of scientific research and technological innovation.

Consider, for example, the issue of fracking. Energy companies have tried to convince the public that they are merely scaling up old and trusted technology when in fact we are entering into a categorically new era in humans’ relationship to terra firma. The deleterious consequences of fracking range from contamination of communities’ water supplies to regular earthquakes where there once were none in living human memory. The U.S. government’s inability to and disinterest in successfully maintaining safe drilling practices has been met with a peculiar popular reaction; despite immediate and direct harms, Americans have chosen to carefully but steadfastly appeal to broken regulatory frameworks while also spending considerable time and effort cataloging and monitoring environmental impacts. This reaction seems to have no historical analogue. I feel confident in saying that historically, if a large outside party arrived and inflicted massive damage on a basic resource, that party would be met with pitchforks and bricks, not test tubes and spreadsheets.

While I am not prepared to unconditionally advocate for pitchforks and bricks, I think we as social scientists should think twice about our role in advocating for scientific inquiry and technological development by non-experts. The hidden danger in citizen science is that it lends credence to the notion that certainty is needed for collective action. Instead of collecting data, we should be deconstructing the kinds of evidentiary demands needed to take action on disasters that stem from environmental racism, organized ignorance around issues important to minority groups, and unchecked innovation more broadly. Instead of replacing necessary environmental monitoring with crowd sourced DIY methods, we must absolutely demand that those institutions charged with protecting the common good do their jobs. At some point, we need to know when communities should stop collecting data and start demanding the resources that they are owed.

None of this is meant to disparage the excellent work done by many of the scholars that make up SKAT and Science and Technology Studies. Rather, I am pointing toward the simple fact that our institutions of science and governance are increasingly found to be criminally negligent in their duties.

The pipes in Flint are a particularly maddening example of both the government’s and the academy’s disinterest towards human life. In February, the Chronicle published an interview with Virginia Tech scientist Marc Edwards who stated unequivocally that environmental crises are the direct result of the “perverse incentives that are given to young faculty,” and that scientists are so reliant on fragile funding networks that calling out wrong doing in the private and public sector is tantamount to a career killer. I identified a similar dynamic in a 2014 Tikkun Magazine essay in which I suggested that a way around this dynamic would be a no-strings-attached block grant program for communities paired with “a clearinghouse of sociologists, water chemists, lawyers, economists, and geologists all fully paid by the federal government and willing work with a community to solve problems identified by its residents.”

It is no longer enough to work on methods and tools that let communities fix part time what well-funded governments and universities are ruining full time. We need programs that aid in producing swift but certain knowledge claims that activate reformed or radically altered governance mechanisms. Making technology work for more people more often starts with building better organizations, not filling in holes in (sometimes intentionally) tattered governance regimes. If, as Bruno Latour says, technology truly is “society made durable” then creating better technology means creating a better society. This begins not at the design table, but in communities and the governing institutions that purport to serve them.

David A. Banks is an editor for Cyborgology, a member of the Theorizing the Web organizing committee, and a PhD candidate at Rensselaer’s Science and Technology Studies Department.