Interviewed by Larry Au on Nov 21, 2022 for the Fall 2022 issue of SKATOLOGY, award announcement below provided by Diane Vaughan
Diane Vaughan‘s book, Dead Reckoning: Air Traffic Control, System Effects and Risk (Chicago, 2021) has been selected for the 2023 American Institute of Aeronautics and Astronautics Gardner-Lasser Aerospace History Literature Prize, presented annually to the best original contribution to the field of aeronautical or astronautical non-fiction literature published in the last five years dealing with science, technology, and/or impact on society.
Dead Reckoning is an historical ethnography of the life course of the air traffic control system from system emergence through 2017. Based on archival research and fieldwork in four air traffic control facilities, the book focuses on how historical institutional conditions, assemblages of social actors, and events in the system’s external environment – political, economic, technical, cultural – impact the air traffic organization, changing it, and how in turn those changes affect not only the social, technological, and material arrangements of the workplace, but also controllers’ interpretive work, cultural understandings, and work practices. Far from a top down model, the analysis shows how controllers respond to these events, implementing repairs in response to the liabilities of technological and organizational innovation. It expands what we know about knowledge production, boundaries and boundary work, culture and cognition, expertise, and the changing nature of technical work over time.
Q: At the beginning of the book, you tell this remarkable story about how you gain entry to your field site in the most direct way possible: by signing up for a tour of the Boston Center, a large high-altitude radar facility, in New Hampshire in 1998, where you were the only person that showed up. Could you tell us how you first became interested in air traffic control as a possible negative case and a field site?
A: I had spent most of my career looking at how things went wrong in organizations large and small. Of course, I didn’t realize that I was going do that when I started out. But some of the things that I saw in my first book, I saw again with similar patterns in the other two books: early warning signs that were either missed, misinterpreted, or ignored until something was seriously wrong. After I did the Challenger case, which about a large-scale sociotechnical system failure where history mattered, I wanted to be in a situation where I could watch people working, where the work involved technology, and there was some risk involved. I wanted a case where they got things mostly right, and air traffic control was the only place I could envision where the work was standardized enough that I could go into the workplace and sit with them and watch what they did and also interview. It would be my negative case about how people were trained to recognize early warning signs and correct them before there was a catastrophe. So that was what started me off on the project.
Q: I also remember reading the book that you originally thought it was going to be an article or two rather than a whole book!
A: Little did I know. That’s where history came into this case, because once I was in the setting, I saw the impact of history on the workplace. Controllers would come up to me and tell me incidents in their work history. Also, in 2000 I was wearing a head set from the 1960s, just like they were. All computers and keyboards were ‘60s models too. At the smallest of the four facilities that I studied, a tower, they were connecting and disconnecting their mics with foot pedals that they had to tape them together. I saw some changes in technology and work rules and how that had changed the work over time. I left the field in June 2001 to write all this up. And then September 11 happened. I witnessed the effect of history on the system—not even being in it—but I knew how, once the hijackers were discovered, controllers had accomplished the incredible feat of bringing down all the planes in the sky in a little over two hours and fifteen minutes. Handling planes they’d never handled before and without incident. So I went back, and the few articles became an historical ethnography and a book.
Q. One of the things that struck me from the book was the intensity of training for air traffic controllers during boot camp in Oklahoma City. In your account, men would call their wives in tears over the phone after training (p. 138). Did training have to be this way? Are there ways to learn, acquire, and embody the expertise of air traffic control without this hazing?
A: They were really tough on them. Some of them talked about it as being like a military bootcamp. One of the effects of this was that their training was so rigorous, it transformed them as people. They developed a superior vision and the ability to scan the periphery and look at the sequencing of traffic, not focusing on specifically one, but focusing on all. And they developed beyond our normal hearing range, which they needed in order to hear everybody else in the room. Dead reckoning was an early navigational skill based on predicting the position of objects in time and space without benefit of observation or evidence of any kind. Dead reckoning for controllers calls for interpretive work and a shared cultural system of knowledge. This is a kind of modern dead reckoning, where they had to predict in advance what airplanes were doing, as well as what their colleagues in other facilities were doing because they were handing off airplanes from their airspace to the next. This added a serious level of complexity to the work that they were doing at the moment. It was important because they handled so many planes in a minute, that they had to be able to do things “without thinking”. I don’t mean that they “weren’t thinking”, they were. But the results of the training were that expertise became so embodied that they could enact the basics and while working the anomalies. It saved them time. But to answer your question, later the FAA changed the training, becoming “a kinder gentler FAA,” affecting dead reckoning.
Q: You wrote about the cultural imagery of air traffic controls as stressed out and aggressive. But you found that in contrast to this and to training, stress wasn’t part of the day-to-day work of air traffic controllers. One of your “a-ha” moments, when an air traffic controller overhears you asking about stress and shouts: “Hey, Diane. Do I look stressed to you?” (p. 308). Could you elaborate on how “risk strategies” (p. 375) were deployed by air traffic controllers to manage tense encounters and manage their emotions?
A: Two things struck me once I was in the facilities. One is that the air traffic controllers would say things, like “this job is not risky”. They would say, “it’s 99% boredom, 1% sheer terror,” and “the stress in this job is not the airplanes, it’s the people you work with”. I was really surprised because I came in with that culturally imagery of job stress. But they use cultural devices that distance themselves from the risk of the work. The way they deal with it differs depending on whether they work in a tower or whether they work in a radar facility. People at, for example, Boston Logan Tower would look out the window down at the planes and say, “I never think of there being people on that. It’s just me and the pilot” or, “I look out the window and I see all those planes down there, and I just pretend that it’s like my little train set from when I was a kid”. They find other ways to deal with stress by redefining the feelings they have.
One of the most significant questions that I asked them was about their training and how they learned from mistakes. I asked, “Can you think of some mistake you made and what you learned from it?” They would recall everything: the people they had worked with; the planes in the sky; what other pilots were doing; their emotions. They talked about their heart: “My heart stopped”; “my heart was in my throat”; “I couldn’t speak”; “I stood up in my seat”. The fact that they remembered every detail, often years later, told me about its importance. I followed up with this question about risk and stress. I asked, “Do you think that your work is risky?” And they said, “Oh no. If you follow all the rules, it’s not risky”. And then, “Do you think it’s stressful?” Someone said, “The people who thought it was stressful and risky left a long time ago”, and that “It’s only risky and stressful some of the times, and that’s when you lose control”. Mistakes were clearly moments they lost control. To deal with stressful experience, they redefine them culturally. The fact that their heart was in their throat, many people described it as, “it’s like a high”, or “it’s like skiing when you come down the slope”. They would redefine this in relation to some normal experience. They lessened it, normalized it. There are other kinds of comments in which they said, “We don’t have time to feel emotion because we have to deal with emergencies and there are rules”. They’re tied up with executing the rules. For example, at a tower to get ambulances on the field or to get other planes out of the way. But they feel it afterward. This is also, I think, a common bond between them.
Q: A theme that surprised me was the inability of regulators to consider what working air traffic control was like, with regulations imposed on air traffic control without consideration for its effect on existing organizational practices and norms. What might be a lesson that you hope policymakers and regulators take from your book?
A: First of all, the system suffers from a lack of funding and being politically vulnerable as a public agency. The most crucial thing that happened while I was there was a staffing shortage. Prior to my coming in, the FAA had not been getting enough money to hire new air traffic controllers. The hiring was shut down from 1992 through 2004. Also, controllers who had been hired in the 1980s were retiring. During the Obama administration, when the government shut down, many controllers were laid off and not working. Even when funding was available, the FAA didn’t really start hiring. When finally they did start to hire and train again and it was catch-up ball. New controllers were coming in, but not fast enough, and so the FAA changed the training to speed things up but the new controllers didn’t have the embodied experiential expertise that the others did. Also, all controllers were learning to work automated air traffic control, and planes were flying according to standardized routes in the sky without pilots or controllers managing them. But there were always incidents when they would come off of the automated routes and pilots couldn’t handle it and neither could new controllers. So this is a continuing risk.
Second, I think even FAA officials can’t predict the effects of changes. Controllers have always been responsible for the system’s survival. When I went back in 2017 after they had automated, they were dealing with the liabilities of organizational and technological innovation. Controllers who work at a TRACON (Terminal Radar Approach Control Facilities) are working in radar. This is a middle-level altitude facility, and they work in very small rooms that are dark, maybe six to eight people in a room so they can hear and see each other. In an effort to modernize and save money, the FAA decided to consolidate TRACONs, mixing small and large facilities. They built new buildings with huge control rooms. Controllers said, “it was like moving from a shoebox to an airplane hangar”. The room was light, not dark. Their workstations were bigger, so they weren’t sitting elbow to elbow and couldn’t hear each other. Also there was a lot of conflict because each facility had its own way of doing the job. Controllers reorganized their practices to repair by redesign and by creating a common culture, which is important for coordinated activities among the very different TRACONs that would move in. This alerted me to the concept of resilience and how air traffic controllers have always been the people at the bottom of the hierarchy who have supplied the resilience that keeps the system going, and so likely workers in other large-scale systems or small-scale systems do – so resilience as a general concept.
Q: On that point about improvisation. The chapters about September 11 and its aftermath were so vividly written (as with the rest of the book). As a respondent clearly puts it, what allowed the air traffic controllers to work that day and safely land the 4,395 planes in two hours and fifteen minutes without incident was: “structure and routine, structure and routine […] if we have to improvise, we improvise from the base” (p. 396). How might structure, routines, and improvisation play a role in other emergencies and disasters?
A: The resilience during that event was based on embodied expertise. Their job involves a great deal of mixing standardization with improvisation. Because it’s a large-scale system, very standardized, with a lot of people doing the same work, you get to see things that you wouldn’t be able to see ordinarily. But there’s lots of variation. All the rules and regulations in equipment don’t work the same in every facility because their airspace is different. These standardized changes come through, and they have to make it work locally. So that’s a question that they always ask: “How can we make this work here?” We think of this in terms of workarounds and people not conforming to the rules. Some of the informal ways that they resolved a problem actually worked so well that, that they became formalized. This wasn’t their intent, but to make the system work, they had to improvise. So they had a long history of mixing standardization and improvisation. And I think on September 11th, that worked for them because they were able to improvise and do the job in an unprecedented situation.
Q: This book took many years of careful, patient, and meticulous research to write. When I was reading the book, I could picture you shadowing and chatting with your respondents, learning their craft. The empirical material is incredibly rich. But Dead Reckoning, along with The Challenger Launch Decision, were both such technical books too. What is your advice to researchers in the SKAT Section, working in highly technical topics, on how best to “learn the science” and gain some kind of interactional expertise?
A: Unless you have tenure, don’t do a long book! Seriously, this is a big topic, and I have been writing about what I call “insights in place.” So here are just a few suggestions. Well-known among qualitative researchers is that every time we enter an organization and technical system, we learn a new language and ways of being and doing. This can lead to a new variation of a well-known concept. In both of these books, I define technology broadly, not just computers, scientific instruments and tests, and headsets. But, following Latour and Gieryn, the training, the rules and procedures, the architecture of the room are also technologies of coordination and control that can constrain or enable a controller. Also, temporality matters. Both observation and interviews are the best, because you can’t see what people are thinking. Sitting and observing was crucial, but the time of day and day of the week vary, so being there enough to catch the variation was important. Being able to just hang out was a big advantage, in contrast to Challenger, in which I relied mostly on engineering documents and official investigation interview transcripts. Also, I couldn’t separate interactional expertise from what’s going on in the larger organization that affects the workplace. So if you want to understand that, you have to take into account how a person’s movement through an organization affects expertise. If you look at a hierarchy that’s shaped like a Christmas tree, people at the bottom enter at the outside of a low branch and expertise is developed as they work toward the trunk. The greatest socialization and intensity of training is right when at the trunk of that Christmas tree. But as you move up to the next branch, you also get farther away from what workers do. One of the major problems internally is how does information and expertise get passed down or passed up a hierarchy? And how do orders that come from the top get enacted on the way to the bottom. The problem at NASA was structural secrecy, that worked against engineers’ interests., even when everyone was trying to convey it. Developing research strategies depends very much on the type of organization it is, and understanding the dynamics apart from what people tell us, because their expertise is limited by their position. They don’t have the ability that we do to go to other layers and find out how an order is executed or implemented.
Q: So do you think you would be able to give directions for planes to land and take off if you were handed the controls? Do you think you would be able to act as an air traffic controller if you’re given the opportunity?
A: I could sit with them a long time, but unless I really took the training and worked airplanes, I wouldn’t be able to do the job. I don’t think I could ever even talk that fast. It’s really a young person’s job. They feel that they are slowing down when they are 40 and change their moves for working airplanes to assure safety, “not running them as close.” Mandatory retirement is 56.
Q: Your next book is on Theorizing: Analogy, Cases, and Comparative Social Organization. You’ve published on analogical theorizing in other venues, but could you give us a brief preview to the SKATOLOGY readers about what you hope to accomplish through that book?
A: Analogical theorizing relies on cross-case comparisons of similar events that happen in different socially organizational settings, searching for similarities and differences. I think people learn by comparing based on analogy and differences and psychology confirms that is the main way children learn. A child touches something hot and immediately connects it when they see something else hot. We have a lot of theories in our heads that we carry around with us. Intuitively, we learn and write on the basis of analogical comparison. What is a citation? Citing a similarity or difference with another work. The book is going to be about that, using comparisons that I have made based on insights in place, and the analogical theorizing other scholars have done using cross-case comparisons, without even acknowledging it, as it comes naturally. I hope to be able to show how it works and when it does not, so we can teach it.