Organizational disaster and organizational decay: the case of the National Aeronautics and Space Administration

Howard S. Schwartz

Oakland University, Rochester, MI 48309-4401, USA

Schwartz, H.S., 1989. Organizational disaster and organizational decay: the case of the National Aeronautics and Space Administration. Industrial Crisis Quarterly, 3: 319-334.

 

Abstract

 

Organizational decay is a condition of generalized and systemic ineffective­ness. It develops when an organization shifts its activities from coping with reality to presenting a dramatization of its own ideal character. In the decadent organization, flawed decision making of the sort that leads to disaster is normal activity, not an aberration. Three aspects of the development of organizational decay are illustrated in the case of the US National Aeronautics and Space Administration. They are (1) the institutionalization of the fiction, (2) per­sonnel changes in parallel with the institutionalization of the fiction, and (3) the narcissistic loss of reality among management.

 

 


          Introduction

Explanations of disasters often assume that the disaster was the result of a single, isolated decision that was wrongly made. Indeed, it is typically asserted that the decision making process employed was one which is ordinarily valid but which, in the specific case, crossed over some vague boundary and led to the disaster.

Explanations like this take for granted that the organizational context of the decision was basically sound. Set against the presumed backdrop of the orga­nization's continuing healthy activity, the decision and the disaster that fol­lowed from it are seen as an aberration, an unfortunate accident — as much a tragedy for the well-meaning and generally competent individuals who made the decision as for its more direct victims.

While this scenario is certainly accurate in many instances, there are other cases in which an opposing vision may be closer to the facts. Here, the specific decision is seen as fundamentally flawed and as taking place within an organi­zational context that is generally unsound. Indeed, within this vision the deci­sion is seen as being only one of many bad decisions which the unhealthy or­ganization generates naturally and almost inexorably.

But we have no theories of organization that enable us to understand orga­nizations that are fundamentally unhealthy. Our theories of organization are basically functionalist theories which assume that organizational processes make sense in term of the overall purposes of the organization. Within this paradigm, these overall purposes go unquestioned and the validity of the fun­damental organizational processes which represent them is taken for granted. Thus, within this paradigm, organizational disasters and the bad decisions which lead up to them must be seen as aberrations.

The purpose of the present paper is to show how organizations can decay — how they can become basically unsound; how rational process can become the aberration, rather than faulty decision making and disaster. I will illustrate this process of decay by an analysis of the history of the United States National Aeronautics and Space Administration (NASA) as it led up to the Challenger disaster.

The concept of organizational decay that I present here is a product of psy­choanalytic thinking. Psychoanalytic theory is emerging as an important cur­rent of thought concerning organizations and especially concerning organiza­tion dysfunctions (Kets de Vries and Miller, 1984; Baum, 1987; Hirschhorn, 1988). One reason for this, as I have suggested above, is that orthodox func­tionalist theories are bound to a conception of organizational rationality. Within them, dysfunctions are necessarily seen as aberrations. For a rich theory of dys­function, one needs to go beyond the assumption of rationality, and this is the domain of psychoanalytic theory.

I might add that psychoanalytic theory may be described as the study of those truths about ourselves that we do not want to know. As the study of disaster and dysfunction becomes more important for us, I suspect that we will find more and more that what we need to know is what we do not want to know.

The Challenger disaster as an aberration

It will be useful to begin our analysis by considering more traditional orien­tations to the Challenger disaster. In an article on organizational culture and reliability, Weick (1988) reasoned that:

When people think they have a problem solved, they often let up, which means they stop making continuous adjustments. When the shuttle flights continued to depart and return suc­cessfully, the criterion for a launch — convince me that I should send the Challenger — was dropped. Underestimating the dynamic nature of reliability, managers inserted a new crite­rion — convince me that I shouldn't send Challenger, (p. 25)

Similarly, Starbuck and Milliken (1988) maintained that the catastrophe was the result of "fine-tuning" that had gone too far. For them, the disaster arose in the context of a natural intra-organizational conflict between man­agers and engineers:

Engineers are taught to place very high priority on quality and safety. If engineers are not sure whether a product is safe enough, they are supposed to make it much safer than they believe necessary...

[But] safety factors are, by definition, supposed to be unnecessary... To reduce waste and to make good use of capacity, an organization needs to cut safety factors down...

...successful experiences make safety factors look more and more wasteful...

Although engineers may propose cost savings, their emphasis on quality and safety relegates cost to a subordinate priority. Managers, on the other hand, are expected to pursue cost reduc­tion and capacity utilization, so it is managers who usually propose cuts in safety factors, (pp. 24-25)

Thus, incremental reduction in safety factors on the basis of successful ex­perience, a form of what Starbuck and Milliken call "fine-tuning," is a normal and natural organizational process — a part, indeed, of the manager's job. And it is natural, normal and even commonplace to pursue it until disaster happens. The point is not to stop fine-tuning, but to learn from the disasters that it inev­itably creates on the road to progress.

In all of this, there is no hint that there was anything wrong with NASA. Indeed, Weick is even worried that a loss of faith in NASA's reliability will have the paradoxical effect of decreasing its reliability. For reliability is "dynamic" and grows out of faith in the reliability of the system. This faith makes it pos­sible for the system to act. Then, vigilance in the course of the action creates the reliability that had been assumed. Thus:

The importance of faith in holding a system together in ways that reduce errors has been dis­cussed for some time as "The Right Stuff"...

While this mechanism is sometimes interpreted as macho bravado, it is important to re­member that confidence is just as important in the production of reliability as is doubt. The mutually exclusive character of these two determinants can be seen in the growing doubt among astronauts that they have been flying the safe system they thought they were. Notice that the system itself has not suddenly changed character... (pp. 27-28)

But a closer look at the context of the Challenger disaster reveals difficulties with these analyses. It reveals, first of all, that the decision to launch the Chal­lenger was not based on a sound principle that was overapplied. Second, it reveals that NASA was far from being healthy. Let us take these one at a time.

To begin with, Starbuck and Milliken's claim that managers were trying to remove unnecessary safety factors is incorrect because there were no safety fac­tors and the managers were blind not to know this. The fact is that the shuttle flights were not successes. Many of them were near-catastrophes and had been so for a long time. Below I note a number of system components which were regularly experiencing serious problems. Considerations of space prevent me from going into these problems in detail. Perhaps it will be more useful to quote from the Rogers Commission (RC, 1986) on what the idea of a "safety factor" meant to NASA:

From the beginning, Thiokol had suspected the putty was a contributing factor in O-ring ero­sion, even after STS-2. In April 1983, Thiokol reported on tests conducted to study the behav­ior of the joint putty. One conclusion of the report was that the STS-2 erosion was probably caused by blow holes in the putty, which allowed a jet of hot gas to focus on the primary O-ring. Thiokol discovered the focused jet ate away or "impinged" on portions of the O-ring. Thiokol calculated that the maximum possible impingement erosion was .090 inch, and that lab tests proved that an O-ring would seal at 3,000 psi when erosion of .095 was simulated. This "safety margin" was the basis for approving Shuttle flights while accepting the possibility of O-ring erosion, (p. 133)

The second premise, that the system was healthy, also turns out to be false. Rather, a closer look at the organizational context shows that, despite Weick's claim, there certainly was something wrong at NASA. Indeed, the system had changed its character. To be sure, it had not changed suddenly. Nonetheless, over the years, NASA had become a hollow shell of its former self.

Consider the problems that had arisen in four cross-cutting dimensions:

(1)    Hardware problems: the solid rocket booster joints that were found to
have caused the Challenger explosion were far from being the only unreliable
items in the shuttle system. On the contrary, the Rogers Commission found
that the wheel, braking and steering systems were all faulty; and that the main
engines had a number of serious problems, including cracks in the turbine
blades, valve failures and leaks from heat exchangers.

(2)    Loss of administrative control: NASA had virtually lost control of its
spending and had wasted, according to federal audits, at least $3.5 billion:

In the last 15 years...bad administration and spending abuses have been found in virtually every aspect of the NASA operations, from running the shuttle to developing planetary probes, from satellites to construction of buildings, from space experiments to employee overtime, from headquarters to field centers, according to the [General Accounting Office] documents. (New York Times, April 23, 1986)

(3)  Loss of technical control: in its early years, NASA had maintained the technological capability and the staff to oversee its contractors. Indeed:

[James Webb, NASA Administrator from 1961 to 1968, would not allow NASA to fall behind its contractors technically. He demanded that NASA employees always know more about their programs than the contractors working for them. When the electronics of Apollo seemed to go beyond the agency's knowledge, Webb pushed through a NASA electronics center at MIT. (Trento, 1987, p. 56)

By 1982 this capability had been lost, and contractors had become free to do whatever they wanted with impunity (New York Times, Business Section, June 29, 1986; Trento, 1987, pp. 208-209,239).

(4)  Loss of control over operations: NASA came to have extreme and increasing difficulty in conducting and coordinating the complex processes involved in shuttle operations. The Rogers Commission, in assessing NASA's difficulties in this area, maintained that:

An assessment of the system's overall performance is best made by studying the process at the end of the production chain: crew training... (p. 166)

And, in this regard, they quoted astronaut Henry Hartsfield:

"Had we not had the accident, we were going to be up against a wall; STS 61-H... would have had to average 31 hours in the simulator to accomplish their required training, and STS 61-K would have to average 33 hours [note: normal time was 77 hours]. That is ridiculous. For the first time, somebody was going to have to stand up and say we have got to slip the launch because we are not going to have the crew trained." (p. 170)

On the whole, the picture of NASA that emerges from thorough investigation is of an organization characterized by generalized and systemic ineffectiveness: an organization in which the flawed decision to launch Challenger was not an aberration but a normal and ordinary way of doing business. James Webb — the man who, more than any other single person, had built NASA — put it this way: "There was an organization that was regarded as being perfect, that sud­denly doesn't do the simplest thing" (Trento, 1987, p. vii).

Under the circumstances, the focus of inquiry into the Challenger disaster changes. It becomes not a question of how a specific decision could be made at a specific time, but a question about the organization as a whole. Specifically, it becomes a question as to how an organization which "was regarded as being perfect," that placed men on the moon, became an organization that "doesn't do the simplest thing."

The concept of organizational decay

I have argued elsewhere that, for the committed organizational participant, the idea of the organization represents an ego ideal — a symbol of the person one ought to become such that, if one were to become that person, one would again be the center of a loving world as one experienced oneself as a child (Schwartz, 1987a,b,c). The ego ideal represents a return to narcissism (Freud, 1914, 1921; Chasseguet-Smirgel, 1985). It represents an end to the anxiety that entered our lives when we experienced ourselves as separate from our appar­ently all-powerful mothers.

With regard to organizations, this means that individuals redefine them­selves as part of an organization, conceived as perfect: an organization in which members are perfectly integrated into a collectivity which is perfectly adapted to its environment. An image of an organization serving as an ego ideal may be called an "organization ideal" (Schwartz, 1987a, b, c).

The problem with the organization ideal, like any ego ideal, is that it can never be attained. It represents a denial of our separation, finitude, vulnerabil­ity and mortality; but these remain with us by virtue of our existence as concrete individual human beings (Becker, 1973; Chasseguet-Smirgel, 1985; Schwartz, 1987c).

Given the power of the belief in the possibility of attaining the ego ideal, organizations often attempt to generate a way of preserving the illusion of the organization ideal in the face of the failure of the organization to exemplify it. The attempt to manage an organization by imposing this illusion is what I call, following Shorris (1981), organizational totalitarianism (Schwartz, 1987a).

The problem is that such symbolic manipulation places falsehood right at the core of organizational functioning and therefore cannot help but lead to a loss of rationality. For the return to narcissism is impossible, short of psychosis (Chasseguet-Smirgel, 1985), and therefore organizational totalitarianism means the superimposition of a psychosis upon organizational functioning. Ul­timately, whatever the gains in motivation, such a loss of rationality must lead to generalized and systemic organizational ineffectiveness of the sort we found at NASA.

Moreover, I suggest that this condition of generalized and systemic ineffec­tiveness has a unity to it, and therefore represents something like an organiza­tional disease. I give it the name organizational decay (see also Schwartz, 1989).

Some aspects of the decay process

There are a number of aspects of organizational decay. In this paper, I shall consider three of them. First is what I call the institutionalization of the fiction, which represents the redirection of its approved beliefs and discourse, from the acknowledgement of reality to the maintenance of an image of itself as the or­ganization ideal. Second is the change in personnel that parallels the institu­tionalization of the fiction. Third is the narcissistic loss of reality which repre­sents the mental state of management in the decadent organization.

The institutionalization of the fiction

The commitment to a bad decision

If the organization were the organization ideal, it would never make a bad decision. Since no organization is or can be the organization ideal, this means that they all make bad decisions sooner or later. The institutionalization of the fiction of the organization ideal begins when the organization, trying to justify its bad decision, becomes committed to it (see Staw, 1980).

In the case of NASA, the original bad decision was the decision to build the shuttle on the cheap. What made this a bad decision was that the low figure for development that NASA accepted ruled out the original idea of a reusable shut­tle system that could inexpensively and reliably carry payloads into orbit.

Denial of reality through the idealization of the organization

The underfunding that began at this point need not have been fatal. What was fatal was that having made a deal to develop the shuttle cheaply, NASA management magnified their sense of competence into believing that, since they were NASA, they could still realize the original idea of the space shuttle.[1]

NASA had two strategies that could have led to a viable shuttle program. The original plan, which would certainly have been the best in the long run, was to build an adequately funded shuttle system that would have permitted cheap operation. The second acceptable alternative was to build the shuttle cheaply and compensate with high operating costs. The Nixon administration can­celled the first possibility, but the second remained. However, the second strat­egy would have required NASA to recognize the severe limitations that the restricted developmental budget had placed on the shuttle, and it was this sense of limitation they could not accept. Hence, they chose a third strategy: building the shuttle cheaply and operating it cheaply. This strategy involved the denial of the reality of the shuttle system's limitations, countering it with a fantasy of the shuttle's perfection as a product of perfect NASA.

Senator and former astronaut John Glenn, interviewed on the news program This Week with David Brinkley, described this cultural transition this way:

Well, I think there has been, and I think back in the days when I was in the program I think there was a can-do attitude, a go-for-it attitude, and safety was paramount. Bob Gilruth, when we first got in the program, told us back in those days, "You know, any time you have a ques­tion about safety, let me know and we'll change, we'll do additional tests, we'll do whatever." And I think that can-do attitude, perhaps at least with some people at NASA... was replaced by a can't-fail attitude, and I think that's unfortunate that that crept into the program.

And Eugene Cernan, another former astronaut, said on the same program "I think they were just caught up with the fact that 'Hey, we're infallible. We can't help but succeed.' "

Moreover, there is evidence of the choice of totalitarianism, as described above, developing at NASA over the issue of maintaining the idea of building the shuttle cheaply.

Trento quotes NASA veteran John Naugle to this effect:

I think Fletcher [NASA Administrator under Nixon] felt sincerely that if he couldn't justify the shuttle economically, he couldn't make it go. And that was where my feeling was that if he had gone back to Nixon and said, "There is no way Jose that I can justify this economically; we either do it as an R&D program because it ought to be done, or we go out of the manned space flight business." (p. 119)

But Fletcher did not do that. Rather, when reality intruded upon NASA's ideal­ization of itself, it appears that NASA suppressed reality. Thus, Naugle says:

Up until that era there, I never worried about saying what I felt. I always felt my bosses...while they might not agree with me, they might slap me down, they might quarrel with me, but they were not going to throw me out just because I brought them bad news. And somewhere be­tween the time Fletcher came on board and the time he left, I no longer felt that way. (p. 121)

Further decision making on the basis of the idealization of the organization

If beliefs existed only in the mind, there would be no problem caused by the belief in the organization ideal, but in fact they determine actions. The fate of the Challenger was sealed by the decisions made on the basis of NASA's self-idealization.

A good example to show the systemic nature of organizational decay was the decision to declare the shuttle, after only four flights, "operational." The Rogers Commission observed that the use of the term "operational"

...has encountered some criticism because it erroneously suggests that the shuttle had attained an airline-like degree of routine operation, (p. 5)

This connotation of the term "operational" is one that NASA bought into en­tirely, and as a result placed demands on the shuttle system that simply could not be met and that would, according to many experts, have resulted in disaster even if flight 51-L had been postponed (see for example, Murray, 1986).

The Rogers Commission documents numerous problems that arose from the declaration of the shuttle as operational. One example which will serve for all is this:

...the capabilities of the Shuttle processing and facilities support work force became increas­ingly strained as the Orbiter turnaround time decreased to accommodate the accelerated launch schedule. This factor has resulted in overtime percentages of almost 28 percent in some direc­torates. Numerous contract employees have worked 72 hours per week or longer and frequent 12-hour shifts. The potential implications of such overtime for safety were made apparent during the attempted launch of mission 61-C on January 6, 1986, when fatigue and shiftwork were cited as major contributing factors to a serious incident involving a liquid oxygen deple­tion that occurred less than five minutes before scheduled lift-off, (p. 171)

From the point of view of our theory, the concept that the shuttle was oper­ational was a specification and extension of the organizational ideal and had the effect of ramifying the denial of reality. On this we may quote Bruce Mur­ray (cited above):

I think they were caught up with an unexamined assumption by this point of time, which was that the shuttle could be operational and everybody was doing his best to make what I think was a myth be true...Not stupid, I think it was bad judgment...because the people were so under pressure and so blended into a mold that the shuttle was really a safe, reliable vehicle, that they no longer questioned that assumption. I think they stopped questioning that about four or five years earlier, (pp. 6-7)

Thus, we may observe that commitment to bad decisions leads to the denial of reality through self-idealization, which leads to further bad decisions, and so on.

Personnel changes in parallel with the institutionalization of the fiction

Advancement of incompetent individuals on the basis of ideology

To the extent that the core organizational process becomes the dramatiza­tion of the organization as ideal, the evaluation of individuals for promotion and even for continued inclusion must be made on the basis of how much they contribute to this dramatization. This means that, increasingly, promotion cri­teria shift from competence to ideological purity. This means that those indi­viduals who are retained and promoted will be those who will know very well how things are supposed to be, according to the dominant ideology, but who will know less and less about reality insofar as it conflicts with, or simply is independent of, ideology.

In government service, since the promotion system, especially at high levels, is controlled by individuals in the government itself, the relevant ideology is not so much the organization ideal, but the national ideology of the governing group[2] In this case the organization ideal is understood as a representation and example of the broader national ideology of the ruling group[3]


Thus, partisan politics on the national level comes to be the determinant of promotability.

Trento provides evidence that the shift from competence to political suit­ability increasingly determined who would be promoted to top level adminis­tration at NASA. Consider this progression:

[James Beggs, NASA administrator under Reagan] spent a year at NASA under Webb. Beggs' wife Mary remembers Webb introducing them to President Johnson. "Jim was a Republican in NASA and they knew it. Jim Webb knew it...He [Webb] said, 'I want you to know we look for people who can do the job in NASA, and we don't look for party affiliation,' “Mrs. Beggs remembers, (pp. 179-180)

But,

NASA was changing under Nixon. Paine agreed to accept political appointees to take over the NASA legal and legislative affairs offices. After that he found the White House pushing for more and more political appointees, (p. 90)

And,

During the Nixon administration, people looking for political jobs had to be more than true-blue Republicans. They had to be Nixon loyalists, (p. 96)

Then, with the advent of the Reagan administration, things went from bad to horrible. In the light of the idealization of business in Reagan's administra­tion, consider what the following passage suggests about the reasons behind Beggs' choice:

[Former NASA Comptroller] Lilly described Beggs as a "nonentity" in his earlier stint at NASA. After all, to Lilly, Beggs was first and foremost a contractor. Unlike old NASA hands, Beggs believed that the contractor and government were a partnership and not even occasion­ally adversaries. Such a relationship was the ideal born out of a free-enterprise system and representative democracy, (p. 184)

However,

Although he worked for Reagan's election, he was not one of the new, ultraconservative Reaganite true believers. As a lifelong Republican businessman, Beggs did not realize that the conservatives' agenda was not subject to the kind of compromise that he was used to. If you were not one of them, you were against them. If Jim Beggs was an obstacle, he would be re­moved, (p. 184)

And,

For all his experience in the corporate and political world, Jim Beggs was not prepared for the Reagan White House. He did not understand that appearance meant more than substance. That outward adherence to doctrinaire conservative philosophy meant more than the quality of the work. (p. 253)

Indeed, Trento suggests that the sin that finally led to Beggs' removal, through the vehicle of a spurious indictment engineered by the Reagan administration, was that, while he was Administrator, former radicals Jane Fonda and Tom Hayden were invited to a shuttle launch.

As the political criteria for NASA selection became more important, Trento shows, the managerial and administrative competence of its high officials steadily decreased. The ultimate act in this tragedy came with the appointment of William R. Graham as Beggs' deputy, a position which led to his appoint­ment as acting administrator eight days later when Beggs was forced to take a leave of absence following his indictment.

Graham was forced on Beggs, who was tricked and browbeaten into taking him even though (1) Graham's background was not in the space program but as a nuclear weapons expert, (2) the largest group he had ever managed was 12 analysts at the RAND Corporation, (3) "I [Beggs] had been warned by this time that the guy was a right-wing kook, a nut..." (p. 261), and (4) NASA was under terrible pressure and the job of deputy administrator was no place for on-the-job training...

Since the explosion of the shuttle occurred only two months after his ap­pointment as acting administrator, the fact of Graham's unsuitability for the top NASA post was not long in publicly emerging. He demonstrated that his knowledge of shuttle operations was deeply inadequate. This is from the New York Times of February 3, 1986:

William R. Graham, Acting Administrator of the National Aeronautics and Space Administration, said that solid-fuel booster rockets were "some of the sturdiest parts of the entire shut­tle system."

"They are considered primary structure, and not susceptible to failure," Dr Graham said...

Dr Graham also said that, if there had been some warning, the seven member crew might have had time to attempt an emergency landing here at the Kennedy Space Center, (p. 1)

All of these statements were false and would have been known to be false by anyone who had more than a passing acquaintance with NASA's operations. But Graham's lack of knowledge was at least partly due to his lack of experi­ence. While regrettable, this would not necessarily have been his fault, and con­ceivably could have been corrected if he had had the time.

What was much more disturbing was an apparent failure to recognize his limitations that would have precluded a serious attempt to correct these inad­equacies. Thus, for example, when his name came from the White House as a candidate for the job, Beggs said that he had offered him another job in NASA that would be consistent with his qualifications. But, according to Beggs, Gra­ham refused to take any other position. Again, on the day he took over from Beggs, according to Trento, "Graham...told a reporter, 'I'm in full charge and I intend to run this agency as though I am' " (p. 272). This arrogance is what seems to me to be the fatal flaw here, both in Graham and in the system that promoted him to his position. To explain it we may return to our theory.

Remember that the promotion system in the totalitarian organization is geared to advance individuals who idealize the organization. Then note that this idealization will be most pronounced toward those who most represent the organization: its highest officials (Schwartz, 1987a, b, c, d, 1989; see also Klein and Ritti, 1984, pp. 170-172; Sievers, 1986). Thus, Graham's arrogance may be understood as a natural concomitant, and even a requirement, of his high position in an organization like this. His position meant that he could, and even should, idealize himself and require that others do the same. He was, ac­cording to his ideology and the ideology of the increasingly totalitarian NASA, the ego ideal, and that meant to him that his ideological agenda was the mean­ing of NASA. Accordingly we find, at a time when NASA was burdened by perhaps the greatest degree of pressure that it had ever experienced, this:

I [Beggs] did go about twice a week to pick up my mail and answer phone calls which were numerous. All the NASA people wanted to tell me what he [Graham] did today. The first thing he did was issue a lot of directives. He acted like a typical analyst. He sat in his office with the door closed and wrote directives. The first directive he wrote described how you were supposed to wear your NASA badge. The second directive he wrote was a standards of conduct memo on what was permissible and what was not permissible for NASA employees. (Trento, 1987, p. 277)

And this:

Beggs, Kennedy Space Center Director Dick Smith, and others remember that Graham was very concerned about the guest list for the 51-L launch. Beggs said he received a phone call a few days before the scheduled launch of 51 -L from the Public Affairs Office. "They said, 'What's with this guy Graham?' And I said, 'I don't know. What's he doing now?' And they said, ' Well he [the Public Affairs Officer] says, 'He's [Graham's] scratching names out he says he is going to get in trouble with on the Hill.' " The public affairs people told Beggs that Graham was taking Dem­ocrats and any others he perceived to have liberal leanings off the list. (p. 282)

Discouragement and alienation of competent individuals

Another result of this sort of selection must be that realistic and competent persons who are committed to their work must lose the belief that the organi­zation's real purpose is productive work and come to the conclusion that its real purpose is self-idealization. They then are likely to see their work as being alien to the purposes of the organization. Some will withdraw from the orga­nization psychologically. Others will buy into the nonsense around them, cyn­ically or through self-deception (Goffman, 1959), and abandon their concern with reality. Still others will conclude that the only way to save their self-esteem is to leave the organization. Arguably, it is these last individuals who, because of their commitment to productive work and their firm grasp of reality, are the most productive members of the organization. Trento cites a number of ex­amples of this happening at NASA. Considerations of space preclude detailed discussion here.

The narcissistic loss of reality among management

As we saw exemplified in the case of William Graham, management in the totalitarian organization comes to believe that it has attained its goal of becom­ing again the center of a loving world. It takes itself to be the ego ideal and insists that it be taken as such by subordinates, even to the extent that infor­mation which conflicts with management's overvaluation of itself will be with­held. This is the case with information concerning the state of the environ­ment, but it is true as well about information concerning the internal state of the organization.

Thus, subordinates will know that their security and advancement depend on the success of their portrayal of the organization as the organization ideal: of its management as perfect management and of themselves as perfectly inte­grated employees. Given their need to believe in the organization ideal, and as part of the cultural transformation, they may even repress their own percep­tions and come to believe this. Whether they believe it or not, this dramatiza­tion will further decrease management's hold on reality and render it increas­ingly incapable and the organization increasingly ineffective. This in turn will increase the demands on the subordinates to assist management in divorcing itself from reality. This narcissistic state of NASA management was revealed in its response to the Cook memorandum.

Richard C. Cook, a budget analyst for NASA, was assigned to assess the impact of any problems with the SRBs. In a memorandum written July 23, 1985, he warned that flight safety was being compromised by erosion of the seal O-rings and that failure would be catastrophic. After the explosion, he wrote an­other memo, referring back to his first, which was leaked to the Rogers Com­mission, who called him to testify on February 12. Cook claimed that his infor­mation was based solely on what the engineers working on the SRBs had told him, but the Rogers Commission was dismissive and called NASA witnesses to refute his charges. In evaluating their response, recall that this was before it had been established that erosion of the O-rings was, in fact, the problem.

Rogers: Ah, it's fair then to say that after or at about the same time Mr. Cook's memorandum was written in July '85, that you and your team were, had been and were at that time conduct­ing a lot of investigations, doing a lot of work about the O-rings.

[David] Winterhalter [Acting director of NASA's shuttle propulsion division]: That's correct, sir.

Rogers: But in the final analysis, the qualified people, the engineers and others who were as­signed responsibility of their decisions have to make the decisions.

Winterhalter: That's true. And I pride, I prided myself on our division to be particularly good team workers. We have our differences, we work 'em, out...At no time...during that period did any of my people come to me, give any indication that they felt like there was any, any safety of flight problems in their area.

Q: Was it the view of your division, the propulsion group, that the seal design, as it was in­stalled and operating in the shuttle system was ah, safe and adequate? Winterhalter: It was. (New York Times, February 13, 1986, p. B1)

The Times went on to say:

A parade of agency witnesses testified that Mr. Cook's concerns were overstated, that the issue of seal erosion had been dealt with carefully by NASA engineering experts and managers, and that seal problems had diminished in 1985.


The NASA officials did not specifically dispute the budget analyst's contention that seals had eroded but argued that more competent profes­sionals than he had judged them adequately safe... [His boss Michael B.] Mann said he checked with the engineers and concluded that "maybe the memo overstated their concerns."...David Winterhalter...said that in his department, "People are not afraid to speak up...At no time did any of my people come to me and say they thought there was an issue of flight safety." (p. B12)

The next day, the Times gave Cook a chance to respond. In evaluating his interpretation, bear in mind that, as subsequent investigation has shown, he was entirely correct in his apprehensions:

In his first major interview since publication of his internal memorandum warning that rocket seals might leak and destroy the shuttle, the budget analyst, Richard C. Cook, said that pro­pulsion engineers at the National Aeronautics and Space Administration "whispered" in his ear ever since he arrived last July that the seals were unsafe and even "held their breath" when earlier shuttles were launched.

But he said such concerns got submerged because the "whole culture of the place" calls for a "can-do attitude that NASA can do whatever it tries to do, can solve any problem that comes up" as it "roars ahead toward 24 shuttle flights a year."

Today, L. Michael Weeks, deputy associate Administrator for space flight, the space agen­cy's second-ranking shuttle official, said that the climate at the agency actually encouraged individuals two or three levels below him to speak their minds on safety concerns. He said that working-level engineers "don't hesitate to tell Mike Weeks anything" and "quite often will argue on the spot at a significant meeting with me or with Jesse," a reference to Jesse W. Moore, the top shuttle official...

Mr. Cook said he based his warning memorandum last July on conversations with engineers in the agency's propulsion division who were concerned about erosion of the rocket's safety seals. "They began to tell me that some of these things were being eaten away," he said, "and rather innocently I asked what does that mean?"

"They said to me, almost in a whisper in my ear, that the thing could blow up," he contin­ued. "I was shocked." In his July memorandum, Mr. Cook explained, "I was simply paraphras­ing what this engineering group was telling me. I was not making it up that flight safety was being compromised and the results could be catastrophic. I didn't put it in my memorandum, but one of them said to me, when this thing goes up, we hold our breath.' "...

Mr. Cook said that, in meetings called by the shuttle program managers, a middle-level en­gineer with safety concerns is "just a little guy."

You aren't going to find an engineer with 20 years' experience and a livelihood to protect stand up and say, "Excuse me, but we might have an explosion on the next shuttle flight be­cause the O-ring might break. It's just not going to happen."

"If some did get up, he would quickly be branded a nay-sayer," Mr. Cook said. "I never said a word in these meetings. I was a nobody, more junior than the veteran engineers. And there is always the nagging thought in the engineers' minds that, 'Gee, we may be wrong. Maybe nothing will happen.'" {New York Times, February 14, 1986, p. B4)

Conclusion

Seeing organizational disasters as a natural by-product of organizational de­cay poses a serious problem for the organization that is concerned to prevent them. If a disaster is the product of a flawed organizational process within an organizational context that is healthy, then it is at least conceivable that the process can be changed to avert further disasters of the same sort. On the other hand, if the disaster is the result of systemic decay, its cause cannot be isolated within a specific area of organizational functioning and repaired therein. Moreover, the very means that are required in the repair of an organizational process, e.g. feedback, problem identification, even reflexive action of the or­ganization upon itself, are likely to be as much in decay as the original problem­atic process. Preventing the occurrence and reoccurrence of disasters, to the extent that they are caused by organizational decay, requires an organizational strategy of a different sort. Indeed, it requires an organizational strategy that is not even an organizational strategy.

Organizational decay is the result of a denial of reality and a concomitant addiction to fantasy. The reality that is denied is the reality of the individual’s separation, limitation and mortality. It seems inevitable that the solution to the problem of organizational decay must involve the acceptance of this reality. Within this context, the idea of a solution to organizational decay does not look like a specific program that powerful executives can impose on, and through, a powerful, potentially limitless organization. Rather, it comes to look like a group of limited men and women, trying hard each day to reclaim, within the terrible constraints that each one faces, a little bit of the hold on reality that they, themselves, threw away.

Acknowledgements

Some time for this research was provided through a course reduction from the Department of Management and Marketing at Oakland University. I would like to thank Dan Braunstein for making this time available. I would also like to thank David Doane, Chris Argyris, Harry Levinson, Carole Barnett and Joanne Martin for useful comments on earlier drafts.

References

Baum, S., 1987. The Invisible Bureaucracy: the Unconscious in Organizational Problem Solving. Oxford University Press, New York.

Becker, E., 1973. The Denial of Death. Free Press, New York.

Chasseguet-Smirgel, J., 1985. The Ego Ideal: a Psychoanalytic Essay on the Malady of the Ideal (1st American ed.), translated by Paul Barrows. Norton, New York.

Freud, S., 1914. On Narcissism: an Introduction (standard ed., Vol. 14). Hogarth, London, 1957.

Freud, S., 1921. Group Psychology and the Analysis of the Ego (standard ed., Vol. 18).Hogarth, London,1955.

Goffman, E., 1959. The Presentation of Self in Everyday Life. Doubleday Anchor, Garden City, NY.

Hirschhorn, L., 1988. The Workplace Within: Psychodynamics of Organizational Life. MIT Press, Cambridge, MA.

Kets de Vries, M.F.R. and Miller, D., 1984. The Neurotic Organization. Jossey-Bass, San Fran­cisco, CA.

Klein, M., 1984. Contributions to Psychoanalysis. Hogarth, London.

Klein, S.M. and Ritti, R.R., 1984. Understanding Organizational Behavior (2nd ed.). Kent, Boston, MA.

Murray, B., 1986. Interview on This Week With David Brinkley, ABC Television, June 8, 1986.

Rogers Commission, 1986 (William P. Rogers, Chair, Presidential Commission on the Space Shuttle Challenger Accident). Report of the Presidential Commission (Vol. I). US Govern­ment Printing Office, Washington, DC.

Schwartz, H.S., 1987a. On the psychodynamics of organizational totalitarianism. Journal of Management, 13 (1): 41—54.

Schwartz, H.S., 1987b. On the psychodynamics of organizational disaster: the case of the space shuttle Challenger. The Columbia Journal of World Business, XXII (1): 59-67.

Schwartz, H.S., 1987c. Anti-social actions of committed organizational participants: an existen­tial psychoanalytic perspective. Organization Studies, 8 (4): 327-340.

                                                                                        

 



[1] For an account of how NASA’s self-idealization contributed to the decision to launch the ill-fated Challenger flight, see Schwartz 1987b.

[2] 2De Lorean (Wright, 1979, p. 40; Schwartz, 1989) gives an account of the parallel process in

private industry.

 

[3] 3See Schwartz (1989) for a discussion of the place of NASA in American ideology.