“The post-antibiotic apocalypse” and the “war on superbugs”: catastrophe discourse in microbiology, its rhetorical form and political function

Discourses evoking an antibiotic apocalypse and a war on superbugs are emerging just at a time when so-called “catastrophe discourses” are undergoing critical and reflexive scrutiny in the context of global warming and climate change. This article combines insights from social science research into climate change discourses with applied metaphor research based on recent advances in cognitive linguistics, especially with relation to “discourse metaphors.” It traces the emergence of a new apocalyptic discourse in microbiology and health care, examines its rhetorical and political function and discusses its advantages and disadvantages. It contains a reply by the author of the central discourse metaphor, “the post-antibiotic apocalypse,” examined in the article.


Introduction
An announcement for a course on "Disaster Discourse" at Michigan State University reads as follows: "Chernobyl. Challenger. Bhopal. Exxon Valdez. AIDS. People who were adults in the 1980s would have no trouble identifying the common element linking those terms: disaster" 3 (Larabee, 2000). People living at the beginning of the twenty-first century would probably react in similar ways to the terms climate change, pandemic influenza and superbugs. The five events studied by Larabee (2000) "had in common an accidental and unexpected failure of we, the professional climate scientists, who are now the (catastrophe) sceptics. How the wheel turns. (Hulme, 2006) Hulme asks whether using the language of fear and terror to frame climate change operates as an ever-weakening vehicle for effective communication and might therefore not be the inducement for behavioral change that it was intended to be (see Lowe et al., 2006). This is a question we have to ask yet again when studying the framing of the rise in antibiotic resistance and the increase in superbugs in terms of "war" and "apocalypse." In 1998, the year that Weingart reflected on catastrophe discourse in climatology, Richard Smith, the then editor of the British Medical Journal wrote in an editorial that "[i]ncreasing resistance to antimicrobial agents is health care's version of global warming. The serious problems are global and lie in the future" (Smith, 1998: 764). A decade later these serious problems lie no longer in the future. As Richard James, Director of the Centre for Healthcare Associated Infections at Nottingham University, points out: I believe that the problem of healthcare-associated infections … now is rather like global warming was a few years ago … People were arguing about the science and whether it was going to happen. But now everyone is on board in the UK: all the politicians believe it is serious, that we have got to do something, that it's going to cost money, but if we don't it will be horrendous. (Daily Mail, 16 January 2007) He and others would like to see an integrated strategy in place that will deal effectively with the problem. Significant investment in resources would be needed to achieve this but, they argue, the present piecemeal strategy is not working. Using apocalyptic language to effect this change in perspective seems to be a good strategy and a strategy that seemed to work in the case of climate change. There might be some drawbacks, however, that need to be discussed and which are already being debated with relation to attitudinal and behavioral changes regarding climate change (Lowe et al., 2006).
In the following sections of the article, I shall first provide an overview of the conceptual tools used to study the emergence of a "catastrophe discourse" in microbiology and sketch the historical background from which it emerged. This will set the scene for the empirical part of the article which studies the use of catastrophe metaphors in a small corpus of articles. The article ends with some preliminary conclusions and a reply by the microbiologist whose discourse has been studied in the empirical part.

Conceptual framework
In a recent interview with Newswise (15 October 2007), the frame analyst and public communication expert Matthew C. Nisbet, is quoted: "The reason Gore and others have failed to unite public opinion across political lines … is because they have focused on factors that frighten, an approach Nisbet terms 'the catastrophe frame'" (Newswise, 2007).
This article draws on concepts and theories from frame and storyline analysis (Hajer, 1995), conceptual metaphor analysis (Lakoff and Johnson, 1980) and the study of "discourse metaphors" (see Zinken, 2007;Zinken et al., in press;Romaine, 1997) to study how the catastrophe frame is used, not in the context of global warming, but in the context of health care associated infections. According to Entman (1993: 53), to frame is "to select some aspects of a perceived reality and make them more salient in a communicating text, in such a way as to promote a particular problem definition, causal interpretation, moral evaluation, and/or treatment recommendation for the item prescribed." The catastrophe frame can be seen as part of a repertoire of frames used, "by journalists to craft interesting and appealing news reports, and by policymakers to define policy options and reach decisions"-and, one should add, by scientists to draw attention to issues of concern (Nisbet and Scheufele, 2007).
Frames are linked to both storylines and metaphors in intricate and complex ways. Storylines are narratives about social reality through which elements from many different domains are combined and that provide actors with a set of symbolic references that suggest a common understanding. Stories around wars, disasters and the apocalypse are such symbolic devices. "The point of the story-line approach is that by uttering a specific element one effectively reinvokes the story-line as a whole. It thus essentially works as a metaphor" (Hajer, 1995: 47). Conversely, a specific metaphor, such as a disaster or catastrophe metaphor can evoke a whole storyline. Together, metaphors and storylines frame a specific approach to what seems to be a coherent problem, be it global warming or the rise in health care associated infections.
Storylines and metaphors are intricately linked. Conceptual metaphors of war and contest evoke storylines and scenarios that go well beyond them. But whereas war metaphors, such as "fighting a war against cancer" go almost unnoticed, discourse metaphors such as "we are approaching an antibiotic apocalypse" are more salient discursive and political framing devices. Unlike conceptual metaphors, which are deemed to be universal and used tacitly (Schön and Rein, 1994), discourse metaphors are metaphorical projections that function as key framing devices within a particular discourse over a certain period of time. Many of them have definite authors, such as "the selfish gene," coined by Richard Dawkins, "silent spring," coined by Rachel Carson and "the post-antibiotic apocalypse," coined by Richard James. Unlike the source concepts of conceptual metaphors, supposed to be based on links between cognition and embodied patterns of experience, discourse metaphors refer to phenomenologically salient real or fictitious objects that are part of interactional space (i.e., can be pointed at, like MACHINES or HOUSES) and/or occupy an important place in cultural imagination (such as Frankenstein's monster or the apocalypse); and, conversely, discourse metaphors themselves highlight salient aspects of a socially, culturally or politically relevant topic. Like conceptual metaphors, discourse metaphors activate specific emotional commitments. They evoke an emotional or affective stance, position or reaction in the listener and user-what Aristotle called "pathos." As we will see, with relation to the metaphors used in catastrophe discourses, this can have social, political and behavioral consequences, especially since such metaphors, intended to persuade policy makers and the public, may also dissuade and demotivate.

The emergence of an antibiotic apocalypse discourse-a historical overview
Over the last century antibiotics have turned from miracle to menace (Raman, in press). The miracle began with the discovery of penicillin in 1928 and its mass production in 1944 (Rotbart, 2007). In 1949 medicine had as its goal [t]o live in a world without menacing microbes, to have all disease-producing microbes rendered harmless and domesticated; to see infectious diseases vanish from the earth, or at least be easily controlled, to make this planet free from the dangers of death from infectious diseases. (Sokoloff, 1949: 94, emphasis in the original) In the 1960s, "it seemed as if the war against bacterial infections was over. In fact, by 1967 it looked so promising that the US Surgeon General confidently declared: 'It's time to close the book on infectious disease'" (James, 2005). However, the rise in antibiotic resistance, or the ability of bacteria and other microorganisms to withstand an antibiotic to which they were once sensitive, led to the emergence of health care associated infections and so-called superbugs. The "war" against microbes, which many thought had been won, had to start all over again. A vision of a disease-free utopia, which had been heralded by the discovery of antibiotics, gave way to a vision of a dystopian future where bacteria reign supreme, unchecked and uncheckable.
One of the most well-known infections linked with antibiotic resistance is MRSA. The incidence of MRSA has risen sharply in most Western countries and multiple resistance to different classes of antibiotic has also become more common. As a result, the first choice antibiotic for MRSA is now vancomycin, which is often called the drug of last resort.
The first warnings of an impending catastrophe appeared in popular science books published in the mid 1990s bearing titles such as Coming Plague: Newly Emerging Diseases in a World Out of Balance (Garrett, 1994) and Superbug-Nature's Revenge (Cannon, 1995), 4 but articles in scientific journals and the media at large soon reflected similar concerns (see Nerlich and Koteyko, in press). In 1996 Ash published an article in Trends in Microbiology entitled "Antibiotic resistance: the new apocalypse?" She argued that an apocalypse was perhaps not yet round the corner but that "in the 1990s the public perception of the threat from new infectious agents has been fuelled by fear of AIDS, Ebola virus, Salmonella and mad-cow disease" and she stresses that a "problem that is more likely to affect us personally is infection by antibioticresistant bacteria" (Ash, 1996: 371). She quotes Levy as saying that "antibiotics are societal drugs and that antibiotic resistance is an ecological problem" (Ash, 1996: 371)-a perspective on antibiotic resistance that is all-too-often overlooked in the "battle" against bacteria (Levy, 1998(Levy, , 2002. If antibiotic resistance is an ecological problem then "war" might not be the best way to solve it, an issue I shall explore further in the following sections. In 1997, Hiramatsu et al. published an article in The Lancet entitled "Dissemination in Japanese hospitals of strains of Staphylococcus aureus heterogeneously resistant to vancomycin" (Hiramatsu et al., 1997) which alerted scientists to first cases of reduced susceptibility to vancomycin. The article was accompanied by another one by Tabaqchali entitled "Vancomycin-resistant Staphylococcus aureus: apocalypse now?" (Tabaqchali, 1997). In 2001 Sharp published an article for Science Watch asking: "Are we returning to a pre-antibiotic age?" and in 2002 a vancomycin resistant strain of MRSA was discovered in the United States. The situation at the turn of the twentieth to the twenty-first century (when Stephen Hawking predicted a "biological apocalypse," Daily Mail, 17 October 2001) was summarized by Varaldo in the following terms: Little by little, yesterday's perhaps ingenuous hopes and the early dream of all-powerful antibiotics have been eroded and progressively replaced with deep distrust. The repeated warnings of microbial wars, new plagues, worldwide calamities, new apocalypses and even an impending post-antimicrobial era are a disquieting confirmation of this profound lack of confidence. (Varaldo, 2002: 1) This view was echoed by the Institute of Biology and the Royal Pharmaceutical Society which warned of an impending "Pharmageddon" (Microbiologist, December 2002, http:// www.sfam.org.uk).
In 2003, Goldstein and Kitzis published an article entitled "Vancomycin-resistant Staphylococcus aureus: no apocalypse now" (Goldstein and Kitzis, 2003) which mentions that unprecedented agitation had followed the publication of Hiramatsu et al.'s 1997 article, and they end their own article by saying that the apocalypse may occur under certain conditions. At about the same time, a popular science book The Killers Within (Shnayerson and Plotkin, 2002) was reviewed under the title "Apocalypse soon?" (Gratzer, 2003). Dr. Martin Westwell, a chemist at the University of Oxford who has carried out research into new and resistant forms of bacteria, explored this issue in an interview with the Daily Mail entitled "Superbug apocalypse" (Daily Mail, 30 September 2003). In 2003, Fitzpatrick published an article in The Lancet entitled "Apocalypse from now on." This paper focused on the SARS (severe acute respiratory syndrome) crisis but points out that Fears about an explosive epidemic of a lethal infectious disease compound existing anxieties about bioterrorism, nuclear war, or some environmental disaster, to create a peculiarly modern kind of apocalypse. The end is believed to be nigh, but this is a protracted condition rather than a terminal event, a state that looms but never happens. It is a case … not of "Apocalypse Now", but of "Apocalypse From Now On". (Fitzpatrick, 2003(Fitzpatrick, : 1310 Talking about microbial wars, apocalypse, nemesis (Humphrey, 2000) and Armageddon raised the profile of the problems posed by a rise in antimicrobial resistance and the increased appearance of superbugs and stimulated various political campaigns, between 2000 and 2005, intended to deal with this threat. The main thrust of these campaigns was to increase "cleanliness" in hospitals (see Koteyko et al., in press). This political focus on hygiene was questioned by many microbiologists, including James, who used a new rhetorical strategy to promote a more science-led approach.

Preparing the ground for a new apocalyptic discourse
In the spring of 2005 James, Professor of Microbiology at the University of Nottingham wrote an article for the University of Nottingham's Vision magazine entitled "Battling bacteria" in which he talks for the first time of a post-antibiotic apocalypse (although he qualifies this by saying: "what people call post-antibiotic apocalypse"-and some of these "people" have been quoted above) (James, 2005). James' article ends on a rather hopeful note though: "It's early days, but there is some hope that a future antibiotic apocalypse may be averted." This hope seems to fade in 2006. On 7 January 2006 The Guardian published a lengthy interview with James entitled "War on Terror" in which he outlines "his vision of an apocalypse." His aim, it seems, was to change policy makers' perceptions of how to deal with antibiotic resistance and superbugs and to promote new research into this issue.
A year later, on 5 January 2007 the University of Nottingham opened a new Centre for Healthcare Associated Infections (of which I am a member), intended, as indicated on its website, to carry out scientific research in an area "which is now recognized … as being under-funded and lacking recognized Centres of Excellence in the UK, even though there is an urgent need for such centres that carry out both basic and translational research and also offer advice and information" (http://hcai.nottingham.ac.uk/HCAI_home.htm).
The University issued a press release that quotes James as saying: "Quite frankly, the impending crisis on the horizon can be called the 'post-antibiotic apocalypse'." This phrase reverberated through the regional, national and international press. In the following I want to discuss the advantages and disadvantages of the language of war and apocalypse in the context of dealing with the rise in hospital acquired infections.

Apocalyptic war discourse: form and function
Using the keywords "antibiotic apocalypse," Lexis Nexis Professional was searched and 19 articles, published in English speaking news, between 1995 and 2007 were retrieved. Most of the articles appeared shortly before and after the opening of the Centre for Healthcare Associated Infections on 5 January 2007. Two articles were stored twice on Lexis Nexis which brings the total number of articles studied to 17. One article was published in 1995, one article and three short notices were published in 2006 and 12 articles were published in 2007, of which two appeared on 5 January, five on 6 January, two on 17 January, and one each on 31 January, 1 February and 30 March. Overall, five articles were published by the local paper, the Nottingham Evening Post, four by the Guardian/Observer (national newspapers with a left/environmental agenda), two by other local papers, one by The Express, one by The Sun, both tabloids, one by the Toronto Star and three by the Associate Press. The longest article dealing with the whole issue of superbugs was published in The Guardian on 17 January. A long article on quorum sensing (more below) was published on 1 February by the Associated Press.
The first article mentioning the phrase "antibiotic apocalypse" appeared on 24 July 1995 in the Toronto Star, entitled "A looming antibiotic 'apocalypse'." This was the time when apocalyptic bestsellers about new plagues and new superbugs had just been published (see previous section). The next article appeared on 1 February 2006 in the Nottingham Evening Post and was entitled "Our future at mercy of deadly superbugs." In this article we find the first use of the phrase "post-antibiotic apocalypse" with Richard James as the author of this phrase, a phrase that was used in all the remaining articles. A few months later, in May 2006, James appeared on a science radio program where he used the phrase again, which was reported in The Observer, The Sentinel (Stoke), and The Guardian. The rest of the articles studied appeared on the day of or the days after the opening of the Centre on 5 January 2007.
The majority of these articles quote the following section of an interview given by Richard James: "Quite frankly, the impending crisis can be called the post-antibiotic apocalypse. We are facing a future where there will be no antibiotics and hospital will be the last place to be if you want to avoid picking up a dangerous bacterial infection. In effect, cut your finger on Monday and you'll be dead by Friday if there's nothing to prevent it." James' vision of a post-antibiotic apocalypse was questioned by some experts, such as the Chief Nursing Officer Professor Christine Beasley, who accused him of sensationalism and by Professor Brian Duerden, Inspector of Microbiology and Infection Control for the Department of Health who said (as quoted in an article for the Nottingham Evening Post, 6 January 2007): "It is misleading to give the public the idea that there are currently only a few effective antibiotics remaining." Ten days later James defended his stance in the Nottingham Evening Post by saying: "The problem is in not admitting the scale … Perhaps the aim is not to scare people." In an article for the University of Nottingham's Exchange magazine (which was not part of our sample) he stressed that: We are in a war against HCAI [health care associated infections] and a new front has recently been opened by the realisation that potentially more dangerous strains of MRSA are now circulating in the community. Politicians have consistently underrated the threat posed by these formidable bacterial enemies. Having been accused of sensationalism and scaremongering by the Department of Health, I await their inevitable conversion to policies that fully acknowledge the scale of the threat of HCAI. The first sign that this is happening will be the announcement of mandatory screening for MRSA of all patients admitted to hospital. (James, 2007a) 5 Let us now take a closer look at the frames and metaphors used in the 17 articles of our corpus. The corpus contains one mention of a "looming antibiotic apocalypse," a warning issued in 1995 and 15 mentions of a "post-antibiotic apocalypse" which, James warned, the world was "facing," which was an "impending crisis," was "on the horizon," was "forecast" and would happen unless "we do something to change things." The use of a startling discourse metaphor such as this has definite performative and emotive implications. "The work of metaphor," Bono argues, and, I would add, especially that of discourse metaphors, "is not so much to represent features of the world, as to invite us to act upon the world as if it were configured in a specific way like that of some already known entity or process" (Bono, 2001: 227). In this case the known entity or process is the apocalypse, a term now often used to mean "the end of the world" and a term, more importantly, evoking many fictional narratives that portray end of the world scenarios. The metaphor has what speech act analysts call an illocutionary force and a perlocutionary effect, namely warning people of a possible end of the world scenario and getting the audience to do something to avoid it. The metaphor also evokes an emotion, namely being frightened or alarmed. Entman (1993) wrote that frames diagnose, evaluate and prescribe. Framing the issue of antibiotic resistance in terms of a catastrophe metaphor serves to diagnose a pressing problem in health care, evaluates it as needing urgent attention and prescribes a course of action. However, by using a frame that evokes negative emotions, this call for action might not serve its purpose as well as the speaker intends it to do, as I will discuss later.
These forecasts and this call for action were, however, offset by journalists quoting from a statement by the Department of Health that said that "talk of a 'post-antibiotic apocalypse'" was sensationalist and scaremongering (this appeared eight times in the corpus, six times together with the above quote by James-an act of media "balancing" which is quite common).
James used the phrase "post-antibiotic apocalypse" as a deliberate discursive move to attract attention to a situation that needed urgent political consideration and action. While talking about this apocalyptic scenario, framed by his well-chosen discourse metaphor, James and others also used conceptual metaphors which are more commonplace in microbiological and infectious disease discourse, namely the conceptual metaphor FIGHTING DISEASES IS WARand, as it turned out, he used these metaphors quite unconsciously and tacitly (see James, 2007a). However, the combination of the consciously chosen discourse metaphor and the unconsciously used war metaphors was a potent mix.
The corpus contains metaphors of war, race and contest, even one of "plague" (The Sun, 6 January 2007, headline: "'Plague' fears at bug HQ"). More importantly, bacteria are personified as "smart" agents whose ingenuity scientists can all but admire, albeit rather grudgingly (see Nerlich and Koteyko, in press). They are "the greatest survivors on the planet," which evade medicine by mutating, adapting and multiplying, that is, by using evolution cleverly and ingeniously; they won't "simply roll over." They attack "frail, elderly, very sick people in hospitals," they "strike down" celebrities, they are "a new killer in our midst," but not any new killer, they are a killer "in a new deadly garb." Here the almost dead killer metaphor is injected with new life by giving the bacterial agent clothes to wear. Bacteria as personified agents refuse to be "wiped out" or "stamped out" and always "regain the upper hand." They "are increasingly defeating standard antibiotics" and "may come back with a vengeance." However, scientists are fighting back and show "no mercy." Some are developing a cream which "will give … bacteria a sledgehammer blow." But there will always be a race or contest between the scientists trying to "halt" superbugs and their "microbial foes," "powerful adversaries" or "dangerous enemies." But in a post-antibiotic world scientists might be fighting "a losing battle" and they may soon "run out of ammunition." Another kind of war is also metaphorically explored, this time not between bacteria and scientists, but between bacteria themselves. One article, published by the Associated Press on 1 February 2007, especially focuses on this topic, namely "quorum sensing"-that is the use of antibiotics to break down bacteria's "lines of communication," which is a research focus at the Centre. As pointed out by James in an article for the University of Nottingham's Vision magazine: "It's like a battlefield communication system. When bacteria like Staphylococcus aureus infect the body, their toxin genes are switched off under the control of the quorum sensing system. Only when there are enough bacteria to cause a serious infection do they switch on the toxin genes and go all out to attack" (James, 2007b). The article in my corpus is entitled "Chatter key to fight superbugs." It explains how bacteria "communicate" and how researchers, by interrupting this communication, can find new ways "to fight killer superbugs." Paul Williams, professor of molecular biology at the Centre, is quoted as saying: "Bacteria are a bit like an army going into battle … Only when they've got strength in numbers do they tell their troops to start firing." Another microbiologist, Dr. Peter Greenberg from the University of Washington, is quoted as saying: "If we can break them up, we can kill them." Here not only is the war metaphor used "incidentally" and tacitly, as in "fighting superbugs," but the whole scenario of war is exploited to tell a story about the fight against the bugs (on metaphor scenarios, see Musolff, 2006). The almost dead war metaphor is brought back to life in order to tell a story about battle that is entirely new and based on new science.
"War" and "competition" metaphors have been a long-standing currency in medical discourse (Hodgkin, 1985;Warren, 1991;Annas, 1995;Larson et al., 2005 and many more). From the times of Louis Pasteur onwards, dealing with bacteria or germs has been framed in terms of waging war-what Montgomery (1996) calls "biomilitarism." From the 1940s onwards, when antibiotics became widely available, the use of antibiotics too was framed in terms of war against invading bacteria-they seemed to be a "silver" or "magic" bullet in the fight against infectious diseases. And, in a sense, they were literally a weapon in a war, as the first really significant antibiotic, penicillin, was seen as vital to the allies winning the Second World War. For a time they were hugely successful, to such an extent that the dominant war frame that accompanied the use of antibiotics might have obscured the exploration of and investment in other technologies and therapies (see Barry, 2005).

Apocalypse discourse: pros and cons
When highlighting the diminishing powers of antibiotics in the war against bacteria using the new discourse metaphor of the post-antibiotic apocalypse can be useful. However, although it raises the profile of this problem and gets it on the public agenda, it might be counterproductive in the long term, as the apocalypse is usually seen as something that is inevitable, "the end of the world" and against which one cannot do anything. There are, however, other more modern and positive associations surrounding the term apocalypse which stress the importance of human agency in averting disaster. The two connotations-that one can do nothing and that one can do something-fight each other and might distract from the message that microbiologists wish to convey.
In common usage, the word "apocalypse" means "end of the world" or "mega-disaster." Such visions have been used from Nostradamus to films such as The Day After Tomorrow (2004) to spur people into action, either to turn to God, or, after the secularization of the apocalypse in modern cinematographic culture, to turn to human ingenuity and political action and investment. This is, I believe, the tradition in which James' apocalyptic vision inscribes itself, rather than the "end of the world" one. As in a number of films, from Apocalypse Now (1979) to Al Gore's An Inconvenient Truth (2006), this apocalyptic discourse focuses "on human ingenuity in avoiding the end rather than on the inevitability of cosmic cataclysm" (Ostwald, 1998). Ingenuity is needed to develop new diagnostic technologies to improve and speed up the detection of pathogens, but political acceptance of the scale of the problem and then implementing the strategy that will significantly reduce the scale of the problem are also essential. This is quite different from the use of catastrophic visions surrounding terrorism and climate change as discussed recently by Allenby (2007). He is right in saying that "the catastrophic vision becomes valuable for its wielders, for the insistence that we face an overwhelming threat is a powerful rhetorical and political device." In our case this vision is supposed to shift attention from "cleanliness," which has been the focus of government policies, to screening and isolating patients, as well as to investment in research. The rest of Allenby's analysis of catastrophe discourses surrounding climate change fits perhaps less well, especially his claim that catastrophic visions are used "for stifling the discussion of alternatives." James and other microbiologists want to stimulate the discussion of alternatives and argue against an oversimplified method for dealing with superbugs, namely cleaning. They stress the multifactorial problem posed by hospital acquired infections and highlight the importance of scientific innovation and change. In this they might be more closely comparable to ancient rather than modern apocalyptic visionaries. In fact "apocalyptic discourse may wedge in room for innovation, even radical change of practice" (Meek, 2000: 468).
In this context, war metaphors have a specific function. They draw attention to the many ways in which bacteria pose a threat and the many weapons that have to be used to defend human health from their actions. This includes scientific innovation but also a radical change of practice. However, if antibiotic resistance and the emergence of superbugs is an "ecological problem," as Levy surmised in 2002, and if the focus switches from the bacterial populations to human populations as the "culprits" in this rise of antibiotic resistance, then framing the management of this (potentially apocalyptic) threat in terms of war may not work so well. It might still draw attention to the problem and induce governments to invest in research and strategies other than cleaning, but it might impede important behavioral changes. As Hulme has pointed out: The language of fear and terror operates as an ever-weakening vehicle for effective communication or inducement for behavioural change.
This has been seen in other areas of public health risk. Empirical work in relation to climate change communication and public perception shows that it operates here too.
Framing climate change as an issue which evokes fear and personal stress becomes a self-fulfilling prophecy. By "sexing it up" we exacerbate, through psychological amplifiers, the very risks we are trying to ward off. (Hulme, 2006) This feeling was echoed by an expert we interviewed as part of our research project. 6 He pointed out that it is important to take away "some of the fear." He admitted that using catastrophe discourse was "designed to catch the attention," but stressed that this is also "frightening and people tend to back off then. They want something done about it but they're scared and they say oh you do something about it." He then went on to explore a more "ecological" image of bacteria and humans: I think it's probably important to say we live in this world with bacteria, some can cause damage to us, some of them are drug resistant but this is an interaction between the whole human population and the whole bacterial population. We as professionals will do our best to limit the risks in our practice but you know we need everybody's help in terms of hygiene and reducing risk of transmission between people, you know it's not something that's so frightening, you know we all live with bugs and most of us live quite happily with them. You know this is the sort of message that I would want to give rather than frightening them off.
In recent years, some scientists in the fields of microbiology and infection control have called for an "end to war metaphors" and "to drop the manichaean view of microbes-'We good; they evil.' … We should think of each host and its parasites as a superorganism with the respective genomes yoked into a chimera of sorts" (Lederberg, 2000). This is reminiscent of the Gaia hypothesis in which the Earth is seen as a super-organism and climate change/global warming as an illness (see http://www.ecolo.org/lovelock/). Just as the relationship between humans and germs is out of balance so is the relationship between humans and the Earth. What is needed, some argue, is more "ecological balance." In 2004, a workshop was organized entitled "Ending the War Metaphor: The Changing Agenda for Unraveling the Host-Microbe Relationship." A commentator writing for Microbe magazine applauded this enterprise but also hinted at some problems with this approach: Let's assume some microbiologists agree to study war no more, figuring that is no longer the appropriate way to describe what happens between pathogens and hosts. However, no perfect alternative has emerged to replace that long-used, if also shopworn metaphor. … Will some brave microbiologist come forward and volunteer to craft those welcome new metaphors, perhaps aspiring to be the poet laureate of infectious diseases? (Fox, 2005) This task will not be easy, as research into genomics discourse has shown (Nerlich and Hellsten, 2004). Genetics and genomics discourses have been dominated for a long time by information and code metaphors. After the decipherment of the human genome and various discoveries that overthrew old assumptions, biologists tried to change the old metaphorical framing devices, but this did not result in a major metaphorical paradigm shift. Information and code metaphors that frame genetic and genomic research have become ontologies and ontologies are not easy to shift (Kay, 2000: 3). This might also be the case for war metaphors in microbiology.

Conclusions
In 1998, Weingart had observed that exaggeration and discursive overbidding were tools used by scientists, such as climatologists, to gain public support and public funding. He speculated that [w]hat appears here as a recent and unique development can be demonstrated to be a recurrent pattern. In policy-relevant areas the emergence of new research fields follows the path of climate change research: In the beginning is the claim of an impending danger if not catastrophe. A small group of scientists (from different disciplines) who proclaims this danger also provides suggestions for a solution. The promise to be able to avert the threat comes with the authority of scientific expertise in a brand new research area and is tied to the condition of needed financial support … (p. 878) This article has shown that the discourse on antibiotic resistance and the war on superbugs follows this template. It is stepping into the shoes that climate change discourse is trying to leave behind when reflecting on its own "discursive overbidding." Just as in the case of global warming, talking about the situation regarding health care associated infections in terms of apocalypse and war has advantages and disadvantages. It alerts politicians to a situation that needs urgent attention, attracts funding bodies' attention to new lines of scientific research, and might reverse ordinary people's expectations regarding "miracle drugs" and the miraculous power of cleanliness. In the competition of ideas for research and political attention an apocalyptic discourse may provide a winning edge in securing resources. However, it might also induce fears which could stifle behavioral change.
It might be time for microbiologists to take a closer look at the debate about catastrophe discourse that is emerging in climate research. In a recent article investigating the relatively low priority given to issues of climate change in the United States, Goodman writes: It may be, paradoxically, that framing this issue in catastrophic terms ends up paralyzing instead of motivating us. … As Ross Gelbspan, author of "The Heat is On," says, "when people are confronted with an overwhelming threat and don't see a solution, it makes them feel impotent. So they shrug it off or go into deliberate denial." Michael Shellenberger, co author of "The Death of Environmentalism," adds, "The dominant narrative of global warming has been that we're responsible and have to make changes or we're all going to die. It's tailor-made to ensure inaction." (Goodman, 2007, italics added) "So to better communicate about global warming, scientists must discover frames that redefine the issue while also remaining true to current understanding" (Nisbet, pers. commun.). As shown with reference to genomics discourse, this will not be easy. This might be why campaigns to improve hand hygiene and hospital cleanliness might still be given priority when trying to deal with the threat of superbugs. Talk of a post-antibiotic apocalypse has its merits in galvanizing policy makers' and funding agencies' attention, but might be less well suited when trying to change ordinary people's and ordinary policy makers' behavior (i.e. make them focus on screening and isolating over and above cleaning). Research into factors that might affect attitudinal and behavioral change with relation to climate change has shown for example that images of a climate catastrophe don't work (Nicholson-Cole, 2005;Lowe et al., 2006), but here too searches for different ways of framing the issue are still ongoing (see Brahic, 2007;Futerra, 2007). Germany, where she did her Ph.D. in French linguistics. After three years as a Junior Research Fellow in general linguistics at the University of Oxford she moved to Nottingham, where she has worked in the departments of linguistics and psychology. Her current research focuses on the cultural and political contexts in which metaphors are used in public debates about science and emerging diseases. She has written books and articles on the history of linguistics, historical linguistics, cognitive linguistics and, more recently, the social study of science and technology with a focus on cultural and linguistic framing. Correspondence: Institute for Science and Society, Law and Social Sciences Building, West Wing, University of Nottingham, Nottingham NG8 2FY, UK; e-mail: brigitte.nerlich@nottingham.ac.uk

Reply from Richard James
I plead guilty to using metaphors but offer the following mitigating circumstances.
(a) Some metaphors especially those used in newspaper headlines are inserted by the journalist and not the scientist; (b) Scientists are often accused of talking to other scientists rather than having a dialogue with the public. The use of metaphors makes it easier to present complex biological systems such as why inhibition of quorum sensing in bacteria is an exciting alternative to conventional antibiotics; (c) I have studied the problems caused by antibioticresistant superbugs for 31 years and, despite the multitude of objective scientific reports that describe the problem and the strategies needed to contain it, I still await an integrated strategy by the UK government to significantly reduce the problem of health care associated infections; (d) There is direct experience of a world without antibiotics since they only became available less than 70 years ago. I have a photograph that I use in my lectures on the problem of antibiotic resistance that was taken in 1932 outside Springfield House Open Air School in London that shows children sleeping out in the open in three rows of beds that stretch away into the distance. This was the treatment for tuberculosis in the age before antibiotics, fresh air!!; (e) Politicians are faced all the time with requests from pressure groups to spend large sums of money in order to solve a serious problem. To even get an issue on to the political agenda therefore needs considerable skills that scientists rarely if ever need for writing research grant applications in order to harness the power of the media. If that process has to include the use of metaphors that present the problem in easy to understand terms then I will play the game. It should not be forgotten that, despite the criticism of the use of war metaphors by advocates of global warming, all the political parties in the UK now accept the scale of the problem of global warming and the significant financial cost of the solutions.
A detailed analysis of the problem of hospital infections in the USA was presented in a book entitled "Unnecessary Deaths: The Human and Financial Costs of Hospital Infections" (2nd edition) by Betsy McCaughey who is Chairman of the US Committee to Reduce Infection Deaths. 1 The essential facts quoted in the book are the estimated number of deaths from hospital infections is now 103,000 every year, which equates to the fourth largest killer in America and equals the number of deaths from AIDS, breast cancer and auto accidents combined. The estimate of $30.5 billion in additional medical care costs does not include the additional impact on the wider economy which may be significantly larger. The really shocking claim in the book is that "nearly all these infections are preventable." Evidence is presented in the book on the well-documented strategies that are needed to reduce deaths from hospital infections.
From the UK perspective, deaths from hospital acquired infections are estimated at 5,000 per year, and are increasing due to the problems caused by MRSA and Clostridium difficile, whilst the additional costs to the NHS are £1 billion per year. It is widely believed by medical microbiologists that, despite the recent large increase in funding for the UK health service, the overriding government attention on achieving targets to reduce hospital waiting lists is incompatible with the requirements for effective infection control procedures. Compelling support for this comes in the UK Healthcare Commission investigation into outbreaks of Clostridium difficile at Stoke Mandeville Hospital. 2 McCaughey argues that the problem of hospital acquired infections is the next asbestos, where worker exposure to this agent eventually led to enormous legal settlements. A similar situation happened in the UK with respect to coal miners exposed to coal dust. Solicitors in the UK are now trying to use health and safety legislation to make claims against individual hospitals on behalf of patients who have been infected. It would be sad if it requires high legal payouts against single hospitals to force UK politicians to change their priorities and concentrate on the quality of the treatment outcomes rather than the number of patients who are operated upon.
Comparisons might be made between the billions of pounds made available in the last few years in the UK to improve train safety and to contain the foot and mouth outbreak when compared with the much smaller amounts that have been found for hospital acquired infections. Why is it that the tragic death of commuters in the recent UK train crashes can open the Treasury coffers whereas the death of > 100 patients per week does not seem to merit such attention. Back to the war analogy, if the UK army suffered deaths at that rate in Iraq then that war would have been over some time ago.
Perhaps I could offer a new analogy for hospital acquired infections. If we imagine that the millions of people who are treated in UK hospitals each year are the equivalent of the great herd migrations over the plains of Africa then hospital infections are the crocodiles waiting at the river crossing points to pick off the weak. Wildlife photographers filming the river crossing always make the point that this is nature and we should not interfere. I and others argue that we need to "interfere" and make a significant reduction in the number of deaths due to hospital acquired infections by an integrated strategy that includes (a) screening of patients for MRSA on/before hospital admission; (b) isolation of MRSA carriers; (c) reintroducing small isolation rooms into existing hospitals; (d) better buy-in of staff for the need for improved hand washing compliance; (e) wide availability of hand wash gels that do not dry the skin; (f) a reduction in bed occupancy rates to < 85 percent; (g) improvement in the resourcing of NHS diagnostic microbiology laboratories to allow earlier detection of infections; and (h) ring-fenced funding for a UK health care associated infections research program. The last would include funding for the development of novel antibiotics that do not kill the pathogen but disable it and thus would not be expected to select for antibiotic resistance; a laudable homage to the Gaia hypothesis.
I believe that it is now only a matter of time before politicians finally have to accept the real scale of this struggle with bacterial pathogens like MRSA and C. difficile that are a very significant threat to our health care system. Notes 1 www.ncpa.org/pub/special/pdf/RIDBooklet_120605.pdf 2 www.healthcarecommission.org.uk/_db/_documents/Stoke_Mandeville.pdf

Author
Richard James is Professor of Microbiology at the University of Nottingham and Director of the Centre for Healthcare Associated Infections (CHAI). With researchers from nine Schools of the University from microbiology through to sociology, the breadth of expertise in CHAI allows a unique holistic approach to address the problem of health care associated infections. His research interests are in the development of novel antibiotics and the development of rapid diagnostic tests for MRSA. He has lectured on the problems of health care associated infections and antibiotic-resistant bacteria for more than 30 years.