Operation Pied Piper: a geographical reappraisal of the impact of wartime evacuation on scarlet fever and diphtheria rates in England and Wales, 1939–1945

SUMMARY This paper examines the geographical impact of the British Government's wartime evacuation scheme on notified rates of two common acute childhood diseases (scarlet fever and diphtheria) in the 1470 local government districts of England and Wales, 1939–1945. Drawing on the notifications of communicable diseases collated by the General Register Office (GRO), we establish pre-war (baseline) disease rates for the 1470 districts. For the war years, techniques of binary logistic regression analysis are used to assess the associations between (a) above-baseline (‘raised’) disease rates in evacuation, neutral and reception districts and (b) the major phases of the evacuation scheme. The analysis demonstrates that the evacuation was temporally associated with distinct national and regional effects on notified levels of disease activity. These effects were most pronounced in the early years of the dispersal (1939–1941) and corresponded with initial levels of evacuation-related population change at the regional and district scales.


INTRODUCTION
The nightmare conditions of life since September 1939, the peripeteia of war, in the first winter a theatrical unreality, then a sense of imminent, overwhelming danger, followed by the miseries of the second autumn-winter, disturbed the judgments of the wisest, even epidemiologists, and some forgot the sound rule of only prophesying after the event [1, p. 333].
Monday 1 September 2014 marked the seventy-fifth anniversary of the start of Operation Pied Piper, the British Government's scheme for the evacuation of inner-city children and other vulnerable classes to the relative safety of the countryside at the outset of World War II (Fig. 1). Within 96 h, the operation had spawned the largest civilian population flux in British history [3]. Beginning with the evacuation of unaccompanied school children from London, Birmingham, Liverpool, Manchester and other large urban centres on 1 September 1939, followed by accompanied infants and younger children, pregnant women and certain classes of disabled person in subsequent days, some 1·47 million inner-city residents had been dispersed to the relative security of the British countryside within the month [4]. Although the number of evacuees began to dwindle in the absence of the anticipated aerial bombardment of British cities by the Luftwaffe in the latter part of 1939, further major waves of evacuation followed the German occupation of France and the Low Countries (May-June 1940), the Blitz (September 1940-May 1941) and the V-1 flying bomb attacks on London and south-eastern England (June-September 1944) [5]. By the time that official approval for the final return of evacuees to London was given in June 1945, a total of four million people had been relocated at some time or another under the evacuation scheme [4,5].
As described elsewhere [6], the evacuation scheme was greeted with considerable apprehension by the medical press [7,8], local medical officers [9,10], medical statisticians [1] and prominent staff within the Ministry of Health [1,11]. It was feared that the mass dispersal of children from the endemic disease foci of large cities would result in the carriage of a range of common acute childhood infections (including diphtheria, measles, scarlet fever and whooping cough) to rural areas where the infections were less frequently encountered and where levels of immunity in the local children were correspondingly low [7,8]. Sensitive to these concerns, the Ministry of Health's provisional investigation of national disease trends during the first 4 months of the evacuation (September-December 1939) concluded that 'the incidence of infectious diseases . . . was remarkably low' [11, p. 405]. However, adequate assessment of the situation awaited the careful analysis of Dr Percy Stocks, Chief Medical Statistician in the General Register Office (GRO). As Stocks explained at the time: No satisfactory answer can be given to the question how the dispersal affected the incidence of infectious diseases in children, without dividing the country into all its component areas, reassembling them into evacuation, neutral and reception groups, and comparing the trends of . . . notifications in these groups with due regard to the changing populations at risk [1, p. 312].
Stocks' own two-part examination of the evidence for the early months of the war, published in 1941 and 1942, demonstrated that the initial evacuation of September 1939 was temporally associated with a brief inflation in the incidence of certain common acute childhood infections in some reception districts [1,12]. The nature and weight of the evidence, however, varied by disease, time period and geographical location. As Professor Major Greenwood concluded of the work, 'Dr. Stocks has given us much information, but much remains which only the leisure of historians can provide' [1, p. 333]. A fundamental question follows on from the contemporary studies of Stocks and colleagues: in what ways did the major phases of wartime evacuation (1939-1941 and 1944-1945) impact on the underpinning geography of common acute childhood infections in Britain?
In an earlier study, we examined the effects of childhood evacuation from the Greater London area on diphtheria, poliomyelitis and scarlet fever activity in 14 counties of south-eastern England [6]. In this paper, we undertake a systematic geographical analysis of two of these diseases (diphtheria and scarlet fever) in the entire set of 1470 local government districts of England and Wales. For the seven calendar years of World War II (1939)(1940)(1941)(1942)(1943)(1944)(1945), we scale the rates of notified disease activity in each of the 1470 districts to a pre-war (baseline) rate. Standard techniques of binary logistic regression analysis, with multi-level predictors in the time dimension, are then used to examine the associations between (a) above-baseline disease rates in the operationally classified (evacuation, neutral and reception) districts of the Government's evacuation scheme and (b) the major phases of the evacuation.
The analysis will demonstrate that, at the national level, the major phases of wartime evacuation were associated with a deflationary effect on levels of scarlet fever activity. In the evacuation and neutral districts, this effect was most evident in the early years of the dispersal (1939)(1940)(1941) and manifested as a significantly lower odds of above-baseline scarlet fever rates. Regionally, the same deflationary effect was signalled to varying degrees in the evacuation, neutral and reception districts of two regions (North and South East) that experienced a net population outflow as a consequence of the evacuation, with a corresponding inflationary epidemiological effect in the reception districts of one region that experienced a net population inflow (South West). Similar geographical patterns are identified for diphtheria in the early years of the dispersal. More generally, the analysis points to the need for sensitivity to the differential geographical effects of the Government's evacuation scheme on patterns of common acute childhood infections in wartime England and Wales.

Background to the evacuation scheme
Details of the Government's evacuation scheme are provided by the Ministry of Health [5, pp. 107-110], (see also Titmuss [4] and Smallman-Raynor et al. [6]). The scheme was based on a three-category division of the (then) 1470 local government districts (boroughs and county districts) of England and Wales. A total of 110 districts that were deemed to be militarily vulnerable, and from which movements were organized, were classified as evacuation districts, while a total of 1102 'safe' districts, to which the evacuees were moved, were classified as reception districts. The remaining 258 districts were classified as neutral (Table 1) [13]. Figure 2a shows the primary geographical focus of the evacuation and neutral districts in and around the major urban agglomerations of Greater London and the Midland and North regions, while Figure 2b shows the national scatter of the reception districts. Although some adjustments were made to the original district designations as the war progressed, the changes were minor and the core structure of the original scheme as summarized in Table 1 and Figure 2 was maintained throughout the war years.

Major phases of evacuation
As noted in the Introduction, the initial evacuation of September 1939 was the first and most substantial of several waves of evacuation that occurred as the war progressed. Two principal phases of dispersal can be defined [6]: June and September of that year, some 1·25 million people were moved to the safety of reception areas [4,5]. As the allied armies advanced northwards through Europe, the bombing decreased, although the threat of aerial strikes on the South East continued into the late winter of 1944-1945 with the advent of the V-2 rocket attacks.
For convenience, we refer to these two major phases of evacuation activity by the abbreviations EP-I (Evacuation Phase I) and EP-II (Evacuation Phase II) in the remainder of this paper.

Data sources and disease matrices
We follow Stocks [1] in our selection of scarlet fever and diphtheria as the common acute childhood infections for detailed examination in the present paper. While this selection is informed by contemporary concerns regarding the specific impact of evacuation on these two diseases [1,10], we note that the statutory notification of some other potential candidate diseases for examination (measles and whooping cough) did not begin until the latter months of 1939 [14,15]. Practical considerations, including the establishment of pre-war baselines against which to assess wartime trends, have precluded these latter diseases from the present analysis. Summary overviews of the nature and epidemiology of scarlet fever and diphtheria in England and Wales are provided by Smallman-Raynor & Cliff [16, pp. 44-50]. As described there, the annual count of disease notifications fluctuated around an approximately stable mean of 100 000 (scarlet fever) and 55 000 (diphtheria) in the 1920s and 1930s. While there was a progressive reduction in scarlet fever notifications during the 1940s, the implementation of the wartime diphtheria immunization campaign resulted in a sharp and sustained fall in recorded diphtheria activity. By the late 1940s, diphtheria notifications were less than one-tenth of their pre-war level [16].

Disease data and district categorizations
To examine the epidemiological impact of the evacuation scheme in the 1470 standard local government districts of England and Wales, we draw on the notifications of communicable diseases collated by the GRO, London, and published in the annual volumes of the Registrar-General's Statistical Review (London: HMSO). To establish a baseline against which to assess wartime levels of disease activity, the year 1931 was selected as the start of a 17-year time 'window' that straddled World War II (1939)(1940)(1941)(1942)(1943)(1944)(1945) and ended in 1947. For this observation period, annual disease counts and annual mid-point population estimates for local government districts were abstracted from the Statistical Review to form 1470 (geographical unit) × 17 (year) space-time matrices of notification rates per 100 000 population for scarlet fever and diphtheria. Within each disease matrix, the 1470 districts were coded according to: (a) the operational classification of districts in the Government's evacuation scheme as evacuation, neutral and reception districts at the outset of the war [13]; and (b) the geographical distribution of districts in a contemporary six-category regional division (East, Midland, North, South East, South West, Wales) of England and Wales, adopted for statistical purposes by the Registrar-General for England and Wales [17, p. 263] and mapped in Figure 2. Districts were then cross-categorized according to the two coding schemes, (a) and (b), to yield the 28 national and regional sets of district categories in Table 2. For reference, the table gives the total number of districts associated with each region and evacuation scheme class, along with the resident civil population as recorded in the National Register (29 September 1939) and the associated counts of scarlet fever and diphtheria notifications for the entire observation period (1931)(1932)(1933)(1934)(1935)(1936)(1937)(1938)(1939)(1940)(1941)(1942)(1943)(1944)(1945)(1946)(1947) and the war years (1939)(1940)(1941)(1942)(1943)(1944)(1945).

Quality of disease data
Insights into the quality and completeness of the scarlet fever and diphtheria data contained in the wartime publications of the GRO are provided by Stocks [1] and Smallman-Raynor et al. [6]. Here, we note that clinical diagnoses accounted for the majority of notified cases and errant diagnoses (arising from confusion with other diseases) and missed diagnoses (subclinical and mild cases) represent potential sources of error in the national notification records; see, for example, Noah [18, p. 50] and Russell [19, p. 18]. The exigencies of war serve to further complicate the picture. In particular, contemporary epidemiologists and medical statisticians were alert to the possible impact of the evacuation scheme and the war on routine disease surveillance, including the increased likelihood of disease under-reporting, reporting delays and misdiagnoses in the provisional case reports [1]. To circumvent some of these data-associated uncertainties, the present paper draws on the corrected (annual) notifications included in the Registrar-General's Statistical Review, rather than the provisional (weekly and quarterly) notifications included in the Registrar-General's Weekly Return and Quarterly Return. Notwithstanding this precaution, all results presented in this paper are subject to the caveat of data quality.

Binary logistic regression: experimental design and application
To determine whether the major phases of wartime evacuation mentioned earlier were associated with underpinning shifts in notified levels of disease activity in the national and regional sets of evacuation, neutral and reception districts, we use binary logistic regression [20]. This is used routinely in epidemiological analysis to assess the degree of association between a binary disease response ('outcome') variable and one or more predictor ('exposure') variables [21,22]. The response variable in the regression model is expressed as a binary classification in which 1 signifies a positive outcome (in the present analysis, above-baseline disease rates) and 0 signifies a negative outcome (at-or below-baseline disease rates). The predictor variables are continuous or categorical variables that describe the exposure(s) of interest. The univariate version of the model can be written as where Y is the probability of the response variable being equal to 1, X is a predictor variable and β 0 and β 1 are coefficients to be estimated. The exponential function of β 1 (odds ratio, OR) provides a measure of association between the response and predictor variables. In epidemiological investigations, OR = 1 indicates that the predictor variable does not influence the odds of disease outcome; OR > 1 indicates that the predictor variable is associated with a higher odds of outcome, while OR < 1 indicates that the predictor variable is associated with a lower odds of outcome.
Experimental design. In using equation (2) to determine evidence for evacuation-related effects on levels of disease activity, an important issue arises as to the selection of a suitable 'control group' or 'referent' against which to assess statistically these effects. In his original analysis of the impact of evacuation on scarlet fever and diphtheria, for example, Stocks adopted an experimental design in which neutral districts formed the referent against which evacuation and reception districts were compared [1]. Stocks' approach was based on the assumption that, as neutral districts did not send or receive evacuees as part of the Government's public evacuation scheme, their patterns of disease activity would be (relatively) unaffected by the population flux. Such an assumption is, however, incorrect. It overlooks the substantial levels of private evacuation from neutral districts [13], whose geographical distribution is unknown, and the correspondingly marked reductions in disease levels that we identify for these districts in the Results section. In the context of the present study, the statistical effect of using neutral districts as the referent for evacuation and reception districts would be (a) to under-represent any deflationary epidemiological effects, (b) to over-represent substantially any inflationary epidemiological effects and, by design, (c) to preclude a consideration of any epidemiological effects in neutral districts. Because neutral districts would represent a biased control group in a standard ANOVA design, the present analysis adopts an alternative approach in which the referent is set as the pre-war period (1931-38) for a given category of districts. In the context of equation (2), this methodology has the particular advantage of establishing a common pre-war OR (=1) against which to compare patterns across time periods, district categories and diseases. Analytical issues arising in consequence of the use of time-based predictors are considered in the Discussion.
Model application. For each of the district categories in Table 2, equation (2) was used to determine whether the interval of wartime evacuation was associated with above-baseline rates of scarlet fever and/or diphtheria. The binary classification of districts within a given category as above-baseline (1) or otherwise (0) disease rates was entered as the response variable in a series of logistic regression models in which time was treated as a single categorical predictor (X) variable with, variously, (a) ten levels (1931-1938, 1939, 1940, 1941, . . ., 1947) and (b) five levels (1931-1938, 1939-1941, 1942-1943, 1944-1945, 1946-1947). Here, the ten-level predictor permits an examination of annual associations while the five-level predictor measures the aggregate associations for EP-I and EP-II and the adjacent time periods. As noted above, specification of the 1931-1938 (pre-war) level of either predictor as the referent (OR = 1) in the modelling procedure allows the direct comparison of associations across time periods, district categories and diseases.
In recognition of the small number of districts associated with some of the district categories in Table 2, model fitting was limited to the 21 national and regional sets of district categories with 530 constituent districts. Analysis was undertaken for each district category (n = 21), disease (n = 2) and multi-level predictor (n = 2) to yield a total of 84 regression models. All model fitting was undertaken in Minitab ® v. 16.2.4 (Minitab Inc., USA), with the pre-war level (1931)(1932)(1933)(1934)(1935)(1936)(1937)(1938) of the predictor (X) variables specified as the referent period. The results of the analysis are presented as the OR, 95% confidence intervals (95% CI) and associated P values for the sets of models run using the temporally aggregated (five-level) predictor variable (Table 3) and the temporally disaggregated (10-level) predictor variable (Figures 4, 6, 7). For all analyses, statistical significance of the OR was judged at the P = 0·05 level (two-tailed test). Figure 3 plots the annual series of scarlet fever and diphtheria notifications per 100 000 population in England and Wales, 1931-1947. The extension of evacuation scheme designations to the pre-and postwar years captures the effects of the wartime dispersal on long-term disease trends in the national sets of evacuation, neutral and reception districts. Set against the uniformly higher levels of notified disease activity in the evacuation and neutral districts in the pre-war years, Figure 3a shows that scarlet fever rates in these categories collapsed to, and below, the corresponding rates for the reception districts during EP-I. A pronounced rebound in scarlet fever activity in 1942-1943 was superseded by a secondary reduction of rates in all district categories during and after EP-II. Figure 3b identifies a similar, if less pronounced, reduction in diphtheria rates in evacuation districts during EP-I. Thereafter, the roll-out of the wartime diphtheria immunization campaign resulted in a fall in diphtheria rates to low levels in all three categories of district by the early post-war years.

RESULTS
Logistic regression, I: national analysis Table 3 relates to the national sets of local government districts and summarizes, for scarlet fever (models 1-4) and diphtheria (models 5-8), the results of the logistic regression analysis using the temporally aggregated (five-level) predictor variable. We consider each disease in turn.

Scarlet fever
Model 1 in Table 3 shows that EP-I was associated with significantly lower odds of above-baseline scarlet fever rates in the entire set of local government districts. This implies a deflationary epidemiological effect in the first phase of the dispersal. While this deflationary effect was pronounced for evacuation (model 2) and neutral (model 3) districts, no similar or countervailing effect is evident for reception districts (model 4). The general lull in evacuation activities in 1942-1943 corresponded with a national upturn to significantly higher odds for the entire set of 1470 districts (model 1) and the subsets of neutral (model 3) and reception (model 4) districts. Thereafter, the primary feature of EP-II was a secondary reduction to significantly lower odds of above-baseline scarlet fever rates in evacuation districts (model 2). Figure 4 captures the principal features of this national pattern by plotting the annual OR and 95% CI for scarlet fever in the sets of evacuation, neutral and reception districts. Figure 4(a, b) shows that evacuation and neutral districts shared a common pattern of downswings (lower odds) and upswings (higher odds). These correspond with the wartime shifts in scarlet fever rates in Figure 3a. EP-I was associated with a sharp and sustained deflationary effect on disease activity in these districts. This effect was especially intense in evacuation districts, where the odds of above-baseline scarlet fever rates reached its nadir in 1940 (OR 0·04, 95% CI 0·01-0·12). Thereafter, a rebound to significantly higher odds in 1943 was followed, in EP-II, by a secondary reduction which was most pronounced in evacuation districts. By contrast, Figure 4c highlights the marginal statistical effect of the major phases of evacuation on reception districts.

Diphtheria
As was the case with scarlet fever, Table 3 shows that EP-I was associated with significantly lower odds of above-baseline diphtheria rates for the national set of evacuation districts (model 6). Unlike scarlet fever, however, the odds for neutral (model 7) districts are not significantly different to the referent, while reception districts display significantly higher odds (model 8). These principal features are captured by the plots of the annual OR for diphtheria in Figure 4, where sharp downward trends to significantly lower odds from 1942 to 1943 are also evident.

Logistic regression, II: regional analysis
The results of the regional analysis are distilled in Figure 5(a-c) (scarlet fever) and Fig. 5(d-f) (diphtheria). The maps identify, for the entire set of districts in each of the six regions, those regions with significantly higher and lower odds of above-baseline disease rates in EP-I, the evacuation lull 1942-1943 and EP-II. Figure 5a shows that EP-I was associated with: (i) a deflationary epidemiological effect which manifested Table 3. Summary results of logistic regression to determine the odds of above-baseline scarlet fever and diphtheria rates in the national sets of local government districts, England and Wales, World War II* as significantly lower odds of above-baseline scarlet fever rates in two of the primary evacuee source regions (North and South East); and (ii) a corresponding inflationary epidemiological effect which produced significantly higher odds of above-baseline disease rates in one of the primary reception regions (South West). The odds for the remaining regions (East, Midland and Wales) did not differ significantly from the pre-war period in this first phase of the dispersal. Figure 5b depicts a switch in the epidemiological pattern in 1942-1943 that manifested as significantly higher odds of above-baseline disease rates throughout the English regions. Finally, Figure 5c shows that EP-II was associated with a reversion to significantly lower odds in the South East, significantly higher odds in the South West and a general subsidence of odds to the pre-war level for all other English regions. The dominant regional effects in Figure 5(a-c) are highlighted in the corresponding plots of the annual OR in Figure 6. The graphs portray: (a) the collapse to significantly lower odds in the North and South East and a countervailing increase to significantly higher odds in the South West in EP-I; (b) the upturn in odds to a high peak in the South East and lesser peaks in the East, Midland and North regions in 1943; (c) the general downturn in odds in the English regions in EP-II; and (d) the apparent lack of any statistical signal for scarlet fever in Wales during the war years. These features are underscored by the bar charts which plot the number of districts with above-baseline disease rates (Δx it > 0) in excess of the mean annual number for the referent period (1931)(1932)(1933)(1934)(1935)(1936)(1937)(1938); zero values mark years with counts at, or below, the mean. So formed, the South West is singled out by non-zero scores in consecutive years of the war, indicative of a sustained wartime inflation in the number of districts with above-baseline disease rates in this region.

Diphtheria
In common with scarlet fever (Fig. 5a), diphtheria in EP-I was associated with a deflationary epidemiological effect with significantly lower odds in the South East (Fig. 5d). But, unlike scarlet fever, there is no evidence of a similar deflationary effect for diphtheria in the North. Elsewhere, an inflationary epidemiological effect is implied by the significantly higher odds of above-baseline diphtheria rates in a geographically expansive area of central and western England and Wales (Midland, South West and Wales regions). As the corresponding regional plots of the annual OR in Figure 7 show, this inflationary effect reached a maximum in the South West (OR 2·75, 95% CI 1·98-3·83) and Wales (OR 2·51, 95% CI 1·82-3·48) in 1941. For later time periods, Figures 5 and 7 show that the odds fall to, and below, the referent level in all six regions.

DISCUSSION
While the principal motivating factor for the British Government's wartime evacuation scheme was to alleviate the threat posed to the young and vulnerable by enemy bombs, epidemiologists found an additional justification for the dispersal in the disease risks of the anticipated air war [2,23]. Forewarning of the possible spread of infections in the overcrowded air raid shelters of London (Fig. 1), Greenwood [2] urged that the primary need was to reduce the shelter populations by evacuating women and children. Evacuation, however, posed its own disease risks and public health officials were alert to the possible spread of infectious agents among the young evacuees and, more especially, among their young counterparts in the reception areas [11]. Recognizing the provisional and inconclusive nature of wartime investigations into these latter epidemiological effects, the present study has sought to elucidate the impact of the evacuation on the underpinning geography of two common acute childhood diseases (scarlet fever and diphtheria) in the local government districts of England and Wales.
In his classic study of the first year of the war, Stocks [1] focused on the percentage deviation of disease rates from a pre-war baseline that was set as the second quarter of 1939. Sensitive to concerns over data quality, we have adopted an alternative approach with a binary classification of districts as above the disease rate (1) or otherwise (0) for an 8-year baseline period (1931)(1932)(1933)(1934)(1935)(1936)(1937)(1938). Our selection of the length of the baseline period was a balance between a sufficiently long interval to avoid the potentially skewing effects of single epidemic periods, and the need to circumvent the effects of any long-term trends in disease notification rates.
In his original investigation of the epidemiological impact of evacuation, Stocks pointed to the potential confounding effects of the regular epidemic cycles of the diseases under investigation. 'The trend of [disease] incidence in the country as a whole', Stocks observed, 'is affected by cyclical changes which affect the notification rates very considerably, apart from the wars and disturbances caused thereby' [1, p. 312]. For diphtheria, at least, the available evidence suggests that such confounding effects were limited by the wartime immunization programme. According to the Ministry of Health, the pre-war incidence of diphtheria showed 'a marked inclination to epidemicity every 5 to 7 years' [24, p. 23]. In the event, the immunization programme served to disrupt the established cycle of diphtheria activity and the anticipated epidemic of 1942-1943 did not materialize [24].
Our use of binary logistic regression, with the predictor formed in the time dimension and with the prewar period as the referent, has permitted a direct comparison of associations across diseases, time periods and geographical areas. One corollary of this analytical approach, which draws on disease reports for the same districts in sequential time periods, is the possible presence of temporal autocorrelation in the response variable [20]. We note here, however, that our use of aggregated (annual) disease data, with time-based predictors formed for periods 51 year, is likely to have reduced some of the more severe effects of autocorrelation in the regression residuals.
As compared to the pre-war years, our analysis has shown that EP-I was associated with significantly  , 1939-1947. The graphs are based on the results of logistic regression analysis using the 10-level predictor (X) variable and plot the odds ratio (OR) (circles) and associated 95% confidence intervals (lines); the pre-war years (1931)(1932)(1933)(1934)(1935)(1936)(1937)(1938) form the referent (OR = 1·00). ORs that are significantly different to 1·00 at the P = 0·05 level are represented by the solid circles and denote periods of significantly higher (OR > 1·00) and significantly lower (OR < 1·00) odds of above-baseline disease rates. Major phases of wartime evacuation [Evacuation Phase I (EP-I) and EP-II] are indicated for reference, as is the period of evacuee drift back associated with the first phase. lower odds of above-baseline scarlet fever and diphtheria rates in the national set of evacuation districts ( Table 3), indicative of a deflationary epidemiological effect in these areas. A similar deflationary effect is apparent in the national set of neutral districts for scarlet fever. These effects are consistent with the documented declines in infectious disease notifications in Birmingham, Liverpool, London, Manchester and other major towns and cities [25]. Contemporary observers attributed this to the attenuation of the school-age population through evacuation and the effects of emergency school closure on the children left behind [1,3,11,26]. The role of the latter was emphasized by Dr J. Alison Glover in his Presidential Address to the Section of Epidemiology and State Medicine, Royal Society of Medicine, on 5 April 1940: 'Even the much-deplored school closure in the evacuation and neutral areas helped to reduce the cases of diphtheria and scarlet fever by lessening the risk of school infection' [11, p. 411].
A noteworthy feature of the analysis in Table 3 and Figure 4c is the significantly higher odds of diphtheria in the national set of reception districts in EP-I. This feature is not mirrored by scarlet fever and may reflect: (a) the spread of a virulent (gravis) strain of diphtheria in some parts of the country at this time Maps of odds ratios (OR) for above-baseline disease rates in the standard regions of England and Wales by major phase of the evacuation scheme. The maps identify, for scarlet fever (a-c) and diphtheria (d-f), regions with significantly higher and significantly lower odds of above-baseline disease rates for Evacuation Phase I (EP-I) and EP-II. Maps for the inter-phase period (1942)(1943) are also shown. The pre-war years (1931)(1932)(1933)(1934)(1935)(1936)(1937)(1938) are formed as the referent (OR = 1·00) in all instances. The maps are based on the OR and associated P values for the entire set of districts (all evacuation scheme classes) in each region. Evacuation and neutral districts are re-plotted from Figure 2, while the vectors on panels (a) and (d) provide a general indication of the direction of movement of evacuees at the outset of the evacuation scheme. [3]; and (b) the over-diagnosis of the disease in newly arrived evacuees as a consequence of the concerns of anxious foster parents, teachers and domiciliary health workers [1]. As we have noted earlier, most notified cases of diphtheria were based on a clinical diagnosis and were not subject to bacteriological confirmation. As described by Russell [19], the Emergency Public Health Laboratory Service began the typing of diphtheria infections in 1940, although typing was not universally conducted until 1941. In that year, 8457 specimens (equivalent to 14% of diphtheria notifications) were typed by the Emergency Public Health Laboratory Centres. While gravis accounted for a relatively high proportion (>50%) of diphtheria infections in some central and northern areas, and a relatively low proportion (<20%) in parts of the South West and South Wales, there was no evident geographical association with the diphtheria case-fatality rate. Fig. 6. Annual odds ratios (OR) for above-baseline scarlet fever rates in the standard regions of England and Wales, 1939-1947. The graphs are based on the results of logistic regression analysis using the 10-level predictor (X) variable and plot the odds ratio (OR) (circles) and associated 95% confidence intervals (lines) for each region; the pre-war years (1931)(1932)(1933)(1934)(1935)(1936)(1937)(1938) form the referent (OR = 1·00) in all instances. The bar charts plot the number of districts with above-baseline disease rates (Δx it > 0) in excess of the mean annual number for the referent period (1931)(1932)(1933)(1934)(1935)(1936)(1937)(1938); zero values mark years with counts at, or below, the referent mean. Standard regions are mapped in Figure 2. See the caption to Figure 4 for other plotting conventions.
Geographically, our analysis has identified a marked regional effect for both scarlet fever and diphtheria in EP-I. For scarlet fever, two major source regions of evacuees (North and South East) had significantly lower odds of above-baseline disease rates, while one major reception region for evacuees from all parts of England (South West) had significantly higher odds of above-baseline disease rates (Figs 5a, 6). Diphtheria shares some of the salient features of scarlet fever (significantly lower odds, South East; significantly higher odds, South West), with the additional facet of significantly higher odds in the Midland and Wales regions (Figs 5d, 7). Any attempt to examine the correspondence between these geographical patterns and the population flux generated by the evacuation is complicated by the lack of longitudinal information on the population changes caused by the dispersal [1]. Recognizing this data limitation, Figure 8 uses estimates of evacuation-related population change, derived from the National Register [13] and relating to the first calendar month of EP-I (September 1939), as a proxy for the degree of involvement of geographical areas in the evacuation scheme. The graphs plot the regional OR (all districts) for scarlet fever and diphtheria against two proxy measures of regional population flux: (a) the net population change for each region, providing a measure of the population flux associated with inter-regional evacuee movements; (b) the mean population change of the constituent districts of each region, providing a measure of both inter-and intra-regional evacuee movements.
In both instances, the measures are formed as a percentage proportion of regional and district populations in the period preceding the onset of evacuation. In interpreting measure (b), we note that the universally positive values of the regional means in Figure 8(b, d) arise from the scaling effects that accrue from the inclusion of relatively small districts with relatively large and positive population increments in the computation of mean population change.
Although the small number of regional units in Figure 8 precludes statistical inference, visually there is a positive correspondence between the measures of population flux and the regional OR for both scarlet fever (Fig. 8a, b) and diphtheria (Fig. 8c, d). While this correspondence is consistent with an association between evacuation-related population change and reported levels of disease activity, additional analyses at finer geographical scales (sub-region or county) are required to verify these general observations.
A distinctive feature of Table 3 and Figures 3-6 is the sharp rebound in levels of scarlet fever in the English regions that followed EP-I. This rebound was especially pronounced in the national sets of evacuation and neutral districts (Fig. 4a, b) and in the East, Midland, North and South East regions (Fig. 6) where, in all instances, the OR reached a peak in 1943. While 1943 stands out as an epidemic year for scarlet fever in the national curve [16, p. 49], the focus of the most pronounced aspects of this rebound in evacuation and neutral districts (Figs 3a, 4) merits further investigation as a possible corollary of the return of many evacuees to the major towns and cities.
For scarlet fever, we have shown that EP-II marked a partial reversion to the regional pattern observed for EP-I, with significantly lower odds of above-baseline disease activity in the South East, significantly higher odds in the South West, and a general subsidence of odds to the referent level for all other English regions (Fig. 5c). This spatial pattern corresponds with the shifting geographical locus of evacuation activities and, in particular, the concerns generated by the V-1 Fig. 8. Odds ratios (OR) for above-baseline disease rates in relation to estimates of evacuation-related population change in the regions of England and Wales, Evacuation Phase I (EP-I). The OR for each of the six regions (all districts) are plotted for scarlet fever (a, b) and diphtheria (c, d) against the two measures of evacuation-related population change defined in the text. ORs that are significantly different to 1·00 (P = 0·05 level) are represented by the solid circles and denote periods of significantly higher (OR > 1·00) and significantly lower (OR < 1·00) odds of above-baseline disease rates. bomb attacks on London and the South East [5]. In contrast to EP-I, the North was largely unaffected by this second phase of the dispersal and levels of disease activity approximated the referent level.
The latter part of EP-I coincided with the launch of the Ministry of Health's diphtheria immunization campaign in the winter of 1940-1941 [16, pp. 44-48]. While the effects of the inaugural year of the campaign on the results reported in this paper are difficult to decipher, major gains from the immunization campaign soon followed. The immunization coverage of children aged <15 years in England and Wales had reached almost 50% by the end of 1942, rising to 62% by the end of 1945 [27]. The dramatic impact of these developments on the odds ratios in Figures  4, 5, and 7 is evident.
An interesting feature of the regional analysis for scarlet fever is the apparent lack of any statistical signal for Wales during the war years. To account for this observation, which contrasts with the evidence for significantly higher odds of above-baseline diphtheria rates in EP-I, we note that inflated levels of scarlet fever were recorded in many Welsh counties in the years immediately preceding the evacuation. The pre-war maxima in scarlet fever notifications were recorded in 1936 (Denbigh and Pembroke), 1937 (Brecknock, Caernarvon and Merioneth) and 1938 (Anglesey, Cardigan, Glamorgan and Monmouth) with, presumably, a corresponding rise in levels of acquired immunity that would offer protection against any chance importations of scarlet fever by evacuees.
In addition to the data limitations noted earlier, two further data-related issues merit comment here. First, the wartime publications of the Registrar-General do not include age-specific case data at the level of individual districts, thereby precluding an age-adjusted analysis in the present study. Second, our use of the corrected (annual) disease notifications in the Registrar-General's Statistical Review reflects concerns over both the accuracy and completeness of the provisional notifications in the Weekly and Quarterly Returns and the complexities engendered by the seasonal vicissitudes of the diseases under examination [1]. A corollary of our use of annual data is that the initial year of EP-I (1939) includes an 8-month interval (January-August) that preceded the onset of the Government's evacuation scheme. Likewise, the final year of EP-II (1945) includes a 6-month interval (July-December) that followed the official end of the Government's evacuation scheme. While the expected effect of the inclusion of these 'additional' months would be to dampen the evacuation-related signal in the analysis presented, we note that private evacuation from the major towns and cities extended beyond the time-frame of the official scheme (September 1939-June 1945) and this factor will have been captured in our results [13].