Spatial characteristics of spring phenological development and freeze damage days
Before assessing the impact of spring freeze damage on cherries in each phenological stage, we first examined the climatology of the onset and end of the spring season, as well as the length of spring, defined based on phenological stages. The beginning date of the second phenological stage, "side green," was used to indicate the onset of spring, and the beginning date of the eighth phenological stage, "full bloom," was used to indicate the end of spring. Figure 2 shows the mean, the interannual standard deviation, and the temporal trend over the study period for the side green dates, the full bloom dates, and their difference, which represents the spring duration.
In general, the mean dates of side green (Fig. 2a) display a latitude-dependent pattern, ranging from approximately Julian Day 80 in the southern sections of the study domain to Julian Day 130 in the north, likely in direct response to mean temperatures. The interannual variations of the side green date (Fig. 2b), as measured by the standard deviations, are larger in the northern parts of the domain than in the southern parts, with maximum standard deviations of over 12 days in northern Michigan and Wisconsin versus minimum values of less than 7 days in the south. Local maxima in both mean and standard deviation are noticeable in the Black Hills region of South Dakota and sections of the Appalachian Mountain Range from Pennsylvania to North Carolina, suggesting that the onset of side green (and early spring vegetative development) is more variable at higher elevation locations. Trends in the date of side green vary in sign, with positive (later with time) trends in the northern Plains, the upper Midwest, and portions of the middle Mississippi and Ohio Valleys, and negative (earlier with time) trends across the rest of the domain. The only significant trends are found in the Southern Great Plains, with a 0.3 days per year advance toward earlier dates over the study period.
The patterns of the means and standard deviations of full bloom dates (Figs. 2d, e) across the domain are very similar to those of the side green dates. The means generally vary from approximately Julian Day 90 in the southern sections to Julian Day 160 in the northern sections. The highest standard deviations, around 10 days, are observed in the northern sections of the domain, with minimum values of around 4 days in the south. Although both phenological stages are based on temperature-derived GDD totals, the trend pattern for the full bloom dates (Fig. 2f) is somewhat different from that of the side green dates. The positive trends over the lower Ohio Valley in the side green dates are replaced by negative trends. Statistically significant trends appear not only in the Southern Great Plains, as in the side green dates, but also in areas of the Appalachian Mountains. Positive trends at the northern edge of the domain also extend southward to cover most of the Northern Great Plains, and these trends appear to be greater in magnitude.
The mean values of spring duration, estimated using the difference in days between full bloom and side green dates, range from 12 to 30 days across the study domain (Fig. 2g), with standard deviation values of less than 7 days (Fig. 2h). Overall, the time between tart cherry side green and bloom is decreasing in the southern part of the domain while growing longer in the northern part (Fig. 2i). Decreases are statistically significant in the lower Ohio Valley and the Texas Panhandle area due to the combination of both later side green dates and earlier full bloom dates (Figs. 2c, f, i).
Next, we examine spring freeze damage in terms of the number of damage days and the fraction of remaining buds in a given year, with the mean, standard deviation, and trend over the 40-year study period shown in Fig. 3. At each grid point, the buds remaining fraction for each year starts at 100 percent (1.0) on day 305 (November 1st) of the previous fall. The daily damage percent value at a given point is determined by daily minimum temperatures relative to the threshold temperature for the phenological stage at that location during dormancy and through the growing season. If damage occurs, the buds remaining fraction on that day would be the previous day’s remaining fraction minus the daily damage percent value. Local minima of buds remaining fraction over the season occur in the Southern and Central Great Plains, northern sections of the Upper Midwest, and the Appalachian Mountains, where standard deviations are also higher. The highest remaining bud fractions and lowest standard deviations are observed across the central and eastern Great Lakes region, the Middle Mississippi Valley, and portions of the Mid-Atlantic regions. An inverse correlation between remaining bud fraction and damage days is evident across the domain: areas with high remaining bud fractions tend to have relatively few damage days, and vice versa. Areas with a greater frequency of damage (more than 4 days per year) are observed in the Southern Great Plains, the northern edge of the upper Midwest, and the Appalachian Mountains. Standard deviations of damage days are also relatively higher in these areas, suggesting large year-to-year variability in damage compared to other parts of the study domain.
Trends in remaining bud fractions and the number of damage days vary greatly across the domain. Significant increases in remaining bud fractions (1% per year or greater) are observed in the Ohio Valley and northern portions of the Upper Midwest, with significant decreases (1% per year or greater) across southwestern portions of the Great Plains. In contrast to the pattern of the buds remaining fraction, significant decreases in damage day occurrence (0.1 days per year) are observed over almost the entire Ohio Valley and the northern edges of the Upper Midwest and the Northern Great Plains, while significant increases are seen in the Southern Great Plains, particularly over western Oklahoma. Between the areas of greatest changes in the southern and northern sections of the domain are areas of relatively smaller, non-significant changes with general decreases in remaining bud fraction and increases in the number of damage days.
Given the changing vulnerability of tart cherry buds to freeze damage as the crop develops during the early growing season, it is important to consider the timing of the damage days. The total number of damage days for each simulated phenological stage over the years when freeze damage occurred in that particular stage within the 40-year study period is given in Fig. 4. It is worth noting that because the values are for damage-occurring years only, small values may be due to the fact that only a few of the 40 years experienced freeze damage. Collectively, the number of damage days decreases with increasing phenological stage, reflecting the synchrony of two underlying annual cycles: one linked with the rate of seasonal crop development (and early season air temperatures) with increasing vulnerability to cold damage as the crop develops, and the other with the climatic occurrence of freezing temperatures, which decrease both in frequency and severity as the season progresses.
In Stage 0 (dormancy), damage only occurs in far northern sections of the domain, as this is the only portion of the study area that experiences temperatures of -34.4°C or lower necessary to cause damage during that growth stage. Moving toward to Stage 2, most regions show summed damage days from zero to thirty days. Compared to other stages, Stage 2 shows the most areas with freezing damage, likely because fruit trees are more vulnerable in the early growing stages. From Stage 3 to Stage 8, the patterns and magnitudes are similar, with larger values across the western Southern Great Plains and the Appalachian Mountains. Stage 9 shows the least areas of freezing damage since fruit trees are more resistant at this matured stage. Determined by the daily minimum temperature, the damage days show a different pattern from that of the mean temperature (discussed in the discussion section). It is daily minimum temperature, not mean temperature, and its cumulative effects over time that ultimately determine the risk for freeze damage to tree fruits.
Regional characteristics of freeze damage
As shown above, there are considerable spatial variations in the frequency and severity of freeze damage across the study domain. To statistically quantify these regional differences, we divided the study domain into six subregions following the USA regional divisions in Karl and Knight (1998) (Fig. 1). For each subregion, we computed statistics for damage occurrence and severity. The results, displayed in Fig. 5, show the annual mean number of damage days (standard deviations) ranging from 0.74 (0.84) days per year in the VA-NC region to 2.28 (1.62) days per year in the Northern Great Plains across the six subregions.
Notably, the Upper Midwest and the NY-PA subregions experienced maximum damage in 2012, consistent with reported severe freeze damage to crops for that year (Labe et al., 2017; Kistner et al., 2018). Interannual variability, indicated by standard deviations, is greatest across western sections of the domain in the Northern and Southern Great Plains. Regarding trends in damage over the 40-year simulation period, four of the six subregions exhibit no significant trends. However, the Ohio Valley shows a significant decreasing trend of -0.3 freeze damage days per decade, consistent with previous findings (Easterling 2002). A similar declining trend and interannual variation pattern are observed in the VA-NC region, albeit with smaller damage day values compared to the Ohio Valley.
The annual mean numbers of damage days and damage severity values are depicted by region for each phenological growth stage in Fig. 6a. Results indicate that the most frequent and severe damage tends to occur during dormancy (Stage 0) and the first two vegetative growth stages (Stage 2 and Stage 3). Stage 2 emerges as the growth stage with the highest likelihood of significant damage occurrence, with severity generally decreasing with increasing growth stage. The Northern Great Plains and the Southern Great Plains exhibit more frequent damage occurrence but moderate severity values across all nine stages compared to other regions. These regions, along with the Upper Midwest, demonstrate general increasing interannual trends in both damage occurrence and severity over the study period. Conversely, NY-PA and VA-NC show upward trends in damage occurrence but downward trends in severity. The Ohio Valley experiences downward trends in both occurrence and severity, with the latter being significant from Stage 2 through Stage 8, indicating an overall decrease in freeze damage risk in recent decades.
Regional differences in damage frequency and severity may be attributed to variations in minimum temperatures associated with damage events, as shown in Fig. 6b by subregion. Damage events during dormancy are most common in the Upper Midwest and Northern Great Plains and least in VA-NC and the Southern Great Plains. For vegetative stages 2–9, mean damaging minimum temperatures generally increase from approximately − 6°C at Stage 2 to -3°C at Stage 9. Notably, damaging temperatures decrease throughout the 40 years in the Northern Great Plains, Southern Great Plains, and Upper Midwest, consistent with increasing freeze damage risks observed earlier. Conversely, significant increases in damaging temperatures are noted in the Ohio Valley, with non-significant increases in NY-PA and VA-NC. Overall, freeze damage events tend to occur earlier with time across most of the study region, except for the Ohio Valley and VA-NC, where certain stages trend toward later occurrences.
Air temperature climatology
An essential factor influencing springtime freeze damage is air temperature, with the relationships between temperature and damage occurrence and severity varying across crops’ phenological stages. Here, we explore the spatial patterns of air temperature and their associations with spring freeze damages. Figure 7 illustrates the mean values, standard deviations, and trends of springtime daily maximum and minimum temperatures from 1981 to 2020. The mean values of both daily minimum and maximum temperatures exhibit latitude- and terrain-dependent patterns, mirroring the spatial distribution of spring onset dates (Fig. 2).
Noticeably, regions such as the Black Hills and the Appalachian Mountains show lower temperatures and larger interannual standard deviations, aligning with the more frequent occurrence of freeze damage with high interannual variations in these areas (Fig. 3). Moreover, the Northern Great Plains exhibit higher interannual standard deviations of daily maximum and minimum temperatures compared to other regions, likely due to the region's strong continental climate character and the potential for contrasting warm and cold periods during the transitional spring season.
Warming trends are evident across most of the study domain for both maximum and minimum temperatures over time, with significant warming trends observed, particularly in many southern and eastern areas of the domain, reaching as high as 0.06ºC per year. However, notable exceptions include the northern Great Plains and portions of the Upper Midwest, where maximum temperatures show a downward trend over the study period. Relative changes are more pronounced and spatially widespread for minimum temperatures compared to maximum temperatures, consistent with earlier studies highlighting decreases in the daily temperature range over the record period (Angel et al., 2018).
The significant increases in daily minimum temperatures over the Ohio Valley correspond to the decreasing risk of freeze damage in the Valley over the recent four decades (1981–2020), aligning with the downward trend in springtime frost days reported by Easterling (2002) from 1948 to 1999. These findings underscore the complex relationship between air temperature dynamics and the occurrence and severity of spring freeze damages, emphasizing the importance of understanding temperature patterns for effective risk assessment and management strategies.
False spring occurrence
The notable warming trends observed across much of the study domain raise concerns about the potential for more frequent false springs characterized by unseasonably warm temperatures that prematurely bring crops out of dormancy, increasing their susceptibility to subsequent freeze damages. Despite the historical association of false springs with considerable crop damage (Kistner et al., 2018), research on their occurrence and trends over time remains limited. Peterson and Abatzoglou (2014) introduced a False Spring Exposure Index, measuring the likelihood of severe freeze damage occurrences following early crop development. However, this index overlooked early springs in its definition. Here, we propose a refined definition of false spring, incorporating the earlier relative timing of spring onset alongside freeze damages.
Specifically, false spring is considered to occur when the side green date, signifying the first major vegetative stage of development, falls within the earliest one-third of dates across the 40-year (1981–2020) study period, accompanied by at least one damage day thereafter. Figures 8a,b illustrate the spatial distribution of total false spring occurrences using this definition. Additionally, a more stringent criterion requiring at least three damage days post-early spring arrival is presented. Across the domain, false spring occurrences generally decrease from south to north, with peak frequencies observed in the Southern Great Plains and the lower Ohio Valley.
Adjusting the criterion from at least one damage day to three or more yields an overall reduction in occurrences, though not uniformly across the domain. Substantial reductions, particularly in the lower Ohio Valley and VA-NC regions, suggest a lower risk of severe freeze damage in these areas. In contrast, frequencies remain relatively high across the Southern Great Plains, indicating greater vulnerability to severe freeze damage.
The latitudinal pattern of false spring frequency appears more closely linked to the timing of side green dates than to the frequency of freeze damage events. Skewness values of side green dates, as shown in Fig. 8c, reveal a positive skew in the distribution in the south, indicating a higher occurrence of early-season warm temperatures and advanced phenological development. This correlates with higher frequencies of false springs in the Southern Great Plains and lower Ohio Valley. On the other hand, the Upper Midwest and NY-PA regions exhibit a highly negative skew suggesting fewer instances of early, advanced seasons and false springs. Overall, these results suggest that the earlier the onset of prolonged warm weather during the spring, the greater the risk of a false spring event.