In this paper, we estimated disease state durations and transmission parameters from data presented in de Carvalho Ferreira et al. (2013) to update ASF within herd transmission models used to inform surveillance and risk assessment. The estimated parameters were used in an individual based stochastic disease transmission model to predict the time to detect ASF in a herd via increased morbidity or mortality. Although de Carvalho Ferreira et al. (2013) provided transmission parameter and infectious period estimates, other simulation model parameters, such as statistical distributions for the time to onset of clinical signs and latently infected period, were not estimated previously. A key model extension is the inclusion of separate infectious periods for pigs that recover and those that die to capture the time to death accurately. Experimental data for moderately virulent ASFV strains indicates that the infected pigs either die during the acute infection phase or recover from clinical disease and continue to shed at lower levels via oropharyngeal fluid for an extended period [4, 7].
Our simulation results indicate that it may take more than two weeks post-exposure to detect moderately virulent ASF under most mortality or morbidity trigger thresholds evaluated. One of the factors contributing to an extended time to detection is the relatively long latently infected period at the individual pig level (mean 4.5 days), which results in relatively slower transmission during the initial stages of the herd infection. For instance, only 2.3 (95% P.I., 1–8) pigs were infectious and 1.1 (95% P.I., 1–3) pigs had mild clinical signs at 8 days post exposure under the baseline scenario. Nonetheless, ASF infection in the herd was detected via clinical signs and increased mortality in almost all of the simulation iterations. This is because the force of infection and the incidence of new cases eventually picks up as the number of infectious pigs increases resulting in rapidly increasing morbidity and mortality during an exponential transmission phase. The results on the predicted time to detection are beneficial in outbreak response planning and to inform between premises transmission models used to evaluate regional epidemiological outcomes. In addition, the results provide baseline predictions to evaluate the relative performance of additional active or passive surveillance protocols.
Risk managers would need to consider the trade-off between earlier detection and excessive false triggers while choosing the appropriate morbidity and mortality triggers for detection via daily numbers of dead pigs or pigs with clinical signs of disease. The results for the daily mortality trigger in Fig. 3 show a rapid increase in the false trigger rate as the threshold was reduced below 4 dead per 1000 pigs in both the baseline and slow spread scenarios. For example, the false trigger rate increased from 0.5–1.3% when the daily mortality trigger threshold was decreased from 5 per 1000 to 4 per 1000 pigs. A similar relationship can be observed in the weekly mortality trigger results given in Fig. 4 with the inflection point occurring at a trigger threshold of about 15 per 1000.
We used swine industry expert opinion to estimate the frequency of mild clinical signs as production data on this aspect were not available. Swine industry experts indicated that the average proportion of pigs with mild clinical signs would range from 0.5–2.0 percent while 4.0–4.5 percent of the herd represents a higher value under routine production. In addition, a morbidity trigger threshold of 9% was suggested as a conservative criterion for an abnormally high proportion of pigs with mild clinical signs (i.e., this morbidity trigger threshold would be expected to have a very low false trigger rate). The predicted time to detection with a morbidity trigger threshold of 9% (mean 20; 95% P.I., 16–25 days under the baseline scenario) was shorter compared to the time to detect via daily mortality trigger threshold of 5 per 1000. Even though mild clinical signs are non-specific, they occur earlier in the course of ASFV infection and result in earlier detection at the herd level. Data on severe clinical signs during routine production were not available and hence the false trigger rate could not be calculated for severe clinical signs. However, swine industry subject matter experts opined that one percent of the herd would represent a conservative threshold as severe signs occur at a much lower frequency during routine production.
The time to detection via increased mortality from our study (22, 95% P.I., 17–28 days with a daily mortality trigger threshold of 5 per 1000 pigs and baseline contact rate) is similar to Halasa et al., 2016 (median 21–28 days); however, the predicted time to detection from our study is substantially longer than the values in Faverjon et al., 2020 based on room-level mortality thresholds (8 days) [14, 15]. Possible factors contributing to the relatively shorter time to detection in Faverjon et al., 2020 include 1) using highly virulent ASF strain characteristics, 2) starting disease transmission simulations with an infectious pig in contrast with initiating transmission with an exposed pig as in this study and 3) employing lower trigger thresholds for increased mortality. The infectious period for pigs dying due to ASF for moderately virulent strains of 8.3 (95% P.I.,3.9,14.3) days was substantially longer than that for highly virulent strains used in Faverjon et al., 2020. Furthermore, the false trigger frequency was higher in this study due to the greater variability in the mortality data. The mean weekly mortality among the 248 different herds varied considerably (5th and 95th percentiles 0.6 and 6.7 per 1000 respectively). In addition, some herds had a significant positive autocorrelation in the mortality for different weeks at 1 or 2 lags.
Active surveillance via rRT-PCR testing is a key outbreak measure for early detection of ASF. The currently proposed active surveillance protocols for an ASF Control Area are often based on targeted sampling of sick and dead pigs [16]. Several articles report the aggregate clinical score based on the degree of different types of clinical signs to capture the progression of clinical signs [2, 11]. However, it is not straightforward to model the distribution of the clinical score at the herd level to inform simulation models used for surveillance design. The parameters for the time to onset and duration of clinical signs estimated from individual pig level clinical signs are more directly applicable in disease transmission and surveillance models.
There are several alternative approaches to estimate transmission parameters from experimental data. Several studies have used a reconstruction of the transmission process in conjunction with Generalized Linear Models or maximum likelihood estimation to estimate the transmission parameters [7]. However, the reconstruction process requires an important simplifying assumption of deterministic and integer disease state durations. Recently, some studies have used Markov Chain Monte Carlo (MCMC) methods to jointly estimate the disease state durations and the transmission parameters [17]. While this approach has fewer approximations, it necessitates including additional variables for the unobserved disease state transition times and may require a longer computer run time for convergence in some instances [17]. We utilized an acceptance rejection-based approximate Bayesian computation algorithm to estimate the transmission parameter from experimental data. This method enables the consideration of the impact of the variability in infectious state durations and the latent period in the estimation. In addition, the algorithm is easily parallelizable and enables the efficient use of high performance resources.
Although the experimental data used in this study was focused on genotype I moderately virulent ASFV strains, mortality percent for Genotype II moderately virulent strains from Estonia (50%) was in a similar range as the disease mortality percent in the current study (95% P.I., 24% − 57%) [2]. Moreover the time to detection via increased daily mortality is fairly insensitive to the ASF disease mortality percent used as the input in the model. For instance, the mean time to detection only decreased from 22.4 to 20.6 days when disease mortality percent was increased to 90% in an additional scenario. Therefore it is possible that the time to detection for genotype II moderately virulent strains has a range similar to that for genotype I strains in this study, although further evaluation may be necessary.
There are some limitations that must be considered while interpreting the study results. We assumed a constant transmission rate even though the level of shedding is possibly reduced beyond 30 days post infection after the pigs have recovered from the acute infection phase. However, the potentially reduced transmission rate from recovered pigs would arguably have a lesser impact on the time to detection via clinical signs, which would mostly depend on the transmission dynamics during the initial stage of herd infection [18]. For example, the time to detection via increased daily mortality under the fast spread scenario remained virtually unchanged even when the infectious period for pigs that recover was reduced to 25 days in an alternative scenario. Several pigs in de Carvalho Ferreira et al.(2013) were intermittently shedding above 1.92 TCID50 per swab after recovering from acute ASF infection.
The transmission model in this study assumed homogenous transmission within herd as experimental data used did not include the heterogeneous transmission rates within and between animal sub units such as pens or rooms. This may be particularly relevant for large swine operations with multiple barns and rooms within a barn. Evaluating the time to detection for moderately virulent strains using a heterogenous transmission model which incorporates the premises and barn structure is important an area for future study.