Process, Staff & Capacity iterative modelling
Following the establishment of the facility in May 2020, the focus turned to enhancing our efficiency and effectiveness of operation. The critical first step in this journey was to establish and agree Key Performance Indicators (KPI) setting a baseline against which to measure performance. These KPI focused on four areas:
- Quality: In-Process (IP) void rate <=0.5% of samples voided through process errors in the laboratory
- Capacity: The ability to process up to 22,000 samples in a 24-hour period
- Turn Around Time (TAT): >80% of samples having results reported within 24 hours of bio-sampling (in response to the target set by the UK Prime Minister)7,14.
- Safety: No reportable Safety, Health and Environmental incidents of any description
In comparison to other testing facilities within the Lighthouse Laboratory Network and globally9,10, the CCTC had a relatively constrained footprint of separate rooms within an already operational laboratory facility. With this restricted footprint, process modelling was essential to correctly ascertain the optimal number of automated platforms and staffing deployment to deliver the workflow at each station across the facility (Fig. 1), first basing the model on best estimates. As the laboratory process matured over the first months of operation, the model was refined through the feedback of real-world empirical data, building in more rulesets that served to highlight weaknesses in the logic and iteratively improved the predictive power of the model15. The initial CCTC laboratory process was able to achieve a capacity of more than 10,000 samples/day. Over the summer months in 2020, the CCTC strived to double its daily capacity to 22,000 samples/day. We were however conscious that simply adding more staff would not be the most efficient solution to the problem of delivering against our four KPI. Social distancing, both within the laboratory and the wider site, was crucial to maintain, in addition to the understanding that once a certain team size is reached, the addition of further resource can make processes less efficient due to sub-optimal communication and reporting16,17.
Our modelling determined that for continuous CCTC operation, the optimal staffing should be as described in Table 1a and equipped with 17 class-II Biological Safety Cabinets (BSC), 11 Beckman Coulter Biomek liquid handlers (i5/i7), and 9 Roche RT-qPCR Light Cycler®480 II instruments. We initially deployed two nine-hour shifts of ~40 staff, across all days of the week, however after refining our model based on the changes in sample delivery regime to a ‘start/stop’ model, we calculated performance would achieve an average of 17,000 samples/day with a maximum of 20,000 samples/day. This difference from our target capacity was tightly linked to the assumption within our initial modelling that a constant flow of samples would be maintained, providing an endless stream of work during operating hours. In reality, this was rarely achieved due to the varying time of sample delivery to the lab; for example, Testing Sites would perform a large amount of bio-sampling towards the end of each day, and therefore large consignments of samples would be received by the lab at the end of the evening shift, leaving no time for them to be fully processed during that working day.
In our focus on efficiency, we exploited our modelling to identify bottlenecks in our laboratory process and strategically implement improvements on that process. The introduction of a night shift allowed a 24-hour operation that avoided in-process samples being held overnight, and therefore our process model could be adapted back from ‘start/stop’ to ‘continuous’ – now predicting a maximum operating capacity of 24,000 samples/day (Table 1b).
The modelling continued to highlight a major bottleneck in the process at the RNA extraction step driven by the fixed number of liquid-handling robots in the RNA extraction lab. Removing the requirement for RNA extraction altogether would both reduce the laboratory footprint and make the process more economical, transferring the bottleneck to the labour-intensive step of removing secondary packaging within a BSC. A theoretical removal of the requirement for BSC containment at the stage of secondary packaging, allowing this to occur on the open bench, was predicted to expand our capacity to an average of 29,000 samples/day. Intrinsically linked to capacity the theoretical TAT of a sample was calculated as: 3 h 50 min - 5 h 10 min. However empirical data on timings gathered through the initial phase of the CCTC operation showed that our mean end-to-end laboratory processing time was in fact 8 h 35 min.
To address both the capacity and TAT bottlenecks, our technology development focussed towards two key innovations:
1) The removal of RNA extraction (so called Direct to PCR; D2PCR) to create a more economical and efficient process whilst reducing the laboratory footprint.
2) Heat Inactivation of viable samples upon receipt prior to entry into the laboratory environment to circumvent the requirement of BSC containment at the point of secondary packaging removal18.
We describe the validation of Heat Inactivation of viral samples at scale, within a separate manuscript currently under preparation, and it is not discussed further here.
Direct to PCR (D2PCR)
Alongside the introduction of Heat Inactivation, we also explored experimentally the scope for a Direct to PCR assay (D2PCR) that removed the requirement for RNA extraction shown through our modelling to be a capacity and rate-limiting step due to the physical laboratory space available to accommodate the required liquid-handling robotic platforms and the long run-time of the protocol (70 – 115 min).
The RNA extraction step of the COVID-19 testing workflow achieves two purposes: first the Guanidinium Isothiocyanate (GITC) content of the RNA extraction lysis buffer serves to inactivate potentially viable virus. At the time of establishing the CCTC, GITC-mediated virus inactivation was the most well understood, and therefore the preferred method19,20. Second, RNA extraction purifies and concentrates viral RNA in advance of RT-qPCR detection. However, advances in RT-qPCR reagents lead to the possibility of performing RT-qPCR directly on crude samples, and when coupled with Heat Inactivation, the very real possibility of removing any RNA extraction step completely. In our workflow, the omission of RNA extraction was calculated to reduce the laboratory TAT on samples by 2 hours, whilst simultaneously increasing overall capacity of the laboratory by repurposing staff and facilities into sample receipt and preparation. This D2PCR approach has other significant advantages, in particular a reduction in the use of laboratory consumables, including a 50% reduction in the number of pipette tips, further reducing the cost of the assay, reducing waste and at a point when the global supply-chain for reagents, labware, and equipment could not keep up with demand, facilitating centre operation. Further to this, RNA extraction methods use large amounts of solvents that require bespoke storage and disposal. Use of D2PCR for detection of COVID-19 has been previously demonstrated21,22,23. Herein we present a clinically approved high-throughput methodology, developed using the Genesig® Real Time PCR COVID-19 High Throughput HT-CE kit V2.0 containing an optimised buffer formulation which overcomes sample-mediated PCR inhibition.
Validation of the D2PCR process for clinical testing was carried out as described in the methods, comparing the D2PCR method directly with the standard RNA extraction-based protocol. All samples with a positive result in the standard assay, with a Cq value of 33 or lower, tested positive using the direct to PCR assay (Fig. 2a, 2b) with a concordance rate of 100%. For weaker positive samples with Cq values of between 33 and 36, the concordance rate was 52.6%, while very weak positives (Cq > 36 in standard assay) were mostly not detected (6.25% positive to positive detection rate) (Fig. 2b). This shift in the limit of detection was expected based on the D2PCR using 4-fold less RNA input than the standard assay (due to the lack of concentration effect from RNA extraction), as well as some likely impact of interference on PCR efficiency from the crude sample matrix. The significance of individuals with high Cq positive results within wider public health response is a matter of current debate, however it is likely that this is reflective of low-level viral RNA relating to individuals late in their course of infection, even when they are no longer infectious to others24. Data were reviewed by our Clinical Lead and wider Public Health England boards, where it was agreed that the reduced sensitivity at extremely low viral-load levels was acceptable and the D2PCR methodology was formally approved for clinical sample testing.
Beyond the benefits of cost, reagents, footprint, and waste reduction we assessed the effect that the D2PCR method would have on TAT in our operational laboratory. When we examined the laboratory TAT in this pilot study, we found the samples had a median time to completion of 3 h 32 min. When compared against all other samples processed in the same month (March 2021) using the standard laboratory process, we found this represented a median time saving of 1 h 52 min (Fig. 2c). As mentioned above, we have also developed and deployed a method for Heat Inactivation of samples before they enter the lab. The combination of D2PCR with Heat Inactivation led to a further median in lab time saving of 33 min (Fig. 2d).
Exploiting the Operational informatics
Across the Lighthouse Laboratory Network, the end-to-end laboratory process was supported by a Laboratory Information Management System (LIMS) that provided the backbone of data management within the labs. A LIMS is fundamental to management of data flow within a testing laboratory such as the CCTC dealing with several thousand samples per day - mapping the lifecycle of individual patient samples as they progress through the physical laboratory process. As patient samples undergo transformation and compression from individual vials to multiwell microtitre plates, onwards through plate-plate transfers, the LIMS records that lineage and captures various timestamps throughout the process (Fig. 1). These timestamps are not only imperative to the detailed tracking of individual samples based on an anonymised barcode, but also provide a rich data set with which to view performance of the process in a real-time fashion. However due to the nature of the LIMS environment and requirement to ensure change-control was centralised across the lab network, the ability to be agile with development of aligned local IT tools to exploit the data was crucial.
The combination of a core Customisable Off The Shelf product aligned with associated tools developed in an agile methodology to bring immediate benefit in exploitation of operational data is well proven to deliver results quickly25. To this end we targeted two user bases who we thought would be best placed to interact with these data – delivering tools appropriate for each (Supplementary Fig. S2). Firstly, we provided the laboratory management team with data regarding past performance to examine areas for improvement (Centre Performance Overview tool & Shift Lead dashboard; Supplementary Fig. S3-S4). Secondly, we provided the scientists in the laboratory with dashboards to enable real-time feedback on performance against key performance indicators (Supplementary Fig. S5).
Retrospective and real-time operational data (informatics dashboards)
The Centre Performance Overview tool provides a retrospective view of the laboratory TAT, broken down by station and time of day, with multiple interactive methods of viewing the data. Visualisation of where and when samples were being delayed focussed our attention, enabling adoption of working practices aimed at reducing any bottleneck. Key process inefficiencies were quickly identified at the handovers between stations and shifts, which could be addressed through process change without requiring significant modification to the SOPs for the individual workstations. Visualising TAT data in this fashion also highlighted the importance of maintaining staff levels at defined minimum numbers in certain teams to avoid new process bottlenecks arising – ensuring that the CCTC management team could work with operational Shift Leads to rebalance resource appropriately. Viewing the flow of data through the centre in this holistic fashion also enabled informed discussion with the upstream Department of Health & Social Care (DHSC) logistics teams around optimal sample delivery schedules to achieve the best TAT.
To complement the retrospective executive view, it was crucial to provide real-time, non-interactive dashboards, providing quantitative feedback to the teams on each shift regarding their performance in real-time. This approach has previously been documented for the receipt of samples and result reporting at an in-house hospital diagnostic facility, with operational improvements made considering this visualisation of data, but our efforts concentrated on the laboratory process26. Information was broken down for each station in three streams (Supplementary Fig. S4-S5):
- The incoming workload from the previous station to prepare reagents and equipment.
- A real-time view of the workload at the station, where plates experiencing a delay beyond expected process time are highlighted in red.
- A 24-hour analysis of the day’s performance, allowing instant feedback.
The visualisation of current workflow was particularly important in stations containing automated platforms, where dashboards were configured to highlight automation end-times so that plates/data could be expedited to the next step. These tools were specifically designed to ensure completed plates were swiftly moved to the next stage, striving towards a steady flow through the lab.
To investigate any effect on TAT through use of these data management tools, we monitored the time a sample plate spent within Sample Preparation before and after implementation. Here we observed a notable decrease in the time a sample spent at this stage within a few days of the introduction (Fig. 3) and an overall significant reduction was observed across the time points studied with a median reduction of 8.46 min (10.6% improvement). In particular, the number of plates spending over three hours in Sample Preparation were substantially reduced through introduction of this tool, indicating that staff are not necessarily working at a higher speed, but rather that delayed plates are being identified and expedited, thus reducing the variance in time spent at this station.
Reviewing CCTC performance
To review performance of the CCTC against our established KPI, we plotted the seven-day rolling mean of the process timing data collected to quantify our progress (Fig. 4). The laboratory TAT understandably had a direct relation to the number of samples processed, however, after implementing the strategies described in our paper (minus D2PCR) the CCTC sustained a high workload with peaks in both January 2021 and March 2021 without an aligned detrimental effect on TAT. Indeed, our mean TAT for March - April 2021 was below 6 h, in line with the theoretical time for the process of 3 h 50 min – 5 h 10 min (Fig. 1). This focus on exploitation of our operational data to continually drive efficiency of process has led to the CCTC consistently achieving its KPI of >80% of samples processed within 24 h (achieved on >73% of days in 2021). Heat Inactivation upon receipt was formally adopted into the CCTC process in early February 2021 and quickly showed positive impact by helping to smooth the flow of samples from receipt into the lab – along with the other advantages described earlier.
Whilst our efforts in reducing laboratory TAT had an observable impact, this would be counter-productive if these improvements were detrimental to quality. The seven-day rolling mean of the centre’s In-Process (IP) voids is plotted in Fig. 4c and shows that there has been no increase in IP voids throughout our drive against efficiency and effectiveness of process (the average rate remaining below our target KPI at 0.45%).