Time management

10 May 2021



It’s been decades since developers realised workflows dominated by hours of repetitive pipetting were market opportunities for pipetting robots, but automation has not impacted all clinical laboratories equally. Now, as even complex, heterogeneous microbiology begins to reap the benefits, Isabel Ellis asks Felix Lenk, head of the SmartLab systems research group at TU Dresden, and Nate Ledeboer, medical director for clinical microbiology and molecular diagnostics at the Medical College of Wisconsin, how ‘total’ automation might reinvent laboratory medicine.


Felix Lenk, the head of the SmartLab systems research group at TU Dresden, makes clinical laboratories sound like anxiety dreams. On crossing the threshold into one, he’s transported back to the early 1990s, to a world “where everything was put on paper”. All around him, spectacularly overqualified technicians squeeze pipettes, prepare samples and funnel data between impassive machines. It seems there’s no end to the drudgery. “You press a button on a machine and it does some small step in a large process, and then you even write down the result before taking the sample to the next device,” he winces. “Everything is very manual, very traditional and very limited.” Then everyone goes home, and nothing happens at all.

Coming from an automation engineer, Lenk’s reaction is more allergic than most. “If I see a process being done more than once,” he admits, “I think about automation.” But his aversion to inefficiency doesn’t mean he’s overstating the issue. In clinical microbiology, traditionally one of the hardest disciplines to automate because of the diversity of specimens, materials and containers it handles, technicians spend more of their time (32% according to one Clinics in Laboratory Medicine paper) devoted to manual tasks like plate streaking and broth medium inoculation than any other activity, while a further 10% of their working hours are taken up by simply transferring samples from one place to another.

That would be enough of an issue if microbiology labs were fully staffed. They’re not. The growing difficulty laboratory medicine faces in recruiting and retaining specialist staff suggests Lenk is not the only one having issues with the nature of the work. Between 2016 and 2018, according to the Journal of Clinical Microbiology, clinical laboratories in the US were hamstrung by a technician vacancy rate of over 7%, as several thousand staff left the profession and the number of new positions grew at double the rate of other occupations. Covid-19 hasn’t made things any easier.

“Automation has a quality component to it and, quite bluntly, it has a practical component – especially in the US and Canada, where we’re seeing massive shortages of trained medical technologists who work in laboratories,” says Nate Ledeboer, medical director for clinical microbiology and molecular diagnostics at the Medical College of Wisconsin. In 2014, Ledeboer’s lab became one of the first microbiology labs in the US to fully automate its front-end processing (medium selection, medium inoculation, plate streaking and the application of patient information and barcodes for tracking), incubation and plate imaging with the Copan WASPLab platform. Since then, it’s cut a day from clinical workflows, and learned how to field calls from physicians adamant that, as Ledeboer recalls, “You must have screwed up my patient’s results; there’s no possible way you could have got them back so fast.”

To borrow the terms Lenk uses to define the ‘smart’ lab of the future, what Ledeboer had actually done is begin to “bridge the gap between biology and technology”. In the clinical labs of Lenk’s unoptimised nightmares, that chasm is papered over by the wriststraining bustle and busywork of overstretched technologists, and quality suffers as a result. A recent study – also in the Journal of Clinical Microbiology – of the economic impacts of implementing total lab automation (TLA) platforms (either the Copan WASPLab or the BD Kiestra) in four US microbiology labs found productivity increases ranging from 18–93%, and cost-per-specimen reductions of 15–47%, resulting in approximate annual labour savings of between $268,000 and $1.2m. Across laboratory medicine, these systems are making the connections necessary for greater use of AI and machine learning in diagnostics, as well as enabling the integration of multiple specialities into a single hub. But the requirements for implementing TLA are just as profound as its potential impacts.

Embrace automation

“If you look at clinical laboratories, what they have is very sophisticated devices for each specific target, because there are good sales for smaller companies that have specific solutions for specific problems,” explains Lenk, “but these device solutions are not normally integrated into the infrastructure. They create a text file with the result, or they have a display saying: ‘Okay, you are ill.’ Now, the existing gap calls for a solution to link all these systems, to organise the workflow with less staff – perhaps even making it 24 hours – and to connect all these island solutions to form the big picture.”

Particularly in larger urban hospitals and more standardised disciplines, we can already begin to sketch that image. Microbiology, for all the difficulties it presents, is beginning to catch up with chemistry, where almost all samples are automatically verified in the laboratory information system without technologist involvement. Lenk also notes that blood tests and cultures have been highly automated for many years. That doesn’t mean lab staff, or even device manufacturers, are prepared for the changes that come with embracing automation and digitisation.

As Ledeboer explains, the hardware “nuts and bolts” of automation only become useful when labs redesign their processes to make use of them. He doesn’t downplay what that entails. “If you want your automation project to succeed, you can’t try to fit automation into your current workflow,” he says. “You really have to look at your workflows and blow them up, if you will, to make them much more suitable and much more compatible for an automated laboratory.”

To start, that actually means leaving the lab. Prospective automaters need to follow their specimens back to the source to adapt and standardise the way they are collected across the hospital or health system – swapping traditional cotton swabs for flocked swabs that are better able to release material into a liquid state, for instance. So begins a delicate journey of selfdiscovery. Doctors and nurses are likely to have a few questions about these new commandments, while a number of a clinical laboratory’s SOPs and best practices might have as much to do with the quirks and preferences of the people that have worked there as the quality of its outputs.

“We had a lot of resistance because every tech has a slight variation in how they manage their bench or how they pull their plates, and we can’t have that anymore when it comes to automation,” says Ledeboer, his words taking on an increasingly insistent rhythm. “Everybody has to do the same thing. You can’t build your automation to have a unique process for tech one, and a unique process for tech two.”

Benefits of an automated future

Since Ledeboer’s lab began its transformation before there was much data about the positive impact of automation on microbiology, he was forced to take a step-by-step approach. This involved starting with the most standardisable cultures (urine and surveillance swabs), and parlaying success in those areas into approval for the more difficult aspects of the implementation. Using automation to move from one-shift to three-shift plate readings for urine cut a day from the clinical workflow by ensuring results reached doctors in time for the early morning slot – when they review lab reports – rather than in the early afternoon. The drop in manual interventions also meant that cultures could be incubated under far more consistent conditions, which, when combined with high-resolution imaging, meant the lab was able to read urine plates after 16 hours rather than 24.

The impacts on surveillance and susceptibility testing more generally were just as impressive. “We really started to see better isolation of organisms,” says Ledeboer. “Our colonies were bigger, so we could do susceptibilities faster. Instead of having to hold the plate for 48 hours, we were seeing [answers] on the first read of the plate – again, because of much better imaging capabilities. We really had a turnaround benefit, a susceptibility turnaround benefit, and we had an improved isolation benefit that was attached to it.” Whether it means finding an infection, preventing transmission or identifying a treatment faster, all of these improvements have direct impacts on patient outcomes.

32%

Proportion of time technicians in clinical microbiology labs spend on manual tasks.

Clinics in Laboratory Medicine

7%

Proportion of unfilled lab technician roles in the US between 2016 and 2018.

$1.2m

Upper limit of labour savings that implementation of TLA platforms could bring to US microbiology labs.

Journal of Clinical Microbiology

229

Hours that the Medical College of Wisconsin could save on testing urine samples every year by using digital imaging technology and AI to autoreport negative specimens.

Journal of Clinical Microbiology

“If we can just remove the variation we see between your eyes and my eyes when we both look at a slide, we can be much more consistent at screening for rare events.”

Nate Ledeboer

Now, with so much evidence to justify the initial capital outlay and change in behaviour, Ledeboer notes how it’s much easier for a laboratory to “jump feet first” into automation and update all of its workflows at once. Nonetheless, Lenk, for all his enthusiasm, is careful to break the process of digitally transforming a lab into five tiers. The first, which simply requires the use of devices that deliver data through sensors, is ubiquitous. In the fifth tier, all of those sensors are networked together and feed into a data lake that is structured, contextualised and interpreted for technicians by machine learning and AI tools. Labs that can do that are vanishingly rare. But Lenk doesn’t blame laboratories – which he feels are generally stuck in tier three, feeding data into a unified storage device but struggling to make use of it thereafter – for failing to reach that point. Device and software developers need to work out how to standardise their offerings too. Without shared interfaces for data transfer, digitally transformed lab staff will have to do as much to manage virtual samples as they once did for physical ones.

“The typical business of a lab supplier is just delivering the devices,” he explains. “They say, ‘Okay, you are the researcher, do what you want.’ This gap needs to also be filled right now because all the devices are getting so complicated and specific, [and] somebody has to advise the researchers how to connect these devices.” As his research group at TU Dresden is a hub for smart lab research and implementation in Europe, Lenk has had as much work convincing developers to adapt as Ledeboer has with technologists. “Most of the vendors are small and medium-sized enterprises,” he continues. “So, you’re usually talking to the CEO and founder of the company, who says, ‘We have been doing this for the last 20 years – why do we need to adapt now?’ Not everybody takes these discussions light-heartedly.” Previously, Lenk had to struggle to get developers to adapt to both OPC Unified Architecture and standardisation in lab automation (SiLA) consortium standards. Over the past year, the two have cohered into an ecosystem whereby the first is used for devices that communicate primarily with other devices, and the latter for devices that interact more with users, making things considerably easier.

Measurable results

Since implementing its automation system, Ledeboer’s research team has focused on using built-in digital imaging capabilities and AI algorithms to achieve some of the loftier goals of digital transformation – even turning the job of plate reading over to its machines. His recent Journal of Clinical Microbiology paper on using segregation software to automatically analyse urine cultures in commonly-used 5% sheep’s blood and MacConkey agars found it had an overall sensitivity of 99.8% and a specificity of 72% – making it highly reliable for batch or autoreporting negative specimens. With 43.3% of all specimens registering as negative to both the algorithm and technologists, that could mean 55,000 fewer urine cultures in need of individual technologist review every year in Ledeboer’s lab – a saving of approximately 229 hours.

Although Ledeboer’s study did not explicitly address variations in how different technologists interpreted cultures, there’s a growing body of evidence suggesting that machine-learning tools, which don’t get tired or distracted, can be more consistent than lab staff. It’s a possibility Ledeboer intends to pursue. “If we can just remove the variation we see between your eyes and my eyes when we both look at a slide, we can be much more consistent at screening for rare events,” he says. “There’s a huge opportunity to improve quality of care and improve consistency in how we report things.” That said, regulators don’t have to approve whatever it is about the human mind that might make us see things differently. They will have to develop a unified understanding of machine learning – which will not leave our brains the way it found them.



Privacy Policy
We have updated our privacy policy. In the latest update it explains what cookies are and how we use them on our site. To learn more about cookies and their benefits, please view our privacy policy. Please be aware that parts of this site will not function correctly if you disable cookies. By continuing to use this site, you consent to our use of cookies in accordance with our privacy policy unless you have disabled them.