The path to the autonomous lab: balancing innovation, culture, and compliance
Laboratories across the pharmaceutical industry are moving closer to full autonomy, but progress depends on more than technical capability. True transformation requires breaking down silos between systems, improving data interoperability, and addressing the human and regulatory factors that shape how labs operate.
Hansjoerg Haas, senior director and general manager of laboratory automation at Thermo Fisher Scientific, spoke with Discover Pharma about what’s really holding labs back, how automation is redefining data quality and reproducibility, and why culture and compliance remain just as critical as technology in the journey toward autonomous research.
Q: What do you see as the biggest barrier to labs becoming fully autonomous — is it technological, cultural, or regulatory?
A: Most people point straight to technological barriers, but the real roadblock goes deeper. Yes, old systems slow things down. Labs often work in silos, so connecting hardware, software, and data into one smooth flow is tough. In fact, 93% of leading pharma companies say the heterogeneity of their data makes it hard to share or use under the FAIR principle (findable, accessible, interoperable, and reusable).
But the bigger hurdles are cultural. Many lab technicians and scientists learned to work by hand, driven by their own ideas, and some worry automation could cost them their jobs. Scientific training rarely covers process improvement or how to integrate science with automation and emerging technologies (e.g., artificial intelligence (AI), digital twinning, etc.).
Regulatory rules add another layer. Automation in regulated labs is still new. We’re proud to say we’ve worked with pharmaceutical companies to help bring automation into regulated spaces, but automation in these settings requires its own set of standards. Key challenges include tracking every step, keeping records, checking results, and managing security with the right access. Software is only now moving toward GxP guidelines and regulations. Standardizing software for these environments takes time. Even though automation audit trails log every step, regulators often question its use because the test is performed differently compared to the established manual method. Equivalency testing has shown that automation almost always performs tasks better than humans.
So while technology grabs attention, the real work lies in shifting culture and meeting compliance needs. These changes mean rethinking how teams work and how labs are run.
Improving data quality and reproducibility
Q: Many automation tools are designed to improve efficiency, but how do you ensure they also enhance data quality and reproducibility rather than just speed?
A: It’s easy to think automation is only about speed, but that misses the point. Automation brings steady, reliable data because it repeats tasks the same way every time – unlike people. Studies show that more than 80% of process failures come from human error2.
Organizations must make data quality a core part of every system. First, teams should set clear Standard Operating Procedures (SOPs) and keep track of every change to protocols and instruments. Many systems have controls that warn users if something goes outside the expected range, reducing the risk of errors in the lab. Every step gets logged—who did it, when, and what happened—so you can trace every action. This level of tracking is hard to match with manual work.
Automation also helps cut down on handoffs. When a machine collects data, it should transfer it automatically, not rely on a person to move files. This change alone can boost accuracy, ensure data integrity, and promote full traceability. Systems can also help ensure the appropriate use of equipment, even in the case of an outage. For example, if a robotic setup has five high-performance liquid chromatography (HPLC) units and one fails, the software keeps working with the other four and adjusts the workflow to maximize output seamlessly.
Most important, automation grabs extra details through Internet of Things (IoT) sensors. For example, it records lab temperature, so you see if a cold morning or a failed air conditioner affects results. In one case, a system in Saudi Arabia gave different results because the lab temperature changed. Automation helps you spot these shifts and understand the real context behind your data.
Supporting scientists in evolving their skills
Q: Automation in pharma labs can be seen as a threat to traditional roles. How are you supporting scientists in evolving their skill sets alongside this technology?
A: The labs making the most progress take a more holistic view in how they approach technology integration, bringing IT, quality, and operations together. Ensuring this internal alignment from the first stage helps address any global or cultural differences at the outset, ensuring operations run much smoother and breaking down barriers between groups.
Instead of being seen as a threat, we find it’s more a question of upskilling and reskilling initiatives that companies are driving. We’ve been able to partner with many of our customers to offer technical training that reaches all audiences, from e-learning basics to hands-on operator and advanced courses. Some organizations even have everyone in the department complete basic training to build a shared understanding, including senior leaders, as they need to see how automation changes lab life, not just the daily work.
There’s also a shift toward data analytics and informatics. Labs need people who can handle large data sets, use laboratory information management systems (LIMS), and work with automated analysis tools. Some companies even reward staff for learning software skills, e.g., Python, before joining lab-of-the-future projects. Once people see they can learn these skills and move up, they get excited.
The most in-demand scientists now have both lab and digital backgrounds. While some roles may change, new jobs are opening up, e.g., in validation of automation — making sure automation works right and meets quality standards. The rise of in silico (computer-based) drug discovery, AI, and machine learning is also creating new career paths.
Safeguards for overnight automation
Q: When automation runs overnight or without human oversight, what safeguards are in place to ensure data integrity and prevent costly errors?
A: Research & Development labs have run automated systems overnight – and even for weeks or months at a time – for years. This is key in biologics, where some steps take much longer due to cell-based work and necessary incubation. Applying automation to bioanalytical testing (regulated environments) is the logical extension of this.
To ensure data integrity and prevent errors, we start by checking every input. Automation systems run boundary tests, and if something’s off, users get instant alerts on their phones or screens. We use visual cues too. For example, our Smart Handle technology on our robotic systems indicates the status that can guide user interaction. If they illuminate blue, then all is well; turn yellow if a problem is coming, such as running low on reagents, and turn red if a problem has occurred. The Smart Handles are touch sensitive and direct the user through processes like taking an instrument offline or undocking and docking carts between robotic systems.
For overnight runs, our systems have strategies to stay alive and overcome errors. Examples are to try again if a robot can’t grab a plate. If it still fails, it moves on and logs the issue—when it happened, what happened, and what else is affected. If one instrument in a pool fails, the system keeps going with the rest to ensure continued operation.
On the software side, security is built in. Role-based access means users can’t change methods unless they have the right permissions. The software blocks risky actions, whether by mistake or on purpose. Running a smooth process is easy—what matters most is how well the system handles surprises. Good error recovery and deep logging set top systems apart, keeping data safe and reliable.
Balancing short-term ROI with long-term lab autonomy
Q: As you collaborate on digital transformation, how do you balance short-term ROI pressures with the longer-term goal of full lab autonomy?
A: Digital change in labs has three parts: robotics and automation, digitalization, and AI/ML (machine learning). In the past, automation was mainly used to address bottlenecks. Now, labs aim to automate whole departments or even buildings. The main question isn’t if labs should change, but how to do it right and bring everyone along.
The first step is making sure data flows into data management systems like LIMS or electronic lab notebooks (ELN). About 15% of top pharma companies have already done this and are seeing the benefits because they planned ahead and got buy-in. Most others – about 65% – plan to get there in the next three to five years3.
Getting results starts with small wins. Show progress, keep things open, make it simple, and bring people along for the ride. Once the first steps work, momentum builds. At least half of the leaders in this space have already cut time to market. The payoff is real—it’s about starting and making sure everyone’s moving in the same direction.
Current progress and remaining milestones
Q: How close are we to a truly autonomous pharma lab, and what major milestones still need to be reached?
A: First, let’s define what we mean by autonomous. You’ll hear terms like “lab in the loop” and “self-driving lab.” Lab in the loop means automation, machine learning, and AI drive the experiment cycle, but people are still in the decision loop. A true self-driving lab removes that human step. Another example for autonomous operation is the use of AMRs (autonomous mobile robots). They move on their own, taking samples between robotics systems, instruments and other devices (sample storage units). These are like driverless cars for labs, blending automation, instruments, and movement.
Right now, we have strong automation and robotics. Machine learning and AI are starting to grow, and integration platforms are just getting off the ground. Digital twins—virtual models of lab processes—are still new.
What’s missing? Connecting every step from start to finish and building strong AI that can make decisions on its own. Good AI needs lots of high-quality, validated data to train the models. Some labs use automation.