Beyond animal testing in drug development

For over a century, the life sciences industry has relied on animal models as the standard for predicting drug safety and effect. But today, the industry faces a harsh reality: over 90% of new drugs that appear safe and effective in animal tests ultimately fail in human clinical trials (see for example Ineichen et al. 2024 [1] and Marshall et al. 2023[2])

A biological gap largely drives this massive failure rate. Natural differences between humans and testing species like mice or monkeys mean that applying animal results to humans is often inaccurate. History provides several high-profile examples of this disconnect. The monoclonal antibody TGN1412 was assessed as completely safe in monkeys at doses 500 times higher3 than the intended human dose. Yet, it caused a life-threatening immune reaction in healthy human volunteers. Similarly, drugs like Vupanorsen4 and Fialuridine5 showed no major toxicity in animals but led to severe liver or cell damage during human trials.

Beyond the clear scientific limits, the economic and ethical pressures on animal testing are ever increasing . The early testing of a single monoclonal antibody can require up to 144 monkeys, with costs recently rising to $50,000 per animal6. Combined with growing public pressure to prioritize animal welfare, the industry is experiencing a shift towards more human-relevant and data-driven alternatives.

 

The regulatory push towards human-relevant data

Regulatory authorities worldwide are actively steering the industry away from animal models and towards the 3 Rs principle: replace, reduce, and refine.

In the United States, the FDA Modernization Act 2.07 recently removed the legal requirement that forced companies to use animal testing for new drug applications. Following this, the FDA published a comprehensive Roadmap to Reducing Animal Testing in Preclinical Safety Studies6. This ambitious plan aims to make animal studies the exception rather than the rule within three to five years, starting specifically with monoclonal antibodies.

In Europe, the landscape is also evolving. The European Medicines Agency (EMA)8 provides structured pathways to help developers include alternative data in their clinical trial applications.

However, pharmaceutical developers are currently facing what industry experts call the NAMs paradox9. Laws and roadmaps encourage the use of New Approach Methodologies (NAMs), but a clear guidebook detailing exactly how to format and submit a regulatory package based solely on these new methods is still largely missing. Submissions are currently evaluated on a case-by-case basis, requiring companies to engage early with regulators and build a strong “weight of evidence” approach.

 

New approach methodologies and their current limits

So what exactly are the technologies replacing animal tests? According to a recent feature in Nature10, these new approaches generally fall into two categories.

The first category includes lab-grown and human-derived systems. These include patient-derived organoids (mini-organs) and organs-on-chips. These bioengineered platforms use human cells combined with fluid flow to mimic the complex environment of human tissues. For example, human intestinal organoids11 can accurately copy the structure of the gut to test oral drug absorption and stomach toxicity. Ex vivo human organ perfusion takes this a step further by pumping fluids through actual human organs donated for research, keeping them alive for drug testing.

The second category involves computer-based tools. This includes physiologically-based pharmacokinetic (PBPK)12 modeling to simulate how a drug is absorbed and processed by the body, as well as machine learning algorithms trained on historical data to predict how the immune system might react before a drug is even made.

It is important to be realistic about where these technologies currently stand. While a liver-on-a-chip is excellent for identifying liver toxicity, it lacks the full connection of a complete living organism. Predicting complex reactions that require an intact immune or nervous system remains very difficult to recreate on a chip. Furthermore, advanced computer models still often rely on existing animal data to work accurately. We are not yet at the stage where all animal testing can be entirely eliminated overnight, but the tools to drastically reduce their use are maturing rapidly. To quote a recent Nature news feature article13: “we will still need to use animals in basic discovery science for a while to come”.

How can we help you navigate the data?

Replacing the complete environment of a living mammal requires data tools capable of analyzing human cell models in extreme detail. You cannot simply swap an animal test for a cell culture without extracting much more data from that culture.

This is exactly where BioLizard steps in. As a bioinformatics and data science company, we provide the computer infrastructure and expertise required to navigate this transition. We offer realistic, industry-ready solutions that help developers build a compelling case for their regulatory submissions.

Here is how we are currently helping the industry move forward:

Smart reuse of existing data. Before starting any new physical experiments, we help clients mine and structure vast amounts of existing human clinical data and historical animal studies. By using large language models and advanced analytics, we extract useful biological insights from scientific literature and public databases. This allows us to test ideas virtually, significantly reducing the need for new lab tests.

High-resolution single-cell and spatial analytics. When evaluating human mini-organs or tissue samples, standard testing methods are no longer enough. We build modular workflows for single-cell analysis to map how specific cell groups respond to drugs, allowing us to detect extremely rare side effects. The recent rise of single cell foundation models, which can, for example, be used to predict the effect of gene perturbations14, is also an extremely relevant and interesting development. Additionally, our spatial biology pipelines track how cells interact within their natural structure, providing the deep visual evidence that regulators need when evaluating alternatives to traditional animal tissue reports.

Multi-omics integration. The most successful regulatory submissions combine different types of data (genomics, proteomics, metabolomics) into a single clear story. We help clients simultaneously measure real-time metabolic changes and link them with cellular stress responses, ensuring a full understanding of how a drug works. Using such multi-omics data, we can also help improve the animal-to-human translation for our clients that are still performing animal studies, for example, by employing bioinformatics and AI/ML methods to assess the clinical relevance in humans of the molecular signals observed in a preclinical animal model.

Digital twin cells. While whole-patient digital twins are still largely experimental, Digital Twin Cell technology is something we can leverage today to optimize early-stage drug candidates. By integrating deep protein data and known interaction networks, we can use computers to simulate how a new molecule affects complex proteins within specific human cells. This AI-driven approach accurately predicts unintended interactions, allowing developers to improve molecules long before physical testing begins.

Think data, speak biology

The global pharmaceutical industry has arrived at a critical turning point. Moving away from animal models is no longer just an ethical choice; it is a scientific and economic necessity to overcome the 90% clinical failure rate.

To succeed in this new regulatory landscape, developers must ensure their complex data pipelines strictly comply with FAIR data principles (Findable, Accessible, Interoperable, and Reusable) and meet the strict documentation standards expected by the FDA and the EMA.

At BioLizard, we help the life sciences industry move from complex data to confident decisions, guided by clear, biology-driven insights. By combining the details of human-derived cell models with immense computer power, we can help you generate clinical-grade evidence earlier in the development pipeline.

If you are looking to integrate new approach methodologies into your research and need a partner to manage the underlying data science, reach out to our team today.

Contact us >

 

References

1. https://journals.plos.org/plosbiology/article?id=10.1371/journal.pbio.3002667

2. https://journals.sagepub.com/doi/10.1177/02611929231157756

3. https://pmc.ncbi.nlm.nih.gov/articles/PMC2964774/

4. https://www.ahajournals.org/doi/10.1161/CIRCULATIONAHA.122.059266

5. https://www.nejm.org/doi/full/10.1056/NEJM199510263331702

6. https://www.fda.gov/files/newsroom/published/roadmap_to_reducing_animal_testing_in_preclinical_safety_studies.pdf

7. https://www.congress.gov/bill/117th-congress/senate-bill/5002

8. https://www.ema.europa.eu/en/human-regulatory-overview/research-development/ethical-use-animals-medicine-testing/regulatory-acceptance-new-approach-methodologies-nams-reduce-animal-use-testing

9. https://www.linkedin.com/pulse/nams-paradox-regulators-say-yes-ind-cta-playbook-stefano-gaburro-phd-hu8qf/

10. https://www.nature.com/articles/d41586-026-00563-3

11. https://advanced.onlinelibrary.wiley.com/doi/10.1002/advs.202522276

12. https://accp1.onlinelibrary.wiley.com/doi/10.1002/cpdd.70046

13. https://www.nature.com/articles/d41586-026-00563-3

14. https://www.nature.com/articles/s41592-025-02980-0

 

How spatial biology improves clinical trial success in oncology

How spatial biology improves clinical trial success in oncology

Oncology drug development often begins in patients, allowing early safety and efficacy insights. Yet many cancer drugs still fail in the clinic. We validate the target but ignore its context within the tumor microenvironment. This article explores why spatial biology may improve clinical success.

Why bioinformatics workflows require experienced software engineers

Why bioinformatics workflows require experienced software engineers

Bioinformatics pipelines break for the smallest reasons: package updates, shifting dependencies, or “it only works on my machine.” This post explains why experienced software engineers and DevOps practices (Git, CI/CD, IaC) are essential to keep workflows reproducible, stable, and scalable.