“Without animal research, medicine as we know it today wouldn’t exist.”  This is a line taken directly from the website of a campaigning group in favour of continued animal testing. In many respects they’re right; to date, research in animals has provided instrumental data in the development of many life-changing pharmaceutical and non-pharmaceutical chemicals, for understanding their impact on the environment, and for advancing our basic knowledge of human and animal health and disease. However, times are changing, and there is increasing recognition from the industry that animal models may no longer be sophisticated enough to meet the needs of drug development in the 21st century. But according to some groups who oppose animal research, this shouldn’t be a problem because there are “very obvious and more appropriate non-animal methods of research that could have been used instead”. So why aren’t pharmaceutical companies and the biotechnology sector adopting these non-animal approaches more readily? The truth is not as black and white as some of these groups may lead us to believe. Dr Anthony Holmes, Programme Manager – Technology Development at the NC3Rs, explores why.

There are considerable scientific, regulatory, practical and technological hurdles to overcome before non-animal approaches can be adopted. However, driven by the realisation that unless something changes, the pharmaceutical industry is at risk of entering an ice age, the sector is embracing these challenges. The drug development process is long and expensive. High attrition rates due to safety liabilities and lack of efficacy observed in the clinic, but which were not identified during preclinical development is a major contributor to the escalating costs and reduced productivity of the industry. Current preclinical testing paradigms rely predominantly on animal models, including rodents, dogs and non-human primates. However, the high rate of attrition has called in to question the utility of some of these models and demands for more predictive tools.

Companies have tried since the 1970s to address this by frontloading their discovery and development processes with in silico and in vitro approaches. The aim is to identify compounds with undesirable characteristics as early as possible and remove them from development before they enter regulatory animal testing and clinical trials – fail fast, fail cheap is the mantra. This has been successful in reducing attrition due to poor pharmacokinetics and bioavailability, but the simple two-dimensional single cell-type cultures (often using transformed cells) are not sufficient for reliably determining efficacy and safety in modern drug development.  Increasingly, the research community is saying ‘goodbye flat biology’ and embracing advances in tissue engineering and microfluidic technologies to represent the multicellularity of organs in vivo; creating potentially more physiologically relevant three-dimensional cell and tissue cultures for early screening.

It is outwith the scope of this article to provide detailed case studies of how these technologies are being applied, but two excellent recent reviews in Nature Reviews Drug Discovery and Advanced Drug Delivery Reviews, cover this in great detail.

It is still early days in terms of the wider application of these technologies, but they hold great potential and there is real interest from the pharmaceutical sector in their development. The Wyss Institute in Boston, USA is leading the way, but there is increasing activity in Europe and the UK to capitalise on the growing excitement. However, for the full potential of these technologies to be realised, three steps need to be taken:

  1. Companies need to engage collaboratively with the international regulatory agencies. Regulators are keen to support companies in applying these new in vitro approaches and shouldn’t be seen as a barrier to innovation. To maximise the benefit of guidance regulators can provide, industry needs to engage with them early, and regularly as data emerges and the technology grows, to demonstrate its utility and justify its use.
  2. Companies need to take a leap of faith. Industry is currently maintaining a watching brief as these technologies develop, but to affect a paradigm shift, greater willingness to adopt and validate these approaches for their specific needs, is required. This is starting to happen, and as the results of these studies emerge we will start to see more companies embracing these alternative approaches.
  3. Greater cross-sector collaboration. Academics can be naive of the challenges faced by industry, and industry is often unaware of the advances being made in the science base which could benefit their business. More effective collaboration between these communities will ensure that the technologies emerging from the science base can be translated for industry application.

Opportunities are emerging to support industry in meeting these steps. New guidance from the European Medicines Agency to support companies in applying non-animal models through a ‘safe harbour’ approach has recently been drafted, and examples exist where drugs have entered clinical trials without having to go through standard regulatory animal studies (see Megit S. Immunocore pioneers new safety studies. In MedNous. 2011;5:14–15). The UK Government, through Innovate UK (formerly the Technology Strategy Board) and the major research funding councils, is investing in non-animal technologies to transform business and improve product development across a range of bioscience industries. This new initiative provides funding to support companies in working collaboratively with other businesses and the public sector to advance the development and application of novel, more predictive technologies. Finally, open innovation initiatives like CRACK IT from the NC3Rs is connecting researchers across sectors in highlighting industry challenges and providing considerable funding to support research and development of the solutions and a route to market.