With the time and cost to bring a new drug to market now conservatively estimated at about 12 to 15 years and $800 million, it is interesting to reflect on the fact that most new drugs that reach the market are still discovered using the tried and tested "suck it and see" empirical approach with which our ancestors discovered that the bark of the Willow tree can cure headaches.
Obviously, modern drug companies do not rely on people in forests randomly chewing on the odd piece of tree bark here and there, to find the next great treatment for high cholesterol. While this approach may have enjoyed such notable successes as Aspirin - the active ingredient in the bark of the Willow and still one of the best-selling drugs ever - it required generations of random sampling reinforced by the oral tradition of folklore before its utility and efficacy became evident. Not exactly the basis on which to build a business model for a competitive pharmaceutical company.
This basic "suck it and see" approach to drug development is however still very much alive and well in modern pharmaceutical companies, although these days it is a much more deliberate and concerted process in which everything is done to stack the deck in fortune's favor. The modern high throughput screening approach is an accelerated and refined form of this empirical process, but for all of its refinement, the underlying methodology is essentially the same. Try stuff out and see what works.
The problem inherent in this approach is that you may never really understand exactly how your drug works or what other activities it might have in the patient's cells until after it has been in the clinic, at which point you are running the risk of doing actual harm to your patients. Vioxx and Avandia are household names even amongst the vast majority of the population who have never taken them, for this very reason. A significant portion of the regulatory activities involved in getting a new drug approved are actually intended to mitigate this very problem, but it is clear that even this demanding regulatory process cannot prevent the harm to patients that results from an incomplete understanding of the drug's mechanism of action.
This lack of mechanistic understanding lies at the core of the pharmaceutical industry's problems - a regulatory process that gets ever longer and more costly as it strives to fill the gaps as well as the devastation to both the industry's reputation and its bank balance that results from the nasty surprises that occur in clinical trials, or even after the drug has already been approved and sold to thousands or millions of patients.
For all of its shortfalls however, this empirical methodology has been a justifiable and rational approach for the pharmaceutical industry to take because simply put, biological systems are massively complex and there have until now, never really been any suitable tools with which biologists (or drug developers) could get a handle on this complexity. A civil engineer designing a suspension bridge is working with a system that is far less complex than even a single living cell and its behaviors and properties can be accurately understood by plugging the data into the models that physics and mathematics have provided the engineer. Biology alas, is replete with data, but woefully lacking in the kind of models that would allow its researchers to transform it into useful insight and to approach living systems (or the design of the drugs that modulate them) with an engineer's perspective.
Where computer science and biology intersect in the exciting field of digital biology, new approaches to computational modeling are emerging that have been specifically designed with the challenges of biological complexity in mind. These new methods allow biologists for the first time, to build real models of the massively complex systems inside living cells, without the constraints on scope and resolution that are inherent in the traditional approaches to biological modeling that have been borrowed and adapted from other fields such as physics and chemistry.
The ability to approach biological systems with a truly mechanistic perspective will transform basic research for academic biologists. For the pharmaceutical industry, it holds the promise of ushering in an era in which drugs are designed rather than discovered, finally lending an air of veracity to the industry's own TV commercials in which earnest researchers in white coats and safety glasses scribble designs for new molecules on perspex screens.
With the time and cost of drug development spiraling and the approval rate for new drugs actually in steady decline as R&D expenditure increases, something needs to change if we are not to return once more to measuring in human generations, the time it takes to discover a new drug.
The author Gordon Webster, has spent his career working at the intersection of biology and computation and specializes in computational approaches to life science research and development.
© The Digital Biologist | All Rights Reserved