Mobile Collateral - HPE

Business white paper

Speeding drug discovery with AI and Big Data

It takes more than $2.5 billion and 15 years on average to develop a new drug. But some people say there's a way to cut costs and speed that up: Use artificial intelligence, cloud computing, IoT, high-powered computing, and Big Data.

The current state of drug development is a lengthy, expensive process, and to a certain extent, it is a shot in the dark. Only one out of thousands of potential compounds make it through the research, trial, and review pipeline to be approved by the U.S. Food and Drug Administration (FDA) and put into large-scale manufacturing. The cost of developing a single drug is staggering. A study by the Tufts Center for the Study of Drug Development found that it costs more than $2.5 billion to develop a drug. Other estimates put it higher: $4 billion, or even as high as $11 billion.
And things may be getting worse, not better. We've all become accustomed to the idea that things in technology and science are on an inevitable march toward more efficiency and lower costs. The prime example is Moore's Law, which essentially says that computing power doubles approximately every 18 months. But the reverse seems to be happening when it comes to drug development.
"There's something called Eroom's Law in drug discovery, and it's the exact opposite of Moore's Law," says Alex Madama, chief technologist at Hewlett Packard Enterprise. "It says that every nine years, the throughput and productivity of getting drugs through the approval cycle goes down by half."
There are a number of reasons for that, including stricter safety and approval guidelines. It's a problem with which the pharmaceutical industry has been struggling. And it's not only the industry that gets hurt by it, but also patients hoping for new drugs that can help with their medical conditions.