In any industry, innovation is the key to success. But just like any other business attribute, innovation too, has its challenges. Businesses have limited resources. And feeding the innovation process with the right ideas requires considerable time.
The pharma industry is no exception. The longevity of pharma organizations depends on a healthy pipeline of new drugs. Through this post, I will share a few observations on how leading pharma companies are experimenting with open innovation. Next, we will analyze the implications on the Quality Assurance (QA) & Testing function, and explore how pharma QA teams can prepare for the crowd-driven, open innovation revolution.
The drug development lifecycle can take up to 10 to 15 years, and costs anywhere between 1 to 2.5 billion US dollars. The low success rate for new drug development often makes it difficult to sustain large R&D investments. This effectively translates to pharma companies either improving the success rate of new drug development, or decreasing costs. In that regard, companies are now exploring other means to speed up the innovation. Some examples include:
- Partnering with small-to-mid size biotech firms to invest in a promising new drug – the collaboration addressing aspects such as who will lead the research, options for development and commercialization, and so on. The Sanofi – Regeneron collaboration in immuno-oncology is a good example.
- Another form of collaboration is where companies are willing to swap compounds and give their scientists access to those compounds for research and development; without having to pay any fees. The AstraZeneca-Sanofi deal announced in late 2015 is an example of this kind of collaboration.
- Finally, what is also gaining momentum is an idea of open research and innovation, wherein the global scientific community is provided access to preclinical data for medicines to further their research. AstraZeneca supported the DREAM challenge for crowd-sourcing scientific and therapeutic techniques.-Open innovation addresses the challenges of conventional innovation management, and delivers big savings on R&D time and cost.
Let's understand the challenges with some industry examples: Pharma companies have invested time and effort in accumulating large libraries of chemical compounds. These compounds must be tested to determine their viability as a drug. Should these companies risk sharing these libraries with competitors? For scientists, this is less of a worry. Given the large number of lead compounds they create and improve, the possibility of resulting drugs matching each other is minimal. An ideal case is the recent AstraZeneca and Sanofi's announcement of direct exchange of 210,000 compounds from their respective proprietary compound libraries. Given the application of different scientific methods on these libraries, the chance of the two companies ending up with the same drug is nearly impossible. In fact, buying the libraries would entail a cost of approximately $50 million, and thus sharing has resulted in a win-win situation for both companies.
While there are definite and tangible benefits of open innovation, there are also pitfalls to avoid. Careful planning, project management overheads, regular contact with regulators, and information technology support are important aspects to keep in mind. As big pharma companies continue to become increasingly risk-averse, it is unclear how much risk they will be willing to assume in open innovation drug development projects. Then again, there are questions on the ownership of intellectual property generated through the open innovation process.
Against this backdrop, IT teams need to ensure security, standards compliance, and cost effectiveness of open innovation projects. Sooner or later, the constantly growing data volume will need to be moved from storage devices to cluster nodes. To meet high performance and scalable I/O operations, and prevent bottlenecks in downstream analysis, IT teams must ensure:
- Secure, GxP compliant storage infrastructure, with adequate access controls
- Platforms for sharing libraries
- Frameworks and processes for analyzing and reconciling, structured and unstructured data sourced from libraries.
- Recommendation on and deployment of technologies to accelerate the screening process
The manifestation of these requirements could take various forms – from setting up a private cloud to leveraging niche technology platforms. For example, UK’s MedChemica, an intermediary company, manages the data sharing for Roche and AstraZeneca. MedChemica's expertise in Matched Molecular Pair Analysis (MMPA) technology, besides adding to the competencies of the pharma companies, also raises the quality and specificity of drug design rules.
Tools such as High Performance Computing (HPC) also contribute to open innovation. In a study conducted using a 50,000-core cluster (Amazon cloud), the computational chemistry outfit Schrdinger analyzed 21 million drug compounds in just three hours, at a cost of less than USD 4,900. Using the traditional approach, in addition to the large sum of money, this would have taken four years to complete. You can read more about this HPC success story here.
In all such cases, as IT requirements evolve, QA takes center stage in ensuring GxP compliance, infrastructure qualification, security risk assessment, and data privacy. It also plays an important role in validating data ingestion, storage formats, migration guidelines, conversion rules, and analytical algorithms. While IT has a defined role cut out for itself – meeting storage and cloud requirements, QA has some homework to do – move beyond traditional test cases, evolve and adopt modern methodologies that assure not just functionality, but also performance, security, availability and reliability.
When this happens, QA teams will start unearthing pharma's unstated problems, enabling the industry to take a fresh look at innovation management – move beyond internal boundaries, and open up to the promising new world of open innovation.