I created a proof of concept model for a global pharma technology division, and here's what I learned.
The ask: Evaluate emerging technologies in a regulated environment without breaking the bank or burning resources
A key element of my previous role was the global ownership of application portfolio management for the Pharma technology division. That included all applications from Shop Floor to Top Floor and everything in between ( i.e., MES, EBR, Packaging, Warehousing, ERP, LIMS, EMPOWER, Quality Management Systems, EH&S, etc.) Among the team's responsibilities were identifying and prioritizing business needs, application selection, project budgeting, business case creation, and user requirements gathering. The project's delivery was managed by the IS function, with my function acting as an interface between the business and IS, effectively keeping both parties honest and ensuring each side delivered against their project commitments. While the model and approach proved to be very successful for traditional applications, it didn't work well with the new emerging technologies falling under the umbrella of I4.0.
The Challenge: Deconstruct the old model of evaluation and build a better one
The first thing to understand was: why wouldn't the existing model work for I4.0? After all, applications are applications, right? Well, yes and no, the model outlined works well when delivering a tried and tested application, i.e., one that has been successfully implemented in many organizations many times.
For example, when implementing a LIMS, SAP, or MES, the question is not does the application work. But what variant of the business process will we implement, and what degree of customization is required to support it? The difference with I4.0 is that, for the most part, it is a set of technologies that can be used to supplement existing applications or grouped to create new applications. This provides a great deal of flexibility and brings a significant headache as you are effectively building an application from scratch. To further complicate matters, the various technologies are at different degrees of maturity.
We are all too aware that Life Sciences doesn't lend itself to the "give it a go" project delivery approach. And for a good reason, lives are at stake. So, the question is: how to rapidly experiment with taking advantage of emerging technologies while maintaining the safe and high standards that are critical to the Life Sciences?
Firstly, what attributes do we need to embed into the fundamentals of the I4.0 model?
From evaluating what we were missing in the old model, key components were:
- Increase Flexibility: We needed to be able to think like a start-up.
- Operate Constraint-Free: We had no rules, especially with concern for existing limits.
- Be Dynamic: We had to think outside of the box.
Next, we needed to address the larger context: What do 'we – the business' need to change?
- Adopt a Venture Capital Approach: We had to convince the business that unlike traditional projects and investments in which failure was not an option, we encouraged the team to take risks with the approach, technology selected, solution design. The one successful initiative could 'pay' for all of the ones that didn't succeed.
- Accept Failure (our was motto 'Fail fast Succeed Quicker'): there may be no ROI. Most proof of concepts will be written off with only a 20% success rate.
- Business-driven model for focus areas and proof of concept selection (i.e., Not IT for IT sake): By selecting existing business problem areas for proof of concept focus, you automatically bring the business to the table; they have a vested interest in resolving the problem.
- Embrace Speed: All technology is progressing faster than most businesses can accommodate. That said, some of the emerging technologies are genuine game-changers. As such, we needed to devise a way to evaluate and rapidly deploy these technologies to add significant benefit to our business.
- Think outside the box: what considerations are needed here (in addition to adopt a venture capital approach and fail fast, succeed quicker)?
So, how did we do it? Here were our rules of engagement:
- A strict 12-Week proof of concept Model: twelve weeks to design, build, evaluate the technology. You kill it after 12 weeks, regardless.
- Exec C Suite Support: We made sure to get executive buy-in. This is key should you recommend that you wish to invest further or make a case for rapid deployment.
- Small, dedicated I4.0 team: We supplemented a small team with vendors on a plug-in/plug-out basis, meaning that we didn't build an in-house team of data scientists. I4.0 technologists let your partners hire these people so you can source capacity and expertise from the partners on an as-needed basis.
- Dedicated I4.0 (Opex) budget: this was approved during the annual budgeting cycle.
- No Procurement involvement: we expedited the process because we had no preferred vendors, no three quotes, and no delay.
- No Quality / Computer Systems Validation: We didn't do any Quality/ CSV. You read that correctly. To eliminate constraints and more quickly, we didn't embark on the validation process. We determined that we could move forward with validation after we confirmed the proof of concept was worth pursuing.
- Business-driven problem statements aligned to business vision, in turn, dictating selection: Typically, the majority of our traditional implementations are multi-year and address traditional business problems. These differ as they address non-traditional business problems with new technologies, i.e., how do I automate planning using AI.
- Host innovation days: We gathered senior business and functional leaders to demonstrate how some of the I4.0 technologies could address, enhance, or obsolete business processes or activities. These days would typically take place off-site at a vendor's innovation center, and the topics/areas covered would be based on a poll of the business leaders' problem areas.
What were the outcomes and lessons learned?
We tested 4/5 technologies/scenarios per year, over two years, including:
- Artificial Intelligence
- Machine Learning
- Augmented & Virtual Reality
- Natural Language Processing
- Workplace wearables
The result? Of those tested, we adopted three, and seven weren't implemented for a variety of reasons.
But several key learnings and successes came about through the experiment:
- Proof of concept methodology was a massive success, so much so that other functions adopted the model.
- Even with the adoption, scaling the rapid deployment of the successful proof of concept remains an issue. Especially since there was more demand for proof of concept than the group could accommodate.
- Stakeholder relationships are critical: C-Suite engagement is vital, and it was essential to actively manage our IT relationship.
Choosing between game-changing technology and maintaining quality is a false binary; you just have to be willing to think outside the box.
Evun Wyer is an innovative IS & Business leader with over 25 years of experience in a range of roles and industries including Life Sciences and Fast Moving Consumer Goods. In preparation for our early March Quality Leaders Forum, he outlined his experience creating a proof of concept program in a global pharma technology division.