Figuring Out the New Economics of Niche

As smaller batch products become more prominent within drug development, companies are being required to adopt more agile, modular approaches to ensure commercial viability is achievable.

With an increasing proportion of niche therapies entering the development pipeline, the bio/pharma industry is having to pivot to more agile processes that work for smaller batch sizes. As a result, industry is losing the traditional economies of scale that it had benefitted from with the high-volume blockbuster drugs of the past.

To gain a comprehensive oversight of how flexible capabilities are helping to improve efficiencies and secure patient access to niche modalities, The Pharma Navigator spoke with a panel of experts. The panel comprised: Russell Miller, VP of Global Sales & Marketing, Enzene; Rasmus Pedersen, Head of KojoX, FUJIFILM Biotechnologies; and Vidar van der Meijden, VP of Product, Prolific Machines.

Ensuring Commercial Viability

TPN: What innovative cost-management strategies are being used to ensure the processes required for these novel, orphan/niche therapies remain commercially viable?

Miller (Enzene): The shift toward orphan and niche therapies does indeed mean that we’re routinely working with smaller, more customized batches, where traditional economies of scale just don’t apply. To ensure that these smaller batch programs are commercially viable, we focus on three areas.

The first is directly relevant to our mission to help our partners bring accessible treatments to patients, and it’s on process intensification through continuous manufacturing. Moving from fed‑batch to connected perfusion and continuous downstream processing is one of the most effective levers we have, because we’ve demonstrated using fully-connected continuous manufacturing (FCCM) technology that continuous modes can be used to reduce equipment footprints, improve cell density, and deliver multi‑fold increases in volumetric productivity, offsetting the cost disadvantages of small lots.

To bring agility to upstream development, we emphasize media and feed optimization, cell‑specific perfusion rate tuning, and alternating tangential flow (ATF)‑based perfusion control to quickly maximize productivity. In a recent program involving a complex bispecific, shifting from fed‑batch to perfusion enabled an eight‑fold cumulative productivity increase and a high titer suitable for clinical supply at small scale.

And by integrating our end‑to‑end operational models, smaller batches can benefit from connected process steps, reduced product hold times, and minimal changeovers. When we talk about continuous bioprocessing, we assume a tight link between upstream and downstream operations. This means that we can reduce labor load and variability, and achieve levels of operational efficiency that niche products need to be a commercial success.

Overall, our strategy is to continuously innovate so that we can substitute economies of scale with economies of efficiency: intensify the process, compress timelines, integrate workflows, and engineer yield into the system. This has enabled us to achieve a 50% reduction in cost of goods sold (COGS) commensurate with the productivity gains. This is what makes smaller batches commercially viable, and I think this is how the smaller, specialized therapies of tomorrow will not only be commercially feasible, but will enable affordability and wider access — even without blockbuster‑scale volumes.

Pedersen (FUJIFILM Biotechnologies): This is a major change in our industry, and one that other kinds of industries have gone through before us. So, in addition to the general evolution of manufacturing technologies and digitalization in the pharmaceutical industry, we can also look outside for inspiration. One of the learnings here, is that the ecosystems across pharma aren’t really set up for niche products, agility or resilience today.

To achieve commercial viability, we must make development and manufacturing — as well as the way we collaborate across the industry — much more product-agnostic. The more that we can repurpose across products, and the more seamless we can configure the manufacturing and supply chain of any one product, the better cost management in the end. This is true for many types of medicines across the phases of a drug’s life, and it applies to both decentralized manufacturing and larger cohesive network-based approaches.

When demand changes to smaller volumes dispersed across many variants, cost management changes from efficiency ‘within the batch’ to a much more strategic focus on all the things that must be true ‘in-between batches’.

The combination of modularity and digitalization is a powerful way to orchestrate commercially viable manufacturing in this changing environment.

Van der Meijden (Prolific Machines): The blockbuster model's economics relied on large batch volumes. As the field moves toward smaller, more targeted patient populations, scale-up is no longer the most important lever - getting more protein out of smaller volumes is. This means that yield-per-liter becomes a more consequential metric. For innovators itkeeps costs down, and for CDMOs it makes it possible to put a larger number of programs through the same amount of capacity. At the same time, we see an increase in complex modalities, while innovators are driving for the fastest time to the clinic. The result is that titer is often sacrificed for speed. We are encountering programs that are looking for yield improvements after reaching the clinic when, next to pure speed, program value and drug substance cost comes into focus.

At Prolific Machines our answer to this structural shift is to use optogenetics to enable light-controlled biomanufacturing, making it possible to change the level of expression of the therapeutic protein over time., This allows cells to first allocate resources toward growth rather than protein production and reach high cell densities, and then when induced by light to strongly express the therapeutic protein and switch resource allocation strongly to protein production.. This level of control over gene expression enables step-change improvements in upstream productivity and control over critical quality attributes, especially for these complex modalities. This changes the manufacturing economics, even at small scale. We have demonstrated  titers that effectively reduce the volume or time required to hit a given mass target, lessen risk for innovators and can rescue programs that are struggling with low titers. A molecule that previously required a 2,000 L batch to produce a commercial supply run can potentially be manufactured in a 500–1,000 L bioreactor, or in seven rather than 14 days. This not only reduces COGS per gram but is also highly compatible with single-use, flexible capacity. For the most complex novel formats, where low titer can be a real barrier to commercial viability, this is truly enabling.

Improving Levels of Standardization

TPN: To what extent is the success of modularity dependent on industry-wide hardware and software standardization?

Pedersen (FUJIFILM Biotechnologies): Modularity and standardization go hand in hand, yet they are different concepts. Standardization is all about decreasing variation. Modularity on the other hand, is all about increasing variation that makes a difference to the customer while decreasing variation internally in products, processes, and supply chains. The key is to standardize the interfaces between modules, and this is where industry-wide standardization becomes powerful. When we can combine and configure a variety of well-known hardware and software solutions with standardized interfaces for new products.

Across our industry and ecosystems, there are current limitations in this underlying standardization. Our process platforms across the industry aren’t really standardized. We work in many isolated silos when you look at the full value chain across regulatory support for modular systems, recipe management taxonomies for seamless transfer of products, equipment integration and interoperability, software standards for process development, manufacturing execution.

Modular manufacturing networks will be tuned with the use of Generative and Agentic AI in the years to come, and the change is already happening. The way we can connect production lines, harvest data from them and then let any node in the network learn from knowledge harvested at any other node will change fundamentally. The combination of modularity and AI [artificial intelligence] is extra powerful, because the modular architectures for processes and equipment provide a standardized scheme for the AI to learn and an easier path to let those learnings apply across the full network. Moving products across scale, technology and location will also become much easier with increased virtualization of the knowledge backbone in modular manufacturing networks.

This is an opportunity for all players in the industry, but it is also an opportunity for the combined industry. Getting those industry-wide ecosystems in place and then tune them with AI, will greatly improve our ability to serve many more patients.

Van der Meijden (Prolific Machines): There is an important distinction that tends to get lost in conversations about modularity, which is that the success of modular systems depends far more on standardized interfaces than on standardized modules themselves. The modules can and should vary according to need — different bioreactor vendors, different single-use systems, different automation schemes — but if the interfaces between them (data formats, control protocols, utility connections, software handoffs) are not agreed upon, modularity becomes complex in the operational context, separate from any process validation and regulatory implications.

This is precisely why Prolific has designed our illumination and process control layer to be vendor-agnostic from the outset. Rather than requiring a specific automation platform or bioreactor vendor, our system is engineered to interface with all widely-used industrial systems, and to be easily adaptable to new system formats.

Miller (Enzene): Modular facilities give us the flexibility needed for today’s diversified pipelines, but their full value really emerges only when hardware and software systems can plug together seamlessly. In practice, that means common interfaces for single‑use equipment, interoperable automation layers, and data structures that allow batch, quality, and analytics systems to talk to each other without custom engineering.

Progress is being made in single‑use connectors and certain bioreactor formats, but the industry lacks true end‑to‑end standardization, partially because players have differing views on what’s best. Echoing many pre-internet computer systems, automation vendors continue to use proprietary control layers, and data models differ between platforms. These gaps can add to the effort needed for integration and slow down the speed at which modular capacity can be deployed or repurposed.

Where modularity does succeed is when companies invest early in a well‑defined internal standard — a harmonized set of equipment, automation templates, and data interfaces. Combined with continuous or intensified processes, this lets us treat each module almost like a repeatable unit operation rather than a custom build, and that approach has paid dividends to us as our EnzeneX platform can be scaled up, out, or on with relative ease.

Accelerating Tech Transfer

TPN: How are modular approaches shortening the tech transfer timeline between R&D and commercial-scale production?

Van der Meijden (Prolific Machines): A common thread with our partners is a recognition that as molecules become more complex and the push to clinical data shows no signs of slowing down, they’re looking for ways to hedge against scaling- and tech transfer issues, which can severely delay programs. While these issues were uncommon with conventional mAbs, we’re hearing concerns and issues around scaling complex molecules like bispecifics, because there’s currently no way of adjusting for critical attributes like stoichiometry (the correct assembly of the bispecific) after clone selection.

One way Prolific is working to address this challenge is by enabling light control over chain ratios at the clone stage. In other words, having the ability to adjust the relative abundance of the different parts of the bispecific all the way into production. Moreover, we see opportunities to greatly accelerate cell line development for bispecifics by needing to screen far fewer pools. This provides an opportunity to accelerate & derisk cell line development tech transfer and process scale up. We’re currently piloting this approach with the first couple of partner molecules, and we’re very excited for its potential to support the rapidly growing number of bispecific and multispecific programs.

Miller (Enzene): Modular facilities help tech transfers because they leverage standardized, repeatable unit operations, reduce ‘facility‑fit’ work during scale‑out (or scale-up, etc.), and by offering integrated drug substance to drug product transfer pathways.

Together, modularity accelerates tech transfer by turning a bespoke, site‑dependent exercise into a replicable, low‑variability deployment model. That’s especially valuable in a global network where advanced modalities need to move quickly from development to commercial readiness, and given today’s supply chain uncertainties, can help ‘shore up’ regional supply to patients.

Pedersen (FUJIFILM Biotechnologies): Modular approaches fundamentally change a tech transfer from a translation project to a configuration activity. A classic tech transfer has its uncertainties, unknowns, and surprises. A modular tech transfer is largely based on the repurposing of known solutions, providing less surprises and much better plan adherence. Within a family of products, one can sometimes repurpose close to 90% of the settings, documents, validation etc. and focus on the little differences between products, instead of redoing a massive amount of work that is in fact the same from product to product. Doing so across the phases of drug development also speeds up the timelines when you move from one regulatory milestone to the next.

Balancing Disruption with Advancement

TPN: For organizations with significant stainless steel legacy infrastructure, what are the most effective hybrid models for integrating flexible, modular capacity?

Miller (Enzene): Organizations with established supply patterns would want to avoid disrupting steady-state commercial supply, but when expansion is needed to accommodate new products then perhaps an investment in advanced bioprocessing would facilitate smooth transition of end-of-lifecycle products to the advanced bioprocessing model, to take advantage of its relative flexibility, smaller footprint and so on, at lower volumes. Even with the cost of process transfer, the advantage of using an advanced bioprocessing model would facilitate continued, economic supply of that product for longer, and free the large scale, say 20,000-L equipment, either for new products as they approach commercial maturity, or to free space in the facility by converting that ‘stainless-steel’ space to advanced bioprocessing instead. That way, the organization may end up with a more flexible space, both in terms of the equipment on hand and space that can be repurposed as needed.

Pedersen (FUJIFILM Biotechnologies): There are two strategic perspectives here: 1) evaluating the hybrid mix across the full network of manufacturing assets, mainly the balance between your often single use capacity and your stainless-steel capacity; 2) evaluating the mix of technologies within each asset, and understanding how stainless steel and single use, batch production, and continuous production possibly come together on the same shop floor.

Modularity applies in both cases. Across the network, it becomes a question of pointing the right kind of demand towards the right kind of manufacturing capability and then reuse process modules that are agnostic of manufacturing technology across that network. Mixing technologies within an asset is also seen, with single use seed trains, buffer preps, and selected parts of the flow paths as examples. One can also add single use based flexible options and add-ons to the often notoriously inflexible stainless-steel plants for instance for perfusion. In either case, one must be careful to utilize the asset for what it is best at and not, on one hand, avoid polluting a high-volume production plan with too many change overs, or, on the other hand, make too much volume based on expensive consumables.

A Learning Opportunity

TPN: Does a transition toward decentralized and modular manufacturing require a fundamental retraining of the pharmaceutical workforce?

Pedersen (FUJIFILM Biotechnologies): Yes. One key change is that work becomes increasingly cross-functional as the average operations will move from line-specific specialization to a cross-functional, network wide approach.

Possible ways to match the skills of the workforce with increased decentralization and modular manufacturing, could be to strengthen proficiency in changeovers, cross functional problem solving and execution across modular platforms as well as AI literacy.

Miller (Enzene): Assuming you already have a team that understands bioprocessing, then a partner can quickly provide supplemental training and can relatively rapidly help you transition to advanced bioprocessing, potentially saving months off your timeline to validation and qualification.

If you don’t have the basics, then it may take 12–16 months to develop the skills needed for advanced bioprocessing. The industry knows how to support that skill development and there are multiple universities and other learning centers that offer bespoke training on manufacturing, GMP, processing and so on, so to me it’s about having the right strategy in place to develop an appropriate skillset that will work for your team and your circumstances.

Van der Meijden (Prolific Machines): The transition toward decentralized and modular manufacturing does not make traditional bioprocess knowledge obsolete – cell biology, cell culture science, and downstream purification expertise remain foundational. Moreover, it’s important to acknowledge and celebrate the deep institutional and platform expertise embedded within the pharmaceutical workforce from discovery to manufacturing.

However, there are some new developments that I think are worth watching, and one important one in my view is the developments in process control. My background is in the semiconductor industry, where the level of process analytics and process control is lightyears ahead of the state of the art in most other industries, including biomanufacturing. Part of that is due to the economics and the technical manufacturing performance being so closely linked — the value of your product is determined by how many transistors you can fit on a chip. However, that has driven the industry to not just measure more, but also to create systems to act on those measurements: systems that enable control authority and control specificity over the critical quality attributes that drive performance.

Biomanufacturing is rapidly innovating on the monitoring side, with process analytical technologies like Raman spectroscopy, better in-line and at-line analytics solutions etcetera. However, there’s a real gap on the control authority side. Oxygen, CO2, feed, maybe some small molecules — if you think about it, these are weakly coupled to the outputs we actually care about and strongly coupled to everything we don't. They offer no direct control authority and low control specificity over the core processes of transcription, growth, translation, and post-translational modification. To use an analogy that may resonate with pharma execs: The current state of the art is like trying to control the quality of a Swiss watch by regulating what the assembly line workers have for lunch and the humidity of the factory floor. I could invest in systems to meticulously measure every aspect of the watch and use extensive design-of-experiment to correlate lunch nutrients to, say, the level of polish on the watch gears. Maybe I would even find a marginal gain from serving Gruyere over Gouda (they are Swiss, after all). Dissatisfied with the level of improvement, I could decide to throw AI at the problem, because surely, if we feed it enough data about the lunches and the resulting quality, it can find some hidden correlation, right? I would argue all that time and money would have been better spent focusing on figuring out how to control the actual manufacturing and assembly processes of the watch's mechanism. That is what control authority is about: the ability to control the thing you care about.

It's my strong belief that in order to truly leverage all the data advanced PAT solutions are generating, the industry needs to shift its focus to actually closing the loop around the core processes that drive protein production and quality — gaining direct, specific control authority over transcription rates, cellular states, and post-translational machinery, to name a few, rather than acting on them through poorly conditioned, indirect inputs.I haven’t seen a solution other than optogenetics that enables that level of dynamic control at industrial scales, while at the same time retaining all the value of the institutional knowledge and investments in current platforms. So, I believe the skills gap is in smart process control and in developing and leveraging these types of new control abilities to transform the way we think about controlling product quality and driving productivity, and learning from other industries in the process.

Image Credit: © adragan - stock.adobe.com

Next
Next

ten23 health Opens New Office in Tokyo