Innovation Transfer

Over the past two decades, significant socio-economic changes have occurred to impact how the life-science industry discovers, develops, and delivers new products. Most significant among these changes are: For the by life-sciences industry the direct consequences of these include skyrocketing R&D costs, especially in the discovery and development of more complex molecules and subsequently m...

By Dave Harrold April 1, 2004
AT A GLANCE
  • Innovation/engineering value chain

  • Compress process workflow

  • Contract R&D

  • Analyze and synthesize data

Sidebars:
Process data delivers wisdom for capital spending

Over the past two decades, significant socio-economic changes have occurred to impact how the life-science industry discovers, develops, and delivers new products. Most significant among these changes are:

  • An aging population;

  • Changing disease patterns;

  • Increasing antibiotic resistance;

  • Competitive and shareholder pressures;

  • Expanding local, state, federal, and country regulations;

  • Growing price scrutiny; and

  • Rapidly decreasing periods of product exclusivity.

For the by life-sciences industry the direct consequences of these include skyrocketing R&D costs, especially in the discovery and development of more complex molecules and subsequently more complex manufacturing processes.

A study by the Tufts University Center for the Study of Drug Development ( https://csdd.tufts.edu/ ) indicates that from phase-one clinical trials through new drug application, life science companies expend $30,000 per day per drug. That clearly makes compressing time-to-market a critical issue in addressing the challenges listed above.

ARC Advisory Group ( www.arcgroup.com ) vice president, Richard Hill says, “One area where a positive time-to-market impact can be achieved is to improve the product introduction timeline. Producers need to address how potential products are identified and how processes are defined.”

To achieve Hill’s suggestion, life science companies are paying particular attention to improving efficiencies within the innovation/engineering value chain, including more aggressive use of available technologies and more efficient communications between R&D and production personnel.

Innovation/engineering

The innovation/engineering value chain begins with the discovery of a new molecule. Life science companies are investing significant resources to expand their knowledge of sciences, such as recombinant DNA and genomics.

In the life-sciences industry, information flow about a new product’s process development generally follows this series of activities.

To increase the discovery rate of new chemical entities, several life science companies have extended their R&D capabilities by establishing alliances with contract research companies like Avantium Technologies.

Avantium Technologies claims to have assembled the world’s largest catalyst library and, when combined with its proprietary high-throughput technology and advanced software, Avantium’s scientists can efficiently and quickly deliver rational designs, automated experimentations, data analysis, and simulation services to the life science and chemical industries.

To further assist clients, Avantium’s life sciences division recently added dedicated capabilities for screening class IV compounds for salt formation and polymorphic behavior, facilities few companies can afford to develop and support on their own.

Using such contract companies to supplement in-house R&D efforts does more than accelerate experimental throughput, it also accelerates the per experiment knowledge gains. Additionally, third-party R&D contractors can assist with process development and optimization, and provide accurate, concise, well-documented results, all aimed at helping reduce a product’s R&D-to-production transition.

Critical path

Following molecule discovery, the critical time-to-market path is typically product and process development. Parallel to the product development workflow—discovery; biological testing; preparation of investigational new drug application; clinical trials; and new drug application—is the process development workflow (see diagram).

Life science companies that have successfully unified and compressed process workflow report manufacturing savings of as much as 40% for processes optimized in a way that retains the same synthetic route. Manufacturing savings increase to as much as 65% for processes optimized for changing synthetic routes.

Such savings have prompted more life science companies to increase resources on streamlining time-to-market process workflow and the mechanics of knowledge/technology transfer.

Though terminology issues may exist between R&D and production personnel, the greatest challenges associated with knowledge/technology transfer include:

  • Ensuring consistent data, methods, models, and tools are used (such as calculation of density or heat capacity, or modeling of non-ideal azeotropic systems);

  • Guaranteeing a process can safely be scaled up, especially where solvents are involved;

  • Identifying the influences of heat, light, and/or physical agitation to product shelf-life;

  • Addressing equipment process actions (shake versus stir versus tumble, or determining appropriate milling speeds to produce the correct particle sizes);

  • Assessing the impact of batch size and time on product quality (time to heat or cool a laboratory or pilot-plant-sized product versus the time to heat or cool the product in a production-sized batch); and

  • Confirming that equipment capabilities are adequate (chiller size required to meet cooling requirements, or solvent recovery capacity).

Compress workflow

Product development consists of unique and often isolated domains; asset planning; environment planning and reporting; scale-up; flexibility; technology communications; and operations.

Developing a more integrated means of achieving knowledge/technology transfer among these domains compresses workflow. However, success requires life science companies to overcome two major hurdles.

First, they must modify the ingrained culture that tends to reinforce the isolated domain development process.

Second, they must invest in and insist upon using well-designed and tightly integrated software technologies. These would include Aspen Technology’s Batch Plus or OSIsoft’s ProcessPoint software. These tools allow the development of a single repository for a company’s core intellectual property, eliminating time- and cost-intensive searches and duplications. Benefits of these tools and the single repository of knowledge begin at the laboratory stage and extend well into production.

In the laboratory , process chemists can use route screening and selection features to rapidly compare process alternatives, evaluate “make versus buy” decisions, and select synthetic routes for new products. During early stages of product development, when minimum process information is available, chemists can use embedded tools to conduct mass balances, obtain quick cost estimates, and use molecular structures in reactions for physical property estimation and detailed kinetic reaction modeling.

During early process development , chemists and engineers can use equipment-independent recipe features to further refine the synthetic route provided by the laboratory staff and map conceptual processes to recipes. Robust workflow compression technologies also permit quicker evaluations of process alternatives, calculate material balances, estimate processing cycle times, and produce early budget estimates.

In the pilot plant , robust workflow integration technologies permit engineers and chemists to apply scale-up algorithms; perform “process fit” analyses (where a process is matched to the most suitable site); generate plant operating instructions and process flow diagrams; and perform advanced emission calculations from a single recipe.

At the process engineering stage, engineers can use workflow compression software to scale-up pilot plant successes and determine where to physically locate the new process. Software also can help engineers further refine economic analysis, and assist in determining shared resource peak-load requirements (such as utilities, waste treatment, raw materials, and labor), helping to avoid equipment and facility over-sizing and unnecessary capital expenditures.

In production , software technologies help increase return on assets by helping identify bottlenecks and investigate capacity expansions. Also, simulation of single- and multi-product campaigns and in-plant emission profiles can be calculated and time- and size-limiting equipment explorations conducted.

All in all, creating a more tightly integrated environment among development domains opens up significant opportunities to compress process workflow. Even when introduced later, these software technologies offer significantly improved workflow compression opportunities. Greater visibility across stages also facilitates revisions earlier in the process, when they’re less expensive.

Time is money

According to ARC’s Hill, process industries also could learn a lot from the semiconductor industry where new products are introduced every 12 to 18 months. “A couple of decades ago the semiconductor industry began developing and using standardized design tools and processes designed to move a product from design to production quickly and efficiently. As a result of concerted workflow compression activities, semiconductor producers are now able to begin producing a new product in weeks or days.”

Obviously significant differences exist between the semiconductor and process industries, but precedent exists for developing and using efficient ways for transferring knowledge/technology into production specifications.

Contract alliances can help. If outsourcing isn’t the answer, then thousands of in-house experiments will likely produce tens-of-thousands of data records that must be analyzed and synthesized, and various tools can help with that.

MatLab from The Mathworks is a well-recognized development software application. Building into the life sciences area, MathWorks recently introduced its Bioinformatics Toolbox for MatLab.

Bioinformatics Toolbox offers computational molecular biologists and other research scientists an open and extensible environment for exploring ideas; prototyping new algorithms; and building drug research, genetic engineering, and other genomic and proteomic projects.

Once developed, MatLab-based algorithms can be included in advanced process control applications, such as neural networks, and deployed into many of today’s open control systems using C/C++, COM, and/or Microsoft Excel add-ins.

More than more data

Once a product reaches production, there’s a dramatic increase in data sources and a greater need to analyze data for internal and external consumption, further complicating complexity, support issues, and manufacturing. The U.S. Food and Drug Administration recognized related challenges in July 2002. Then, the director of the FDA Evaluation and Research division, Janet Woodcock, said, “U.S. drug products are of high quality, but there is an increasing trend towards manufacturing problems resulting in recalls, disruption of operations, and drug shortages.”

In many companies, the data needed to ensure compliance and identify operational streamlining opportunities exist in many different systems, dispersed across the company’s network. Repositories of product-related yield, quality, who, when, why, and how data include systems for:

  • Laboratory information management;

  • Supervisory control and distributed control;

  • Manufacturing execution;

  • Batch record; and

  • Enterprise operations.

Companies best able to manage data—the ones able to quickly, accurately, and efficiently analyze data— avoid drug shortages and are least impacted by recalls.

Working in cooperation with the FDA’s Process Analytical Technology (PAT) subcommittee, Aegis Analytical is among companies helping examine why problems identified by Woodcock occur and how new and existing products can improve the production process and product quality.

Aegis Analytical’s Discoverant is process-centric software that uses point-and-click conventions to access read-only data from multiple repositories, then displays the results in a unified view.

High-performance analysis tools, such as Discoverant, provide users with easy to use methods to plot and animate data in control charts, histograms, Pareto charts, Pearson correlations, multivariate techniques, and more.

However, seeing data is only the first step. Robust data analysis software should also:

  • Test each measured value against approved ranges;

  • Provide interactive rules, and setting of actions and control limits;

  • Monitor production and automatically reveal process trends;

  • Facilitate preparation of mandatory annual product reviews and identify root causes of atypical performance, even when interacting parameters are involved; and

  • Assist production to maintain process control and provide appropriate audit documentation.

Alliances, methods, and technologies can help the transfer of innovation and compress the workflow of batch processes, but adoption remains somewhat spotty, mostly confined to pilot-plant and production areas. No doubt these adoptions have, and will continue, to reap benefits; however significant additional benefits remain dormant in the up-stream discovery and development phases.

Companies that can overcome cultural resistance to change and remove barriers to innovation transfer will join those already enjoying 40-65% gains in process development optimization.

Online extra

Tools help with material properties, simulation, analysis, reports

Additional software technologies available to help the transfer of innovation and compress workflow include:
From Aspen Technology:

  • Aspen Properties, a thermo physical properties calculation system; and

  • Process Manual and Tools, a Web-based knowledge base for hard to simulate solids processing/handling and separation processes.
    From OSI Soft:

  • BatchView toolset monitors and analyzes multiple batches or events;

  • ProcessTemplates measure key performance indicators over repeatable time segments and use the information to create templates of expected behavior, and securing data from recurring processes; and

  • RtReports support current Good Manufacturing Practices by providing simplified 21 CFR Part 11 compliant reporting using secure access to process data and by managing changes to reporting rules.

Process data delivers wisdom for capital spending

Which is the most cost-effective solution when faced with:

Sold out production capacity;

Marketing estimates that indicated a need to double production in the next two years; and

Marketing estimates that indicate a need to double production again the following two years.

Should you invest in a new plant or invest in “de-bottlenecking,” revamping, and optimizing the current process? That was the decision a major fine chemical producer faced. The company assembled in-house product and process expertise and provided Aspen Technology’s Batch Plus software. After Batch Plus was configured with current product and process data, including shared utilities, several “what-if” scenarios were developed.

The final recommendation was to add four new reactors and revamp the piping to several existing reactors.

Twelve months after completing the additions and revamp, the company experienced a 30% decrease in manufacturing costs and a 130% increase in plant throughput.

Too frequently, process companies assume current production operations are optimized and thus make unnecessary investments. This experience illustrates the benefit of assembling in-house expertise and using robust process modeling software to ensure monies are wisely invested.