Balancing Act
- Hits: 4405
Collaboration reduces risk in quality data management. Quality has long been an initiative, a goal and—especially in regulated industries—a requirement for staying in business. Today’s quality professionals are pushing the envelope to take quality to new levels and make it a way of life for their organizations.
Risk reduction is a driver for quality improvements across corporate and global boundaries. Standards, such as ISO 9001, have been widely adopted to minimize risks to consumers. The ISO 9000 series of standards have been interpreted to fit specific requirements in various industries—from automotive and aerospace manufacturers to telecommunication and software providers.
These tools are in the early stages of being adopted in the pharmaceutical industry as part of the implementation of quality by design (QbD). In parallel with implementing risk-reducing quality improvements, it is important to also undertake a technical evaluation program of product and process improvements to ensure they remain in place and new risks do not unexpectedly arise.
With common methods and standards in place, manufacturing organizations also share a daunting challenge: an increased volume of electronic and paper-based data collected during process development and manufacturing. Overwhelming amounts of data stored in disparate systems result in the inability to easily access pertinent data for quality investigations. Quality teams need easy ways to get accurate views of product quality, and better manage the time and resources required to produce standard quality reports.
Global pharmaceutical manufacturers sometimes refer to data silos, such as corrective and preventive action (CAPA) systems, laboratory information management systems (LIMS), enterprise resource planning, manufacturing execution systems, data historians and paper records. When quality and manufacturing teams investigate a problem using data from multiple siloed systems, the result is often spreadsheet madness, which leaves individuals sifting through piles of out-of-context data that may not even pinpoint the problem.
An added challenge is the lack of collaboration among most manufacturing and quality teams when it comes to data management. The sidebar, “Signs of Trouble,” describes four additional indicators that suggest an organization’s quality data management system needs improving.
Signs of Trouble
There are four telltale signs an organization’s quality data management approach needs a tune-up:
- Quality teams are manually gathering data from static data sets stored in spreadsheets to generate quality reports.
- Quality teams are spending significant hours standardizing and verifying data stored in various systems.
- Source system owners do not allow direct access to source databases, resulting in long delays or incomplete reports.
- The quality data maintained in spreadsheets do not have documented, traceable sources.
Establishing shared goals and a mutual understanding of the problem can help overcome the practical difficulties of accessing process development and manufacturing data from separate databases and paper records. When collaboration occurs, significant paybacks result from connecting to data in real time for trending analysis.
Life science drivers
The life sciences industry serves as an interesting example because it is relatively behind the curve when it comes to reducing risk through the management of quality data. According to Pharmaceutical Technology’s Equipment and Processing Report: “The industry’s approach to risk is more sophisticated now, but still has room for improvement. Drug companies sometimes lack certain capabilities that could help enhance product and process knowledge.” Other industries, such as food and beverage, and specialty chemicals, are further along and operate with smaller profit margins.
Imagine a continuum of manufactured goods (Figure 1). One end includes products that are still saleable when variance occurs. These items can be repurposed for an identified market when quality lies outside an initial acceptance range (for example, textiles sold to discount retailers). The other end holds process manufacturing-based products that, for safety and legal reasons, cannot be sold when variance exists.
Until recently, larger margins on products that made it to pharmacy shelves compensated for large numbers of lost batches. But recent market forces have created pressure for reduced pricing, driving the need for life sciences manufacturers to invest in measures to reduce process variability, increase yield, and improve quality and robustness.
Economic factors, such as fewer blockbuster drug pipelines, loss of patents and outsourcing, are also driving the life sciences industry’s need to control variability to reduce manufacturing risks and supply chain costs. Regulators are helping with recommendations, such as QbD, that present an opportunity to improve manufacturing processes based on new real-time process measurements and by taking advantage of previous experiences with similar processes.
A decade ago, process analytical technology was introduced by the U.S. Food and Drug Administration (FDA) as a mechanism for improving real-time process control and, later, for achieving QbD in pharmaceutical and biotechnology processes. Since then, the volume of electronic and paper-based data collected during process development and manufacturing has increased dramatically.
With risk reduction and improved product quality in mind, these efforts have helped make QbD a way of life for the life sciences industry. As a result, more of the processes that make life sciences products have quality designed into them rather than just involving testing the quality of the product at the end of the production line.
Regulators encourage collaboration among manufacturing and quality teams with guidelines that integrate the two areas. In its 2008 Q10 document, the International Conference on Harmonization of Technical Requirements for Registration of Pharmaceuticals for Human Use (ICH) said, “ICH Q10 demonstrates industry and regulatory authorities’ support of an effective pharmaceutical quality system to enhance the quality and availability of medicines around the world in the interest of public health.
Implementation of ICH Q10 throughout the product life cycle should facilitate innovation and continual improvement and strengthen the link between pharmaceutical development and manufacturing activities.”2
ICH Q10 outlines a quality management system model—the Pharmaceutical Quality System—and states that “implementation of the Q10 model should result in achievement of three main objectives which complement or enhance regional good manufacturing practice requirements”:
- Achieve product realization: supports operation of processes that meet appropriate quality attributes.
- Establish and maintain a state of control: enables risk-based monitoring and control systems for process performance and product quality, providing assurance of continued suitability (for example, the ability for the process to continue producing the intended outcomes) and capability of processes.
- Facilitate continual improvement: strengthens the knowledge or data link between development, manufacturing and quality.
- ICH Q10 also facilitates a life-cycle approach to quality management and risk reduction. “It was decided to go to ICH -Q10 to bring the ICH vision to fruition over the product life cycle,” explained Joseph Famulare, former deputy director of the Office of Compliance in the FDA’s Center for Drug Evaluation and Research. “The result is that firms that employ QbD and use risk-management principles (ICH Q8 and ICH Q9) can use a robust quality system to manage change and continual improvement, and this would lessen the need for post-approval (regulatory) filings, putting quality more in the hands of the manufacturer,” he said.4
Intersection of quality and manufacturing
When manufacturing organizations implement quality improvement initiatives, quality and manufacturing teams can interact in new ways to achieve shared goals. Today, most quality systems are designed to serve the needs of the quality team, such as verifying assessments and publishing results. By design, LIMS alone are not focused on making a manufacturing process more successful.
Collaboration in the name of quality data management requires a paradigm shift that organizes data around the manufacturing process and resulting product control strategy, rather than around samples, control of analytical workflows, control of standards and analysts—the current organizing principles on which LIMS are based.
For example, results may be posted and verified about the selenium content of a particular sample, but why would someone want to see the data for an individual sample? Ultimately, the goal is to better control the process that produces the samples, so there needs to be a combined or unified process-centric view of product development, quality and manufacturing across the IT systems. This shift in thinking must be made attractive to team members who may be resistant to it, and championed by those who value a shared purpose between quality and manufacturing teams.
As another example, annual product reviews (APR) typically are quality’s responsibility, and they are organized around individual drugs. In a more collaborative environment, the APR process itself can be the organizing principle for data. Information about specification failures and process deviations is shared among the combined process development, quality and manufacturing teams.
A quality team might see trend charts and ask how the problem will be solved. The next step before taking corrective action is to dig through the data from disparate sources and reorganize it using error-prone manual methods.
An added complexity is that different quality teams often have different data languages that result from their unique needs and make it impractical to organize the data in one way that will satisfy all their needs. An example of the different data languages used by quality teams with varying priorities is when one team uses deviations as a driver for batch release and another team uses control charts as a driver for change control.
The increased use of contract manufacturing organizations (CMO) on a global scale (up to 40% of drugs Americans take are currently manufactured outside the United States) increases the challenge of achieving this unified, process-centric view across all aspects of the organization.5 But in these situations, it is even more critical that all aspects of sponsor and contract or supplier organizations are aligned on the strategy and tactics for ensuring ongoing product quality.
Quality can’t execute on its own without the ability to collaborate in a world where teams share a common language (data and facts in context) organized around a process. For example, the temperature might be X, but investigators also might need to know characteristics of the previous process step or input material before they will be able to recommend appropriate action.
Critical success factors include providing end users with on-demand access to data in a collaborative analytics, trending and reporting environment in which the multidisciplinary team can work together productively.
The most important requirements for this system are:
- A simple user-centric interface for direct, on-demand access to all the data from disparate sources.
-A practical way to capture paper-based data and make it easily available in electronic form.
- The ability to work with continuous and discrete data together, sharing data, analysis results and reports across disciplines, scales of operation and geographically dispersed sites.
- Simplifying the preparation and distribution of analysis results.
- Automating the generation of periodic reviews and reports of batches and campaigns.
The ICH Q10 process performance and product quality monitoring system section says: “Pharmaceutical companies should plan and execute a system for the monitoring of process performance and product quality to ensure a state of control is maintained. An effective monitoring system provides assurance of the continued capability of processes and controls to produce a product of desired quality and to identify areas for continual improvement.”6
The right tools, such as data aggregation, analysis, contextualization and reporting, will work with existing quality systems by adding a layer on top to span quality and manufacturing systems and provide access to the necessary data.
In October 2010, the International Society for Pharmaceutical Engineering (ISPE) released “Overview of Product Design, Development, and Realization: A Science- and Risk-Based Approach to Implementation.” According to PharmaQbD.com, “ISPE’s product quality life cycle implementation initiative has been one of the leading efforts to translate ICH guidance on science and risk-based manufacturing into more practical language and to outline how Q8, Q9 and Q10 might be applied by industry.”
Meanwhile, the American Society for Testing and Materials is developing standards for industry that provide practical approaches to implementing ICH Q8-108 and the FDA’s process validation guidance.
Technology tools (such as software), used for data analytics should work with quality assurance data systems to enable and support ICH Q10 product life cycle management goals in the following areas:
Global. Provide a Part 11 compliant (a Code of Federal Regulations requirement for integrity and security of electronic data), audit-trailed capability to capture paper-based data in electronic form in a network-accessible database available to all users; enable centralized experts to prepare and distribute outputs for widespread use and updating by globalized teams of diverse, nonstatistical users; provide a collaborative data access and analytics platform that spans geographic and organizational boundaries (between sponsors and CMOs) so teams with shared concerns can achieve their joint goals.
Process performance and quality monitoring. Provide observational users an updated portal or dashboard to display status data anywhere, anytime. Simplify and reduce the cost of periodic quality review, APR and production report preparation and distribution; provide views of test method processes to study and reduce test method variability; provide a collaborative platform for comparison of internal and outsourced operations and quality data.
CAPA. Provide a link between customer-complaint data and product testing and production data to understand cause-and-effect relationships; enable a link between process development and manufacturing sample testing to make comparisons; enable testing and implementation of process improvements.
Change management. Provide a platform for immediate access, organization and flow of data for internal communications.
Management review. Provide management users an updated portal or dashboard to display status data anywhere anytime; provide easy access to and organization of data for timely preparation of management reports; provide a collaborative platform to enable staff to implement management actions.
Tips for success
Based on first-hand practical experience, the top five tips for change management with quality and manufacturing teams include:
Minimize time requirements. No one resists the concept of quality trend monitoring, but some fear the perceived effort and time it might take. It is important to reserve the time needed for initial benchmarking and conduct ongoing collaborative data analysis.
Centralize. Having a central project group to relieve the workload and evangelize the approach can drive success. It can help if a central group is charged with doing baseline analysis to minimize the burden placed on other groups.
Train. Provide basic statistical training across groups to ensure uniform capabilities. Guidance documents assist in eliminating subjectivity that might affect results.
Communicate. Teams must meet regularly to review data so quality, plant and manufacturing operations groups can communicate consistently. There may be initial resistance to global visibility of data when producing the same product at multiple sites and contract sites, but the payoff is a more thorough and routine examination of variation and a consequent reduction in variability across sites.
Automate. The right technology tools enable routine monitoring without interruption. Manual data must be incorporated using automation and technology that minimizes error. Automating data access, aggregation and contextualization, which involves organizing related data into groups or “batches” in preparation for analysis is a massive time-saver. While monthly analysis is a minimum requirement to ensure that trends in high-volume products are identified and controlled, technology tools allow the option for daily updates due to reduced cost and effort requirements.
Quality and manufacturing teams can apply these lessons for better quality data management. Ultimately, life sciences companies can use up-front analysis to identify continuous improvement opportunities by identifying the products with the least acceptable outcomes (for example, high variability in pH values). They can triage the list to drive goals and priorities for the manufacturing and quality organizations.
New products, as well as additional manufacturing sites, can be brought into the quality data trend monitoring program. The same approach can be used to develop joint goals and implement continuous improvement activities between sponsor companies and CMOs.
Quality can truly be a way of life with the right management structures and technology. A collaborative approach that meets ICH Q10 guidelines and adds a data access contextualization layer over existing technology systems can improve overall quality data management.
Spreadsheets and paper records will never be completely eliminated, but the right technology system helps significantly reduce the errors and risks associated with these components. It enables companies to leverage existing IT systems in a way that invites management, IT, quality and manufacturing to support important risk mitigation initiatives.
Reference: ASQ