The Boston Area Chapter educational program entitled “Cloud Computing: New Challenges in Data Integrity and Security” was held on Thursday, November 13 at Cubist Pharmaceuticals in Lexington with a simulcast planned for the Crowne Plaza in Providence, Rhode Island. This topic attracted over 60 attendees in Lexington and another 25 attendees in Rhode Island, including a number of nonmembers whom we may have convinced to become ISPE Members.
|Attendees chatted with peers while enjoying a light dinner at the opening reception.|
(The Chapter extends its apologies to the Rhode Island attendees for the technical issues which could not be resolved that night. Plans are underway to make a video of the presentation available for these attendees. For more information, contact the Chapter office at firstname.lastname@example.org .)
Following an excellent networking reception with a very impressive spread supplied through Cubist’s Cubistro, the program began with opening remarks by Boston Area Chapter President Christopher Opolski who welcomed the guests to Cubist and thanked our generous host for providing a first-class venue. He also made sure to promote a number of upcoming Chapter events including the “Future Trends” educational program on January 15 and the New Year's Social on January 22 at Flat Top Johnny's in Cambridge.
Binesh Prabhakar, founding partner and manager of Cambridge IT Compliance, was then introduced as the program manager and moderator for the panel discussion. Binesh introduced the four panelists and thanked them for providing their insights on cloud hosting technologies in a regulated environment. Each has in-depth, real world experience in leveraging, deploying or validating cloud-hosted solutions and opened the discussion with a short series of slides designed to inspire questions from the audience.
An extended Q&A session allowed attendees to get one-on-one feedback from
First up was Tracy Lampula, Associate Director of GIS Compliance, Vertex Pharmaceuticals. Tracy has been deploying GxP IT solutions for a number of years and has experience leveraging cloud-based solutions for Vertex. She believes the short step from application provider, through shared resources to “security as a service” and then to the cloud should not prevent regulated companies from making a risk-based assessment of the new services, provided they take care of those risks as they would for any new deployment model.
Tracy went on to say that cloud models can offer great benefits over traditional deployment models, including an agile approach to changed requirements, exceptional disaster recovery and redundancy (something traditional deployment struggles to provide both practically and with financial justification) and has excellent elasticity which can be critical for high-powered research computing.
But she warned that companies should not be complacent about the cost. Cloud is rarely a cost effective solution, when you need compliance, validation and security to be guaranteed. Vendor management and setting expectations early will ensure you are not paying for controls you don't need while being sure that all your needs really are met, including, "How do I get my data back should I need it or want to change my cloud vendor?" Testing out a cloud vendor with smaller, less critical projects is a good way to accurately gauge what they really can provide.
William Sanborn, Director of Information Technology, LFB-USA, followed Tracy's talk. He highlighted the different elements in the “cloud technology stack” that make up the clients (normally through a browser), the application layer, platform layer and infrastructure layer, then walked us though how these layers have spawned the IaaS, PaaS, and SaaS terms for services that cloud vendors can provide. Despite Tracy's practical advice on the relative costs, William pointed out that the birth of these services was based on the goal of reducing costs to the end user community, enabling them to focus on their core business.
Next up, Robert Streit, Sr. Manager, Q&C Architecture and Design for IT Systems, Johnson & Johnson, covered the particular concerns involved with hosting an application containing data regulated by health authorities. Bob was specifically asked to join our panel as a representative of the ISPE GAMP Special Interest Group on cloud computing. His opening slide addressed the key question in the minds of audience members, "If you build a virtual environment in your own data center, FDA expects validation and qualification of the system. How does this expectation change by putting that system in somebody else's building?”
His answer to this question is that a cloud-based GxP system requires exactly the same validation approach (such as that found in GAMP 5) and validation documentation as any other internal system but care should be taken to look at the risks and risk mitigation with the cloud architecture in mind and pay particular attention to roles, responsibilities and where critical validation documents are managed. Key is the vendor’s understanding of life science predicate rules and expected qualification and validation efforts, specifically backup and restore expectations.
And what other "services" are they including in the package you are paying for? When you are using an application as a service, not only your specific data but the entire application and configuration requires protection.
Finally, Robert Wherry, CPIP, Principal Consultant, Strategic Compliance Services, PAREXEL International, turned our attention to the current challenge of demonstrating data integrity to regulators, ensuring that they trust that data created or managed in the cloud is as truthful as any other data produced. Having a third party manage the security and access to the data opens risks about the integrity of that data. The tricky part is how to assess how well they are achieving that.
For a normal third party arrangement, a key aspect of supplier assessment is to conduct on-site audits into their procedures and processes but cloud vendors rarely permit that level of access. And what if a regulator insists on that same level of access? This is where a company can use their validation of the application and their data review to show that the data is secure. Just as you would run validation testing for critical data integrity processes (such as security, audit trails and backup and restore) in your own hosted solutions, you should do exactly the same for a vendor-hosted solution or have your vendor do that for you.
Equally, data integrity can be assured much more easily if you can demonstrate control over the environment and manage change appropriately. Ensure you have purchased and manage test/development and validation environments as well as production environments, especially for cloud-based applications.
How do companies prepare for a detailed data integrity focused audit when the application is in the cloud? The key is to know and understand what causes regulators to lose trust in the company’s data and make sure to address any new risks that the cloud deployment might be introducing. Remember, the FDA itself stores critical records in the cloud!
A lively and extended question and answer session followed as the audience raised their specific concerns about moving regulated life science data into a cloud environment.
How to get quality agreements set up that include details of how the vendor would implement software and hardware changes? What kinds of notification should a company expect (ideally, before the changes occur and in time to evaluate those changes)?
How to prepare for a regulator who may insist on visiting a cloud provider?
What kind of cloud vendors are out there? Is it better to look to a large provider with broad experience across many different applications; or focus on smaller, niche companies who have specific experience hosting life science applications for pharmaceutical or medical device companies?
What about access to the backups of critical data? Those also contain proprietary information. Where are they kept? How secure are they and who "owns" the information the backups contain? Most cloud vendors have no interest in having responsibility for your data, so perhaps sending you the backups to store locally is a good solution especially since then you do not need to pay the vendor to store the backups as well as the live data.
When a hosted solution is solely for security verification or if security is a part of the hosted solution, who is responsible for the updating such as removing access for ex- users?
When is it cost effective to use cloud services? Is there a “sweet spot?” Should we avoid use of the cloud for regulated data and simply use it for non-regulated data?
What about the security of the vendor’s data centers? Even though the cloud is private and secure, there is still physical security to consider.
And speaking of physical security, where IS my data? For most regulated pharmaceutical data, it seems key to ensure that the data stays in the US for a US-based company. This ensures compliance with local laws and regulations. Data that resides outside the US could become subject to the laws and regulations of the hosting country which would seriously add to costs.
Bob Streit had talked about certifications for vendors, but what does that mean? In reality, the exact certifications may not guarantee your data is protected but the more quality processes the company has in place that have been verified by a third party, the more confidence you can have that they have the internal quality knowledge and procedures that should protect your data.
What is a reasonable time to allow in an SLA for retrieval of data if, for example, you wanted to end an agreement and search for an alternative cloud hosting service? This could be a critical aspect when the original vendor holds your data. Accessibility to the data would be simpler for a IaaS or PaaS hosting versus a SaaS service where you might get your data back but would be unable to read it without the original application. Perhaps spinning it up into your own internal virtual environment would give you time to find a new hosting vendor.
Just as with internal computerized systems, validation and qualification based on risk is the key. A companywide policy for data classification should determine and demonstrate what is permitted to be stored in the cloud and what should never go into a hosted virtual environment. This should include instructing employees (including vendors and subcontractors) about what data should never, unofficially be kept on any cloud service, including the likes of Google Drive and Dropbox.
Further discussion arose about whether cloud-hosted applications should be considered "open" or "closed" in context of 21 CFR Part 11, with some difference of opinion amongst the attendees. All agreed, however, that transmission of data should use, at a minimum, a VPN link; but that encryption of transmitted data to prevent interception might also be a reasonable mitigation.
The evening’s program, as a whole, was a resounding success, delivering quality information on a topic of great interest and relevance to Chapter Members. The Boston Area Chapter and Program Managers Binesh Prabhakar, Christopher Ciampa and Heather Longden would like to thank the panelists and audience members for their valuable contributions to this program and Cubist for providing the venue and catering for the event.
The ISPE Boston Area Chapter educational program entitled “Lean Six Sigma - Theory, Applications and Lessons Learned in the Biopharma Industry” was held Thursday, December 11 at Lantheus Medical Imaging headquarters in north Billerica and was offered via simulcast at the Crowne Plaza Providence in Warwick, RI. The goal of this program was to present an overview of the Lean Six Sigma tools and techniques and illustrate how they can be successfully applied to biopharmaceutical operations using actual case studies.
The opening networking reception gave attendees the opportunity to introduces themselves and interact with the speakers while enjoying a light dinner. The program began with opening remarks by Boston Area Chapter President Christopher Opolski who welcomed the attendees, followed by Lantheus Manufacturing and Operations Vice President William Dawes who gave a brief presentation about Lantheus and its product portfolio. After that, Meeting Manager and Lantheus Validation Specialist Juan Espinal introduced the presentations and the speakers.
Speakers (l to r) Rui Coelho, Dan Fleming and Niranjan Kulkarni joined forces
The first speaker, Dan Fleming, Continuous Improvement Manager, GBMP, gave an introduction explaining how Lean connects the company, processes, employees and customers. He emphasized how the use of Lean tools like JIT (just in time), Jidoka and standardization can improve product quality and reduce cost and delivery time. Dan mentioned than Lean focuses 10 percent on techniques and 90 percent on people and emphasized that management support plays a big role in its successful implementation.
Next up was Rui Coelho, Associate Director, Operational Excellence, Biogen Idec, who has over 25 years of management experience applying various product and process improvement approaches. Rui presented on several Lean case studies including one at Biogen called the grass roots improvement process (GRIP) which involved training the workforce in Lean methodologies then empowering them to apply the tools. Tools used included 5S, standard work, visual management, employee idea submission/implementation, and Kanban. The results included gains in efficiency and cost savings. He also presented a lab case study where the use of Kanban produced significant savings.
Following the introduction to Lean presented by Dan and Rui, Niranjan Kulkarni, PhD, Sr. Operations Specialist at CRB, followed with the Six Sigma part of the program. Niranjan began with an introduction to Six Sigma describing its philosophy, methodology, metrics and tools and presented several biopharm case studies where applying the Six Sigma model resulted in benefits such as 35 percent reduction in changeover times, 60 percent increase in manufacturing throughput (ie. number of batches) and 43 percent increase in QC lab throughput (ie. number of tests). He pointed out that in spite of successes such as these Six Sigma cannot be used for everything and should only be applied in the areas where is well suited.
The evening’s three speakers did a great job of explaining how the industry can benefit from the intersection of these powerful methodologies in combination with employee engagement. Lean and Six Sigma are not competing approaches. Instead they are complimentary and together can have a significant positive impact on efficiency and indirectly on the patients who rely on the industry and its products. Selecting the right tools and techniques for solving the problem at hand is very important. However, even more critical for a successful Lean Six Sigma journey is management support combined with employee involvement and cultural change.
Following the presentations, the audience demonstrated their interest and enthusiasm by asking many fantastic questions including:
“How do you manage project overload?” Utilize Policy Deployment and set up a PMO.
“How do I get started?” Get educated through books, webinars, workshops and DVDs but, most important, find an expert in the field who can recommend the best approach and provide coaching while you “learn by doing.”
The Boston Area Chapter and Program Managers Juan Espinal and John Sheridan would like to thank the speakers and audience members for their valuable contributions to this program and Lantheus Medical Imaging for providing the venue and great preparation for the event.
On December 4, the Boston Area Chapter Young Professionals hosted a social event at Night Shift Brewery in Everett. Nearly 40 people representing some 34 companies and universities turned out for an engaging evening of food, networking and craft beer.
An evening filled with fun, laughter, and networking greeted
the Chapter's Young Professionals at the Night Shift Brewery
Night Shift Brewery was started in 2012 by a group of friends who began brewing craft beers in a 5-gallon pot in their apartment. They experimented with unorthodox brewing ingredients and yeasts, creating some very unique and flavorful beers. After some success in brewing competitions, the friends decided to commercialize their products and have had great success providing the greater Boston area with unique beers.
The company moved to a new Everett location within the past year. We were fortunate enough to get an up-close tour of the brewing facility where all of Night Shift’s beers are brewed, fermented and bottled. We saw the vessels where different types of malts are crushed and mixed with hot water to release the fermentable sugars within. The resulting liquid, known as wort, is separated and combined with hops prior to fermentation in large airtight tanks. Once fermentation is complete, the beer is aged and bottled by hand on site.
Each attendee got the chance to sample a handful of the seven beers on tap before choosing their favorite to enjoy with the food provided, a very hearty spread including chicken Caesar salad wraps, pasta salad, and buffalo chicken fingers. The event spanned two rooms: the taproom, with the bar and sitting area, and the barrel room, where you can see dozens of barrels of aging beers as well as play a few rounds of Cornhole!
The evening was filled with fun, laughter and networking. Overall the event was a big success, offering many opportunities to meet new people and build connections, and we look forward to holding similar events at other Boston-area microbreweries in the future!
Page last updated: 7 January 2015