You are on page 1of 6

B

i o

r o c e s s Technical

Approaches to Debottlenecking
and Process Optimization
Rick Johnston

wo major challenges associated


with optimizing
biomanufacturing operations
remain unresolved. The rst is
variability: how to understand and
improve manufacturing with
signicant variation in process times
throughout all unit operations. The
second is complexity: modern
biomanufacturing facilities are complex
and interconnected, with piping
segments, transfer panels, and valve
arrays, as well as water for injection
(WFI) and other shared resource
constraints. That complexity is
becoming even greater with the need
for process standardization and
processing of higher (and more
variable) titers and additional products.
In such an environment,
debottlenecking is becoming
increasingly important as a means of
quantitative process optimization.
This technique allows
biomanufacturing facilities to run
new products with minimal retrots
and also increase the run rate of
existing legacy products without
signicant regulatory impact.
Debottlenecking therefore allows a
biomanufacturing facility to
Product Focus: All biologics
Process Focus: Manufacturing
Who Should Read: Process
development, manufacturing, project
management, and operations
Keywords: Downstream processing,
chromatography, data management,
design of experiments, variabilty
Level: Intermediate
44 BioProcess International

10(5)

M ay 2012

Clearly the
production bioreactor
is the bottleneck!

Clearly the protein A


chromatography
is the bottleneck!

Figure 1: Subject matter experts are notoriously bad at comparing facility


performance in their areas with those in areas they dont know as well.

signicantly extend its useful life by


making the site more exible and
efficient. When judiciously applied in
the correct areas, retrofits can be an
order of magnitude less expensive
than building new capacity and can
be performed in one year or less
(compared with four to six years for
building a new facility).
Here we discuss the impact of
variability and complexity on
debottlenecking and optimization. We
show historical approaches to
debottlenecking based on resource use
and why they can lead to incorrect
assessments of the real bottlenecks in
biomanufacturing facilities. We look
at alternatives to collecting data solely
from subject matter expert (SME)
opinions, which can be biased. We
have observed that SMEs in
biomanufacturing facilities usually
have detailed knowledge about one
area of a process but often nd it
difficult to agree on bottlenecks
among multiple unit operations.

Figure 2: A two-stage serial production


process

Fermentation
(300 hours)

Purification
(72 hours)

The gold standard for


debottlenecking methodology today is
based on perturbing cycle times or
resources in a discrete event simulation
model and observing the resulting
impact on some performance indicator:
e.g., cycle time, labor, or throughput.
This involves collecting data from
process historians (using software from
Emerson Process Management,
Rockwell Automation, and/or PI, for
example) and performing sensitivity
analysis on those cycle times.
Debottlenecking is important both in
resource-constrained facilities (for
which increasing throughput provides a
positive return on investment, RoI) and
in throughput-constrained facilities
(where companies seek to make the

Figure 3: CIP times 20022008


7
6

loudest

engineer who got


the most attention;
it was the most
important issue.

Hours

Debottlenecking
was critical
because it gave our
facility a nonbiased
list of issues that we
needed to work on.
It wasnt the

4
3
2
June
June
June
June
June
June
June
2002
2003
2004
2005
2006
2007
2008
January
Dates

Figure 4: How full-time employees interact with a bioprocess

upstream manager
at a large-scale facility

same amount of product with increased


efficiency and at lower cost). In either
case, debottlenecking provides a
quantitative method of accurately
nding the critical processes in a
facility and assessing potential
improvements to its overall
performance.

Introduction to Debottlenecking

Debottlenecking is the process of


improving efficiency by nding the
rate-limiting steps in a facility and
correcting them. Addressing those
steps will improve performance,
whereas changing other (nonratelimiting) steps will not affect
performance. A complete
manufacturing system may have several
hundred key activities, but only a small
number of them (typically less than a
dozen) dene a facilitys run rate.
Consider a simple two-stage
production process. In the rst stage, a
single bioreactor produces material
before getting cleaned, steamed, and
prepped for the next batch taking
300 hours. In the second stage, a
single downstream train purifies
material produced by that production
bioreactor over 72 hours. Each unit
operation can process only one batch
at a time, so the maximum velocity
46 BioProcess International

10(5)

M ay 2012

Figure 5: Buffer-prep tank supports the first two purification steps, protein A and cation exchange.
Protein A

Cation
Exchange

Viral
Filtration

Anion
Exchange

Ultrafiltration,
Diafiltration

Figure 6: Use of a buffer-preparation tank (busy and idle periods)


Buffer-Preparation Tank A
IDLE

BUSY

IDLE

(processing rate) of batches through


the rst step is one every 300 hours
(about 12.5 days). The downstream
processing rate is 72 hours (about
three days), so for the remaining 9.5
days, a purication skid will sit
waiting for the bioreactor to finish.
No improvement to the purication
train (e.g., reducing its cycle time from

BUSY

IDLE

20%
Use

72 to 60 hours) will affect throughput


or run rate because the rate-limiting
step (bottleneck) is upstream, in
production. Because of that, xing the
production step either by adding
additional equipment in parallel or
decreasing the cycle time from 300
hours to some shorter time is the
only way to improve throughput.

50
48
46
44

42
BH T203 CIP
04.Poros
07. Q-Sepharose Preuse
VF Buffer QA Hold
02.1 Media Preparation 6,000 L
BH T302 SIP
Poros Buffer Prep 2
Pool Tank 01 CIP
Chrome Skid 02 Postuse
120-L BRX CIP
120-L BRX SIP
Centrifuge Postuse
BH T103 SIP
03. Poros Buffer Preuse
Poros Buffer Preparation 1
05.Viral Preuse
UF/DF Buffer Preparation
BH T303 SIP
04. Centrifugation
6,000-L BRX SIP
03A. N-1 Transfer
BH T206 SIP
Media Preparation Tank CIP
Pool Tank 01 SIP
Poros Buffer A QA Hold
Poros Buffer B QA Hold
Protein A Buffer A QA Hold
Protein A Buffer C QA Hold
Q Buffer QA Hold
UF/DF Buffer QA Hold
WFI Daily Limit + 10,000
WFI Daily Limit 10,000
WFI Replenishment
BH T203 SIP
Chrom. Skid 03 Postuse
01. Media Preparation 120-L
Protein A Buffer Preparation A
BH T201 SIP
BH T103 CIP
09. HMAS Final UF/DF Preuse
Protein A Buffer Preparation C
6,000-L BRX CIP
Q Buffer Preparation
UF Skid Postuse
VF Skid Postuse
06. Viral Filtration
BH T303 CIP
BH T206 CIP
VF Buffer Preparation
Protein A Buffer Preparation B
Buffer Preparation Tank CIP
02. N-1 Fermentation
08. Q-Sepharose
Batch Arrival
BH T105 CIP
BH T201 CIP
Protein A Buffer B QA Hold
BH T302 CIP
10. Final UF/DF
Chrom. Skid 01 Postuse
01. Protein A Preuse
02. Protein A
03B. Production Fermentation

Key Performance Indicator (KPI)

Figure 7: Spike chart shows key facility bottlenecks.

Activities

KPI: Mean A Throughput

Figure 8: Detail of Pareto spike chart (activities from Figure 7)


50
49
48

We never thought

47

variability

46
45
44
BH T302
CIP

10. Final
UF/DF

Chromatography Skid
01 Post Use

Debottlenecking is typically a twostep process: First, identify the ratelimiting steps among the hundreds or
thousands of potential resources and
activities in the facility (bottleneck
identication). Second, make changes
to those rate-limiting steps by
either adding equipment, reducing
cycle time, or some other means to
improve the process (bottleneck
alleviation). Both are discussed below.

Data, Complexity, and


Variability in Biomanufacturing
Figure 3 shows data from a
biomanufacturer for clean-in-place
(CIP) times from 2002 to 2008.
Those data came directly from a
manufacturing execution system
(MES) to provide an unbiased
estimate of facility performance.
48 BioProcess International

10(5)

M ay 2012

01. Protein A
Preuse

02.
Protein A

03B. Production
Fermentation

(Bio-G software integrates directly


with most common control systems in
biotech facilities.) Note the signicant
variability in this process step, which
takes between three and six hours.
There is also process drift: The
amount of time the activity takes is
not constant from year to year.
The data suggest that using a single
number say, the average time for
planning and optimization may not
identify the correct bottleneck. My
experience in this area has been that
running the same model with and
without variability produces different
bottlenecks. Such an analysis is
especially of concern because nding
and xing bottlenecks can be a capitalintensive process when retrotting a
validated facility. Fixing the wrong
area will not improve run rate no

was important until


we spent $500,000
on improvements
and got no
improvement.
Variability was the
reason why. vice
president and general
manager of a 60,000-L
biomanufacturing
facility

matter how much improvement is


made in that particular area.
Another issue that must be
overcome in debottlenecking and
process optimization is process
complexity. Figure 4 shows a map of
the places in one facility where people
interact with the process. A delay in
the availability of a full-time employee
(FTE) or piece of shared equipment
(e.g., a CIP skid, transfer line, valve

Bottleneck Identification

The traditional approach to identifying


bottlenecks (made popular in the
1960s) focused attention on facility
resources that get the most use. The
theory was that such resources are
busiest and therefore those in which
improvements would have the most
effect. But this approach is not
guaranteed to identify the real
bottlenecks in a facility. It appears from
empirical evidence to be particularly ill
suited to biomanufacturing facilities.
Consider as an example a facility
with one large buffer preparation tank
used to prepare buffer for the rst two
chromatography steps in downstream
purication. Because the tank is used
only in the rst two high-volume steps
of the process, its overall use is low
(Figure 6). However, using that tank
for two sequential chromatography
steps makes the second step wait for it
to be cleaned and buffer reprepped. If
that process takes longer than the
duration of the protein-A
chromatography step, it could delay the
start of the cation-exchange step.
Thus, even though the tank has an
overall use of 20%, it delays production
and therefore represents a bottleneck.

Debottlenecking Detection

So a simplistic use-based approach to


debottlenecking does not nd the true
50 BioProcess International

10(5)

M ay 2012

Figure 9: Level of change required to achieve a target run rate

KPI: (Mean) Throughput

6% change,
1 run improvement

50% change, 0.6 run


additional benefit

45.8
45.6
45.4
45.2
45.0
44.8
44.6
44.4
44.2

Baseline Run Rate

44.0
100

80

60

40

20

% Changed
Figure 10: Debottlenecking pathway
Add another upstream CIP skid.

Throughput (Runs per Week)

array, transfer panel, or even WFI or


utility systems) will delay that process.
Such hidden periods are not
normally considered in cycle-time
calculations. Complex interactions
make it difficult even for highly
trained operators to understand which
hidden cycle times and waiting
periods will slow a facility down.
Our software uses discrete event
simulation and real-time data feeds to
manage both variability and
complexity. Discrete event simulators
incorporate variability as well as
constraints around how an activity can
start. (So just like an automation
system, an activity will not start
without certain prerequisite conditions
met.) This approach produces accurate
models of biopharmaceutical facilities
that incorporate variability and
hidden wait times.

Reduce buffer-prep
times by 10%.
1

2
Move small-scale
preps to
disposables.

0
Baseline

Iteration 1

process constraints in a
biomanufacturing facility. The current
gold standard is to use a simulation
model to perturb (make controlled
changes to) a model of the facility and
observe the results according to some
metric. One popular approach is to
reduce operation times in each process
step, one at a time, and observe their
impacts on throughput. By simulating
a facility in which a particular process
step (e.g., CIP) takes zero time, we
can examine the effect of having that
activity for free. By repeating this
experiment for every activity, we can
understand the chance of reducing a
potential projects cycle time. This
sensitivity analysis can correctly
identify even complex bottlenecks
because it makes an actual change to a
system model and illustrates the effect
of that change.
Figure 7 shows the output from
just such a sensitivity analysis. In this
case, the metric chosen was
throughput, so a higher number

Iteration 2

Iteration 3

Software designed
for widget
manufacturing or
troop rotation just
doesnt cut it in
biotech. industrial
engineering analyst
at a top-10
biopharmaceutical
company

indicates better performance. Each


point in the graph represents reduction
of a particular activitys cycle time to
zero hours and the resulting
throughput observed over a 100-day
campaign. The experiments are sorted
with those with the most significant

Throughput Mean

Figure 11: Surface plot compares improvements in production fermentation and protein A cycle
time.

Protein A

u
Prod

c ti

erm
on F

enta

ti o n

Protein A

Figure 12: Heat map compares improvements in production fermentation and protein A cycle
time (multiple-factor surface chart).

Debottlenecking
comes down to this:
Where do we
focus, and how
much do we
need to change?
No more back of
the envelope
calculations and
second guessing . . .
shift supervisor at
a bulk facility

teams critical information to evaluate


how signicant facility modications
and investments will need to be.

Iterative Debottlenecking
Production Fermentation

impacts to cycle times on the right


and those with the least on the left.
As that analysis illustrates, only a
small percentage of the several
hundred activities in the
manufacturing facility actually
improve throughput. Figure 8 shows
detail of the four activities that most
affected throughput in the Figure 7
model (those on the far right). This
approach detects bottlenecks very
rapidly and is guaranteed to identify
real process constraints in a system
because it simulates the actual impact
of a change to each constraint in turn.

Bottleneck Evaluation
and Alleviation

After identifying bottlenecks,


alleviating them becomes the central
focus for analysis. The perturbation
analysis above found that the key
process constraint is the cycle time of
the production bioreactor; however,
the simplistic proposed change
(reducing the fermentation time to 0
hours) is not a feasible engineering
solution. Fortunately, it may be
52 BioProcess International

10(5)

M ay 2012

sufficient to improve the production


bioreactors cycle time by a smaller
amount to achieve the desired result.
To quantify how much change is
needed before an activity or resource
is no longer on the critical path, we
can use a similar form of sensitivity
analysis that makes changes to one or
a small number of activities or
resources but at a higher level of
granularity. For example, we could
choose to reduce a bioreactors set-up
time by 1, 2, 3 hours and so on, then
observe the impact of each on
throughput or cycle time.
That approach provides valuable
quantitative information to
engineering teams because of an
explicit tradeoff between key
performance indicators (KPIs) at
different levels of change. In Figure 9,
a 6% reduction in cycle time improves
throughput from 44.1 to 45.0 kg, but
further reductions only increase it by a
little more (0.10.6 kg). If 45.0 kg
were sufficient throughput, then we
would not pay more to further reduce
the cycle time. This gives engineering

The approach of bottleneck detection,


evaluation, and removal presented here
is typically repeated iteratively to
provide a series of incremental
improvements for a biomanufacturing
facility. As we identify and x
bottlenecks, the associated items move
off the critical path, and other activities
or resources become rate-limiting steps.
The process of incrementally
debottlenecking a facility in which
we nd the rst bottleneck and
correct it, then move on to the second
and so on produces a
debottlenecking pathway. That
describes the sequence of steps
required to correct bottlenecks, along
with KPI changes that are achieved
by each step. Figure 10 outlines the
results from one such analysis.
That debottlenecking pathway is a
sequence of three engineering
changes. Each iteration increases the
run rate by a different amount.
Multiple engineering changes are
generally required in different areas of
a facility to increase its run rate. It is
important to note that the sequence
order is important: In this case,
moving to disposables without
reducing buffer preparation times will
yield no improvement. The pathway

must be executed in the order shown


to achieve the target run rate.
The approach outlined here can be
very successful in identifying and
correcting bottlenecks in
biomanufacturing facilities using an
iterative approach. But a design-ofexperiments (DoE) approach can yield
similar results and would be more
suited for a highly complex facility.
DoE makes multiple simultaneous
changes to different aspects of a
facility, and the results of those
changes are monitored and analyzed.
In the iterative analysis, we saw
that shortening protein A and
fermentation cycle times would
probably have the greatest impact on
improving run rates. Now how could
improving those factors in
combination help the company
achieve its desired targets? Figures 11
and 12 illustrate the effects of
simultaneously reducing the duration
of both activities on the critical path.
A throughput of 51 kg could come
from reducing fermentation time
alone by 60%. The same result also
could be achieved by reducing
fermentation time by just 5% if the
protein A cycle time is also reduced by
10%. So the DoE approach can help
you understand the tradeoffs between
different possible improvements and
combinations thereof, which do not
simply aggregate together.
One issue with DoE is that it
produces a combinatorial number of
scenarios that must be examined. For
example, examining two possible
factors (say, fermentation time and
purication time) requires four
scenarios, examining three factors
requires eight, four factors require 16,
and so on. The advantage of this
approach is that it allows for an
explicit tradeoff between factors to be
understood, provided that the
associated modeling engine can
evaluate those scenarios automatically.

The Future of
Biomanufacturing Operations

Biomanufacturing facilities are asked


to produce increasingly high-titer
products in shorter campaigns, with a
larger mix of products. The challenge
in such an environment is

understanding the real capacity of a


facility, where its critical process
constraints are, and what will improve
those processes. The best-in-class
biomanufacturing companies we work
with understand that bottlenecks are
dynamic. As a facility evolves, they
move. Repeating debottlenecking
analyses monthly helps align
manufacturing and operational
excellence groups around a single goal.
Manufacturing oor staff, managers,
and the plant head can all focus on
critical key process areas without
spending time, energy, and money in
areas that do not affect performance.
Decision-making is based on
quantitative data from models that
incorporate the variability and
complexity of a facility. This approach
delivers continual improvements
guided by factual observation rather
than SME opinion. Biomanufacturing
facilities that implement
debottlenecking and process
optimization using methodologies
described herein will consistently outperform those using more traditional
approaches.

Further Reading

Johnston R, Zhang D. Garbage In,


Garbage Out: The Case for More Accurate
Process Modeling in Manufacturing
Economics. BioPharm Int. 22(8) 2009.

Rick Johnston, PhD, is principal at


Bioproduction Group Inc., 1250 Addison
Street, Suite 107 Berkeley, CA 94702;
1-510-704-1803; rick@bio-g.com,
www.bio-g.com.

Second
InTeRnATIonAL
conFeRence
Avignon, FrAnce, 47 June, 2012

high-throughput
process development
Plenary lecture
Pandoras data box...content, error and scale
Speaker: Professor Jrgen Hubbuch,
Karlsruhe Institute of Technology, Germany

Session topics
Upstream Lessons Learned and Future
Challenges
Session chair: Jonathan Coffman, Pfizer, USA

Downstream Lessons Learned and Future


Challenges
Session chair: Jens H. Vogel, Bayer Healthcare, USA

Formulation Lessons Learned and Future


Challenges
Session chair: Andrew Kosky, Genentech, USA

This article is adapted with permission from


the Bio-G white paper, Biomanufacturing
Debottlenecking and Process
Optimization, which is available online at
www.bio-g.com/whitepapers/download/
debottlenecking-and-process-optimization.
Provide feedback at www. zoomerang.
com/Survey/WEB22AYVM3VLFC.

To order reprints of this article, contact


Rhonda Brown (rhondab@fosterprinting.com)
1-800-382-0808. Download a low-resolution
PDF online at www.bioprocessintl.com.

Data Analysis and Managing Large Datasets


Challenges and Opportunities
Session chair: Professor Nigel Titchener-Hooker,
University College London, UK

Quality by Design An HTPD Perspective


Session chair: Stefan Hepbildikler, Roche, Germany

Case Studies HTPD in Action


Session chair: Thomas Linden, Merck, USA

Poster session

Pre-conference workshop
Quality by Design Design of experiments,
theory, tools and case studies

www.htpdmeetings.com
for full program and registration details

HTPD is sponsored by GE Healthcare,


with the active involvement of industry experts.