You are on page 1of 86

A Quality-by-Design Based Methodology

for the
Rapid Development of Robust LC Methods

Richard Verseput, President


Graham Shelver, Ph.D., V.P. Sales and Marketing

S-Matrix Corporation
www.smatrix.com

Quality-by-Design ?

Am I doing QbD ?

Experimental Region
85
Final %
Organic
65

2.0

pH

5.5

7.0

KNOWLEDGE SPACE
The information and knowledge gained from pharmaceutical development
studies and manufacturing experience provide scientific understanding to support the
establishment of the design space, specifications, and manufacturing controls.
[Q8(R1) - Pharmaceutical Development Revision 1, November 2007]
2

Quality-by-Design DOE Experimental Approach


1. Define the Experimental Design Region
a) Select study variables
b) Define study ranges or levels

2. Develop the Knowledge Space


a) Conduct a formal experimental design
b) Analyze results build equations of variable effects

3. Establish the Design Space


a) Define best conditions (setpoint optimization)
b) Define process robustness (operating space)
3

Quality-by-Design DOE Experimental Approach


1. Define the Experimental Design Region
a) Select study variables
b) Define study ranges or levels

Column 5
Column 4
Column 3

FORMAL EXPERIMENTAL
DESIGN
A structured, organized
method for determining the
relationship between factors
affecting a process and the
output of that process. Also
known as Design of
Experiments.
[ICH Q8 - Guidance for Industry,
Pharmaceutical Development, May
2006]

Column 2

100

Column 1
Experimental
Design Region:
pH
and
Final % Organic

Final %
Organic
50
2.0

pH

7.0
4

Quality-by-Design DOE Experimental Approach


KNOWLEDGE SPACE

2. Develop the Knowledge Space


a) Conduct a formal experimental design
b) Analyze results build equations of variable effects

The information and knowledge gained


from pharmaceutical development
studies and manufacturing experience
provide scientific understanding to support
the establishment of the design space,
specifications, and manufacturing controls.
[Q8(R1) - Pharmaceutical Development Revision 1,
November 2007]

100

Final
% Organic

50

2.0

pH

7.0

Rs = o + 1(X1) + 11(X1)2 + 2(X2) + 22(X2)2 + 12(X1*X2)


5

Quality-by-Design DOE Experimental Approach


DESIGN SPACE

3. Establish the Design Space


a) Define optimum operating conditions (setpoints)
b) Define robust operating ranges (control limits)

The multidimensional combination and


interaction of input variables (e.g., material
attributes) and process parameters that
have been demonstrated to provide
assurance of quality.
[ICH Q8 - Guidance for Industry, Pharmaceutical
Development, May 2006]

100

All study factor combinations


within the Design Space have:

Design
Space

Acceptable mean performance

Final %
Organic

Acceptable robustness
Knowledge Space
50

Acceptable Mean
Performance Only
2.0

pH

7.0

LC Method Development Traditional vs. Quality-by-Design


Traditional Trial-and-Error Approach:
Phase 1 Easy to Control Parameters = Instrument method editing
Solvent Strength (% Strong Solvent)
Temperature
Organic Solvent Type (exclusive or)

Phase 2 Difficult to Control Parameters = Instrument hardware change out


pH (online mixing not advised)
Ion Pairing Agents (not universal, slow column equilibration)
Column Type (cost, switching required)

Conventional LC Example of Automation Support

Fusion AE + CDS

CDS

UPLC Example of Automation Support


Fusion AE + CDS

Network

CDS
Solvent Valve Assembly

LAN
LAC/E Card

4-Relay
Panel

Method Development Traditional vs. Quality-by-Design


Automation-supported Formal Experimental Design:
Phase 1 Column/Solvent Screening = Major Effectors

Gradient Slope (5%-95%, vary gradient time)


pH (3 or more levels - automated solvent switching)
Column Type (6 columns - automated column switching)
Solvent Type (can be included, w/wo blending)

Phase 2 Remaining Parameters = Secondary Effectors

Pump Flow Rate


Gradient Conditions (optimization)
Column Temperature
Ion Pairing / Suppressing Agents
10

Knowledge Required to Constitute a QbD Approach

A valid experimental strategy should provide a data set from which all
significant parameter effects can be identified and quantified:

Linear additive effects.

Simple interaction effects.

Complex interaction and non-linear effects.


These effects drive analytical method robustness.

11

Examples of Important Effects

Linear Additive Effect (no slope change)

Area API = b 0 + b1( x1) + b 2 ( x 2 )

12

Examples of Important Effects

Linear Additive Effect (no slope change)

13

Examples of Important Effects

Pairwise Interaction Effect (slope change)

Rs = b0 + b1(x1) + b2(x 2) + b12(x1 * x2)

14

Examples of Important Effects

Pairwise Interaction Effect (slope change)

15

Examples of Important Effects

Complex interaction and and Non-linear Effects

Rs b0 b1(x1) b11(x1) b2(x2) b112(x1) (x2)


2

16

Examples of Important Effects

Complex interaction and and Non-linear Effects

17

Risk Factors in Current Practice

Method Development Phase 1 Column/Solvent Screening


Current Approaches:

One-factor-at-a-time (OFAT)
First Principles Equation
Simplex (Iterative) Studies
Traditional Design of Experiments (DOE)

As we will discuss:
These approaches can lack the experimental design region coverage
and quantitation necessary to Quality-by-Design (QbD) based practice

18

One Factor at a Time (OFAT)


Standard approach that varies one parameter at a time and assess the
effect of these changes on parameters such as resolution (R).
OFAT in Column Screening:
Step 1: Multiple columns are evaluated at constant method conditions
(e.g. constant pH and gradient conditions).
Identify best column by evaluating the various chromatograms.
Step 2: Select a 2nd parameter, say pH, and vary it across a series of
injections using the best column.
Step 3: Select a 3rd parameter, say Final % Organic, and vary it across
a series of injections while using the best pH from Step 2.
And so on ...
19

OFAT and Experimental Design Region


True Experimental Region defined by typical study ranges of two study factors
100

Experimental
Design Region:

Final %
Organic

pH
and
Final % Organic

50

2.0

pH

7.0
20

OFAT Design Region


Coverage Step 1

Column 5
Column 4
Column 3

100

Column 2
Column 1

85
OFAT
Design Region
Coverage

Final %
Organic

50

2.0

5.5
pH

7.0
21

Column/Solvent Screening Steps 2 and 3


Table and graph show an OFAT study in which pH and Final % Organic*
were investigated sequentially to increase resolution of a critical pair.

* Reflects the slope, since gradient time is held constant at 30 min.

22

Column/Solvent Screening Steps 2 and 3


Table and graph show an OFAT study in which pH and Final % Organic*
were investigated sequentially to increase resolution of a critical pair.

OFAT Completely Missed True Optimum Method Conditions


23

Risks in the OFAT Approach

Extremely Limited Coverage of Design Region

No Ability to Study Interaction Effects

24

First Principles Equation Approach


Sequential Studies for One of the Following:
Percent organic (reversed-phase isocratic)
Normal-phase conditions (reversed-phase isocratic)
pH
Mobile phase blend (isocratic)
Ionic strength
Additive/buffer concentration
Gradient conditions

Temperature
Temperature with gradient conditions (reversed-phase)
25

Normalize the Equation Using Tuning Run Data


Normalized (data-adjusted)
Model Prediction Line
1.50

Rs

1.25

1.00

Original First Principles


Model Prediction Line

65

75

85

Final % Organic

26

Risk Factor 1 Disjoint Factor Space

Study #1

Study #2

45

45

Gradient
Time

Gradient
Time
15

15
30.0

50.0

30.0

Temperature

50.0
pH

pH with Temperature ?
27

Risk Factor 2 No Interaction Effects

Linear Additive Effects Model 2 Factors.

Rs b0 b1(x1) b2(x2)

Pure Curvilinear Effects Model 2 Factors (Pure Quadratic).

Rs = b0 +b1(x1) +b11(x1)2 +b2(x2) +b22(x2)2

No Interaction Term

Rs b0 ... b12(x1* x2)


28

Risks in the First Principles Equation Approach

Limited kinds of paired-variable studies


Limited coverage of design region
Provides no knowledge of:
Interaction Effects
Complex Effects

Lots of experiments very little knowledge yield


(5 Columns)*(3 pH levels)*(2 Gradient Runs) = 30 Runs

29

Simplex Optimization Approach

85

Last
Run

Final %
Organic

First
Runs

65

2.0

7.0
pH
30

Risks in the Simplex Optimization Approach


Limited coverage of design region
Provides no knowledge of variable effects
- Variables are too highly correlated to accurately identify effects

Lots of experiments no real knowledge yield

31

Traditional Design of Experiments (DOE)

Can provide a data set from which all significant parameter effects
may be identified and quantified:

Linear additive effects.

Simple interaction effects.

Complex interaction and non-linear effects.

But there are risks ...

32

Risk Factor 1 Small Study Ranges


Effect will be poorly estimated when study range = experimental error range.

Actual
Mean Effect

Effects
Estimation
Error

33

Risk Factor 1 Small Study Ranges


Use Large Variable Ranges = Good Signal/Noise Ratio.

34

Risk Factor 2 Wrong Design Selection


Not all DOE Design Types contain equal knowledge potential.
Example - Plackett-Burman Designs:
Typically are Resolution III
Main effects aliased with pairwise interaction effects
Effective for Robustness Confirmation study
Risky for studies where goal is Knowledge
Trial
1
2
3
4

X1
+
+

X2
+
+

X3
+
+

X1*X2
+
+

X1*X3
+
+

X2*X3
+
+

Limitation is often misunderstood also often forgotten in the data analysis


and interpretation of results.
35

Risk Factor 3 Transcription Errors

Generate experiments.

Calculate sample
Amounts.

Manual, off-line, using


non-validated DOE
software or
best guess.

Manual, off-line, using


non-validated tools
such as MS Excel.
Conducting the Experiment

Run experiments manually.


Manual setup and operation
of equipment.
Prepare samples
36

Risk Factor 3 Transcription Errors

Processing the Data and Results

Statistically analyze and


Interpret the data.

Examine need for


more experiments.

Enter data.
Write report.

Manual, off-line, using


non-validated tools
such as MS Excel.

Using off-line generic


DOE software.

Manual, error-prone.

37

Risk Factor 4 Inherent Data Loss


Inherent Data Loss in Traditional Performance Metrics:
Column screening experiments, even those done by DOE, often have significant
inherent data loss in critical results such as resolution.
The data loss is due to both compound co-elution and also changes in compound
elution order (peak exchange) between experiment trials.
These changes are due to the major effects that pH and organic solvent type can
have on column selectivity.

38

Inherent Data Loss Co-elution and Peak Exchange

Resolution of Impurity C from API 1

Trial 13

Trial 19
Resolution of Impurity C from Impurity G

39

Inherent Data Loss Co-elution and Peak Exchange


Integration Traditionally Involves Peak Tracking

40

Risk Factor 4 Inherent Data Loss

Run No.
1.a.1.a
2.a.1.a
3.a.1.a
4.a.1.a
5.a.1.a
6.a.1.a
7.a.1.a
8.a.1.a
9.a.1.a
10.a.1.a
11.a.1.a
12.a.1.a
13.a.1.a
14.a.1.a
15.a.1.a
16.a.1.a
17.a.1.a
18.a.1.a
19.a.1.a
20.a.1.a
21.a.1.a
22.a.1.a
23.a.1.a
24.a.1.a
25.a.1.a
26.a.1.a

Gradient Time pH
8.8
6.3
10
5
10
5
10
5
7.5
7.5
7.5
7.5
5
7.5
7.5
7.5
5
10
5
10
5
8.8
6.3
10
10
5

2
2
2
2
2
2
2
2
2
4.5
4.5
4.5
4.5
4.5
4.5
4.5
7
7
7
7
7
7
7
7
7
7

Column Type Imp E - USPResolution Imp F - USPResolution Imp G - USPResolution


C18
1.4
2.3
Phenyl
1.78
C18
1.45
2.36
C18
1.15
Phenyl
1.06
2.25
Phenyl
1.66
RP
RP
RP
2.95
C18
0.97
1.18
3.8
Phenyl
1.24
2.27
RP
2.08
2.15
RP
0.98
C18
1
2.63
3.85
Phenyl
1.29
2.26
RP
2.08
C18
2.35
Phenyl
1.45
1.08
2.45
Phenyl
2.03
RP
3.05
RP
1.02
Phenyl
1.42
2.34
RP
1.54
C18
1.89
2.99
2.82
C18
1.87
2.95
2.81
C18

41

Inherent Data Loss Can Not Model Chromatography

Regression Model
Statistics

Compound Name

R2-Adj. Value

Impurity F

0.4785

Impurity G

0.9725

Table above shows - data can not be accurately modeled.


Result Phase 1 is reduced to a Pick the Winner Strategy.
Solution: Trend Responses - independent of peak tracking
i.e. Peak Count Based, Peak Results Based

42

Risks in the Traditional DOE Approach


Traditional Use of Small Study Variable Ranges
- Low Signal-to-Noise Ratio.

Traditional Use of Highly Fractionated Factorial Designs


- Variables effects are confounded. Can not identify true effectors.

Manual experiment design building and results data entry


- Transcription errors can corrupt experiment design and data analysis.

Limitations in Traditional Performance Metrics


- Too much missing data for accurate effects modeling.
- Data often do not represent variable effects.
43

QbD-Based Design of Experiments A Case Study


Automated DOE is carried out in the following five steps:
1) User selects the experimental parameters (study factors).
2) Software generates a statistical experimental design and exports it to
the chromatography data system (automated).
3) LC system runs the various conditions defined in the experimental
design on the instrument (automated).
4) Software imports results from the CDS (automated).
5) Software generates and applies mathematical models to predict the
optimum method (automated).
The following slides present a case study of an Automated DOE
experiment carried at a major Pharma company.
44

Case Study Software / Data Flow


Steps 1 and 2

Step 3

Experiment Design

Ready-to-run
methods & sequences

Experiment run on HPLC


in walk-away mode.

Step 5
File-less Data Exchanges

Automated analysis, graphing, and reporting.

CDS generates
chromatogram results.

Step 4

Report output formats: RTF, DOC, HTML, PDF.


45

Case Study Hardware Framework UPLC System

Fusion AE
+ CDS

Internal Column /
External Solvent
Switching

Waters
Acquity

46

Original Traditional HPLC Method


14 Compounds, including 2 APIs, 11 related impurities, and 1 process impurity
0.070
0.060

4.6 X 150 X 5 YMC Basic


40/40/20 ACN/Methanol/ 2% Ammonium acetate pH 4.5
1.5 ml/min flow
Column Temperature 25 C
10 injection

0.050

AU

0.040
0.030
0.020
0.010
0.000
1.00

2.00

3.00

4.00

5.00

6.00

7.00

8.00

9.00 10.00 11.00 12.00 13.00 14.00 15.00 16.00 17.00 18.00 19.00 20.00
47
Minutes

Step 1 Experiment Setup

Experiment Variable

Range or Level Settings

Gradient Time (min)

5.0 10.0

Gradient Slope (% Organic)

20.0 95.0

pH

2.00, 4.5, 7.0

Column Type

Three Columns:
BEH C18 2.1 X 100
BEH Phenyl 2.1 X 100
Shield RP 2.1 X 100

Organic Solvents

Mobile Phase A1-1:


Mobile Phase A1-2:
Mobile Phase A1-3:

0.05% TFA Buffer, pH 2.00


20 mM Ammonium Acetate Buffer, pH 4.45
10 mM Sodium Phosphate Dibasic Buffer, pH 6.80

Mobile Phase B1:

50% Acetonitrile, 50% Methanol

(solubility considerations)

48

Experimental
OFAT and Experimental
Design Region
Design
Qualification
Region

Center Point Used


for Design Region
Qualification Trial

100

Final %
Organic

50

2.0

pH

7.0
49

0.20

7.920

Experimental Design Region Qualification

Gradient Conditions:
Slope = 5% 95%.
Time = 7.5 min.

Results define that %


Organic starting point
should be increased

pH = 4.5
0.15

New slope: 50% - 95%

= @250nm

2.00

4.00

6.00
Minutes

8.00

8.412

0.00

6.244
6.459
6.881
7.067

5.469

0.05

8.203

Flow Rate = 0.3 ml/min.

7.792

0.10

7.204
7.292
7.494
7.588

AU

Col. Temp = 70C

10.00

50

Step 2 Generate Design and Export it to the CDS


Software maps the experimental
design to the study factors.
Important design region sampling
of Numeric factors is repeated
for each level of each Categorical
factor.

51

Step 2 Generate Design and Export it to the CDS


Automatically reconstruct experimental design within the Empower chromatography
data software (CDS) as instrument methods and sample sets (sequences).

52

Step 3 Run the experiment trials on the LC


Fusion AE + CDS

Network

CDS
Solvent Valve Assembly

LAN
LAC/E Card

4-Relay
Panel

53

Step 3 Integrate chromatograms

54

Step 4 Import Wizard. Auto-computed Trend Responses

55

Step 4 Import Results from CDS


Software automatically imports results from the CDS.
Import
From CDS

56

Step 5 Model Results


pH (X1

= 6.5)

Gradient Time (X2 =

9.0)

No. Peaks 0 1(x1) 11(x1) 12(x1 * x2) 2(x2)


2

Predicted Result

57

Step 5 Model Results

Software generates
and applies
mathematical models
to predict the optimum
analytical method.
At right is a graphical
representation of a
model in this case
the model of study
factor effects on a
given compounds
Tailing Factor
response.

Note response is highly interactive and non-linear


58

Step 5 Graphically Represent Results from CDS


Resolution Maps.
Note region of good Resolution (Red) changes for different pairs of compounds
in the contour plots below.

59

Step 5 Automated Numerical Search for Optimum Method

60

Step 5 Numerical Solution Search Best Conditions

61

Step 5 Graphical Solution Search Best Region Overall

An Overlay graph of all responses for which method performance


goals are defined is automatically generated.
For this presentation we will build the overlay graph one response
at a time.

62

Overlay Graphics
Fusion AE Overlay Graph.
Each color on the graph
corresponds to a response for
which goals have been defined.

Note: Shaded region indicates pH


and Gradient Time and pH
combinations that do NOT meet
performance requirements.

A region shaded with a given


color shows the study variable
level setting combinations that
will NOT meet the goals for the
corresponding response.
Note: the un-shaded region
corresponds to level setting
combinations that meet all
response goals.

63

Overlay Graphics

2 response goals

64

Overlay Graphics
Unshaded Region With
Predicted Best Settings:
Gradient Time = 9.5 min
pH = 7.0
Column = C18

65

Analyze Results to Define Variable Levels for Optimization.

0.080

5.961

5.725

Predicted best conditions chromatogram.


Results define center-point conditions for optimization experiment.

0.070
0.060

5.508

0.040

6.551

0.000

6.174

0.010

4.428

0.020

4.747
4.927
5.104
5.252
5.367

0.030

4.078

AU

0.050

-0.010
0.50

1.00

1.50

2.00

2.50

3.00

3.50
Minutes

4.00

4.50

5.00

5.50

6.00

6.50

66

Optimization Experiment Design

Experiment Variable

Range or Level Settings

Pump Flow Rate (mL/min)

0.1 0.5

(Constant level in Screening Design was 0.25 mL/min)

Gradient Slope
(Starting Point % Organic)

50.0 80.0

Endpoint = 95%

Gradient Time (min)

5.0 10.0

pH

6.50, 7.10

Column Type

One Column:
BEH C18 2.1 X 100

Organic Solvents

Mobile Phase A1:


Mobile Phase A2:

10 mM Sodium Phosphate Dibasic Buffer, pH 6.50


10 mM Sodium Phosphate Dibasic Buffer, pH 7.10

Mobile Phase B:

50% Acetonitrile, 50% Methanol

67

Automated Numerical Search for Optimum Method

68

DOE Models are Mean Response Predictors


Pump Flow Rate (X1

= 1.0)

Single Level
Setting

% Organic (X2 =

80)

Single Level
Setting

Rs 0 1(x1) 11(x1) 12(x1*x2) 2(x2)


2

X = 2.87

Single Prediction = Mean Result

Rs
69

Overlay Graphics Mean Performance

Edges of Failure

70

ICH Q2A Robustness


The robustness of an analytical procedure is a measure of its capacity to remain
unaffected by small variations in method parameters
Actual % Organic
achieved in one assay.

% Organic
Setpoint

76.0

77.0

78.0

Setpoint Error:
Distribution of Variation
in % Organic around
setpoint in normal use
due to statistically
random error.

79.0

80.0

81.0

82.0

83.0
71

Mean Performance Versus Robustness


X

Methods A and B - Same Mean result

LAL

UAL
Method A Large Response Variation
Method B Small Response Variation

96.0

97.0

98.0

99.0

100.0

101.0

102.0

103.0

104.0

API Concentration (%)

72

Mean Performance Versus Robustness - Surrogate

73

The LC System as a Process in a Box


Heating
Chamber
(Column Oven)
Raw Material
(Mobile Phase
Pumps)

At-Line
Measurement
(Detector)
Separation
(Column)

API Amount

LSL

USL

Process Flow

= variation around setpoint

6
Variation
74

Process Capability - Quantified


Process Capability (Cp)
This is a direct, quantitative measure of process robustness used routinely in Statistical
Process Control (SPC) applications. The classical SPC definition of Inherent Process
Capability (Cp) is

USL LSL
cp
6 var iation
USL and LSL

= Specification Limits (tolerance width).

6 Variation

= 3 process output variation.

Traditional Goal 1.33 (standard goal based on setting the USL and LSL at
4 of process output variation).

75

Step 2 Predict Variation at Run 1 Level Settings


Pump Flow Rate (X1)

% Organic (X2)

Variation Around
Setpoint

Variation Around
Setpoint

Rs 0 1(x1) 11(x1) 12(x1*x2) 2(x2)


2

Monte Carlo Simulation = Predicted Variation


Rs
76

Step 3 Compute Robustness Cp for Run 1


Tolerance Width Delta
( distance) = 1.00

USL LSL
cp
C.I.

e.g., Peak Resolution 3


confidence interval of 1.50:

3.87 1.87

2.00
cp

3.62 2.12 1.50

1.33

2.12

3.62

6
1.87

2.87

3.87

USP Resolution

77

Step 4 Repeat Steps 1 - 3 for all Experiment Runs


Mean Resolution

Resolution Robustness

Robustness can be calculated for each key performance metric.


78

Step 5 Generate Prediction Model of Robustness Cp

Mean Performance and Robustness prediction models can be linked to a


numerical or graphical optimizer to identify one or more combinations of
variable level settings that meet or exceed mean performance and
performance robustness goals defined for each response.

79

Design Space Mean Performance

Edges of Failure

80

Design Space Mean Performance + Robustness

81

Experimentally Verify Predicted Optimum Conditions.


0.070
0.060

Trial-and-Error - Final Chromatogram.

0.050

0.030
0.020
0.010
0.000
3.00

4.00

5.00

6.00

7.00

8.00

9.00

10.00 11.00
Minutes

12.00

13.00

0.34
0.32
0.30

14.00

15.00

16.00

QbD Approach - Final Chromatogram.

17.00

18.00

19.00

20.00

API 2 - 5.198

2.00

API 1 - 4.705

1.00

0.28
0.26
0.24
0.22
0.20
0.18
0.16
0.14
0.12

Lastpeak - 5.695

5.513

4.163

4.028

3.934

3.830

3.274

2.962

0.02

2.647

0.04

3.621

0.06

5.007

0.08

Imp B - 4.458

Imp A - 4.274

0.10

2.409

AU

AU

0.040

0.00
2.40

2.60

2.80

3.00

3.20

3.40

3.60

3.80

4.00
Minutes

4.20

4.40

4.60

4.80

5.00

5.20

5.40

5.60

82

Are we doing QbD?

A Best Practices Checklist

1. Does my study address all critical parameters in combination ?


2. Does my study address the entire experimental design region ?
3. Does my study address interactions and complex parameter effects ?
4. Do my study variable ranges ensure good signal-to-noise ratio ?
5. Is my design free of aliasing of study factor effects ?
6. Am I making effective use of automation to support my study ?
7. Will the results data enable accurate quantitative effects estimation ?
8. Am I able to characterize robustness performance ?

83

Conclusions
The Fusion AE QbD-based Approach Presented Today:
Greatly facilitates QbD through:
- Automation
- Statistically valid experimentation
- Novel data treatments
Provides quantitative knowledge of all critical parameter effects
Enables establishing Design Space for both:
- Mean Performance (setpoint optimization)
- Process Robustness (operating space)
Required time for the work is dramatically reduced
Success promotes the use of QbD

84

End of Presentation

Thank You for inviting us.


And
Thanks' for attending!

85

References
1.

ICH Q8 - Guidance for Industry, Pharmaceutical Development, May 2006.

2.

ICH Q8(R1), Pharmaceutical Development Revision 1, November 1, 2007.

3.

Cornell, John A., Experiments With Mixtures, 2nd Edition, John Wiley and Sons, New York, 1990.

4.

Christian, Robert P., Casella, George, (2004), Monte Carlo Statistical Methods: Second Edition,
Springer Science+Business Media Inc., New York

5.

Dong, Michael W., Modern HPLC for Practicing Scientists, John Wiley and Sons, Hoboken, New
Jersey, 2006.

6.

Gavin, Peter F., Olsen, Bernard A., J. Pharm. Biomed. Anal. 46 (2007), 431-441.

7.

Juran, J.M., Juran on Quality by Design, (Macmillan, Inc., 1992).

8.

Montgomery, Douglas C., Design and Analysis of Experiments, 6th Edition, John Wiley and Sons,
New York, 2005.

9.

Myers, Raymond H. and Montgomery, Douglas C., Response Surface Methodology, John Wiley and
Sons, New York, 1995.

10. Rathmore, A.S., Branning, R., and Cecchini, D., Design Space for Biotech Products. BioPharm
International, 20(5), April 2007.
11. Rathmore, A.S., Saleki-Gerhardt, A., Montgomery, S.H., and Tyler, S.M., Quality by Design:
Industrial Case Studies on Defining and Implementing Design Space for Pharmaceutical Processes
Part 1 and 2. BioPharm International, 21(12), December 2008.
12. Snyder, Lloyd R., Kirkland, Joseph J., and Glajch, Joseph L., Practical HPLC Method Development,
2nd Edition, John Wiley and Sons, New York, 1997.
86

You might also like