Monday, 29 June 2009

Dealing with outliers in analytical method validation


Occasionally in validation studies, as in any analysis, outliers may be observed in data sets of results. This is where a value is present in the data set which differs considerably from the majority of the other results. For example, the following data set was obtained for a precision study:

25.4, 25.3, 27.5, 24.5, 24.7, 25.6

The value 27.5 is much higher than the one nearest to it at 25.6 and it is suspected that it may be an outlier. The mean and standard deviation calculated for the data set including and excluding the suspected outlier are presented in Table 1. The mean does not appear to be effected substantially by the suspected outlier but if the suspect value is included the %RSD does not comply with the acceptance criterion of %RSD ≤ 2% for precision. If it is excluded the precision complies with the acceptance criterion.

Table 1

Statistical tests can be performed which will provide confidence in the characterisation of a data point as an outlier. However a decision still has to be made about whether to exclude the data point from the results. Some statisticians object to the rejection of any data from small size data sample, unless it is known that something went wrong during the measurement of that data. Rejection of data during validation studies must be very carefully considered, statistical evidence alone may not be enough to justify the rejection of data.

The most popular statistical test applied to detect suspect outliers in the results from chemical analysis is Dixon’s Q-test. One (and only one) observation from a small set of replicate observations (typically 3 to 10) can be examined. The test assumes a normal distribution of the data. The null hypothesis for this test is that there is no significant difference between the suspect value and the rest of the values, any differences must be attributed to random errors.

The test is applied as follows:

1. The values comprising the data set are arranged in ascending order:
X1, X2, X3, ........ Xn

e.g. 24.5, 24.7, 25.3, 25.4, 25.6, 27.5

2. The experimental Q-value is calculated, defined as the ratio of the difference between the suspect value and the nearest value to it, to the range of the data.

If the suspected outlier is a low value:

If the suspected outlier is a high value:

e.g. Qexp = (27.5 – 25.6)/(27.5 – 24.5) = 0.633

3. The value of Qexp is compared to a critical Q-value (Qcrit). Refer to Table 2 for critical values of Q for Dixon’s test, from work by Rorabacher[1]. The value of Qcrit corresponding to the confidence level required for the test is selected, usually 95%.

e.g. Qcrit = 0.625 (95% confidence level)

4. If Qexp > Qcrit, then the suspect value can be characterised as an outlier.

e.g. Qexp > Qcrit, therefore data point 27.5 can be characterised as an outlier.

Table 2

The problems associated with the Q-test are that it may be misleading if more than one outlier is present, and also if an outlier is detected, a decision then has to be made about whether to exclude the data point from further statistical calculations. Methods which are more robust than the Q-test, such as the Huber method (described in the AMC technical brief, no. 6[2]), are becoming increasingly favoured for treatment of outliers since they consider all data present in the set, and not only three data points as in the Q-test. Robust statistics utilise approaches such as the use of the median and median absolute difference to estimate the mean and standard deviation respectively. In this way the outlying data has no effect and the does not have to be rejected. Any approach used to deal with outliers will have to be justified fully in the validation report.

1. D. B. Rorabacher, Anal. Chem, 63, 139-146, 1991, ‘A Statistical treatment for rejection of deviant values: critical values of Dixon's "Q" parameter and related subrange ratios at the 95% confidence level’.
2. Analytical Methods Committee, AMC Technical Brief, No. 6, 2007, ‘Robust statistics: a method of coping with outliers’ (available on RSC website,

This blog post is an excerpt from 'Validation of Analytical Methods for Pharmaceutical Analysis' by Oona McPolin, available to purchase through the MTS website.

Thursday, 11 June 2009

The June edition of 'Analyse This' is out now

Click on the logo to view:

Previous editions (click to view):

April 2009
May 2009

Any comments, questions, suggestions, contributions? Let us know, you could win an iPod Shuffle.

Wednesday, 10 June 2009

Free HPLC training video

This HPLC training video describes the different parts of a HPLC instrument. A picture of how the whole system operates is built up gradually by introducing each part and defining the role that it plays.

You are welcome to use this video in your HPLC training programmes. It is an example of the resources included in UTrain, an MTS product which enables you to easily deliver your training in-house. A series of videos are used to deliver the knowledge part of the course. These are accompanied by a workbook for each learner which details exercises and practical experiments that give learners an opportunity to apply what they have learned. Then e-Learning modules are used to review the learning and administer an assessment to test whether the learning has been absorbed. The videos and e-Learning modules are accessed through e-MTS, the virtual environment for learning provided by MTS.

The first UTrain course deals with basic HPLC and is due to be released later this year. It is based on the MTS course, An Introduction to HPLC for Pharmaceutical Analysis, recognised by the Royal Society of Chemistry for the purposes of continuing professional development. If you have any queries about UTrain then please let us know.

And the winner is...

Congratulations to James from Manchester in England who has won an iPod Shuffle for submitting this month's question for Ask Lab Tech Guy. James is a keen runner and says that his iPod shuffle will help to keep him motivated when he is out running.

James is mostly listening to Van Morrison at the moment. What will you listen to on your iPod Shuffle? Send any contributions for Analyse This to
You need to be subscribed to Analyse This to be eligible.

Monday, 8 June 2009

The training cycle explained

Training know-how applied to laboratory science

Previously, it was established that most people who work in a scientific laboratory environment have some responsibility for providing training (see blog post dated 8th May 2009, I’m a trainer? But that’s not in my job description. Is it?). Also the training cycle was introduced. In this article we will look at the training cycle in a bit more detail. The four steps are as shown.

Step 1: Identifying the learning needs
Identification of learning needs may also be referred to as training needs analysis, or TNA. The remit of this step can be rather broad since it can apply to all the training required by an individual or a group, even a whole company. If you are responsible for a particular type of training provision then typically it will already have been determined that the individuals who come to you for training have been identified as needing the training. Therefore the scope of this step narrows considerably. You need to consider what the learners need to be able to do at the end of the training. For example if you are providing training on an analytical instrument do they need to be able to:
  • Use the instrument
  • Clean the instrument
  • Calibrate the instrument
  • Service the instrument
  • Troubleshoot the instrument?
The complexity of the technique will determine whether all or some of these tasks need to be incorporated into the training that you are planning. You may need to develop training which can be delivered over time as the learner gains experience with the technique.
The outcome of step 1 is a set of learning objectives for the training that specify what the outcome should be.

Step 2: Design the training
When you come to design your training you need to consider how you are going to achieve the learning objectives that you have set. It may be that a practical session where the learner gets hands on experience of using an instrument or piece of equipment is most suitable. In some tasks a theory session where the concepts relating to the task can be fully understood is appropriate.
When designing the training you need to consider the way in which people learn and be aware of the differences in learning styles. In general adults learn well if the training is relevant to what they will do in the workplace so use case studies and realistic exercises in your training. Methods of training that you can consider are: lectures and presentations, demonstrations, exercises, case studies, practical sessions, question and answer sessions, discussion groups and e-learning.
Make sure to consider the time and other resources that you have available for the training. You may need to prepare visual aids such as PowerPoint presentations (much maligned but valuable if used well), and handouts to accompany the training. Try to keep it simple and stick to the point.

Step 3: Deliver the training
The delivery of the training is the step that some people can find daunting, particularly if presenting to groups of learners is required. Some keys things to remember when delivering training are:
  • Speak clearly and make sure that all learners are able to understand what you are saying.
  • Ensure that the learner’s expectations are addressed early on in the training.
  • Explain the structure of the training at the start.
  • Review regularly to ensure that the material covered has been understood.
  • Try to deal with questions as they arise but if you don’t know the answer, don’t be afraid to say you will get back to the learner later.
  • Give useful and constructive feedback to learners.
  • Check your timing and have a contingency plan in case some parts of the training take longer than you expected.
  • Deliver the training consistently so that all learners receive the same training.
Step 4: Evaluate the training
When you have gone to the trouble of designing and delivering training then you will want to be sure that it is working. This is the purpose of training evaluation. A common way to evaluate training is to get the learners to complete a form at the end of the training where they give feedback on whether they thought it was useful, what they thought of the facilities and the trainer, etc. This information is very useful and may be used to improve the training in the future but it does not measure what was actually learned by the delegates. To do this some type of assessment is usually administered. This may be a written test, or the learner may have to analyse a sample by implementing the training that they have received. You need to decide what is most appropriate for your training.
To assess long term implementation of learning is more difficult. A number of different approaches are possible but all assess how successfully the learner is completing the task in question. The opinion of the learner, colleagues and managers may be canvassed to obtain a balanced view of how well the training has worked.

In summary the training cycle provides a structured way for you to approach your training responsibilities. The core of all successful training programmes are good learning objectives, sometimes known as learning outcomes. In the next instalment of Learning at Work the process of setting realistic and appropriate learning objectives will be discussed.

Thursday, 4 June 2009

Analytical method validation by phase of development


An analytical method which is used in pharmaceutical analysis should always be validated to ensure that the results generated are trustworthy. These results may be used to make critical decisions during the drug development process relating to issues such as safety of the drug, the synthetic route and the manufacturing process. However performing formal validation studies as per ICH guidelines requires considerable resources. In the early stages of the development of a drug the investment required for formal validation may not be desirable due to a number of reasons including:

  • The development and optimisation of the analytical methods is ongoing.
  • The development of the synthetic route for the drug substance is ongoing.
  • The development of the formulation of the drug product is ongoing.
  • The drug may not progress into later stages of development.

The ICH guidelines apply to ‘validation of the analytical procedures included as part of registration applications submitted within the EC, Japan and USA’ [1], and thus do not formally apply to the early phases of drug development. The guidelines from the FDA regarding INDs for phase 2 and phase 3 studies [2] require ‘appropriate’ validation data for methods which are not from a pharmacopoeia or official reference standard.

The consequence of this is that it is common for pharmaceutical companies to use a phased approach to analytical method validation studies. Validation performed for early phase drugs tends to be less extensive than that performed for late stage drugs. However the objective of validation still applies, i.e. to demonstrate that a method is suitable for its intended purpose.

The following list provides suggestions for validation of early phase drugs:

  • A formal validation protocol is not yet mandatory. Internal guidelines, or a standard operating procedure (SOP), may be used to summarise the general validation requirements. This may be referenced rather than producing time consuming documentation.
  • The extent of testing and the number of replications may be reduced.
  • Wider acceptance criteria may be adequate in early phases of development.
  • Specificity and evaluation of the quantitation limit are the primary characteristics to ensure that assay and impurity methods meet their intended purposes of potency and safety.
  • A second method to evaluate accuracy is unlikely to exist in the early phase of development, therefore accuracy may be inferred from the results of precision, linearity and specificity.
  • For precision testing synthetic mixtures of drug substance and placebo may be used, rather than authentic samples, thus enabling the combination of accuracy, precision and potentially linearity at the same time.
  • Formal intermediate precision experiments are not yet needed but if different laboratories need to operate the method then the handover will require suitable validation.
  • The evaluation of the detection limit for impurity methods may be delayed.
  • Formal robustness testing is not yet required. Robustness studies associated with method development are likely to be ongoing at this stage.
  • The validation report may be presented in a simplified tabular format, together with the conclusions. This type of summary report fulfils the expectations of the regulatory authorities, e.g. phase 2 and phase 3 INDs.

Further information regarding validation of analytical methods by phase of development is available from Bloch in ‘‘Method Validation in Pharmaceutical Analysis. A guide to Best Practice’ [3] and also from Boudreau et al. in a paper developed from a PhRMA 2003 workshop [4].

1. International Conference on Harmonisation (ICH) of Technical Requirements for Registration of Pharmaceuticals for Human Use, Topic Q2 (R1): Validation of Analytical Procedures: Text and Methodology, 2005,
2. Guidance for Industry: INDs for Phase 2 and Phase 3 Studies, Chemistry, Manufacturing, and Controls information, US Department of Health and Human Services, Food and Drug Administration, Center for Drug Evaluation and Research (CDER), 2003.
3. M. Bloch, ‘Validation During Drug Product Development – Considerations as a Function of the Stage of Drug Development’, in ‘Method Validation in Pharmaceutical Analysis, a Guide to Best Practice’, Eds. J. Ermer, J. H. Miller, Wiley, 2005, p243-264.
4. S. P. Boudreau, J. S. McElvain, L. D. Martin, T. Dowling, S. M. Fields, Pharm. Technol., 28 (11), 54-66, 2004, ‘Method Validation by Phase of Development – An Acceptable Analytical Practice’.

This blog post is an excerpt from 'Validation of Analytical Methods for Pharmaceutical Analysis' by Oona McPolin, available to purchase through the MTS website.

Tuesday, 2 June 2009

Priming injections

A resource for chromatographers

For most HPLC analyses a priming injection is required prior to the analysis. This injection will usually result in a slightly different chromatogram to subsequent injections of the same solution.

The reason for this is not fully understood but is probably due to the presence of (at least) two types of active sites on the column for interaction with the analyte molecules. One of these sites equilibrates much more slowly than the other and is gradually saturated with the analyte in the first few injections thus stabilising the response for further injections.

For this reason it is common practice to perform one or more ‘test injections’ before starting the analysis injection sequence. The SST solution or the calibration standard is commonly used for the test injection. The test injections may be programmed into the injection sequence if the run will be unattended, e.g. it is started at the end of a working day.

This blog post is an excerpt from 'An Introduction to HPLC for Pharmaceutical Analysis' by Oona McPolin, available to purchase through the MTS website.