There are two procedures that are used to make sure that an HPLC equipped with a mass spectral detector (MS) is reporting proper mass information and that the system is properly set to detect the analyte(s) of interest.
These are calibration and tuning – two processes that are sometimes confused. Calibration is the process used to ensure mass accuracy; tuning is used to make sure that the instrument is working well for a particular sample. Most commonly for quantitative analysis, a single- (LC-MS) or triple- (LC-MS/MS) quadrupole MS detector is used, although the principles discussed here also hold for other LC-MS detectors.
Calibration gives us confidence that when the detector reports a mass (or more properly mass-to-charge ratio, m/z ) for a compound, we can be sure that the reported value is correct. This means that we need to use a well-defined standard for the procedure. Some of the characteristics for a good calibrant are that it needs to cover the mass range of interest, it must be stable, clean, and leave no residue behind. The most popular calibrant for LC-MS work is polypropylene glycol (PPG). It has distinct, well-defined peaks with masses up to >2000 Da and is stable for months, so you don’t need to worry about having to keep a sealed ampoule of calibrant in the freezer prior to use. It is clean, and leaves little residue behind. I’ve seen cases where new workers did not know how to do anything but calibrate the instrument, so while they were waiting for additional training, they calibrated over and over again each day for a couple of weeks. They built up sufficient residue of PPG that it had to be baked out of the system to lower the background. But this is the exception, rather than the rule – residues from PPG usually are of little concern.
To calibrate the system, a dilute solution of the calibrant is infused into the MS with a syringe pump, usually in a configuration such as that in the Figure 1. The calibration procedure for the instrument is followed and the software does the rest. A peak is generated when a particular mass-to-charge ratio of PPG passes through the system at a specific combination of instrument settings. The software correlates this peak and its position in the known PPG profile to a specific mass-to-charge ratio. Then in the future, it can reverse this process and know that a peak coming at a specific instrument setting must be the pre-defined mass-to-charge ratio. There is a trade-off between mass resolution and sensitivity, and since the LC-MS or LC-MS/MS system typically is used as a quantitative tool, not a qualitative one, sensitivity is favored over mass resolution. In practical terms, this means that usually the instruments are set up so that they can identify one mass from another (e.g., ±0.5 Da error), but fractional masses of more than one decimal place are not available. This is sufficient for quantitative work. For qualitative analysis, an electron impact or time-of-flight MS is used and mass accuracy of several decimal places is achieved.
Once the system is calibrated, it should not require re-calibration until some change is made to the instrument. In our lab, we check the calibration once a calendar quarter, but often it does not need to be changed. If we open up the system to do maintenance, such as removing the first quadrupole for cleaning, calibration is checked, also.
Tuning is a much more common process than calibration. The essence of tuning is that we want the instrument to give the maximum signal possible for our analyte. To do this, we use the same instrumental setup as for calibration (Figure 1), but with a solution of our analyte or internal standard instead of the calibrant. Because LC-MS relies on chemical ionization, the chemical environment in the LC-MS interface influences how ions are generated. So not only are the interface settings important (voltages, flow rates, temperatures, vacuum, etc.), but also the chemical composition of the mobile phase from the HPLC influences the analyte ionization. For this reason, it is best to tune the instrument with the HPLC system pumping the mobile phase used for the desired method.
In the “good old days,” one had to adjust the settings for each part of the MS system manually to get the correct signal. This was tedious and took a certain level of skill. Today, all the LC-MS instrumentation has an autotune function that simplifies this technique. The infusion is started and the autotune procedure is started. The instrument automatically adjusts all the voltages and settings from the MS inlet through to the detector. This process is repeated several times until the maximum signal is obtained at the detector. This group of instrument settings is saved in a tune file and is used for analysis.
Once the instrument is tuned for a particular set of method conditions, it should not change, so tuning is not required each time the method is run. However, the tune may change over time as the instrument becomes contaminated or for some other reason. Although one could check the tune on some scheduled basis or each time the method is set up, it is easier to let the system suitability test help to determine whether or not re-tuning is required. Part of the system suitability test should be a determination of sufficient signal-to-noise at the limit of detection (LOD) and lower limit of quantification (LLOQ or LOQ) to generate data with acceptable precision. For example, you might determine that you need to be able to see 600 area counts at the LLOQ with a signal-to-noise ratio of 25 for the method to work properly. When these conditions are no longer met, you might want to check the tune or check for other factors which might cause the performance to degrade, such as contamination of the interface or the first quadrupole.
So once again, calibration is the process that ensures that the instrument is reporting accurate masses, whereas tuning is used to make sure that adequate signal intensity is obtained for specific analytes.