•  

    Reference Standards Quality in NMR Spectroscopy

    Because many conclusions based on NMR spectrometry require optimal performance or a gauge of current spectrometer performance, spectroscopists have come to rely upon the spectra of certain carefully prepared chemical mixtures to evaluate and compare performance characteristics. These mixtures, called Reference Standards or Test Samples, are made in a myriad of recipes to cover different nuclei, concentration ranges, performance criteria, and sample or tube sizes, to name a few. Nearly 1,000 such combinations have been or are currently commercially available. Selecting the right standard from this myriad to measure or calibrate instrument performance isn't always easy. This report provides additional information about reference standards that can help in selecting the right one for your application.

    Commonly Measured Performance Criteria
    Since NMR is an inherently insensitive technique when compared to many other analytical techniques, there has been a concerted effort in the NMR community to improve sensitivity. Higher magnetic fields, higher sensitivity probes, and improved Rf components have all contributed to a marked increase in the performance of NMR spectrometers over the last three decades, during which a 600-fold increase in sensitivity has been achieved. It isn't surprising, then, that the most common application of Reference Standards is the measurement of single pulse Sensitivity or Signal-to-Noise (S/N) in NMR spectrometers. Naturally, standards have and continue to change with improvements in instrument performance, usually with lower concentrations of critical components of the mixture.

    In addition to sensitivity, the other commonly measured performance criterion is spectrometer resolution. Resolution affects your ability to discern closely spaced resonances. Poor resolution can also reduce apparent sensitivity, since peak broadening reduces signal height. Resolution is normally reported as a single pulse peak width at half the peak height measured from a naturally narrow NMR line, such as 1% Chloroform in proton NMR spectrometry.


    Other Spectrometer Performance Criteria
    Because spectrometers must maintain stability through long-term accumulations, there are standards, usually consisting of small concentrations of biological molecules stabilized against microbial degradation, that are used to define resolution stability and multiple pulse sensitivity. Examples include 0.002M Cholesterol Hydrochloride for 1H spectrometry or 0.1M Sucrose for 13C spectrometry. Since many nuclei are studied proton decoupled, reference standards are also available for decoupled and, sometimes, undecoupled sensitivity, so you can measure instrument performance in the mode that most closely matches your experiments.

    There are now sensitivity standards for indirect detection, too. Standards with 13C-enriched Iodomethane and 15N-enriched Benzamide are used for short-term or single pulse sensitivity in inverse detection settings. Reference standards are also readily available for calibrating NMR probe variable temperature system controllers. These 'Chemical Shift Thermometers' make use of the change in chemical shift with temperature observed with certain compounds like Methanol and Ethylene Glycol.

    Reference Standard Quality
    Purity of chemicals used to prepare Reference Standards is an important consideration in selecting the right standard for your application. The concentration of the signal generating ingredient(s) or solute(s) has (have) always been important, especially for sensitivity standards. As an example, consider a standard with solute purity of only 95%. Sensitivity performance measured using this standard will be 5% less than measurements resulting from a standard with a solute purity of 100%. If impurities come only from the solvent, there will be no reduction in measured S/N. But it may be impossible to distinguish if the source of the impurity is the solute or solvent. Thus, the presence of impurities in sensitivity standards has always been reason for concern. Impurities in resolution standards are a problem only if impurity resonances interfere with the resonances from the solute.

    As spectrometer performance has improved, the concentration(s) of solute(s) in Reference Standards have been reduced steadily. This has placed greater importance on chemical purity in the lowest concentration standards, particularly on the deuterated solvent. While it may be acceptable to use the best commercially available deuterated solvent for a standard employing 40% concentration of a solute, extensive purification of solvents is essential for impurity-free sensitivity standards, such as 0.1% Ethylbenzene. But such purifications, which must be performed under controlled atmosphere to preserve deuterium purity, also add to the cost of standards.


    Proper NMR Standard Preparation
    Since resolution and line shape can be affected by relaxation mechanisms, paramagnetic oxygen have been completely removed from most reference standards. And when closing the tube off, the seal must be symmetrical since the tip-off can affect spinning stability. Failure to perform this final step precisely destroys the standard and reduces the yield, increasing manufacturing costs and the consumer's price.

    NMR tube quality is also important. Modulation sidebands contributed by poor NMR tube quality make measurements unreliable. All Wilmad reference standards are made with 535-PP (>600MHz) NMR tube quality.