Categorization of the types of errors affecting accuracy and precisionccuracy and Precision

Now that we have examined many of the factors that affect accuracy and precision, we will categorize the types of errors that result from said factors, as well as mentioning other, practical implications.  In general, there are several sub-types of errors, and sometimes, the individual factors we have already discussed can belong to more than one category.  The subtypes of errors stem from the 3 main sources of error, namely instrumental issues, sample- and preparation-related sources, and user error.  The subtypes and examples of errors are discussed, as follows:

Random Errors – Random errors are not mathematically or statistically predictable, nor are they generally able to be corrected.  They are like lightning strikes, in that their timing, intensity and location aren’t able to be well predicted. Among random errors in microanalysis are such factors as:

  • X-ray generation – There is no guarantee that a given electron-atom collision is going to result in the emission of a characteristic X-ray wavelength.  Only electrons on the exact ideal trajectory that retain enough kinetic energy are able to penetrate into the inner electron-shell of a target atom and ionize it such that a characteristic X-ray is produced.  Incident electrons that strike a target atom at random, glancing angles will produce non-characteristic, ‘continuum’ X-rays almost up to the energy level of the incident beam (i.e., accelerating voltage) with a Gaussian distribution.  Other phenomena occur during random electron collisions, such as the generation of cathodoluminescent photons, phonons, electron capture, generation of Auger X-rays, charging and other vibrational modes.

  • Electronic instrumental instability – We have discussed this issue previously, but will mention a few relevant points.  X-ray intensity peaks are actually 3-dimensional, and we sample a 2-dimensional slice of those 3-D objects/peaks.  The orientation of the slice we sample is governed in part by the orientation of the diffractor crystals and detectors in our spectrometers, which are in turn partly controlled by the alignment of the ‘baseplate’ in the spectrometer; in most JEOL microprobes, the baseplate is a large, brass plate to which the moving parts of the spectrometer are attached.  If the alignment of the baseplate changes between the time that the standards were last calibrated and the current samples are analysed, the 2-D X-ray ‘slice’ we sample may change shape and orientation, resulting in erroneous X-ray intensities being collected, and both the accuracy and precision will be detrimentally affected.  The baseplate may change alignment in response to the spectrometer being bumped, for example, resulting in the generation of random errors.

  • User error – This is one of the most subjective errors to judge, as they vary from individual to individual.  For example, an inexperienced user may select ‘poor’ points, e.g., pitted points, poorly polished points, chemically altered points, grains that are too small, points that are too close to a grain margin.  The most common user error is setting the optical focus of the sample such that the TOA is optimized for the Bragg diffraction condition to be satisfied for the detection of in-phase X-rays.  Unfortunately, errors in focussing the optical microscope may be a result of user needing glasses, or because the user is overtired or even ill.  The human factor is the most difficult one to rectify, but can’t be ignored, despite assertions from a prideful user or analyst.

  • Surface roughness – We have discussed this issue previously, but it bears repeating here, in that surface roughness on the micron-scale can affect the ability to focus the optical microscope well, and may therefore introduce a random error in the intensity data.

  • Two or more phases within the excitation volume – Given that the secondary, back-scatter and topographic electron images generated by the probe only see physical phenomena at the surface of a sample, and are unable to reveal sub-surface features such as inclusions, shallow grain-boundaries or internal fractures and flaws, it is not uncommon that a given excitation volume may contain may than one compositional entity, resulting in a composite analysis.  Also, internal flaws such as fractures, dislocations, coalesced voids and convoluted grain boundaries will cause anomalous deflections of incident electrons and anomalous diffraction of X-rays, thus resulting in erroneous intensity data.

  • Secondary fluorescence – Secondary X-ray fluorescence caused by primary X-rays, including self-absorption-and-re-emission is generally a randomly occurring phenomenon, and is not readily corrected.  One example of false secondary fluorescence is that caused by radioactive elements in the sample – radiation from U, for example, may be energetic enough to induce secondary fluorescence in surrounding, lighter elements.

  • Counting errors – Counting errors can arise owing to random, electronic phenomena in the signal-processing hardware, and even within the detector itself, for example, stray cosmic rays can be detected by the detector and reported as analytical data.  This may sound trivial, but can be a common effect when one is operating a microprobe near the earth’s magnetic pole.  I have seen this effect, personally, when I worked at the University of Manitoba, and the lab was hit by a burst of cosmic rays while the aurora borealis were in play.