Now that we have examined the causes of statistical errors both in accuracy and precision in some detail, we will know discuss the counting statistics themselves, and how they are derived from our intensity data to allow us to attach significance to our analyses – in essence, we are going to look at how we use statistics to tell us how good or bad our data really are. The following sections assume that the reader already has a basic understanding of statistics, and will not spend time discussing derivations of basic terms.
For those users who have data processed using the JEOL software, the following section talks about the algebraic and statistical treatment of intensity data to yield standard deviations and detection limits.
- Algebraic definition of ‘Standard Deviation’ in JEOL software (S.D. = s) expressed as %
- where Ipeak is the intensity of the X-rays measured at the peak position in question expressed in counts/second; IPBL and IPBH are the intensities of the background at both the lower and higher background positions, respectively, expressed in counts/second; and are the background offset positions (i.e., how far the backgrounds were measured away from the peak position) expressed in mm; Inet is the background-corrected intensity of the measured X-ray peak; tpeak is the counting time for the X-ray peak; tPBL and tPBH are the counting times for the lower and higher backgrounds, respectively; L is the total distance between the lower and higher background positions
- Detection Limit – The detection limit is defined as the smallest signal that is distinguishable from the background noise; in general, the signal must be 3s times greater than the background noise. The JEOL software calculates the Detection Limit (D.L.) as follows (quoted at 1s by default):
- where INetSTD is the net intensity of a given X-ray in the standard; mass(%)STD is the mass % of the appropriate element in the standard; Iback is the average intensity of the lower and higher backgrounds; tback is the total counting time of the lower and higher backgrounds.
- Lower Limit of Determination (LLD) – The LLD is defined as the lowest amount of analyte in a sample that can be quantitatively determined with a stated, acceptable precision and accuracy under stated experimental conditions. Generally, the LLD is quoted at 6s for most analytical methods, although the JEOL software does not utilize this parameter, rather it calculates a D.L. at a software-default of 1s, which can be misleading in that a user may assign undue significance to a concentration of an element that has been assigned an overly optimistic D.L. based on single-digit, Net Intensity counts that may not be above 3s of the background noise. The LLD is mentioned here, mostly because most users don’t take this quantity into account, and take the DL as being the be-all-and-end-all, without assigning a statistical importance to the DL.