Validating the unit correctness of spreadsheet programs

Performance of all elements is to be carried out in accordance with organisation standards and procedures, unless otherwise stated.

In this chapter the actual execution of the jobs for which the laboratory is intended, is dealt with.The most important part of this work is of course the analytical procedures meticulously performed according to the corresponding SOPs.In all of the tables in this document, both the pre-2009 NQF Level and the NQF Level is shown.In the text (purpose statements, qualification rules, etc), any references to NQF Levels are to the pre-2009 levels unless specifically stated otherwise.The validation may be strict (such as rejecting any address that does not have a valid postal code) or fuzzy (such as correcting records that partially match existing, known records).

Some data cleansing solutions will clean data by cross checking with a validated data set.Data cleansing or data cleaning is the process of detecting and correcting (or removing) corrupt or inaccurate records from a record set, table, or database and refers to identifying incomplete, incorrect, inaccurate or irrelevant parts of the data and then replacing, modifying, or deleting the dirty or coarse data.Data cleansing may be performed interactively with data wrangling tools, or as batch processing through scripting.Relevant aspects include calibration, use of blanks, performance characteristics of the procedure, and reporting of results.An aspect of utmost importance of quality management, the quality control by inspection of the results, is discussed separately in Chapter 8.It could be something as simple as a run away script or learning how to better use E-utilities, for more efficient work such that your work does not impact the ability of other researchers to also use our site.