Abstract
A typical quality control method for a large volume, record processing application is to visually inspect records sampled from the process output, similar to quality control methods used in a manufacturing environment. However, in a data processing environment, millions of products (output records) may be produced in a matter of minutes. Therefore, a randomly drawn sample small enough to permit manual inspection in a timely manner may not uncover enough examples of defective records for an effective assessment of the quality or accuracy of the process, particularly when the expected rate of defects is small. When the primary objective of quality control is to isolate and correct defective records (as opposed to only the statistical measure of the defect rate), then a sample biased in favor of defective records is preferred to a purely random sample. On the other hand, a rule-based system capable of selecting only defects is the logical equivalent of constructing the original process in a way that produces no defective output. The paper describes an experiment assessing the effectiveness of using a neural network as a logically independent, heuristic tool for extracting defect prone quality control samples from the output of an intelligent, rule-based data standardization system.
Original language | English |
---|---|
Pages | 263-266 |
Number of pages | 4 |
State | Published - 1995 |
Event | Proceedings of the 1995 ACM Symposium on Applied Computing - Nashville, TN, USA Duration: Feb 26 1995 → Feb 28 1995 |
Conference
Conference | Proceedings of the 1995 ACM Symposium on Applied Computing |
---|---|
City | Nashville, TN, USA |
Period | 02/26/95 → 02/28/95 |