next up previous contents
Next: Discussion Up: Feature Selection and Reduction Previous: Feature Selection and Reduction

Reduced Classifier Complexity

To determine whether the Zernike and Haralick features could be used successfully with less complex neural networks, performance was measured on networks having less than 20 hidden nodes. To expedite the testing of the various networks, the entire test set was used to both stop training and evaluate the classification performance rather than splitting the test set into multiple stop/evaluate pairs as described in Materials and Methods. At no point were test samples used to modify the network weights, however. Good correlation was previously found between the train/test and train/stop/evaluate approaches where they were used with the same training data and same number of hidden nodes. The two set approach was therefore used as a screening method when training networks under multiple conditions. Whereas the classification performance using the Zernike moments dropped from $87\pm5.4\%$ with 20 hidden nodes to $83\pm6\%$ with 10 to $78\pm6.5\%$ with 5, the Haralick features maintained essentially constant performance, dropping only from $88\pm5.1\%$ at 20 hidden nodes to $87\pm5.4\%$ with 5. The Haralick result was confirmed using the more rigorous three set train/stop/evaluate method; the average performance was $84\pm5.8\%$. The maintenance of classification rate with fewer hidden nodes indicates that the classification problem is relatively `easier' with the Haralick features than with the Zernike moments. The decrease in feature number, from 49 to 13, and the decrease in the number of required hidden units, 20 vs. 5, both help to make the Haralick features the more desirable of the two feature sets studied here.


next up previous contents
Next: Discussion Up: Feature Selection and Reduction Previous: Feature Selection and Reduction
Copyright ©1999 Michael V. Boland
1999-09-18