Translator Disclaimer
30 April 2007 Performance evaluation of evolutionary computational and conventionally trained support vector machines
Author Affiliations +
The main objective of this paper is to validate this newly developed Evolutionary Programming (EP) derived Support Vector Machines (SVMs) paradigm by a performance comparison with the accepted conventional iterative gradient method usually used to train these SVMs. The paper first reviews the background research associated with this research problem and follows with the description of the EP developed family of SVMs. Both the mutation and selection methods used to formulate the family of SVMs are described, which is followed by the more familiar Langrangian formulation of SVMs. Kernel based learning methods are then discussed. The concepts described here are not limited to SVMs, and the general principles also apply to other kernel based classifiers as well. Results are depicted for two EP methods: the first a "crude" earlier method described in reference 7 and the more recently method described here. Iteratively derived SVM results are also developed for comparison with the EP derived SVM approach. These results show that both methods produced essentially perfect classification AZ results, generally ranging from 0.926 to 0.931. Only the hyperbolic tangent kernel yielded the less accurate result of 0.87. These were expected results because all ambiguous findings were "scrubbed" from the features describing the screen film data set.
© (2007) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
Walker H. Land Jr., John Heine, George Tomko, Alda Mizaku, Swati Gupta, and Robert Thomas "Performance evaluation of evolutionary computational and conventionally trained support vector machines", Proc. SPIE 6560, Intelligent Computing: Theory and Applications V, 65600W (30 April 2007);

Back to Top