|
NAMEStatistics::LTU - An implementation of Linear Threshold UnitsSYNOPSISuse Statistics::LTU; my $acr_ltu = new Statistics::LTU::ACR(3, 1); # 3 attributes, scaled $ltu->train([1,3,2], $LTU_PLUS); $ltu->train([-1,3,0], $LTU_MINUS); ... print "LTU looks like this:\n"; $ltu->print; print "[1,5,2] is in class "; if ($ltu->test([1,5,2]) > $LTU_THRESHOLD) { print "PLUS" } else { print "MINUS" }; $ltu->save("ACR.saved") or die "Save failed!"; $ltu2 = restore Statistics::LTU("ACR.saved"); EXPORTSFor readability, LTU.pm exports three scalar constants: $LTU_PLUS (+1), $LTU_MINUS (-1) and $LTU_THRESHOLD (0).DESCRIPTIONStatistics::LTU defines methods for creating, destroying, training and testing Linear Threshold Units. A linear threshold unit is a 1-layer neural network, also called a perceptron. LTU's are used to learn classifications from examples.An LTU learns to distinguish between two classes based on the data given to it. After training on a number of examples, the LTU can then be used to classify new (unseen) examples. Technically, LTU's learn to distinguish two classes by fitting a hyperplane between examples; if the examples have n features, the hyperplane will have n dimensions. In general, the LTU's weights will converge to a define the separating hyperplane. The LTU.pm file defines an uninstantiable base class, LTU, and four other instantiable classes built on top of LTU. The four individual classes differs in the training rules used:
Each of these training rules behaves somewhat differently. Exact details of how these work are beyond the scope of this document; see the additional documentation file (ltu.doc) for discussion. SCALARS$LTU_PLUS and $LTU_MINUS (+1 and -1, respectively) may be passed to the train method. $LTU_THRESHOLD (set to zero) may be used to compare values returned from the test method.METHODSEach LTU has the following methods:
"n_features" sets the number of attributes in the examples. If "scaling" is 1, the LTU will automatically scale the input features to the range (-1, +1). For example: $ACR_ltu = new Statistics::LTU::ACR(5, 1); creates an LTU that will train using the absolute correction rule. It will have 5 variables and scale features automatically.
In addition to the methods above, each of the four classes of LTU defines a train method. The train method "trains" the LTU that an instance belongs in a particular class. For each train method, instance must be a reference to an array of numbers, and value must be a number. For convenience, two constants are defined: $LTU_PLUS and $LTU_MINUS, set to +1 and -1 respectively. These can be given as arguments to the train method. A typical train call looks like: $ltu->train([1,3,-5], $Statistics_LTU_PLUS); which trains the LTU that the instance vector (1,3,-5) should be in the PLUS class.
AUTHORfawcett@nynexst.com (Tom Fawcett)LTU.pm is based on a C implementation by James Callan at the University of Massachusetts. His version has been in use for a long time, is stable, and seems to be bug-free. This Perl module was created by Tom Fawcett, and any bugs you find were probably introduced in translation. Send bugs, comments and suggestions to fawcett@nynexst.com. BUGSNone known. This Perl module has been moderately
exercised but I don't guarantee anything.
Visit the GSP FreeBSD Man Page Interface. |