Algorithm-SVM-0.13/0000755000077436411310000000000010745461175013661 5ustar lairdmwg-usersAlgorithm-SVM-0.13/MANIFEST0000644000077436411310000000025307560044207015004 0ustar lairdmwg-usersChanges MANIFEST Makefile.PL README SVM.xs TODO bindings.cpp bindings.h lib/Algorithm/SVM.pm lib/Algorithm/SVM/DataSet.pm sample.model libsvm.cpp libsvm.h test.pl typemap Algorithm-SVM-0.13/Changes0000644000077436411310000000351010745461020015140 0ustar lairdmwg-usersRevision history for Perl extension Algorithm::SVM. 0.01 Thu Jun 27 11:16:17 2002 - original version; created by h2xs 1.21 with options -n Algorithm::SVM svm.h 0.02 Mon Sep 23 13:12:51 EDT 2002 - compilation errors on Mac OS X fixed, other bug fixes and code cleanups. 0.05 Tue Nov 26 10:46:46 PST 2002 - fixed build error for perl-5.8.0 with ithreads. Thanks to Mike Castle for this report and patch. 0.06 Tue Jan 14 16:59:50 PST 2003 - fixed a build error for gcc version 3.x 0.07 Tue Jun 3 14:46:26 PDT 2003 - added a cygwin target to Makefile.PL - Algorithm::SVM should now compile under Windows. - Algorithm::SVM now has a new maintainer. As of version 0.07, all inquiries, patches and comments should be sent to Matthew Laird 0.08 Mon May 17 15:09:21 PDT 2004 - Upgraded libsvm to 2.6, added new bindings for most added functionality 0.09 Mon Oct 25 12:05:00 PDT 2004 - Fixed an uninitialized variable for probability. 0.10 Mon Nov 16 10:43:00 PDT 2004 - Added Solaris support, unfortunately not in an automated fashion. If anyone can suggest improvements... I'm very open to suggestions. 0.12 Sun Jan 15 11:39:00 GMT+1 2006 - Updated libsvm to 2.81 (i.e. added half a dozen lines of code) - added many testcases in test.pl - SVM: Added binding for predict_values, improved memory usage - DataSet: changed to use sparse format internally internal format similar to what libsvm uses added asArray function, improved memory usage - All changes (except libsvm update which slightly changes the learned models) are completely transparent to the user. No changes in programs depending on this should be necessary. 0.13 Tue Jan 22 13:38:00 PDT 2008 - Updated the underlaying libsvm version to 2.85 Algorithm-SVM-0.13/lib/0000755000077436411310000000000010146450223014412 5ustar lairdmwg-usersAlgorithm-SVM-0.13/lib/Algorithm/0000755000077436411310000000000010362422453016344 5ustar lairdmwg-usersAlgorithm-SVM-0.13/lib/Algorithm/SVM/0000755000077436411310000000000010362417031017005 5ustar lairdmwg-usersAlgorithm-SVM-0.13/lib/Algorithm/SVM/DataSet.pm0000644000077436411310000000753610362417031020703 0ustar lairdmwg-userspackage Algorithm::SVM::DataSet; use 5.006; use strict; use Carp; use Algorithm::SVM; =head1 NAME Algorithm::SVM::DataSet - A DataSet object for the Algorithm::SVM Support Vector Machine. =head1 SYNOPSIS use Algorithm::SVM::DataSet; # Create a new dataset. $ds = new Algorithm::SVM::DataSet(Label => 1, Data => [ 0.12, 0.25, 0.33, 0.98 ]); # Retrieve/set the label. $label = $ds->label(); $ds->label(1976); # Retrieve/set the attribute with an index of 0. $attr = $ds->attribute(0); $ds->attribute(0, 0.2621); =head1 DESCRIPTION Algorithm::SVM::DataSet is a representation of the datasets passed to Algorithm::SVM object for training or classification. Each dataset has an associated label, which classifies it as being part of a specific group. A dataset object also has one or more key/value pairs corresponding to the attributes that will be used for classification. Values equal to zero will not be stored, and are returned by default if no key/value pair exists. This sparse format saves memory, and is treated in exactly the same way by libsvm. =head1 CONSTRUCTORS $ds = new Algorithm::SVM::DataSet(Label => 1, Data => [ 0.12, 0.25, 0.33, 0.98 ]); The Algorithm::SVM::DataSet constructor accepts two optional named parameters: Label and Data. Label is used to set the class to which the dataset belongs, and Data is used to set any initial values. Data should be an arrayref of numerical values. Each value in the arrayref is assumed to have a key corresponding to its index in the array. ie) In the above example, 0.12 has a key of 0, 0.25 has a key of 1, 0.33 has a key of 2, etc. =head1 METHODS $label = $ds->label(); $ds->label(1976); The label method is used to set or retrieve the DataSets label value. Parameters and return values should be numeric values. $attr = $ds->attribute(0); $ds->attribute(0, 0.2621); The attribute method is used to set dataset attribute values. If a single value is provided, the method will return the corresponding value. If two value are provided, the method will set the first parameter to the value of the second. $ds->asArray(); The asArray method returns the contents of a DataSet object in an efficient way. An optional parameter, $numAttr, can be used to pad the array with zeros if the number of attributes is not known from the beginning (e.g. when creating a word vector on the fly, since all keys not given are automatically assumed to be zero) =head1 MAINTAINER Matthew Laird Alexander K. Seewald =head1 SEE ALSO Algorithm::SVM =cut sub new { my ($class, %args) = @_; # Do some quick error checking on the values we've been passed. croak("No label specified for DataSet") if(! exists($args{Label})); my $self = _new_dataset($args{Label} + 0); if(exists($args{Data})) { croak("Data must be an array ref") if(ref($args{Data}) ne "ARRAY"); for(my $i = 0; $i < @{$args{Data}}; $i++) { $self->attribute($i, (@{$args{Data}})[$i] + 0); } } return $self; } sub label { my ($self, $label) = @_; return (defined($label)) ? _setLabel($self, $label + 0) : _getLabel($self); } sub attribute { my ($self, $key, $val) = @_; croak("No key specified") if(! defined($key)); croak("Negative key specified") if (int($key)<0); if (defined($val)) { return _setAttribute($self, int($key), $val + 0); } else { return _getAttribute($self, int($key)); } } sub asArray { my ($self,$numAttr) = @_; if (!defined($numAttr)) { $numAttr=_getMaxI($self)+1; } my @x=(); for (my $i=0; $i<$numAttr; $i++) { push @x,0; } my $i=0; my $k; my $v; $k=_getIndexAt($self,$i); $v=_getValueAt($self,$i); while ($k!=-1 && $k<$numAttr) { $x[$k]=$v; $i++; $k=_getIndexAt($self,$i); $v=_getValueAt($self,$i); }; return @x; } 1; __END__ Algorithm-SVM-0.13/lib/Algorithm/SVM.pm0000644000077436411310000003125410745461075017364 0ustar lairdmwg-userspackage Algorithm::SVM; use 5.006; use strict; use Carp; require DynaLoader; require Exporter; use AutoLoader; # SVM types my %SVM_TYPES = ('C-SVC' => 0, 'nu-SVC' => 1, 'one-class' => 2, 'epsilon-SVR' => 3, 'nu-SVR' => 4); my %SVM_TYPESR = (0 => 'C-SVC', 1 => 'nu-SVC', 2 => 'one-class', 3 => 'epsilon-SVR', 4 => 'nu-SVR'); # Kernel types my %KERNEL_TYPES = ('linear' => 0, 'polynomial' => 1, 'radial' => 2, 'sigmoid' => 3); my %KERNEL_TYPESR = (0 => 'linear', 1 => 'polynomial', 2 => 'radial', 3 => 'sigmoid'); use vars qw(@ISA %EXPORT_TAGS @EXPORT_OK @EXPORT $VERSION); @ISA = qw(Exporter DynaLoader); %EXPORT_TAGS = ( 'all' => [ qw( ) ] ); @EXPORT_OK = ( @{ $EXPORT_TAGS{'all'} } ); @EXPORT = qw( ); $VERSION = '0.13'; sub AUTOLOAD { my $constname; use vars qw($AUTOLOAD); ($constname = $AUTOLOAD) =~ s/.*:://; croak "& not defined" if $constname eq 'constant'; my $val = constant($constname, @_ ? $_[0] : 0); if ($! != 0) { if ($! =~ /Invalid/ || $!{EINVAL}) { $AutoLoader::AUTOLOAD = $AUTOLOAD; goto &AutoLoader::AUTOLOAD; } else { croak "Your vendor has not defined Algorithm::SVM macro $constname"; } } { no strict 'refs'; # Fixed between 5.005_53 and 5.005_61 if ($] >= 5.00561) { *$AUTOLOAD = sub () { $val }; } else { *$AUTOLOAD = sub { $val }; } } goto &$AUTOLOAD; } bootstrap Algorithm::SVM $VERSION; =head1 NAME Algorithm::SVM - Perl bindings for the libsvm Support Vector Machine library. =head1 SYNOPSIS use Algorithm::SVM; # Load the model stored in the file 'sample.model' $svm = new Algorithm::SVM(Model => 'sample.model'); # Classify a dataset. $ds1 = new Algorithm::SVM::DataSet(Label => 1, Data => [0.12, 0.25, 0.33, 0.98]); $res = $svm->predict($ds); # Train a new SVM on some new datasets. $svm->train(@tset); # Change some of the SVM parameters. $svm->gamma(64); $svm->C(8); # Retrain the SVM with the new parameters. $svm->retrain(); # Perform cross validation on the training set. $accuracy = $svm->validate(5); # Save the model to a file. $svm->save('new-sample.model'); # Load a saved model from a file. $svm->load('new-sample.model'); # Retreive the number of classes. $num = $svm->getNRClass(); # Retreive labels for dataset classes (@labels) = $svm->getLabels(); # Probabilty for regression models, see below for details $prob = $svm->getSVRProbability(); =head1 DESCRIPTION Algorithm::SVM implements a Support Vector Machine for Perl. Support Vector Machines provide a method for creating classifcation functions from a set of labeled training data, from which predictions can be made for subsequent data sets. =head1 CONSTRUCTOR # Load an existing SVM. $svm = new Algorithm::SVM(Model => 'sample.model'); # Create a new SVM with the specified parameters. $svm = new Algorithm::SVM(Type => 'C-SVC', Kernel => 'radial', Gamma => 64, C => 8); An Algorithm::SVM object can be created in one of two ways - an existing SVM can be loaded from a file, or a new SVM can be created an trained on a dataset. An existing SVM is loaded from a file using the Model named parameter. The model file should be of the format produced by the svm-train program (distributed with the libsvm library) or from the $svm->save() method. New SVM's can be created using the following parameters: Type - The type of SVM that should be created. Possible values are: 'C-SVC', 'nu-SVC', 'one-class', 'epsilon-SVR' and 'nu-SVR'. Default os 'C-SVC'. Kernel - The type of kernel to be used in the SVM. Possible values are: 'linear', 'polynomial', 'radial' and 'sigmoid'. Default is 'radial'. Degree - Sets the degree in the kernel function. Default is 3. Gamma - Sets the gamme in the kernel function. Default is 1/k, where k is the number of training sets. Coef0 - Sets the Coef0 in the kernel function. Default is 0. Nu - Sets the nu parameter for nu-SVC SVM's, one-class SVM's and nu-SVR SVM's. Default is 0.5. Epsilon - Sets the epsilon in the loss function of epsilon-SVR's. Default is 0.1. For a more detailed explanation of what the above parameters actually do, refer to the documentation distributed with libsvm. =head1 METHODS $svm->degree($degree); $svm->gamma($gamma); $svm->coef0($coef0); $svm->C($C); $svm->nu($nu); $svm->epsilon($epsilon); $svm->kernel_type($ktype); $svm->svm_type($svmtype); $svm->retrain(); The Algorithm::SVM object provides accessor methods for the various SVM parameters. When a value is provided to the method, the object will attempt to set the corresponding SVM parameter. If no value is provided, the current value will be returned. See the constructor documentation for a description of appropriate values. The retrain method should be called if any of the parameters are modified from their initial values so as to rebuild the model with the new values. Note that you can only retrain an SVM if you've previously trained the SVM on a dataset. (ie. You can't currently retrain a model loaded with the load method.) The method will return a true value if the retraining was successful and a false value otherwise. $res = $svm->predict($ds); The predict method is used to classify a set of data according to the loaded model. The method accepts a single parameter, which should be an Algorithm::SVM::DataSet object. Returns a floating point number corresponding to the predicted value. $res = $svm->predict_value($ds); The predict_value method works similar to predict, but returns a floating point value corresponding to the output of the trained SVM. For a linear kernel, this can be used to reconstruct the weights for each attribute as follows: the bias of the linear function is returned when calling predict_value on an empty dataset (all zeros), and by setting each variable in turn to one and all others to zero, you get one value per attribute which corresponds to bias + weight_i. By subtracting the bias, the final linear model is obtained as sum of (weight_i * attr_i) plus bias. The sign of this value corresponds to the binary prediction. $svm->save($filename); Saves the currently loaded model to the specified filename. Returns a false value on failure, and truth value on success. $svm->load($filename); Loads a model from the specified filename. Returns a false value on failure, and truth value on success. $svm->train(@tset); Trains the SVM on a set of Algorithm::SVM::DataSet objects. @tset should be an array of Algorithm::SVM::DataSet objects. $accuracy = $svm->validate(5); Performs cross validation on the training set. If an argument is provided, the set is partioned into n subsets, and validated against one another. Returns a floating point number representing the accuracy of the validation. $num = $svm->getNRClass(); For a classification model, this function gives the number of classes. For a regression or a one-class model, 2 is returned. (@labels) = $svm->getLabels(); For a classification model, this function returns the name of the labels in an array. For regression and one-class models undef is returned. $prob = $svm->getSVRProbability(); For a regression model with probability information, this function outputs a value sigma > 0. For test data, we consider the probability model: target value = predicted value + z, z: Laplace distribution e^(-|z|/sigma)/2sigma) If the model is not for svr or does not contain required information, undef is returned. =head1 MAINTAINER Matthew Laird Alexander K. Seewald =head1 SEE ALSO Algorithm::SVM::DataSet and the libsvm homepage: http://www.csie.ntu.edu.tw/~cjlin/libsvm/ =head1 ACKNOWLEDGEMENTS Thanks go out to Fiona Brinkman and the other members of the Simon Fraser University Brinkman Laboratory for providing me the opportunity to develop this module. Additional thanks go to Chih-Jen Lin, one of the libsvm authors, for being particularly helpful during the development process. As well to Dr. Alexander K. Seewald of Seewald Solutions for many bug fixes, new test cases, and lowering the memory footprint by a factor of 20. Thank you very much! =cut sub new { my ($class, %args) = @_; my $self = bless({ }, $class); # Ensure we have a valid SVM type. $args{Type} = 'C-SVC' if(! exists($args{Type})); my $svmtype = $SVM_TYPES{$args{Type}}; croak("Invalid SVM type: $args{Type}") if(! defined($svmtype)); # Ensure we have a valid kernel type. $args{Kernel} = 'radial' if(! exists($args{Kernel})); my $kernel = $KERNEL_TYPES{$args{Kernel}}; croak("Invalid SVM kernel type: $args{Kernel}") if(! defined($svmtype)); # Set some defaults. my $degree = exists($args{Degree}) ? $args{Degree} + 0 : 3; my $gamma = exists($args{Gamma}) ? $args{Gamma} + 0 : 0; my $coef0 = exists($args{Coef0}) ? $args{Coef0} + 0 : 0; my $c = exists($args{C}) ? $args{C} + 0 : 1; my $nu = exists($args{Nu}) ? $args{Nu} + 0 : 0.5; my $epsilon = exists($args{Epsilon}) ? $args{Epsilon} + 0 : 0.1; $self->{svm} = _new_svm($svmtype, $kernel, $degree, $gamma, $coef0, $c, $nu, $epsilon); # Load the model if one was specified. if(my $model = $args{Model}) { croak("Model file not found or bad permissions: $model") if((! -r $model) || (! -f $model)); # Load the model. $self->load($model); # Ensure that the model loaded correctly. croak("Error loading model file: $model") if(! $self->{svm}); } return $self; } sub predict { my ($self, $x) = @_; # Check if we got a dataset object. croak("Not an Algorithm::DataSet") if(ref($x) ne "Algorithm::SVM::DataSet"); return _predict($self->{svm}, $x); } sub predict_value { my ($self, $x) = @_; # Check if we got a dataset object. croak("Not an Algorithm::DataSet") if(ref($x) ne "Algorithm::SVM::DataSet"); return _predict_value($self->{svm}, $x); } sub save { my ($self, $file) = @_; croak("Can't save model because no filename provided") if(! $file); return _saveModel($self->{svm}, $file); } sub load { my ($self, $file) = @_; croak("Can't load model because no filename provided") if(! $file); return _loadModel($self->{svm}, $file); } sub getNRClass { my ($self) = @_; return _getNRClass($self->{svm}); } sub getLabels { my ($self) = @_; my $class = $self->getNRClass(); if($class) { return _getLabels($self->{svm}, $class); } return 0; } sub getSVRProbability { my ($self) = @_; return _getSVRProbability($self->{svm}); } sub checkProbabilityModel { my ($self) = @_; return _checkProbabilityModel($self->{svm}); } sub train { my ($self, @tset) = @_; croak("No training data provided") if(! @tset); # Delete the old training data. _clearDataSet($self->{svm}); # Ensure we've got the right format for the training data. for(@tset) { croak("Not an Algorithm::SVM::DataSet object") if(ref($_) ne "Algorithm::SVM::DataSet"); } # Train a new model. _addDataSet($self->{svm}, $_) for(@tset); return _train($self->{svm}, 0); } sub retrain { my $self = shift; return _train($self->{svm}, 1); } sub validate { my ($self, $nfolds) = @_; $nfolds = 5 if(! defined($nfolds)); croak("NumFolds must be >= 2") if($nfolds < 2); return _crossValidate($self->{svm}, $nfolds + 0); } sub svm_type { my ($self, $type) = @_; if(defined($type)) { croak("Invalid SVM type: $type") if(! exists($SVM_TYPES{$type})); _setSVMType($self->{svm}, $SVM_TYPES{$type}); } else { $SVM_TYPESR{_getSVMType($self->{svm})}; } } sub kernel_type { my ($self, $type) = @_; if(defined($type)) { croak("Invalid kernel type: $type") if(! exists($KERNEL_TYPES{$type})); _setKernelType($self->{svm}, $KERNEL_TYPES{$type}); } else { $KERNEL_TYPESR{_getKernelType($self->{svm})}; } } sub degree { my $self = shift; (@_) ? _setDegree($self->{svm}, shift(@_) + 0) : _getDegree($self->{svm}); } sub gamma { my $self = shift; (@_) ? _setGamma($self->{svm}, shift(@_) + 0) : _getGamma($self->{svm}); } sub coef0 { my $self = shift; (@_) ? _setCoef0($self->{svm}, shift(@_) + 0) : _getCoef0($self->{svm}); } sub C { my $self = shift; (@_) ? _setC($self->{svm}, shift(@_) + 0) : _getC($self->{svm}); } sub nu { my $self = shift; (@_) ? _setNu($self->{svm}, shift(@_) + 0) : _getNu($self->{svm}); } sub epsilon { my $self = shift; (@_) ? _setEpsilon($self->{svm}, shift(@_) + 0) : _getEpsilon($self->{svm}); } sub display { my $self = shift; _dumpDataSet($self->{svm}); } 1; __END__ Algorithm-SVM-0.13/Makefile.PL0000644000077436411310000000131710353050334015617 0ustar lairdmwg-usersuse ExtUtils::MakeMaker; $CC = 'g++'; %args = ('CCFLAGS' => '-Wall'); if($^O eq 'cygwin') { $args{'LDDLFLAGS'} = '-shared -L/usr/local/lib'; } WriteMakefile('NAME' => 'Algorithm::SVM', 'VERSION_FROM' => 'lib/Algorithm/SVM.pm', 'PREREQ_PM' => {}, ($] >= 5.005 ? (ABSTRACT_FROM => 'lib/Algorithm/SVM.pm', AUTHOR => 'Matthew Laird ') : ()), 'OPTIMIZE' => '-O3', # segfaults with gcc 2.96 if lower (?) 'LIBS' => '-lm', 'CC' => $CC, 'LD' => '$(CC)', 'OBJECT' => 'SVM.o libsvm.o bindings.o', 'XSOPT' => '-C++ -noprototypes', %args); Algorithm-SVM-0.13/Makefile.PL.solaris0000644000077436411310000000145310146444631017302 0ustar lairdmwg-usersuse ExtUtils::MakeMaker; $CC = 'g++'; %args = (); if($^O eq 'cygwin') { $args{'LDDLFLAGS'} = '-shared -L/usr/local/lib'; } WriteMakefile('NAME' => 'Algorithm::SVM', 'VERSION_FROM' => 'lib/Algorithm/SVM.pm', 'PREREQ_PM' => {}, ($] >= 5.005 ? (ABSTRACT_FROM => 'lib/Algorithm/SVM.pm', AUTHOR => 'Matthew Laird ') : ()), 'OPTIMIZE' => '-O3', # segfaults with gcc 2.96 if lower (?) 'LIBS' => '-lm', 'CC' => $CC, 'LD' => '$(CC)', 'CCCDLFLAGS' => '-fPIC', 'LDDLFLAGS' => '-shared', 'LIB_EXT' => '.so', 'OBJECT' => 'SVM.o libsvm.o bindings.o', 'XSOPT' => '-C++', %args); Algorithm-SVM-0.13/README0000644000077436411310000001053610745460411014536 0ustar lairdmwg-usersAlgorithm::SVM version 0.12 =========================== TABLE OF CONTENTS ----------------- 1) DESCRIPTION 2) INSTALLATION 3) DEPENDENCIES 4) BUGS 5) Algorithm::SVM COPYRIGHT AND LICENCE 6) libsvm COPYRIGHT AND LICENCE 7) AUTHOR INFORMATION 8) ACKNOWLEDGEMENTS 1) DESCRIPTION -------------- Algorithm::SVM was originally written by Cory Spencer of the Simon Fraser University Brinkman Laboratory and provides Perl bindings for a Support Vector Machine. It is currently maintained by Matthew Laird and all inquiries, patches and comments should be sent to him. Algorithm::SVM is based on the libsvm library written by Chih-Chung Chang and Chih-Jen Lin. To read about the latest features, see the Changes file. The author invites feedback on SVM. If you find a bug, please send the information described in the BUGS section below. 2) INSTALLATION --------------- For Solaris installation instructions please see README.solaris To install this module type the following: perl Makefile.PL make make test make install 3) DEPENDENCIES --------------- Algorithm::SVM does not require any additional packages or libraries to be installed. 4) BUGS ------- If you find a bug, please report it to the author along with the following information: * version of Perl (output of 'perl -V' is best) * version of Algorithm::SVM * operating system type and version * exact text of error message or description of problem * example model files/data being classified If we don't have access to a system similar to yours, you may be asked to insert some debugging lines and report back on the results. The more help and information you can provide, the better. 5) SVM COPYRIGHT AND LICENCE ---------------------------- The Perl Algorithm::SVM module is Copyright (C) 2002 Cory Spencer and Fiona Brinkman. All rights reserved. This program is free software; you can redistribute it and/or modify it under the same terms as Perl itself. 6) libsvm COPYRIGHT AND LICENCE ------------------------------- libsvm is Copyright (c) 2000-2002 Chih-Chung Chang and Chih-Jen Lin. All rights reserved. Redistribution and use in source and binary forms, with or without modification, are permitted provided that the following conditions are met: 1. Redistributions of source code must retain the above copyright notice, this list of conditions and the following disclaimer. 2. Redistributions in binary form must reproduce the above copyright notice, this list of conditions and the following disclaimer in the documentation and/or other materials provided with the distribution. 3. Neither name of copyright holders nor the names of its contributors may be used to endorse or promote products derived from this software without specific prior written permission. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS ``AS IS'' AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE REGENTS OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. 7) AUTHOR INFORMATION --------------------- The Perl Algorithm::SVM module was originally written by Cory Spencer but is currently maintained by Matthew Laird of the Brinkman Laboratory at Simon Fraser University, Burnaby, BC, Canada. http://www.pathogenomics.sfu.ca/brinkman Any questions related to the underlying library, libsvm, should be directed to the authors of the libsvm package (Email contact: lincj@ccms.ntu.edu.tw). The libsvm homepage is currently located at: http://www.csie.ntu.edu.tw/~cjlin/libsvm/ 8) ACKNOWLEDGEMENTS ------------------- Thanks go to Chih-Jen Lin, one of the libsvm authors, for being particularly helpful during the development process of this module. As well to Dr. Alexander K. Seewald of Seewald Solutions for many bug fixes, new test cases, and lowering the memory footprint by a factor of 20. Thank you very much! Algorithm-SVM-0.13/README.solaris0000644000077436411310000000244310146444631016210 0ustar lairdmwg-usersWe have been alerted to some differences in how different Perl releases build modules under Solaris depending if one uses the Sun distribution or builds it from source using GNU gcc. For this reason we've created some instructions that seem to work for both cases, if you experience any problems please email us and let us know the source of your perl distribution. Installation of the Algorithm::SVM module is then as follows: perl Makefile.PL.solaris You will then need to edit the resulting Makefile, search for the word 'static' and find the sections titled "MakeMaker static section" and "MakeMaker static_lib section." Comment out the lines which under each of these sections should look something like: # --- MakeMaker static section: ## $(INST_PM) has been moved to the all: target. ## It remains here for awhile to allow for old usage: "make static" #static :: Makefile $(INST_STATIC) $(INST_PM) static :: Makefile $(INST_STATIC) @$(NOOP) # --- MakeMaker static_lib section: $(INST_STATIC): $(OBJECT) $(MYEXTLIB) $(INST_ARCHAUTODIR)/.exists $(RM_RF) $@ $(FULL_AR) $(AR_STATIC_ARGS) $@ $(OBJECT) && $(RANLIB) $@ $(CHMOD) $(PERM_RWX) $@ @echo "$(EXTRALIBS)" > $(INST_ARCHAUTODIR)/extralibs.ld Then continue with: make make test make install Algorithm-SVM-0.13/SVM.xs0000644000077436411310000001025110745457435014705 0ustar lairdmwg-users#include #include #ifdef __cplusplus extern "C" { #endif #include "EXTERN.h" #include "perl.h" #include "XSUB.h" #ifdef __cplusplus } #endif #include "bindings.h" #include "libsvm.h" DataSet *_new_dataset(double l) { return new DataSet(l); } SVM *_new_svm(int st, int kt, int d, double g, double c0, double C, double nu, double e) { return new SVM(st, kt, d, g, c0, C, nu, e); } MODULE = Algorithm::SVM::DataSet PACKAGE = Algorithm::SVM::DataSet DataSet * _new_dataset(l) double l double DataSet::_getLabel() CODE: RETVAL = THIS->getLabel(); OUTPUT: RETVAL void DataSet::_setLabel(l) double l CODE: THIS->setLabel(l); double DataSet::_getAttribute(k) int k CODE: RETVAL = THIS->getAttribute(k); OUTPUT: RETVAL void DataSet::_setAttribute(k,v) int k double v CODE: THIS->setAttribute(k,v); int DataSet::_getIndexAt(i) int i CODE: RETVAL = THIS->getIndexAt(i); OUTPUT: RETVAL double DataSet::_getValueAt(i) int i CODE: RETVAL = THIS->getValueAt(i); OUTPUT: RETVAL int DataSet::_getMaxI() CODE: RETVAL = THIS->getMaxI(); OUTPUT: RETVAL void DataSet::DESTROY() MODULE = Algorithm::SVM PACKAGE = Algorithm::SVM SVM * _new_svm(st,kt,d,g,c0,C,nu,e) int st int kt int d double g double c0 double C double nu double e void SVM::_addDataSet(ds) DataSet *ds CODE: THIS->addDataSet(ds); void SVM::_clearDataSet() CODE: THIS->clearDataSet(); int SVM::_train(retrain) int retrain CODE: RETVAL = THIS->train(retrain); OUTPUT: RETVAL double SVM::_crossValidate(nfolds) int nfolds CODE: RETVAL = THIS->crossValidate(nfolds); OUTPUT: RETVAL double SVM::_predict_value(ds) DataSet *ds CODE: RETVAL = THIS->predict_value(ds); OUTPUT: RETVAL double SVM::_predict(ds) DataSet *ds CODE: RETVAL = THIS->predict(ds); OUTPUT: RETVAL int SVM::_saveModel(filename) char *filename CODE: RETVAL = THIS->saveModel(filename); OUTPUT: RETVAL int SVM::_loadModel(filename) char *filename CODE: RETVAL = THIS->loadModel(filename); OUTPUT: RETVAL int SVM::_getNRClass() CODE: RETVAL = THIS->getNRClass(); OUTPUT: RETVAL void SVM::_getLabels(classes) int classes PPCODE: int i; int *labels; labels = new int[classes]; if(THIS->getLabels(labels)) { for (i=0;i < classes; i++) { XPUSHs(sv_2mortal(newSViv(labels[i]))); } } else { XSRETURN_UNDEF; } double SVM::_getSVRProbability() CODE: RETVAL = THIS->getSVRProbability(); OUTPUT: RETVAL int SVM::_checkProbabilityModel() CODE: RETVAL = THIS->checkProbabilityModel(); OUTPUT: RETVAL void SVM::_setSVMType(st) int st CODE: THIS->setSVMType(st); int SVM::_getSVMType() CODE: RETVAL = THIS->getSVMType(); OUTPUT: RETVAL void SVM::_setKernelType(kt) int kt CODE: THIS->setKernelType(kt); int SVM::_getKernelType() CODE: RETVAL = THIS->getKernelType(); OUTPUT: RETVAL void SVM::_setGamma(g) double g CODE: THIS->setGamma(g); double SVM::_getGamma() CODE: RETVAL = THIS->getGamma(); OUTPUT: RETVAL void SVM::_setDegree(d) int d CODE: THIS->setDegree(d); double SVM::_getDegree() CODE: RETVAL = THIS->getDegree(); OUTPUT: RETVAL void SVM::_setCoef0(c) double c CODE: THIS->setCoef0(c); double SVM::_getCoef0() CODE: RETVAL = THIS->getCoef0(); OUTPUT: RETVAL void SVM::_setC(c) double c CODE: THIS->setC(c); double SVM::_getC() CODE: RETVAL = THIS->getC(); OUTPUT: RETVAL void SVM::_setNu(n) double n CODE: THIS->setNu(n); double SVM::_getNu() CODE: RETVAL = THIS->getNu(); OUTPUT: RETVAL void SVM::_setEpsilon(e) double e CODE: THIS->setEpsilon(e); double SVM::_getEpsilon() CODE: RETVAL = THIS->getEpsilon(); OUTPUT: RETVAL void SVM::DESTROY() Algorithm-SVM-0.13/TODO0000644000077436411310000000000007560044207014331 0ustar lairdmwg-usersAlgorithm-SVM-0.13/bindings.cpp0000644000077436411310000002302110745456272016162 0ustar lairdmwg-users#include "bindings.h" #include #ifdef DEBUG #include void printf_dbg(const char *a, ...) { va_list alist; va_start(alist,a); vfprintf(stdout,a,alist); va_end(alist); fflush(NULL); } #else void printf_dbg(const char *a, ...) {} #endif DataSet::DataSet(double l) { label = l; realigned=false; n=0; max_n=16; attributes = (struct svm_node *)malloc(sizeof(struct svm_node) * max_n); assert(attributes!=NULL); attributes[0].index=-1; // insert end-of-data marker max_i=-1; } DataSet::~DataSet() { printf_dbg("destructor DS called\n"); if (realigned) { attributes[n].value=-1; // notify svm that dataset is destroyed } else { free(attributes); } } void DataSet::realign(struct svm_node *address) { assert(address!=NULL); memcpy(address,attributes,sizeof(struct svm_node)*(n+1)); free(attributes); attributes=address; max_n=n+1; realigned=true; attributes[n].value=0; } void DataSet::setAttribute(int k, double v) { if (realigned) { printf_dbg("set Attr with realigned k=%d, v=%lf\n",k,v); max_n=n+2; attributes[n].value=-1; // notify svm to not care about allocating memory for this dataset struct svm_node *address=(struct svm_node *)malloc(sizeof(struct svm_node)*max_n); assert(address!=NULL); memcpy(address,attributes,sizeof(struct svm_node)*(n+1)); attributes=address; realigned=false; if (k==-1) { return; } } else { printf_dbg("set Attr without realigned k=%d, v=%lf\n",k,v); } if (k>max_i) { max_i=k; if (v!=0) { attributes[n].index=k; attributes[n].value=v; n++; attributes[n].index=-1; } } else { // assume sorted array - check where it belongs int upper = n-1; int lower=0; int midpos=0; int midk=-1; while (lower<=upper) { midpos = (upper+lower)/2; midk=attributes[midpos].index; if (k>midk) { lower=midpos+1; } else if (klower; i--) { attributes[i].index=attributes[i-1].index; attributes[i].value=attributes[i-1].value; } attributes[lower].index=k; attributes[lower].value=v; n++; attributes[n].index=-1; } } } if (n>=max_n-1) { max_n*=2; attributes = (struct svm_node *)realloc(attributes,sizeof(struct svm_node)*max_n); assert(attributes!=NULL); } } double DataSet::getAttribute(int k) { int upper = n-1; int lower=0; int midpos=0; int midk=-1; while (upper>=lower) { midpos = (upper+lower)/2; midk=attributes[midpos].index; if (k>midk) { lower=midpos+1; } else if (k=0; i--) { assert(x_space[idx-1].index==-1); if (x_space[idx-1].value!=-1) { printf_dbg((dataset[i]->realigned ? "+" : "-")); printf_dbg("%lf\n",x_space[idx-1].value); idx-=((dataset[i]->n)+1); dataset[i]->setAttribute(-1,0); } else { printf_dbg("%d already destroyed or changed.\n",i); idx-=2; while (idx >= 0 && x_space[idx].index!=-1) { idx--; } idx++; } } assert(idx==0); free(x_space); x_space=NULL; } } int SVM::train(int retrain) { const char *error; // Free any old model we have. if(model != NULL) { svm_destroy_model(model); model = NULL; } if(retrain) { if(prob == NULL) return 0; model = svm_train(prob, ¶m); return 1; } if (x_space != NULL) free_x_space(); if(prob != NULL) free(prob); model = NULL; prob = NULL; // Allocate memory for the problem struct. if((prob = (struct svm_problem *)malloc(sizeof(struct svm_problem))) == NULL) return 0; prob->l = dataset.size(); // Allocate memory for the labels/nodes. prob->y = (double *)malloc(sizeof(double) * prob->l); prob->x = (struct svm_node **)malloc(sizeof(struct svm_node *) * prob->l); if((prob->y == NULL) || (prob->x == NULL)) { if(prob->y != NULL) free(prob->y); if(prob->x != NULL) free(prob->x); free(prob); return 0; } // Check for errors with the parameters. error = svm_check_parameter(prob, ¶m); if(error) { free(prob->x); free (prob->y); free(prob); return 0; } // Allocate x_space and successively release dataset memory // (realigning the dataset memory to x_space) nelem=0; for (unsigned int i=0; in+1; } x_space = (struct svm_node *)malloc(sizeof(struct svm_node)*nelem); long idx=0; for (unsigned int i=0; irealign(x_space+idx); idx+=(dataset[i]->n)+1; } if (x_space==NULL) { free(prob->y); free(prob->x); free(prob); nelem=0; return 0; } // Munge the datasets into the format that libsvm expects. int maxi = 0; long n=0; for(int i = 0; i < prob->l; i++) { prob->x[i] = &x_space[n]; //dataset[i]->attributes; assert((dataset[i]->attributes)==(&x_space[n])); n+=dataset[i]->n+1; prob->y[i] = dataset[i]->getLabel(); if( dataset[i]->max_i > maxi) maxi = dataset[i]->max_i; } printf_dbg("\nnelem=%ld\n",n); if(param.gamma == 0) param.gamma = 1.0/maxi; model = svm_train(prob, ¶m); return 1; } double SVM::predict_value(DataSet *ds) { double pred[100]; if(ds == NULL) return 0; svm_predict_values(model, ds->attributes, pred); return pred[0]; } double SVM::predict(DataSet *ds) { double pred; if(ds == NULL) return 0; pred = svm_predict(model, ds->attributes); return pred; } int SVM::saveModel(char *filename) { if((model == NULL) || (filename == NULL)) { return 0; } else { return ! svm_save_model(filename, model); } } int SVM::loadModel(char *filename) { struct svm_model *tmodel; if(filename == NULL) return 0; if(x_space != NULL) { free_x_space(); } if(model != NULL) { svm_destroy_model(model); model = NULL; } if((tmodel = svm_load_model(filename)) != NULL) { model = tmodel; return 1; } return 0; } double SVM::crossValidate(int nfolds) { double sumv = 0, sumy = 0, sumvv = 0, sumyy = 0, sumvy = 0; double total_error = 0; int total_correct = 0; int i; if(! prob) return 0; if(! randomized) { // random shuffle for(i=0;il;i++) { int j = i+rand()%(prob->l-i); struct svm_node *tx; double ty; tx = prob->x[i]; prob->x[i] = prob->x[j]; prob->x[j] = tx; ty = prob->y[i]; prob->y[i] = prob->y[j]; prob->y[j] = ty; } randomized = 1; } for(i=0;il/nfolds; int end = (i+1)*prob->l/nfolds; int j,k; struct svm_problem subprob; subprob.l = prob->l-(end-begin); subprob.x = (struct svm_node**)malloc(sizeof(struct svm_node)*subprob.l); subprob.y = (double *)malloc(sizeof(double)*subprob.l); k=0; for(j=0;jx[j]; subprob.y[k] = prob->y[j]; ++k; } for(j=end;jl;j++) { subprob.x[k] = prob->x[j]; subprob.y[k] = prob->y[j]; ++k; } if(param.svm_type == EPSILON_SVR || param.svm_type == NU_SVR) { struct svm_model *submodel = svm_train(&subprob,¶m); double error = 0; for(j=begin;jx[j]); double y = prob->y[j]; error += (v-y)*(v-y); sumv += v; sumy += y; sumvv += v*v; sumyy += y*y; sumvy += v*y; } svm_destroy_model(submodel); // cout << "Mean squared error = %g\n", error/(end-begin)); total_error += error; } else { struct svm_model *submodel = svm_train(&subprob,¶m); int correct = 0; for(j=begin;jx[j]); if(v == prob->y[j]) ++correct; } svm_destroy_model(submodel); //cout << "Accuracy = " << 100.0*correct/(end-begin) << " (" << //correct << "/" << (end-begin) << endl; total_correct += correct; } free(subprob.x); free(subprob.y); } if(param.svm_type == EPSILON_SVR || param.svm_type == NU_SVR) { return ((prob->l*sumvy-sumv*sumy)*(prob->l*sumvy-sumv*sumy))/ ((prob->l*sumvv-sumv*sumv)*(prob->l*sumyy-sumy*sumy)); } else { return 100.0*total_correct/prob->l; } } int SVM::getNRClass() { if(model == NULL) { return 0; } else { return svm_get_nr_class(model); } } int SVM::getLabels(int* label) { if(model == NULL) { return 0; } else { svm_get_labels(model, label); return 1; } } double SVM::getSVRProbability() { if((model == NULL) || (svm_check_probability_model(model))) { return 0; } else { return svm_get_svr_probability(model); } } int SVM::checkProbabilityModel() { if(model == NULL) { return 0; } else { return svm_check_probability_model(model); } } SVM::~SVM() { if(x_space!=NULL) { free_x_space(); } if(model != NULL) { svm_destroy_model(model); model=NULL; } if(prob != NULL) { free(prob); prob=NULL; } } Algorithm-SVM-0.13/bindings.h0000644000077436411310000000423610745456225015634 0ustar lairdmwg-users#ifndef __BINDINGS_H__ #define __BINDINGS_H__ using namespace std; #include #include #include #include "libsvm.h" class DataSet { friend class SVM; private: double label; struct svm_node *attributes; int n; int max_n; int max_i; bool realigned; public: DataSet(double l); void setLabel(double l) { label = l; } double getLabel() { return label; } int getMaxI() { return max_i; } void setAttribute(int k, double v); double getAttribute(int k); int getIndexAt(int i) { if (i<=n) { return attributes[i].index; } else { return -1; }} double getValueAt(int i) { if (i<=n) { return attributes[i].value; } else { return 0; }} void realign(struct svm_node *address); ~DataSet(); }; class SVM { public: SVM(int st, int kt, int d, double g, double c0, double C, double nu, double e); void addDataSet(DataSet *ds); int saveModel(char *filename); int loadModel(char *filename); void clearDataSet(); int train(int retrain); double predict_value(DataSet *ds); double predict(DataSet *ds); void free_x_space(); void setSVMType(int st) { param.svm_type = st; } int getSVMType() { return param.svm_type; } void setKernelType(int kt) { param.kernel_type = kt; } int getKernelType() { return param.kernel_type; } void setGamma(double g) { param.gamma = g; } double getGamma() { return param.gamma; } void setDegree(int d) { param.degree = d; } double getDegree() { return param.degree; } void setCoef0(double c) { param.coef0 = c; } double getCoef0() { return param.coef0; } void setC(double c) { param.C = c; } double getC() { return param.C; } void setNu(double n) { param.nu = n; } double getNu() { return param.nu; } void setEpsilon(double e) { param.p = e; } double getEpsilon() { return param.p; } double crossValidate(int nfolds); int getNRClass(); int getLabels(int* label); double getSVRProbability(); int checkProbabilityModel(); ~SVM(); private: long nelem; struct svm_parameter param; vector dataset; struct svm_problem *prob; struct svm_model *model; struct svm_node *x_space; int randomized; }; #endif Algorithm-SVM-0.13/libsvm.cpp0000644000077436411310000017110310745440721015656 0ustar lairdmwg-users#include #include #include #include #include #include #include #include "libsvm.h" typedef float Qfloat; typedef signed char schar; #ifndef min template inline T min(T x,T y) { return (x inline T max(T x,T y) { return (x>y)?x:y; } #endif template inline void swap(T& x, T& y) { T t=x; x=y; y=t; } template inline void clone(T*& dst, S* src, int n) { dst = new T[n]; memcpy((void *)dst,(void *)src,sizeof(T)*n); } inline double powi(double base, int times) { double tmp = base, ret = 1.0; for(int t=times; t>0; t/=2) { if(t%2==1) ret*=tmp; tmp = tmp * tmp; } return ret; } #define INF HUGE_VAL #define TAU 1e-12 #define Malloc(type,n) (type *)malloc((n)*sizeof(type)) #if 1 void info(const char *fmt,...) { va_list ap; va_start(ap,fmt); vprintf(fmt,ap); va_end(ap); } void info_flush() { fflush(stdout); } #else void info(char *fmt,...) {} void info_flush() {} #endif // // Kernel Cache // // l is the number of total data items // size is the cache size limit in bytes // class Cache { public: Cache(int l,long int size); ~Cache(); // request data [0,len) // return some position p where [p,len) need to be filled // (p >= len if nothing needs to be filled) int get_data(const int index, Qfloat **data, int len); void swap_index(int i, int j); // future_option private: int l; long int size; struct head_t { head_t *prev, *next; // a cicular list Qfloat *data; int len; // data[0,len) is cached in this entry }; head_t *head; head_t lru_head; void lru_delete(head_t *h); void lru_insert(head_t *h); }; Cache::Cache(int l_,long int size_):l(l_),size(size_) { head = (head_t *)calloc(l,sizeof(head_t)); // initialized to 0 size /= sizeof(Qfloat); size -= l * sizeof(head_t) / sizeof(Qfloat); size = max(size, 2 * (long int) l); // cache must be large enough for two columns lru_head.next = lru_head.prev = &lru_head; } Cache::~Cache() { for(head_t *h = lru_head.next; h != &lru_head; h=h->next) free(h->data); free(head); } void Cache::lru_delete(head_t *h) { // delete from current location h->prev->next = h->next; h->next->prev = h->prev; } void Cache::lru_insert(head_t *h) { // insert to last position h->next = &lru_head; h->prev = lru_head.prev; h->prev->next = h; h->next->prev = h; } int Cache::get_data(const int index, Qfloat **data, int len) { head_t *h = &head[index]; if(h->len) lru_delete(h); int more = len - h->len; if(more > 0) { // free old space while(size < more) { head_t *old = lru_head.next; lru_delete(old); free(old->data); size += old->len; old->data = 0; old->len = 0; } // allocate new space h->data = (Qfloat *)realloc(h->data,sizeof(Qfloat)*len); size -= more; swap(h->len,len); } lru_insert(h); *data = h->data; return len; } void Cache::swap_index(int i, int j) { if(i==j) return; if(head[i].len) lru_delete(&head[i]); if(head[j].len) lru_delete(&head[j]); swap(head[i].data,head[j].data); swap(head[i].len,head[j].len); if(head[i].len) lru_insert(&head[i]); if(head[j].len) lru_insert(&head[j]); if(i>j) swap(i,j); for(head_t *h = lru_head.next; h!=&lru_head; h=h->next) { if(h->len > i) { if(h->len > j) swap(h->data[i],h->data[j]); else { // give up lru_delete(h); free(h->data); size += h->len; h->data = 0; h->len = 0; } } } } // // Kernel evaluation // // the static method k_function is for doing single kernel evaluation // the constructor of Kernel prepares to calculate the l*l kernel matrix // the member function get_Q is for getting one column from the Q Matrix // class QMatrix { public: virtual Qfloat *get_Q(int column, int len) const = 0; virtual Qfloat *get_QD() const = 0; virtual void swap_index(int i, int j) const = 0; virtual ~QMatrix() {} }; class Kernel: public QMatrix { public: Kernel(int l, svm_node * const * x, const svm_parameter& param); virtual ~Kernel(); static double k_function(const svm_node *x, const svm_node *y, const svm_parameter& param); virtual Qfloat *get_Q(int column, int len) const = 0; virtual Qfloat *get_QD() const = 0; virtual void swap_index(int i, int j) const // no so const... { swap(x[i],x[j]); if(x_square) swap(x_square[i],x_square[j]); } protected: double (Kernel::*kernel_function)(int i, int j) const; private: const svm_node **x; double *x_square; // svm_parameter const int kernel_type; const int degree; const double gamma; const double coef0; static double dot(const svm_node *px, const svm_node *py); double kernel_linear(int i, int j) const { return dot(x[i],x[j]); } double kernel_poly(int i, int j) const { return powi(gamma*dot(x[i],x[j])+coef0,degree); } double kernel_rbf(int i, int j) const { return exp(-gamma*(x_square[i]+x_square[j]-2*dot(x[i],x[j]))); } double kernel_sigmoid(int i, int j) const { return tanh(gamma*dot(x[i],x[j])+coef0); } double kernel_precomputed(int i, int j) const { return x[i][(int)(x[j][0].value)].value; } }; Kernel::Kernel(int l, svm_node * const * x_, const svm_parameter& param) :kernel_type(param.kernel_type), degree(param.degree), gamma(param.gamma), coef0(param.coef0) { switch(kernel_type) { case LINEAR: kernel_function = &Kernel::kernel_linear; break; case POLY: kernel_function = &Kernel::kernel_poly; break; case RBF: kernel_function = &Kernel::kernel_rbf; break; case SIGMOID: kernel_function = &Kernel::kernel_sigmoid; break; case PRECOMPUTED: kernel_function = &Kernel::kernel_precomputed; break; } clone(x,x_,l); if(kernel_type == RBF) { x_square = new double[l]; for(int i=0;iindex != -1 && py->index != -1) { if(px->index == py->index) { sum += px->value * py->value; ++px; ++py; } else { if(px->index > py->index) ++py; else ++px; } } return sum; } double Kernel::k_function(const svm_node *x, const svm_node *y, const svm_parameter& param) { switch(param.kernel_type) { case LINEAR: return dot(x,y); case POLY: return powi(param.gamma*dot(x,y)+param.coef0,param.degree); case RBF: { double sum = 0; while(x->index != -1 && y->index !=-1) { if(x->index == y->index) { double d = x->value - y->value; sum += d*d; ++x; ++y; } else { if(x->index > y->index) { sum += y->value * y->value; ++y; } else { sum += x->value * x->value; ++x; } } } while(x->index != -1) { sum += x->value * x->value; ++x; } while(y->index != -1) { sum += y->value * y->value; ++y; } return exp(-param.gamma*sum); } case SIGMOID: return tanh(param.gamma*dot(x,y)+param.coef0); case PRECOMPUTED: //x: test (validation), y: SV return x[(int)(y->value)].value; default: return 0; // Unreachable } } // An SMO algorithm in Fan et al., JMLR 6(2005), p. 1889--1918 // Solves: // // min 0.5(\alpha^T Q \alpha) + p^T \alpha // // y^T \alpha = \delta // y_i = +1 or -1 // 0 <= alpha_i <= Cp for y_i = 1 // 0 <= alpha_i <= Cn for y_i = -1 // // Given: // // Q, p, y, Cp, Cn, and an initial feasible point \alpha // l is the size of vectors and matrices // eps is the stopping tolerance // // solution will be put in \alpha, objective value will be put in obj // class Solver { public: Solver() {}; virtual ~Solver() {}; struct SolutionInfo { double obj; double rho; double upper_bound_p; double upper_bound_n; double r; // for Solver_NU }; void Solve(int l, const QMatrix& Q, const double *p_, const schar *y_, double *alpha_, double Cp, double Cn, double eps, SolutionInfo* si, int shrinking); protected: int active_size; schar *y; double *G; // gradient of objective function enum { LOWER_BOUND, UPPER_BOUND, FREE }; char *alpha_status; // LOWER_BOUND, UPPER_BOUND, FREE double *alpha; const QMatrix *Q; const Qfloat *QD; double eps; double Cp,Cn; double *p; int *active_set; double *G_bar; // gradient, if we treat free variables as 0 int l; bool unshrinked; // XXX double get_C(int i) { return (y[i] > 0)? Cp : Cn; } void update_alpha_status(int i) { if(alpha[i] >= get_C(i)) alpha_status[i] = UPPER_BOUND; else if(alpha[i] <= 0) alpha_status[i] = LOWER_BOUND; else alpha_status[i] = FREE; } bool is_upper_bound(int i) { return alpha_status[i] == UPPER_BOUND; } bool is_lower_bound(int i) { return alpha_status[i] == LOWER_BOUND; } bool is_free(int i) { return alpha_status[i] == FREE; } void swap_index(int i, int j); void reconstruct_gradient(); virtual int select_working_set(int &i, int &j); virtual double calculate_rho(); virtual void do_shrinking(); private: bool be_shrunken(int i, double Gmax1, double Gmax2); }; void Solver::swap_index(int i, int j) { Q->swap_index(i,j); swap(y[i],y[j]); swap(G[i],G[j]); swap(alpha_status[i],alpha_status[j]); swap(alpha[i],alpha[j]); swap(p[i],p[j]); swap(active_set[i],active_set[j]); swap(G_bar[i],G_bar[j]); } void Solver::reconstruct_gradient() { // reconstruct inactive elements of G from G_bar and free variables if(active_size == l) return; int i; for(i=active_size;iget_Q(i,l); double alpha_i = alpha[i]; for(int j=active_size;jl = l; this->Q = &Q; QD=Q.get_QD(); clone(p, p_,l); clone(y, y_,l); clone(alpha,alpha_,l); this->Cp = Cp; this->Cn = Cn; this->eps = eps; unshrinked = false; // initialize alpha_status { alpha_status = new char[l]; for(int i=0;i 0) { if(alpha[j] < 0) { alpha[j] = 0; alpha[i] = diff; } } else { if(alpha[i] < 0) { alpha[i] = 0; alpha[j] = -diff; } } if(diff > C_i - C_j) { if(alpha[i] > C_i) { alpha[i] = C_i; alpha[j] = C_i - diff; } } else { if(alpha[j] > C_j) { alpha[j] = C_j; alpha[i] = C_j + diff; } } } else { double quad_coef = Q_i[i]+Q_j[j]-2*Q_i[j]; if (quad_coef <= 0) quad_coef = TAU; double delta = (G[i]-G[j])/quad_coef; double sum = alpha[i] + alpha[j]; alpha[i] -= delta; alpha[j] += delta; if(sum > C_i) { if(alpha[i] > C_i) { alpha[i] = C_i; alpha[j] = sum - C_i; } } else { if(alpha[j] < 0) { alpha[j] = 0; alpha[i] = sum; } } if(sum > C_j) { if(alpha[j] > C_j) { alpha[j] = C_j; alpha[i] = sum - C_j; } } else { if(alpha[i] < 0) { alpha[i] = 0; alpha[j] = sum; } } } // update G double delta_alpha_i = alpha[i] - old_alpha_i; double delta_alpha_j = alpha[j] - old_alpha_j; for(int k=0;krho = calculate_rho(); // calculate objective value { double v = 0; int i; for(i=0;iobj = v/2; } // put back the solution { for(int i=0;iupper_bound_p = Cp; si->upper_bound_n = Cn; info("\noptimization finished, #iter = %d\n",iter); delete[] p; delete[] y; delete[] alpha; delete[] alpha_status; delete[] active_set; delete[] G; delete[] G_bar; } // return 1 if already optimal, return 0 otherwise int Solver::select_working_set(int &out_i, int &out_j) { // return i,j such that // i: maximizes -y_i * grad(f)_i, i in I_up(\alpha) // j: minimizes the decrease of obj value // (if quadratic coefficeint <= 0, replace it with tau) // -y_j*grad(f)_j < -y_i*grad(f)_i, j in I_low(\alpha) double Gmax = -INF; double Gmax2 = -INF; int Gmax_idx = -1; int Gmin_idx = -1; double obj_diff_min = INF; for(int t=0;t= Gmax) { Gmax = -G[t]; Gmax_idx = t; } } else { if(!is_lower_bound(t)) if(G[t] >= Gmax) { Gmax = G[t]; Gmax_idx = t; } } int i = Gmax_idx; const Qfloat *Q_i = NULL; if(i != -1) // NULL Q_i not accessed: Gmax=-INF if i=-1 Q_i = Q->get_Q(i,active_size); for(int j=0;j= Gmax2) Gmax2 = G[j]; if (grad_diff > 0) { double obj_diff; double quad_coef=Q_i[i]+QD[j]-2*y[i]*Q_i[j]; if (quad_coef > 0) obj_diff = -(grad_diff*grad_diff)/quad_coef; else obj_diff = -(grad_diff*grad_diff)/TAU; if (obj_diff <= obj_diff_min) { Gmin_idx=j; obj_diff_min = obj_diff; } } } } else { if (!is_upper_bound(j)) { double grad_diff= Gmax-G[j]; if (-G[j] >= Gmax2) Gmax2 = -G[j]; if (grad_diff > 0) { double obj_diff; double quad_coef=Q_i[i]+QD[j]+2*y[i]*Q_i[j]; if (quad_coef > 0) obj_diff = -(grad_diff*grad_diff)/quad_coef; else obj_diff = -(grad_diff*grad_diff)/TAU; if (obj_diff <= obj_diff_min) { Gmin_idx=j; obj_diff_min = obj_diff; } } } } } if(Gmax+Gmax2 < eps) return 1; out_i = Gmax_idx; out_j = Gmin_idx; return 0; } bool Solver::be_shrunken(int i, double Gmax1, double Gmax2) { if(is_upper_bound(i)) { if(y[i]==+1) return(-G[i] > Gmax1); else return(-G[i] > Gmax2); } else if(is_lower_bound(i)) { if(y[i]==+1) return(G[i] > Gmax2); else return(G[i] > Gmax1); } else return(false); } void Solver::do_shrinking() { int i; double Gmax1 = -INF; // max { -y_i * grad(f)_i | i in I_up(\alpha) } double Gmax2 = -INF; // max { y_i * grad(f)_i | i in I_low(\alpha) } // find maximal violating pair first for(i=0;i= Gmax1) Gmax1 = -G[i]; } if(!is_lower_bound(i)) { if(G[i] >= Gmax2) Gmax2 = G[i]; } } else { if(!is_upper_bound(i)) { if(-G[i] >= Gmax2) Gmax2 = -G[i]; } if(!is_lower_bound(i)) { if(G[i] >= Gmax1) Gmax1 = G[i]; } } } // shrink for(i=0;i i) { if (!be_shrunken(active_size, Gmax1, Gmax2)) { swap_index(i,active_size); break; } active_size--; } } // unshrink, check all variables again before final iterations if(unshrinked || Gmax1 + Gmax2 > eps*10) return; unshrinked = true; reconstruct_gradient(); for(i=l-1;i>=active_size;i--) if (!be_shrunken(i, Gmax1, Gmax2)) { while (active_size < i) { if (be_shrunken(active_size, Gmax1, Gmax2)) { swap_index(i,active_size); break; } active_size++; } active_size++; } } double Solver::calculate_rho() { double r; int nr_free = 0; double ub = INF, lb = -INF, sum_free = 0; for(int i=0;i0) r = sum_free/nr_free; else r = (ub+lb)/2; return r; } // // Solver for nu-svm classification and regression // // additional constraint: e^T \alpha = constant // class Solver_NU : public Solver { public: Solver_NU() {} void Solve(int l, const QMatrix& Q, const double *p, const schar *y, double *alpha, double Cp, double Cn, double eps, SolutionInfo* si, int shrinking) { this->si = si; Solver::Solve(l,Q,p,y,alpha,Cp,Cn,eps,si,shrinking); } private: SolutionInfo *si; int select_working_set(int &i, int &j); double calculate_rho(); bool be_shrunken(int i, double Gmax1, double Gmax2, double Gmax3, double Gmax4); void do_shrinking(); }; // return 1 if already optimal, return 0 otherwise int Solver_NU::select_working_set(int &out_i, int &out_j) { // return i,j such that y_i = y_j and // i: maximizes -y_i * grad(f)_i, i in I_up(\alpha) // j: minimizes the decrease of obj value // (if quadratic coefficeint <= 0, replace it with tau) // -y_j*grad(f)_j < -y_i*grad(f)_i, j in I_low(\alpha) double Gmaxp = -INF; double Gmaxp2 = -INF; int Gmaxp_idx = -1; double Gmaxn = -INF; double Gmaxn2 = -INF; int Gmaxn_idx = -1; int Gmin_idx = -1; double obj_diff_min = INF; for(int t=0;t= Gmaxp) { Gmaxp = -G[t]; Gmaxp_idx = t; } } else { if(!is_lower_bound(t)) if(G[t] >= Gmaxn) { Gmaxn = G[t]; Gmaxn_idx = t; } } int ip = Gmaxp_idx; int in = Gmaxn_idx; const Qfloat *Q_ip = NULL; const Qfloat *Q_in = NULL; if(ip != -1) // NULL Q_ip not accessed: Gmaxp=-INF if ip=-1 Q_ip = Q->get_Q(ip,active_size); if(in != -1) Q_in = Q->get_Q(in,active_size); for(int j=0;j= Gmaxp2) Gmaxp2 = G[j]; if (grad_diff > 0) { double obj_diff; double quad_coef = Q_ip[ip]+QD[j]-2*Q_ip[j]; if (quad_coef > 0) obj_diff = -(grad_diff*grad_diff)/quad_coef; else obj_diff = -(grad_diff*grad_diff)/TAU; if (obj_diff <= obj_diff_min) { Gmin_idx=j; obj_diff_min = obj_diff; } } } } else { if (!is_upper_bound(j)) { double grad_diff=Gmaxn-G[j]; if (-G[j] >= Gmaxn2) Gmaxn2 = -G[j]; if (grad_diff > 0) { double obj_diff; double quad_coef = Q_in[in]+QD[j]-2*Q_in[j]; if (quad_coef > 0) obj_diff = -(grad_diff*grad_diff)/quad_coef; else obj_diff = -(grad_diff*grad_diff)/TAU; if (obj_diff <= obj_diff_min) { Gmin_idx=j; obj_diff_min = obj_diff; } } } } } if(max(Gmaxp+Gmaxp2,Gmaxn+Gmaxn2) < eps) return 1; if (y[Gmin_idx] == +1) out_i = Gmaxp_idx; else out_i = Gmaxn_idx; out_j = Gmin_idx; return 0; } bool Solver_NU::be_shrunken(int i, double Gmax1, double Gmax2, double Gmax3, double Gmax4) { if(is_upper_bound(i)) { if(y[i]==+1) return(-G[i] > Gmax1); else return(-G[i] > Gmax4); } else if(is_lower_bound(i)) { if(y[i]==+1) return(G[i] > Gmax2); else return(G[i] > Gmax3); } else return(false); } void Solver_NU::do_shrinking() { double Gmax1 = -INF; // max { -y_i * grad(f)_i | y_i = +1, i in I_up(\alpha) } double Gmax2 = -INF; // max { y_i * grad(f)_i | y_i = +1, i in I_low(\alpha) } double Gmax3 = -INF; // max { -y_i * grad(f)_i | y_i = -1, i in I_up(\alpha) } double Gmax4 = -INF; // max { y_i * grad(f)_i | y_i = -1, i in I_low(\alpha) } // find maximal violating pair first int i; for(i=0;i Gmax1) Gmax1 = -G[i]; } else if(-G[i] > Gmax4) Gmax4 = -G[i]; } if(!is_lower_bound(i)) { if(y[i]==+1) { if(G[i] > Gmax2) Gmax2 = G[i]; } else if(G[i] > Gmax3) Gmax3 = G[i]; } } // shrinking for(i=0;i i) { if (!be_shrunken(active_size, Gmax1, Gmax2, Gmax3, Gmax4)) { swap_index(i,active_size); break; } active_size--; } } // unshrink, check all variables again before final iterations if(unshrinked || max(Gmax1+Gmax2,Gmax3+Gmax4) > eps*10) return; unshrinked = true; reconstruct_gradient(); for(i=l-1;i>=active_size;i--) if (!be_shrunken(i, Gmax1, Gmax2, Gmax3, Gmax4)) { while (active_size < i) { if (be_shrunken(active_size, Gmax1, Gmax2, Gmax3, Gmax4)) { swap_index(i,active_size); break; } active_size++; } active_size++; } } double Solver_NU::calculate_rho() { int nr_free1 = 0,nr_free2 = 0; double ub1 = INF, ub2 = INF; double lb1 = -INF, lb2 = -INF; double sum_free1 = 0, sum_free2 = 0; for(int i=0;i 0) r1 = sum_free1/nr_free1; else r1 = (ub1+lb1)/2; if(nr_free2 > 0) r2 = sum_free2/nr_free2; else r2 = (ub2+lb2)/2; si->r = (r1+r2)/2; return (r1-r2)/2; } // // Q matrices for various formulations // class SVC_Q: public Kernel { public: SVC_Q(const svm_problem& prob, const svm_parameter& param, const schar *y_) :Kernel(prob.l, prob.x, param) { clone(y,y_,prob.l); cache = new Cache(prob.l,(long int)(param.cache_size*(1<<20))); QD = new Qfloat[prob.l]; for(int i=0;i*kernel_function)(i,i); } Qfloat *get_Q(int i, int len) const { Qfloat *data; int start; if((start = cache->get_data(i,&data,len)) < len) { for(int j=start;j*kernel_function)(i,j)); } return data; } Qfloat *get_QD() const { return QD; } void swap_index(int i, int j) const { cache->swap_index(i,j); Kernel::swap_index(i,j); swap(y[i],y[j]); swap(QD[i],QD[j]); } ~SVC_Q() { delete[] y; delete cache; delete[] QD; } private: schar *y; Cache *cache; Qfloat *QD; }; class ONE_CLASS_Q: public Kernel { public: ONE_CLASS_Q(const svm_problem& prob, const svm_parameter& param) :Kernel(prob.l, prob.x, param) { cache = new Cache(prob.l,(long int)(param.cache_size*(1<<20))); QD = new Qfloat[prob.l]; for(int i=0;i*kernel_function)(i,i); } Qfloat *get_Q(int i, int len) const { Qfloat *data; int start; if((start = cache->get_data(i,&data,len)) < len) { for(int j=start;j*kernel_function)(i,j); } return data; } Qfloat *get_QD() const { return QD; } void swap_index(int i, int j) const { cache->swap_index(i,j); Kernel::swap_index(i,j); swap(QD[i],QD[j]); } ~ONE_CLASS_Q() { delete cache; delete[] QD; } private: Cache *cache; Qfloat *QD; }; class SVR_Q: public Kernel { public: SVR_Q(const svm_problem& prob, const svm_parameter& param) :Kernel(prob.l, prob.x, param) { l = prob.l; cache = new Cache(l,(long int)(param.cache_size*(1<<20))); QD = new Qfloat[2*l]; sign = new schar[2*l]; index = new int[2*l]; for(int k=0;k*kernel_function)(k,k); QD[k+l]=QD[k]; } buffer[0] = new Qfloat[2*l]; buffer[1] = new Qfloat[2*l]; next_buffer = 0; } void swap_index(int i, int j) const { swap(sign[i],sign[j]); swap(index[i],index[j]); swap(QD[i],QD[j]); } Qfloat *get_Q(int i, int len) const { Qfloat *data; int real_i = index[i]; if(cache->get_data(real_i,&data,l) < l) { for(int j=0;j*kernel_function)(real_i,j); } // reorder and copy Qfloat *buf = buffer[next_buffer]; next_buffer = 1 - next_buffer; schar si = sign[i]; for(int j=0;jl; double *minus_ones = new double[l]; schar *y = new schar[l]; int i; for(i=0;iy[i] > 0) y[i] = +1; else y[i]=-1; } Solver s; s.Solve(l, SVC_Q(*prob,*param,y), minus_ones, y, alpha, Cp, Cn, param->eps, si, param->shrinking); double sum_alpha=0; for(i=0;il)); for(i=0;il; double nu = param->nu; schar *y = new schar[l]; for(i=0;iy[i]>0) y[i] = +1; else y[i] = -1; double sum_pos = nu*l/2; double sum_neg = nu*l/2; for(i=0;ieps, si, param->shrinking); double r = si->r; info("C = %f\n",1/r); for(i=0;irho /= r; si->obj /= (r*r); si->upper_bound_p = 1/r; si->upper_bound_n = 1/r; delete[] y; delete[] zeros; } static void solve_one_class( const svm_problem *prob, const svm_parameter *param, double *alpha, Solver::SolutionInfo* si) { int l = prob->l; double *zeros = new double[l]; schar *ones = new schar[l]; int i; int n = (int)(param->nu*prob->l); // # of alpha's at upper bound for(i=0;il) alpha[n] = param->nu * prob->l - n; for(i=n+1;ieps, si, param->shrinking); delete[] zeros; delete[] ones; } static void solve_epsilon_svr( const svm_problem *prob, const svm_parameter *param, double *alpha, Solver::SolutionInfo* si) { int l = prob->l; double *alpha2 = new double[2*l]; double *linear_term = new double[2*l]; schar *y = new schar[2*l]; int i; for(i=0;ip - prob->y[i]; y[i] = 1; alpha2[i+l] = 0; linear_term[i+l] = param->p + prob->y[i]; y[i+l] = -1; } Solver s; s.Solve(2*l, SVR_Q(*prob,*param), linear_term, y, alpha2, param->C, param->C, param->eps, si, param->shrinking); double sum_alpha = 0; for(i=0;iC*l)); delete[] alpha2; delete[] linear_term; delete[] y; } static void solve_nu_svr( const svm_problem *prob, const svm_parameter *param, double *alpha, Solver::SolutionInfo* si) { int l = prob->l; double C = param->C; double *alpha2 = new double[2*l]; double *linear_term = new double[2*l]; schar *y = new schar[2*l]; int i; double sum = C * param->nu * l / 2; for(i=0;iy[i]; y[i] = 1; linear_term[i+l] = prob->y[i]; y[i+l] = -1; } Solver_NU s; s.Solve(2*l, SVR_Q(*prob,*param), linear_term, y, alpha2, C, C, param->eps, si, param->shrinking); info("epsilon = %f\n",-si->r); for(i=0;il); Solver::SolutionInfo si; switch(param->svm_type) { case C_SVC: solve_c_svc(prob,param,alpha,&si,Cp,Cn); break; case NU_SVC: solve_nu_svc(prob,param,alpha,&si); break; case ONE_CLASS: solve_one_class(prob,param,alpha,&si); break; case EPSILON_SVR: solve_epsilon_svr(prob,param,alpha,&si); break; case NU_SVR: solve_nu_svr(prob,param,alpha,&si); break; } info("obj = %f, rho = %f\n",si.obj,si.rho); // output SVs int nSV = 0; int nBSV = 0; for(int i=0;il;i++) { if(fabs(alpha[i]) > 0) { ++nSV; if(prob->y[i] > 0) { if(fabs(alpha[i]) >= si.upper_bound_p) ++nBSV; } else { if(fabs(alpha[i]) >= si.upper_bound_n) ++nBSV; } } } info("nSV = %d, nBSV = %d\n",nSV,nBSV); decision_function f; f.alpha = alpha; f.rho = si.rho; return f; } // // svm_model // struct svm_model { svm_parameter param; // parameter int nr_class; // number of classes, = 2 in regression/one class svm int l; // total #SV svm_node **SV; // SVs (SV[l]) double **sv_coef; // coefficients for SVs in decision functions (sv_coef[k-1][l]) double *rho; // constants in decision functions (rho[k*(k-1)/2]) double *probA; // pariwise probability information double *probB; // for classification only int *label; // label of each class (label[k]) int *nSV; // number of SVs for each class (nSV[k]) // nSV[0] + nSV[1] + ... + nSV[k-1] = l // XXX int free_sv; // 1 if svm_model is created by svm_load_model // 0 if svm_model is created by svm_train }; // Platt's binary SVM Probablistic Output: an improvement from Lin et al. void sigmoid_train( int l, const double *dec_values, const double *labels, double& A, double& B) { double prior1=0, prior0 = 0; int i; for (i=0;i 0) prior1+=1; else prior0+=1; int max_iter=100; // Maximal number of iterations double min_step=1e-10; // Minimal step taken in line search double sigma=1e-12; // For numerically strict PD of Hessian double eps=1e-5; double hiTarget=(prior1+1.0)/(prior1+2.0); double loTarget=1/(prior0+2.0); double *t=Malloc(double,l); double fApB,p,q,h11,h22,h21,g1,g2,det,dA,dB,gd,stepsize; double newA,newB,newf,d1,d2; int iter; // Initial Point and Initial Fun Value A=0.0; B=log((prior0+1.0)/(prior1+1.0)); double fval = 0.0; for (i=0;i0) t[i]=hiTarget; else t[i]=loTarget; fApB = dec_values[i]*A+B; if (fApB>=0) fval += t[i]*fApB + log(1+exp(-fApB)); else fval += (t[i] - 1)*fApB +log(1+exp(fApB)); } for (iter=0;iter= 0) { p=exp(-fApB)/(1.0+exp(-fApB)); q=1.0/(1.0+exp(-fApB)); } else { p=1.0/(1.0+exp(fApB)); q=exp(fApB)/(1.0+exp(fApB)); } d2=p*q; h11+=dec_values[i]*dec_values[i]*d2; h22+=d2; h21+=dec_values[i]*d2; d1=t[i]-p; g1+=dec_values[i]*d1; g2+=d1; } // Stopping Criteria if (fabs(g1)= min_step) { newA = A + stepsize * dA; newB = B + stepsize * dB; // New function value newf = 0.0; for (i=0;i= 0) newf += t[i]*fApB + log(1+exp(-fApB)); else newf += (t[i] - 1)*fApB +log(1+exp(fApB)); } // Check sufficient decrease if (newf=max_iter) info("Reaching maximal iterations in two-class probability estimates\n"); free(t); } double sigmoid_predict(double decision_value, double A, double B) { double fApB = decision_value*A+B; if (fApB >= 0) return exp(-fApB)/(1.0+exp(-fApB)); else return 1.0/(1+exp(fApB)) ; } // Method 2 from the multiclass_prob paper by Wu, Lin, and Weng void multiclass_probability(int k, double **r, double *p) { int t,j; int iter = 0, max_iter=max(100,k); double **Q=Malloc(double *,k); double *Qp=Malloc(double,k); double pQp, eps=0.005/k; for (t=0;tmax_error) max_error=error; } if (max_error=max_iter) info("Exceeds max_iter in multiclass_prob\n"); for(t=0;tl); double *dec_values = Malloc(double,prob->l); // random shuffle for(i=0;il;i++) perm[i]=i; for(i=0;il;i++) { int j = i+rand()%(prob->l-i); swap(perm[i],perm[j]); } for(i=0;il/nr_fold; int end = (i+1)*prob->l/nr_fold; int j,k; struct svm_problem subprob; subprob.l = prob->l-(end-begin); subprob.x = Malloc(struct svm_node*,subprob.l); subprob.y = Malloc(double,subprob.l); k=0; for(j=0;jx[perm[j]]; subprob.y[k] = prob->y[perm[j]]; ++k; } for(j=end;jl;j++) { subprob.x[k] = prob->x[perm[j]]; subprob.y[k] = prob->y[perm[j]]; ++k; } int p_count=0,n_count=0; for(j=0;j0) p_count++; else n_count++; if(p_count==0 && n_count==0) for(j=begin;j 0 && n_count == 0) for(j=begin;j 0) for(j=begin;jx[perm[j]],&(dec_values[perm[j]])); // ensure +1 -1 order; reason not using CV subroutine dec_values[perm[j]] *= submodel->label[0]; } svm_destroy_model(submodel); svm_destroy_param(&subparam); } free(subprob.x); free(subprob.y); } sigmoid_train(prob->l,dec_values,prob->y,probA,probB); free(dec_values); free(perm); } // Return parameter of a Laplace distribution double svm_svr_probability( const svm_problem *prob, const svm_parameter *param) { int i; int nr_fold = 5; double *ymv = Malloc(double,prob->l); double mae = 0; svm_parameter newparam = *param; newparam.probability = 0; svm_cross_validation(prob,&newparam,nr_fold,ymv); for(i=0;il;i++) { ymv[i]=prob->y[i]-ymv[i]; mae += fabs(ymv[i]); } mae /= prob->l; double std=sqrt(2*mae*mae); int count=0; mae=0; for(i=0;il;i++) if (fabs(ymv[i]) > 5*std) count=count+1; else mae+=fabs(ymv[i]); mae /= (prob->l-count); info("Prob. model for test data: target value = predicted value + z,\nz: Laplace distribution e^(-|z|/sigma)/(2sigma),sigma= %g\n",mae); free(ymv); return mae; } // label: label name, start: begin of each class, count: #data of classes, perm: indices to the original data // perm, length l, must be allocated before calling this subroutine void svm_group_classes(const svm_problem *prob, int *nr_class_ret, int **label_ret, int **start_ret, int **count_ret, int *perm) { int l = prob->l; int max_nr_class = 16; int nr_class = 0; int *label = Malloc(int,max_nr_class); int *count = Malloc(int,max_nr_class); int *data_label = Malloc(int,l); int i; for(i=0;iy[i]; int j; for(j=0;jparam = *param; model->free_sv = 0; // XXX if(param->svm_type == ONE_CLASS || param->svm_type == EPSILON_SVR || param->svm_type == NU_SVR) { // regression or one-class-svm model->nr_class = 2; model->label = NULL; model->nSV = NULL; model->probA = NULL; model->probB = NULL; model->sv_coef = Malloc(double *,1); if(param->probability && (param->svm_type == EPSILON_SVR || param->svm_type == NU_SVR)) { model->probA = Malloc(double,1); model->probA[0] = svm_svr_probability(prob,param); } decision_function f = svm_train_one(prob,param,0,0); model->rho = Malloc(double,1); model->rho[0] = f.rho; int nSV = 0; int i; for(i=0;il;i++) if(fabs(f.alpha[i]) > 0) ++nSV; model->l = nSV; model->SV = Malloc(svm_node *,nSV); model->sv_coef[0] = Malloc(double,nSV); int j = 0; for(i=0;il;i++) if(fabs(f.alpha[i]) > 0) { model->SV[j] = prob->x[i]; model->sv_coef[0][j] = f.alpha[i]; ++j; } free(f.alpha); } else { // classification int l = prob->l; int nr_class; int *label = NULL; int *start = NULL; int *count = NULL; int *perm = Malloc(int,l); // group training data of the same class svm_group_classes(prob,&nr_class,&label,&start,&count,perm); svm_node **x = Malloc(svm_node *,l); int i; for(i=0;ix[perm[i]]; // calculate weighted C double *weighted_C = Malloc(double, nr_class); for(i=0;iC; for(i=0;inr_weight;i++) { int j; for(j=0;jweight_label[i] == label[j]) break; if(j == nr_class) fprintf(stderr,"warning: class label %d specified in weight is not found\n", param->weight_label[i]); else weighted_C[j] *= param->weight[i]; } // train k*(k-1)/2 models bool *nonzero = Malloc(bool,l); for(i=0;iprobability) { probA=Malloc(double,nr_class*(nr_class-1)/2); probB=Malloc(double,nr_class*(nr_class-1)/2); } int p = 0; for(i=0;iprobability) svm_binary_svc_probability(&sub_prob,param,weighted_C[i],weighted_C[j],probA[p],probB[p]); f[p] = svm_train_one(&sub_prob,param,weighted_C[i],weighted_C[j]); for(k=0;k 0) nonzero[si+k] = true; for(k=0;k 0) nonzero[sj+k] = true; free(sub_prob.x); free(sub_prob.y); ++p; } // build output model->nr_class = nr_class; model->label = Malloc(int,nr_class); for(i=0;ilabel[i] = label[i]; model->rho = Malloc(double,nr_class*(nr_class-1)/2); for(i=0;irho[i] = f[i].rho; if(param->probability) { model->probA = Malloc(double,nr_class*(nr_class-1)/2); model->probB = Malloc(double,nr_class*(nr_class-1)/2); for(i=0;iprobA[i] = probA[i]; model->probB[i] = probB[i]; } } else { model->probA=NULL; model->probB=NULL; } int total_sv = 0; int *nz_count = Malloc(int,nr_class); model->nSV = Malloc(int,nr_class); for(i=0;inSV[i] = nSV; nz_count[i] = nSV; } info("Total nSV = %d\n",total_sv); model->l = total_sv; model->SV = Malloc(svm_node *,total_sv); p = 0; for(i=0;iSV[p++] = x[i]; int *nz_start = Malloc(int,nr_class); nz_start[0] = 0; for(i=1;isv_coef = Malloc(double *,nr_class-1); for(i=0;isv_coef[i] = Malloc(double,total_sv); p = 0; for(i=0;isv_coef[j-1][q++] = f[p].alpha[k]; q = nz_start[j]; for(k=0;ksv_coef[i][q++] = f[p].alpha[ci+k]; ++p; } free(label); free(probA); free(probB); free(count); free(perm); free(start); free(x); free(weighted_C); free(nonzero); for(i=0;il; int *perm = Malloc(int,l); int nr_class; // stratified cv may not give leave-one-out rate // Each class to l folds -> some folds may have zero elements if((param->svm_type == C_SVC || param->svm_type == NU_SVC) && nr_fold < l) { int *start = NULL; int *label = NULL; int *count = NULL; svm_group_classes(prob,&nr_class,&label,&start,&count,perm); // random shuffle and then data grouped by fold using the array perm int *fold_count = Malloc(int,nr_fold); int c; int *index = Malloc(int,l); for(i=0;ix[perm[j]]; subprob.y[k] = prob->y[perm[j]]; ++k; } for(j=end;jx[perm[j]]; subprob.y[k] = prob->y[perm[j]]; ++k; } struct svm_model *submodel = svm_train(&subprob,param); if(param->probability && (param->svm_type == C_SVC || param->svm_type == NU_SVC)) { double *prob_estimates=Malloc(double,svm_get_nr_class(submodel)); for(j=begin;jx[perm[j]],prob_estimates); free(prob_estimates); } else for(j=begin;jx[perm[j]]); svm_destroy_model(submodel); free(subprob.x); free(subprob.y); } free(fold_start); free(perm); } int svm_get_svm_type(const svm_model *model) { return model->param.svm_type; } int svm_get_nr_class(const svm_model *model) { return model->nr_class; } void svm_get_labels(const svm_model *model, int* label) { if (model->label != NULL) for(int i=0;inr_class;i++) label[i] = model->label[i]; } double svm_get_svr_probability(const svm_model *model) { if ((model->param.svm_type == EPSILON_SVR || model->param.svm_type == NU_SVR) && model->probA!=NULL) return model->probA[0]; else { info("Model doesn't contain information for SVR probability inference\n"); return 0; } } void svm_predict_values(const svm_model *model, const svm_node *x, double* dec_values) { if(model->param.svm_type == ONE_CLASS || model->param.svm_type == EPSILON_SVR || model->param.svm_type == NU_SVR) { double *sv_coef = model->sv_coef[0]; double sum = 0; for(int i=0;il;i++) sum += sv_coef[i] * Kernel::k_function(x,model->SV[i],model->param); sum -= model->rho[0]; *dec_values = sum; } else { int i; int nr_class = model->nr_class; int l = model->l; double *kvalue = Malloc(double,l); for(i=0;iSV[i],model->param); int *start = Malloc(int,nr_class); start[0] = 0; for(i=1;inSV[i-1]; int p=0; for(i=0;inSV[i]; int cj = model->nSV[j]; int k; double *coef1 = model->sv_coef[j-1]; double *coef2 = model->sv_coef[i]; for(k=0;krho[p]; dec_values[p] = sum; p++; } free(kvalue); free(start); } } double svm_predict(const svm_model *model, const svm_node *x) { if(model->param.svm_type == ONE_CLASS || model->param.svm_type == EPSILON_SVR || model->param.svm_type == NU_SVR) { double res; svm_predict_values(model, x, &res); if(model->param.svm_type == ONE_CLASS) return (res>0)?1:-1; else return res; } else { int i; int nr_class = model->nr_class; double *dec_values = Malloc(double, nr_class*(nr_class-1)/2); svm_predict_values(model, x, dec_values); int *vote = Malloc(int,nr_class); for(i=0;i 0) ++vote[i]; else ++vote[j]; } int vote_max_idx = 0; for(i=1;i vote[vote_max_idx]) vote_max_idx = i; free(vote); free(dec_values); return model->label[vote_max_idx]; } } double svm_predict_probability( const svm_model *model, const svm_node *x, double *prob_estimates) { if ((model->param.svm_type == C_SVC || model->param.svm_type == NU_SVC) && model->probA!=NULL && model->probB!=NULL) { int i; int nr_class = model->nr_class; double *dec_values = Malloc(double, nr_class*(nr_class-1)/2); svm_predict_values(model, x, dec_values); double min_prob=1e-7; double **pairwise_prob=Malloc(double *,nr_class); for(i=0;iprobA[k],model->probB[k]),min_prob),1-min_prob); pairwise_prob[j][i]=1-pairwise_prob[i][j]; k++; } multiclass_probability(nr_class,pairwise_prob,prob_estimates); int prob_max_idx = 0; for(i=1;i prob_estimates[prob_max_idx]) prob_max_idx = i; for(i=0;ilabel[prob_max_idx]; } else return svm_predict(model, x); } const char *svm_type_table[] = { "c_svc","nu_svc","one_class","epsilon_svr","nu_svr",NULL }; const char *kernel_type_table[]= { "linear","polynomial","rbf","sigmoid","precomputed",NULL }; int svm_save_model(const char *model_file_name, const svm_model *model) { FILE *fp = fopen(model_file_name,"w"); if(fp==NULL) return -1; const svm_parameter& param = model->param; fprintf(fp,"svm_type %s\n", svm_type_table[param.svm_type]); fprintf(fp,"kernel_type %s\n", kernel_type_table[param.kernel_type]); if(param.kernel_type == POLY) fprintf(fp,"degree %d\n", param.degree); if(param.kernel_type == POLY || param.kernel_type == RBF || param.kernel_type == SIGMOID) fprintf(fp,"gamma %g\n", param.gamma); if(param.kernel_type == POLY || param.kernel_type == SIGMOID) fprintf(fp,"coef0 %g\n", param.coef0); int nr_class = model->nr_class; int l = model->l; fprintf(fp, "nr_class %d\n", nr_class); fprintf(fp, "total_sv %d\n",l); { fprintf(fp, "rho"); for(int i=0;irho[i]); fprintf(fp, "\n"); } if(model->label) { fprintf(fp, "label"); for(int i=0;ilabel[i]); fprintf(fp, "\n"); } if(model->probA) // regression has probA only { fprintf(fp, "probA"); for(int i=0;iprobA[i]); fprintf(fp, "\n"); } if(model->probB) { fprintf(fp, "probB"); for(int i=0;iprobB[i]); fprintf(fp, "\n"); } if(model->nSV) { fprintf(fp, "nr_sv"); for(int i=0;inSV[i]); fprintf(fp, "\n"); } fprintf(fp, "SV\n"); const double * const *sv_coef = model->sv_coef; const svm_node * const *SV = model->SV; for(int i=0;ivalue)); else while(p->index != -1) { fprintf(fp,"%d:%.8g ",p->index,p->value); p++; } fprintf(fp, "\n"); } if (ferror(fp) != 0 || fclose(fp) != 0) return -1; else return 0; } svm_model *svm_load_model(const char *model_file_name) { FILE *fp = fopen(model_file_name,"r"); if(fp==NULL) return NULL; // read parameters svm_model *model = Malloc(svm_model,1); svm_parameter& param = model->param; model->rho = NULL; model->probA = NULL; model->probB = NULL; model->label = NULL; model->nSV = NULL; char cmd[81]; while(1) { fscanf(fp,"%80s",cmd); if(strcmp(cmd,"svm_type")==0) { fscanf(fp,"%80s",cmd); int i; for(i=0;svm_type_table[i];i++) { if(strcmp(svm_type_table[i],cmd)==0) { param.svm_type=i; break; } } if(svm_type_table[i] == NULL) { fprintf(stderr,"unknown svm type.\n"); free(model->rho); free(model->label); free(model->nSV); free(model); return NULL; } } else if(strcmp(cmd,"kernel_type")==0) { fscanf(fp,"%80s",cmd); int i; for(i=0;kernel_type_table[i];i++) { if(strcmp(kernel_type_table[i],cmd)==0) { param.kernel_type=i; break; } } if(kernel_type_table[i] == NULL) { fprintf(stderr,"unknown kernel function.\n"); free(model->rho); free(model->label); free(model->nSV); free(model); return NULL; } } else if(strcmp(cmd,"degree")==0) fscanf(fp,"%d",¶m.degree); else if(strcmp(cmd,"gamma")==0) fscanf(fp,"%lf",¶m.gamma); else if(strcmp(cmd,"coef0")==0) fscanf(fp,"%lf",¶m.coef0); else if(strcmp(cmd,"nr_class")==0) fscanf(fp,"%d",&model->nr_class); else if(strcmp(cmd,"total_sv")==0) fscanf(fp,"%d",&model->l); else if(strcmp(cmd,"rho")==0) { int n = model->nr_class * (model->nr_class-1)/2; model->rho = Malloc(double,n); for(int i=0;irho[i]); } else if(strcmp(cmd,"label")==0) { int n = model->nr_class; model->label = Malloc(int,n); for(int i=0;ilabel[i]); } else if(strcmp(cmd,"probA")==0) { int n = model->nr_class * (model->nr_class-1)/2; model->probA = Malloc(double,n); for(int i=0;iprobA[i]); } else if(strcmp(cmd,"probB")==0) { int n = model->nr_class * (model->nr_class-1)/2; model->probB = Malloc(double,n); for(int i=0;iprobB[i]); } else if(strcmp(cmd,"nr_sv")==0) { int n = model->nr_class; model->nSV = Malloc(int,n); for(int i=0;inSV[i]); } else if(strcmp(cmd,"SV")==0) { while(1) { int c = getc(fp); if(c==EOF || c=='\n') break; } break; } else { fprintf(stderr,"unknown text in model file: [%s]\n",cmd); free(model->rho); free(model->label); free(model->nSV); free(model); return NULL; } } // read sv_coef and SV int elements = 0; long pos = ftell(fp); while(1) { int c = fgetc(fp); switch(c) { case '\n': // count the '-1' element case ':': ++elements; break; case EOF: goto out; default: ; } } out: fseek(fp,pos,SEEK_SET); int m = model->nr_class - 1; int l = model->l; model->sv_coef = Malloc(double *,m); int i; for(i=0;isv_coef[i] = Malloc(double,l); model->SV = Malloc(svm_node*,l); svm_node *x_space=NULL; if(l>0) x_space = Malloc(svm_node,elements); int j=0; for(i=0;iSV[i] = &x_space[j]; for(int k=0;ksv_coef[k][i]); while(1) { int c; do { c = getc(fp); if(c=='\n') goto out2; } while(isspace(c)); ungetc(c,fp); fscanf(fp,"%d:%lf",&(x_space[j].index),&(x_space[j].value)); ++j; } out2: x_space[j++].index = -1; } if (ferror(fp) != 0 || fclose(fp) != 0) return NULL; model->free_sv = 1; // XXX return model; } void svm_destroy_model(svm_model* model) { if(model->free_sv && model->l > 0) free((void *)(model->SV[0])); for(int i=0;inr_class-1;i++) free(model->sv_coef[i]); free(model->SV); free(model->sv_coef); free(model->rho); free(model->label); free(model->probA); free(model->probB); free(model->nSV); free(model); } void svm_destroy_param(svm_parameter* param) { free(param->weight_label); free(param->weight); } const char *svm_check_parameter(const svm_problem *prob, const svm_parameter *param) { // svm_type int svm_type = param->svm_type; if(svm_type != C_SVC && svm_type != NU_SVC && svm_type != ONE_CLASS && svm_type != EPSILON_SVR && svm_type != NU_SVR) return "unknown svm type"; // kernel_type, degree int kernel_type = param->kernel_type; if(kernel_type != LINEAR && kernel_type != POLY && kernel_type != RBF && kernel_type != SIGMOID && kernel_type != PRECOMPUTED) return "unknown kernel type"; if(param->degree < 0) return "degree of polynomial kernel < 0"; // cache_size,eps,C,nu,p,shrinking if(param->cache_size <= 0) return "cache_size <= 0"; if(param->eps <= 0) return "eps <= 0"; if(svm_type == C_SVC || svm_type == EPSILON_SVR || svm_type == NU_SVR) if(param->C <= 0) return "C <= 0"; if(svm_type == NU_SVC || svm_type == ONE_CLASS || svm_type == NU_SVR) if(param->nu <= 0 || param->nu > 1) return "nu <= 0 or nu > 1"; if(svm_type == EPSILON_SVR) if(param->p < 0) return "p < 0"; if(param->shrinking != 0 && param->shrinking != 1) return "shrinking != 0 and shrinking != 1"; if(param->probability != 0 && param->probability != 1) return "probability != 0 and probability != 1"; if(param->probability == 1 && svm_type == ONE_CLASS) return "one-class SVM probability output not supported yet"; // check whether nu-svc is feasible if(svm_type == NU_SVC) { int l = prob->l; int max_nr_class = 16; int nr_class = 0; int *label = Malloc(int,max_nr_class); int *count = Malloc(int,max_nr_class); int i; for(i=0;iy[i]; int j; for(j=0;jnu*(n1+n2)/2 > min(n1,n2)) { free(label); free(count); return "specified nu is infeasible"; } } } free(label); free(count); } return NULL; } int svm_check_probability_model(const svm_model *model) { return ((model->param.svm_type == C_SVC || model->param.svm_type == NU_SVC) && model->probA!=NULL && model->probB!=NULL) || ((model->param.svm_type == EPSILON_SVR || model->param.svm_type == NU_SVR) && model->probA!=NULL); } Algorithm-SVM-0.13/libsvm.h0000644000077436411310000000415410745440535015327 0ustar lairdmwg-users#ifndef _LIBSVM_H #define _LIBSVM_H #ifdef __cplusplus extern "C" { #endif struct svm_node { int index; double value; }; struct svm_problem { int l; double *y; struct svm_node **x; }; enum { C_SVC, NU_SVC, ONE_CLASS, EPSILON_SVR, NU_SVR }; /* svm_type */ enum { LINEAR, POLY, RBF, SIGMOID, PRECOMPUTED }; /* kernel_type */ struct svm_parameter { int svm_type; int kernel_type; int degree; /* for poly */ double gamma; /* for poly/rbf/sigmoid */ double coef0; /* for poly/sigmoid */ /* these are for training only */ double cache_size; /* in MB */ double eps; /* stopping criteria */ double C; /* for C_SVC, EPSILON_SVR and NU_SVR */ int nr_weight; /* for C_SVC */ int *weight_label; /* for C_SVC */ double* weight; /* for C_SVC */ double nu; /* for NU_SVC, ONE_CLASS, and NU_SVR */ double p; /* for EPSILON_SVR */ int shrinking; /* use the shrinking heuristics */ int probability; /* do probability estimates */ }; struct svm_model *svm_train(const struct svm_problem *prob, const struct svm_parameter *param); void svm_cross_validation(const struct svm_problem *prob, const struct svm_parameter *param, int nr_fold, double *target); int svm_save_model(const char *model_file_name, const struct svm_model *model); struct svm_model *svm_load_model(const char *model_file_name); int svm_get_svm_type(const struct svm_model *model); int svm_get_nr_class(const struct svm_model *model); void svm_get_labels(const struct svm_model *model, int *label); double svm_get_svr_probability(const struct svm_model *model); void svm_predict_values(const struct svm_model *model, const struct svm_node *x, double* dec_values); double svm_predict(const struct svm_model *model, const struct svm_node *x); double svm_predict_probability(const struct svm_model *model, const struct svm_node *x, double* prob_estimates); void svm_destroy_model(struct svm_model *model); void svm_destroy_param(struct svm_parameter *param); const char *svm_check_parameter(const struct svm_problem *prob, const struct svm_parameter *param); int svm_check_probability_model(const struct svm_model *model); #ifdef __cplusplus } #endif #endif /* _LIBSVM_H */ Algorithm-SVM-0.13/test.pl0000644000077436411310000001140310362420353015160 0ustar lairdmwg-users# Before `make install' is performed this script should be runnable with # `make test'. After `make install' it should work as `perl test.pl' ######################### # change 'tests => 1' to 'tests => last_test_to_print'; use Test; BEGIN { plan tests => 1 }; use Algorithm::SVM::DataSet; use Algorithm::SVM; ok(1); # If we made it this far, we're ok. ######################### # Insert your test code below, the Test module is use()ed here so read # its man page ( perldoc Test ) for help writing this test script. print("Creating new Algorithm::SVM\n"); my $svm = new Algorithm::SVM(Model => 'sample.model'); ok(ref($svm) ne "", 1); print("Creating new Algorithm::SVM::DataSet objects\n"); my $ds1 = new Algorithm::SVM::DataSet(Label => 1); my $ds2 = new Algorithm::SVM::DataSet(Label => 2); my $ds3 = new Algorithm::SVM::DataSet(Label => 3); ok(ref($ds1) ne "", 1); ok(ref($ds2) ne "", 1); ok(ref($ds3) ne "", 1); print("Adding attributes to Algorithm::SVM::DataSet objects\n"); my @d1 = (0.0424107142857143, 0.0915178571428571, 0.0401785714285714, 0.0156250000000000, 0.0156250000000000, 0.0223214285714286, 0.0223214285714286, 0.0825892857142857, 0.1205357142857140, 0.0736607142857143, 0.0535714285714286, 0.0535714285714286, 0.0178571428571429, 0.0357142857142857, 0.1116071428571430, 0.0334821428571429, 0.0223214285714286, 0.0602678571428571, 0.0200892857142857, 0.0647321428571429); my @d2 = (0.0673076923076923, 0.11538461538461500, 0.0480769230769231, 0.0480769230769231, 0.00961538461538462, 0.0192307692307692, 0.0000000000000000, 0.08653846153846150, 0.1634615384615380, 0.0865384615384615, 0.03846153846153850, 0.0288461538461538, 0.0192307692307692, 0.01923076923076920, 0.0000000000000000, 0.0961538461538462, 0.02884615384615380, 0.0673076923076923, 0.0288461538461538, 0.02884615384615380); my @d3 = (0.0756756756756757, 0.0594594594594595, 0.0378378378378378, 0.0216216216216216, 0.0432432432432432, 0.0000000000000000, 0.0162162162162162, 0.0648648648648649, 0.1729729729729730, 0.0432432432432432, 0.0864864864864865, 0.1297297297297300, 0.0108108108108108, 0.0108108108108108, 0.0162162162162162, 0.0486486486486487, 0.0324324324324324, 0.0216216216216216, 0.0594594594594595, 0.0486486486486487); $ds1->attribute($_, $d1[$_ - 1]) for(1..scalar(@d1)); $ds2->attribute($_, $d2[$_ - 1]) for(1..scalar(@d2)); $ds3->attribute($_, $d3[$_ - 1]) for(1..scalar(@d3)); ok(1); print("Checking predictions on loaded model\n"); ok($svm->predict($ds1) == 10,1); ok($svm->predict($ds2) == 0,1); ok($svm->predict($ds3) == -10,1); print("Saving model\n"); ok($svm->save('sample.model.1'), 1); print("Loading saved model\n"); ok($svm->load('sample.model.1'), 1); print("Checking NRClass\n"); ok($svm->getNRClass(), 3); print("Checking model labels\n"); ok($svm->getLabels(), (10, 0, -10)); my $cnt=0; for (my $i=1; $i<=@d1; $i++) { if ($ds1->attribute($i) == $d1[$i-1]) { $cnt++; } } ok($cnt,20); print("Checking train\n"); my @tset=($ds1,$ds2,$ds3); ok($svm->train(@tset)); $cnt=0; for (my $i=1; $i<=@d1; $i++) { if ($ds1->attribute($i) == $d1[$i-1]) { $cnt++; } } ok($cnt,20); print("Checking retrain\n"); my $p1 = $svm->predict($ds1); my $p2 = $svm->predict($ds2); my $p3 = $svm->predict($ds3); ok($svm->retrain()); ok($svm->predict($ds1),$p1); ok($svm->predict($ds2),$p2); ok($svm->predict($ds3),$p3); print("Checking retrain after DataSet changes\n"); # this tests whether reallocating memory after realign # works ok. $ds1->attribute(2,$ds1->attribute(2)); $ds2->attribute(2,$ds2->attribute(2)); $ds3->attribute(2,$ds3->attribute(2)); ok($svm->retrain()); ok($svm->predict($ds1),$p1); ok($svm->predict($ds2),$p2); ok($svm->predict($ds3),$p3); print("Checking svm destructor\n"); $svm=undef; # destroy svm object (test destructor) ok(1); print("Checking attribute value changes\n"); $ds1->attribute($_, 1) for(1..scalar(@d1)); $cnt=0; for ($i=1;$i<=scalar(@d1);$i++) { if ($ds1->attribute($i)==1) { $cnt++; } else { print $ds1->attribute($i),"::\n"; } } ok($cnt,20); $ds2->attribute(3, -1.5); $ds2->attribute(5, -1.5); $ds2->attribute(4, -1.5); $ds2->attribute(2, -1.5); $ds2->attribute(1, -1.5); $cnt=0; for ($i=1;$i<=5;$i++) { if ($ds2->attribute($i)==-1.5) { $cnt++; } } for ($i=6;$i<=scalar(@d2);$i++) { if ($ds2->attribute($i)==$d2[$i-1]) { $cnt++; } } ok($cnt,20); $ds3->attribute($_, 0) for(1..scalar(@d3)); $cnt=0; for ($i=1;$i<=scalar(@d3);$i++) { if ($ds3->attribute($i)==0) { $cnt++; } } ok($cnt,20); print("Checking asArray\n"); my @x = $ds2->asArray(); # note that this takes attr. 0 as first value, which has never # been set and thus is equal to zero $cnt=0; if ($x[0]==0.0) { $cnt++; } for ($i=1;$i<=5;$i++) { if ($x[$i]==-1.5) { $cnt++; } } for ($i=6;$i<=scalar(@d2);$i++) { if ($x[$i]==$d2[$i-1]) { $cnt++; } } ok($cnt,21); Algorithm-SVM-0.13/typemap0000644000077436411310000000133710362151677015265 0ustar lairdmwg-usersTYPEMAP SVM * T_SVM DataSet* T_DATASET OUTPUT T_SVM if( $var == NULL ) XSRETURN_UNDEF; sv_setref_pv( $arg, "Algorithm::SVM", (void*)$var ); T_DATASET if( $var == NULL ) XSRETURN_UNDEF; sv_setref_pv( $arg, "Algorithm::SVM::DataSet", (void*)$var ); INPUT T_SVM if( sv_isobject($arg) && sv_isa($arg, \"Algorithm::SVM\") ) { $var = ($type)SvIV((SV*)SvRV( $arg )); } else { warn( \"${Package}::$func_name() -- $var is not an Algorithm::SVM object\" ); XSRETURN_UNDEF; } T_DATASET if( sv_isobject($arg) && sv_isa($arg, \"Algorithm::SVM::DataSet\") ) { $var = ($type)SvIV((SV*)SvRV( $arg )); } else { warn( \"${Package}::$func_name() -- $var is not an Algorithm::SVM::DataSet object\" ); XSRETURN_UNDEF; } Algorithm-SVM-0.13/sample.model0000644000077436411310000017013210353070115016151 0ustar lairdmwg-userssvm_type c_svc kernel_type rbf gamma 64 nr_class 3 total_sv 199 rho 0.439702 0.336872 0.444534 label 10 0 -10 nr_sv 87 72 40 SV 7.711958119123536 7.365495639545627 1:0.042410714 2:0.091517857 3:0.040178571 4:0.015625 5:0.015625 6:0.022321429 7:0.022321429 8:0.082589286 9:0.12053571 10:0.073660714 11:0.053571429 12:0.053571429 13:0.017857143 14:0.035714286 15:0.11160714 16:0.033482143 17:0.022321429 18:0.060267857 19:0.020089286 20:0.064732143 0.9576898148768235 0.02039180431218783 1:0.11340206 2:0.10309278 3:0.06185567 4:0.030927835 5:0.010309278 6:0.010309278 7:0 8:0.06185567 9:0.15463918 10:0.020618557 11:0.041237113 12:0.030927835 13:0.010309278 14:0 15:0.051546392 16:0.041237113 17:0.13402062 18:0.06185567 19:0.010309278 20:0.051546392 0 0.8314817940045978 1:0.11309524 2:0.071428571 3:0.11309524 4:0.011904762 5:0.047619048 6:0 7:0 8:0.095238095 9:0.041666667 10:0.047619048 11:0.05952381 12:0.05952381 13:0 14:0.011904762 15:0.017857143 16:0.10119048 17:0.005952381 18:0.10119048 19:0.023809524 20:0.077380952 1.22386885617493 0 1:0.080229226 2:0.091690544 3:0.068767908 4:0.06017192 5:0.020057307 6:0.0028653295 7:0.011461318 8:0.077363897 9:0.11461318 10:0.06017192 11:0.068767908 12:0.042979943 13:0.005730659 14:0.028653295 15:0.06017192 16:0.045845272 17:0.034383954 18:0.054441261 19:0.025787966 20:0.045845272 8 8 1:0.054711246 2:0.060790274 3:0.036474164 4:0.03343465 5:0.054711246 6:0.021276596 7:0.051671733 8:0.10334347 9:0.079027356 10:0.05775076 11:0.048632219 12:0.051671733 13:0.012158055 14:0.030395137 15:0.045592705 16:0.063829787 17:0.027355623 18:0.060790274 19:0.042553191 20:0.063829787 8 8 1:0.059907834 2:0.076036866 3:0.046082949 4:0.023041475 5:0.043778802 6:0.01843318 7:0.039170507 8:0.080645161 9:0.12903226 10:0.039170507 11:0.057603687 12:0.062211982 13:0.011520737 14:0.01843318 15:0.036866359 16:0.048387097 17:0.066820276 18:0.062211982 19:0.02764977 20:0.052995392 0 0.9048219239798362 1:0.13402062 2:0.072164948 3:0.10309278 4:0.020618557 5:0.010309278 6:0 7:0.010309278 8:0.10309278 9:0.072164948 10:0.020618557 11:0.06185567 12:0.030927835 13:0 14:0.010309278 15:0.051546392 16:0.082474227 17:0 18:0.10309278 19:0.051546392 20:0.06185567 1.401142770750075 0 1:0.078125 2:0.1171875 3:0.046875 4:0.046875 5:0.046875 6:0.0078125 7:0.015625 8:0.078125 9:0.125 10:0.0234375 11:0.03125 12:0.0390625 13:0 14:0 15:0.03125 16:0.0859375 17:0.015625 18:0.0859375 19:0.0625 20:0.0625 0.3876175023992005 1.828284414673054 1:0.069053708 2:0.066496164 3:0.061381074 4:0.040920716 5:0.025575448 6:0.012787724 7:0.0076726343 8:0.12531969 9:0.16368286 10:0.035805627 11:0.051150895 12:0.046035806 13:0.012787724 14:0.015345269 15:0.046035806 16:0.046035806 17:0.035805627 18:0.051150895 19:0.035805627 20:0.051150895 1.188129274745781 5.414923507660803 1:0.10837438 2:0.044334975 3:0.083743842 4:0.034482759 5:0.039408867 6:0.0098522167 7:0.044334975 8:0.078817734 9:0.039408867 10:0.0591133 11:0.0591133 12:0.0591133 13:0.0098522167 14:0.039408867 15:0.049261084 16:0.049261084 17:0 18:0.068965517 19:0.078817734 20:0.044334975 1.807209070689883 0 1:0.04516129 2:0.10645161 3:0.058064516 4:0.025806452 5:0.05483871 6:0.022580645 7:0.035483871 8:0.064516129 9:0.058064516 10:0.064516129 11:0.074193548 12:0.032258065 13:0.012903226 14:0.025806452 15:0.093548387 16:0.025806452 17:0.048387097 18:0.061290323 19:0.038709677 20:0.051612903 8 0 1:0.056653491 2:0.08168643 3:0.057971014 4:0.040843215 5:0.039525692 6:0.0092226614 7:0.044795784 8:0.075098814 9:0.084321476 10:0.039525692 11:0.04743083 12:0.063241107 13:0.0065876153 14:0.018445323 15:0.050065876 16:0.065876153 17:0.039525692 18:0.063241107 19:0.046113307 20:0.069828722 0 6.431874337263299 1:0.13061224 2:0.10204082 3:0.057142857 4:0.0040816327 5:0.016326531 6:0.012244898 7:0.024489796 8:0.11428571 9:0.097959184 10:0.036734694 11:0.040816327 12:0.044897959 13:0.012244898 14:0.016326531 15:0.057142857 16:0.040816327 17:0.057142857 18:0.044897959 19:0.020408163 20:0.069387755 2.242011635231154 0 1:0.048582996 2:0.12955466 3:0.048582996 4:0.032388664 5:0.020242915 6:0.032388664 7:0.016194332 8:0.085020243 9:0.10121457 10:0.052631579 11:0.04048583 12:0.044534413 13:0.016194332 14:0.024291498 15:0.060728745 16:0.052631579 17:0.028340081 18:0.064777328 19:0.04048583 20:0.060728745 2.09133957835712 0 1:0.082236842 2:0.098684211 3:0.055921053 4:0.0065789474 5:0.039473684 6:0.023026316 7:0.016447368 8:0.072368421 9:0.11842105 10:0.075657895 11:0.055921053 12:0.032894737 13:0.023026316 14:0.023026316 15:0.049342105 16:0.032894737 17:0.0625 18:0.049342105 19:0.023026316 20:0.059210526 7.760173779100334 0 1:0.048275862 2:0.12068966 3:0.048275862 4:0.034482759 5:0.044827586 6:0.017241379 7:0.024137931 8:0.05862069 9:0.089655172 10:0.055172414 11:0.086206897 12:0.068965517 13:0.013793103 14:0.034482759 15:0.068965517 16:0.024137931 17:0.048275862 18:0.044827586 19:0.024137931 20:0.044827586 4.787234112788998 0 1:0.058823529 2:0.10160428 3:0.050802139 4:0.018716578 5:0.032085561 6:0.053475936 7:0.018716578 8:0.07486631 9:0.082887701 10:0.069518717 11:0.058823529 12:0.042780749 13:0.013368984 14:0.029411765 15:0.06684492 16:0.029411765 17:0.069518717 18:0.061497326 19:0.018716578 20:0.048128342 1.381004952256783 0 1:0.077803204 2:0.12814645 3:0.032036613 4:0.027459954 5:0.029748284 6:0.0068649886 7:0.020594966 8:0.091533181 9:0.11899314 10:0.038901602 11:0.04805492 12:0.054919908 13:0.016018307 14:0.022883295 15:0.050343249 16:0.034324943 17:0.034324943 18:0.061784897 19:0.04805492 20:0.057208238 0 0.7294772331011425 1:0.073891626 2:0.13793103 3:0.054187192 4:0.0049261084 5:0.034482759 6:0.0098522167 7:0.044334975 8:0.044334975 9:0.064039409 10:0.0591133 11:0.073891626 12:0.049261084 13:0.014778325 14:0.034482759 15:0.034482759 16:0.073891626 17:0.039408867 18:0.049261084 19:0.034482759 20:0.068965517 8 2.463563922537672 1:0.057636888 2:0.10662824 3:0.031700288 4:0.031700288 5:0.023054755 6:0.031700288 7:0.04610951 8:0.069164265 9:0.097982709 10:0.054755043 11:0.050432277 12:0.047550432 13:0.015850144 14:0.024495677 15:0.057636888 16:0.051873199 17:0.050432277 18:0.05907781 19:0.030259366 20:0.061959654 2.622733230440654 0 1:0.081081081 2:0.081081081 3:0.064864865 4:0.027027027 5:0.016216216 6:0 7:0.010810811 8:0.032432432 9:0.1027027 10:0.043243243 11:0.07027027 12:0.048648649 13:0.0054054054 14:0 15:0.091891892 16:0.086486486 17:0.032432432 18:0.075675676 19:0.027027027 20:0.1027027 1.844389048445678 0.22770244517299 1:0.082352941 2:0.10588235 3:0.023529412 4:0.023529412 5:0.047058824 6:0 7:0 8:0.070588235 9:0.10588235 10:0.023529412 11:0.070588235 12:0.11764706 13:0 14:0.023529412 15:0.011764706 16:0.082352941 17:0.070588235 18:0.10588235 19:0.023529412 20:0.011764706 4.315625030679095 0 1:0.07712766 2:0.125 3:0.039893617 4:0.021276596 5:0.02393617 6:0.0053191489 7:0.0053191489 8:0.090425532 9:0.14361702 10:0.018617021 11:0.050531915 12:0.066489362 13:0 14:0.018617021 15:0.039893617 16:0.042553191 17:0.061170213 18:0.061170213 19:0.045212766 20:0.063829787 1.919751876365703 0 1:0.094339623 2:0.12264151 3:0.066037736 4:0.037735849 5:0.018867925 6:0 7:0.037735849 8:0.066037736 9:0.13207547 10:0.056603774 11:0.047169811 12:0.018867925 13:0.0094339623 14:0.0094339623 15:0.018867925 16:0.094339623 17:0.037735849 18:0.075471698 19:0.028301887 20:0.028301887 1.115630872619839 0 1:0.081871345 2:0.11695906 3:0.035087719 4:0.040935673 5:0.058479532 6:0.01754386 7:0.029239766 8:0.14035088 9:0.058479532 10:0.040935673 11:0.023391813 12:0.046783626 13:0.011695906 14:0.011695906 15:0.052631579 16:0.064327485 17:0.023391813 18:0.052631579 19:0.023391813 20:0.070175439 1.368730552986554 0 1:0.072649573 2:0.061965812 3:0.055555556 4:0.040598291 5:0.055555556 6:0.0042735043 7:0.034188034 8:0.072649573 9:0.10042735 10:0.051282051 11:0.068376068 12:0.044871795 13:0.0064102564 14:0.027777778 15:0.040598291 16:0.053418803 17:0.019230769 18:0.072649573 19:0.036324786 20:0.081196581 8 2.502004762108732 1:0.062809917 2:0.094214876 3:0.034710744 4:0.026446281 5:0.042975207 6:0.036363636 7:0.034710744 8:0.074380165 9:0.084297521 10:0.061157025 11:0.042975207 12:0.038016529 13:0.011570248 14:0.036363636 15:0.066115702 16:0.031404959 17:0.067768595 18:0.044628099 19:0.034710744 20:0.074380165 0.444740694146401 0 1:0.056603774 2:0.094339623 3:0.08490566 4:0.018867925 5:0.047169811 6:0.018867925 7:0.047169811 8:0.028301887 9:0.047169811 10:0.028301887 11:0.056603774 12:0.066037736 13:0 14:0.037735849 15:0.12264151 16:0.037735849 17:0.08490566 18:0.028301887 19:0.0094339623 20:0.08490566 1.918995998595875 3.569199389027621 1:0.037617555 2:0.12225705 3:0.037617555 4:0.031347962 5:0.043887147 6:0.015673981 7:0.056426332 8:0.065830721 9:0.10031348 10:0.059561129 11:0.040752351 12:0.059561129 13:0.02507837 14:0.018808777 15:0.056426332 16:0.02507837 17:0.059561129 18:0.053291536 19:0.02507837 20:0.065830721 0 0.3172007235944944 1:0.068181818 2:0.11363636 3:0.015151515 4:0.0075757576 5:0.075757576 6:0.03030303 7:0.022727273 8:0.045454545 9:0.053030303 10:0.060606061 11:0.037878788 12:0.060606061 13:0.022727273 14:0.022727273 15:0.068181818 16:0.083333333 17:0.015151515 18:0.11363636 19:0.022727273 20:0.060606061 0.9003790663523249 1.874660213433047 1:0.036585366 2:0.19512195 3:0.036585366 4:0.012195122 5:0.030487805 6:0.024390244 7:0.024390244 8:0.067073171 9:0.048780488 10:0.079268293 11:0.067073171 12:0.067073171 13:0.012195122 14:0.006097561 15:0.085365854 16:0.042682927 17:0.036585366 18:0.085365854 19:0.006097561 20:0.036585366 5.767297152102819 0 1:0.058823529 2:0.10441176 3:0.029411765 4:0.026470588 5:0.052941176 6:0.022058824 7:0.035294118 8:0.058823529 9:0.11764706 10:0.041176471 11:0.045588235 12:0.047058824 13:0.0058823529 14:0.027941176 15:0.072058824 16:0.025 17:0.060294118 18:0.070588235 19:0.035294118 20:0.063235294 8 0 1:0.06147541 2:0.094262295 3:0.049180328 4:0.030737705 5:0.032786885 6:0.0040983607 7:0.014344262 8:0.079918033 9:0.12704918 10:0.047131148 11:0.06147541 12:0.067622951 13:0.0081967213 14:0.026639344 15:0.069672131 16:0.055327869 17:0.032786885 18:0.055327869 19:0.043032787 20:0.038934426 0 4.453396603315925 1:0.050816697 2:0.08892922 3:0.038112523 4:0.029038113 5:0.043557169 6:0.041742287 7:0.043557169 8:0.070780399 9:0.070780399 10:0.052631579 11:0.047186933 12:0.052631579 13:0.0090744102 14:0.034482759 15:0.061705989 16:0.032667877 17:0.052631579 18:0.059891107 19:0.052631579 20:0.067150635 0.5869925052851459 3.285963127756351 1:0.054347826 2:0.09057971 3:0.10869565 4:0.032608696 5:0.06884058 6:0.014492754 7:0.047101449 8:0.036231884 9:0.028985507 10:0.0036231884 11:0.12681159 12:0.039855072 13:0.014492754 14:0.0036231884 15:0.043478261 16:0.086956522 17:0.014492754 18:0.057971014 19:0.072463768 20:0.054347826 0.4124593304177421 0 1:0.070422535 2:0.056338028 3:0.0657277 4:0.046948357 5:0.042253521 6:0.0093896714 7:0.014084507 8:0.079812207 9:0.10798122 10:0.018779343 11:0.018779343 12:0.061032864 13:0.0093896714 14:0.03286385 15:0.056338028 16:0.098591549 17:0.023474178 18:0.10798122 19:0.018779343 20:0.061032864 0.909956697515001 0.3428599350149554 1:0.096774194 2:0.08797654 3:0.046920821 4:0.032258065 5:0.049853372 6:0.011730205 7:0.017595308 8:0.061583578 9:0.12903226 10:0.064516129 11:0.017595308 12:0.038123167 13:0 14:0.023460411 15:0.13196481 16:0.0058651026 17:0.014662757 18:0.090909091 19:0.017595308 20:0.061583578 0.5087018076244285 0.9252971932269028 1:0.12755102 2:0.071428571 3:0.015306122 4:0.020408163 5:0.020408163 6:0 7:0.020408163 8:0.14795918 9:0.071428571 10:0.025510204 11:0.025510204 12:0.025510204 13:0.030612245 14:0.10204082 15:0.020408163 16:0.030612245 17:0.030612245 18:0.086734694 19:0.030612245 20:0.096938776 0.4977585030127463 0 1:0.04691358 2:0.081481481 3:0.071604938 4:0.041975309 5:0.041975309 6:0.017283951 7:0.041975309 8:0.088888889 9:0.091358025 10:0.059259259 11:0.034567901 12:0.054320988 13:0.017283951 14:0.022222222 15:0.037037037 16:0.049382716 17:0.039506173 18:0.079012346 19:0.039506173 20:0.044444444 8 6.804891923489974 1:0.079545455 2:0.095454545 3:0.056818182 4:0.018181818 5:0.038636364 6:0.013636364 7:0.034090909 8:0.068181818 9:0.090909091 10:0.059090909 11:0.084090909 12:0.052272727 13:0.013636364 14:0.015909091 15:0.070454545 16:0.045454545 17:0.031818182 18:0.054545455 19:0.029545455 20:0.047727273 0 2.419125707605434 1:0.09469697 2:0.056818182 3:0.11363636 4:0.022727273 5:0.037878788 6:0 7:0.034090909 8:0.11742424 9:0.068181818 10:0.026515152 11:0.075757576 12:0.03030303 13:0.011363636 14:0.022727273 15:0.034090909 16:0.053030303 17:0.03030303 18:0.060606061 19:0.083333333 20:0.026515152 3.236273204294736 3.115870946018938 1:0.11464968 2:0.057324841 3:0.031847134 4:0.031847134 5:0.044585987 6:0.031847134 7:0.063694268 8:0.063694268 9:0.063694268 10:0.063694268 11:0.025477707 12:0.050955414 13:0.012738854 14:0.01910828 15:0.01910828 16:0.063694268 17:0.044585987 18:0.095541401 19:0.044585987 20:0.057324841 0.9603188152753798 1.36269535657466 1:0.076502732 2:0.076502732 3:0.071038251 4:0.038251366 5:0.027322404 6:0 7:0.016393443 8:0.071038251 9:0.054644809 10:0.054644809 11:0.027322404 12:0.027322404 13:0.010928962 14:0.0054644809 15:0.076502732 16:0.10382514 17:0.06010929 18:0.10928962 19:0.032786885 20:0.06010929 8 2.305676669762596 1:0.088095238 2:0.071428571 3:0.054761905 4:0.023809524 5:0.028571429 6:0.0071428571 7:0.05 8:0.09047619 9:0.10952381 10:0.042857143 11:0.047619048 12:0.038095238 13:0.014285714 14:0.030952381 15:0.030952381 16:0.071428571 17:0.038095238 18:0.061904762 19:0.047619048 20:0.052380952 1.220179697991229 1.07014928758646 1:0.069868996 2:0.12227074 3:0.061135371 4:0.013100437 5:0.034934498 6:0.021834061 7:0.0087336245 8:0.096069869 9:0.10917031 10:0.065502183 11:0.026200873 12:0.065502183 13:0.0043668122 14:0.021834061 15:0.052401747 16:0.030567686 17:0.030567686 18:0.074235808 19:0.056768559 20:0.034934498 2.277354265541319 0 1:0.11585366 2:0.091463415 3:0.067073171 4:0.024390244 5:0.030487805 6:0.012195122 7:0.018292683 8:0.042682927 9:0.079268293 10:0.042682927 11:0.030487805 12:0.06097561 13:0.012195122 14:0.018292683 15:0.06097561 16:0.054878049 17:0.067073171 18:0.036585366 19:0.054878049 20:0.079268293 8 4.382176816175112 1:0.071428571 2:0.071428571 3:0.066502463 4:0.024630542 5:0.027093596 6:0.0024630542 7:0.022167488 8:0.071428571 9:0.16256158 10:0.036945813 11:0.044334975 12:0.083743842 13:0.014778325 14:0.019704433 15:0.041871921 16:0.04679803 17:0.041871921 18:0.044334975 19:0.036945813 20:0.068965517 3.360535755615756 0 1:0.066193853 2:0.10874704 3:0.030732861 4:0.01891253 5:0.047281324 6:0.0094562648 7:0.028368794 8:0.061465721 9:0.082742317 10:0.04964539 11:0.044917258 12:0.035460993 13:0.011820331 14:0.023640662 15:0.066193853 16:0.056737589 17:0.073286052 18:0.087470449 19:0.042553191 20:0.054373522 0.5677176650782586 0 1:0.078313253 2:0.13253012 3:0.060240964 4:0.024096386 5:0.024096386 6:0.012048193 7:0.024096386 8:0.036144578 9:0.060240964 10:0.036144578 11:0.072289157 12:0.036144578 13:0 14:0.018072289 15:0.12048193 16:0.030120482 17:0.030120482 18:0.090361446 19:0.054216867 20:0.060240964 1.642300475912365 0 1:0.053719008 2:0.064049587 3:0.041322314 4:0.02892562 5:0.066115702 6:0.01446281 7:0.037190083 8:0.070247934 9:0.07231405 10:0.059917355 11:0.049586777 12:0.041322314 13:0.0041322314 14:0.039256198 15:0.066115702 16:0.059917355 17:0.037190083 18:0.070247934 19:0.049586777 20:0.074380165 0 1.803031079534803 1:0.047945205 2:0.12328767 3:0.089041096 4:0.01369863 5:0.054794521 6:0.01369863 7:0.047945205 8:0.075342466 9:0.054794521 10:0.02739726 11:0.04109589 12:0.034246575 13:0.02739726 14:0.0068493151 15:0.04109589 16:0.061643836 17:0.061643836 18:0.089041096 19:0.068493151 20:0.020547945 0 0.4172196602537238 1:0.077005348 2:0.075935829 3:0.065240642 4:0.011764706 5:0.027807487 6:0.0042780749 7:0.018181818 8:0.04171123 9:0.068449198 10:0.054545455 11:0.073796791 12:0.045989305 13:0.0042780749 14:0.022459893 15:0.097326203 16:0.056684492 17:0.044919786 18:0.10909091 19:0.052406417 20:0.048128342 2.463005397571117 0.5766041171231547 1:0.067307692 2:0.067307692 3:0.057692308 4:0.019230769 5:0.057692308 6:0.019230769 7:0.048076923 8:0.096153846 9:0.038461538 10:0.067307692 11:0.057692308 12:0.028846154 13:0.028846154 14:0.0096153846 15:0.019230769 16:0.11538462 17:0.019230769 18:0.086538462 19:0.019230769 20:0.076923077 0 0.2707682096198543 1:0.096069869 2:0.087336245 3:0.048034934 4:0.052401747 5:0.03930131 6:0.013100437 7:0.021834061 8:0.065502183 9:0.043668122 10:0.061135371 11:0.03930131 12:0.056768559 13:0.0043668122 14:0.030567686 15:0.096069869 16:0.030567686 17:0.030567686 18:0.10480349 19:0.021834061 20:0.056768559 0.4980663106856051 0 1:0.0859375 2:0.0859375 3:0.05078125 4:0.02734375 5:0.0390625 6:0.015625 7:0.01171875 8:0.109375 9:0.12109375 10:0.03125 11:0.06640625 12:0.0546875 13:0.0078125 14:0.01171875 15:0.06640625 16:0.03125 17:0.03125 18:0.0546875 19:0.0390625 20:0.05859375 8 7.086863049516101 1:0.093939394 2:0.078787879 3:0.057575758 4:0.027272727 5:0.024242424 6:0.003030303 7:0.021212121 8:0.075757576 9:0.12121212 10:0.036363636 11:0.078787879 12:0.084848485 13:0 14:0.015151515 15:0.027272727 16:0.072727273 17:0.033333333 18:0.057575758 19:0.051515152 20:0.039393939 0.16978315166816 0.6262833546527321 1:0.085714286 2:0.1047619 3:0.057142857 4:0.028571429 5:0.038095238 6:0.019047619 7:0.023809524 8:0.1047619 9:0.095238095 10:0.071428571 11:0.057142857 12:0.033333333 13:0 14:0.023809524 15:0.09047619 16:0.014285714 17:0.028571429 18:0.066666667 19:0.019047619 20:0.038095238 0 0.6225616587846976 1:0.047904192 2:0.10179641 3:0.077844311 4:0.023952096 5:0.047904192 6:0.011976048 7:0.041916168 8:0.041916168 9:0.05988024 10:0.011976048 11:0.083832335 12:0.023952096 13:0.005988024 14:0.05988024 15:0.005988024 16:0.083832335 17:0.047904192 18:0.10179641 19:0.071856287 20:0.047904192 3.539512469518375 0 1:0.084985836 2:0.067988669 3:0.056657224 4:0.036827195 5:0.050991501 6:0.0084985836 7:0.033994334 8:0.076487252 9:0.09631728 10:0.050991501 11:0.042492918 12:0.045325779 13:0.016997167 14:0.011331445 15:0.042492918 16:0.07082153 17:0.031161473 18:0.076487252 19:0.033994334 20:0.065155807 5.053320086622517 0 1:0.060526316 2:0.12631579 3:0.026315789 4:0.023684211 5:0.034210526 6:0.0026315789 7:0.026315789 8:0.076315789 9:0.086842105 10:0.084210526 11:0.068421053 12:0.039473684 13:0.0026315789 14:0.028947368 15:0.081578947 16:0.023684211 17:0.021052632 18:0.086842105 19:0.010526316 20:0.089473684 4.680405584726584 3.1607782751567 1:0.098671727 2:0.055028463 3:0.043643264 4:0.019924099 5:0.018026565 6:0.0066413662 7:0.009487666 8:0.090132827 9:0.13282732 10:0.068311195 11:0.065464896 12:0.05597723 13:0.0056925996 14:0.016129032 15:0.067362429 16:0.068311195 17:0.033206831 18:0.072106262 19:0.022770398 20:0.05028463 0.9161839814504772 0 1:0.067605634 2:0.095774648 3:0.076056338 4:0.025352113 5:0.036619718 6:0.0056338028 7:0.014084507 8:0.047887324 9:0.10704225 10:0.028169014 11:0.03943662 12:0.033802817 13:0.0084507042 14:0.022535211 15:0.030985915 16:0.12394366 17:0.028169014 18:0.098591549 19:0.042253521 20:0.067605634 0 3.413622919574543 1:0.057591623 2:0.12958115 3:0.083769634 4:0.0065445026 5:0.039267016 6:0.0013089005 7:0.02486911 8:0.045811518 9:0.060209424 10:0.041884817 11:0.082460733 12:0.044502618 13:0.018324607 14:0.028795812 15:0.023560209 16:0.090314136 17:0.030104712 18:0.078534031 19:0.069371728 20:0.043193717 0 1.623216404693624 1:0.080519481 2:0.07012987 3:0.064935065 4:0.015584416 5:0.041558442 6:0.0025974026 7:0.044155844 8:0.080519481 9:0.077922078 10:0.041558442 11:0.072727273 12:0.057142857 13:0.015584416 14:0.023376623 15:0.033766234 16:0.083116883 17:0.033766234 18:0.072727273 19:0.031168831 20:0.057142857 0 3.064899378088907 1:0.072186837 2:0.12951168 3:0.033970276 4:0.014861996 5:0.027600849 6:0.0042462845 7:0.016985138 8:0.10615711 9:0.17834395 10:0.053078556 11:0.042462845 12:0.038216561 13:0.0042462845 14:0.033970276 15:0.089171975 16:0.014861996 17:0.031847134 18:0.048832272 19:0.016985138 20:0.042462845 0.07321979021271291 0 1:0.098404255 2:0.058510638 3:0.07712766 4:0.037234043 5:0.037234043 6:0.010638298 7:0.015957447 8:0.12765957 9:0.079787234 10:0.053191489 11:0.037234043 12:0.066489362 13:0.026595745 14:0.02393617 15:0.037234043 16:0.04787234 17:0.029255319 18:0.058510638 19:0.029255319 20:0.04787234 1.414559832063233 1.713459987670003 1:0.072463768 2:0.028985507 3:0.057971014 4:0.028985507 5:0.10144928 6:0.014492754 7:0 8:0.11594203 9:0.043478261 10:0.014492754 11:0.072463768 12:0.057971014 13:0 14:0.014492754 15:0.028985507 16:0.086956522 17:0.11594203 18:0.057971014 19:0.028985507 20:0.057971014 0 3.61484382511828 1:0.047021944 2:0.094043887 3:0.10658307 4:0.018808777 5:0.040752351 6:0.0031347962 7:0.043887147 8:0.084639498 9:0.059561129 10:0.02507837 11:0.053291536 12:0.059561129 13:0.012539185 14:0.02507837 15:0.02507837 16:0.11912226 17:0.012539185 18:0.05015674 19:0.081504702 20:0.037617555 2.538096484875596 8 1:0.060240964 2:0.10240964 3:0.048192771 4:0.022088353 5:0.032128514 6:0.012048193 7:0.016064257 8:0.088353414 9:0.13453815 10:0.06626506 11:0.042168675 12:0.072289157 13:0.010040161 14:0.034136546 15:0.080321285 16:0.0080321285 17:0.03815261 18:0.044176707 19:0.014056225 20:0.074297189 2.939830810426161 0.750245447458261 1:0.050373134 2:0.11380597 3:0.029850746 4:0.018656716 5:0.039179104 6:0.024253731 7:0.039179104 8:0.089552239 9:0.10820896 10:0.067164179 11:0.044776119 12:0.050373134 13:0.0018656716 14:0.027985075 15:0.080223881 16:0.029850746 17:0.024253731 18:0.083955224 19:0.020522388 20:0.055970149 8 1.923322991472697 1:0.081632653 2:0.10714286 3:0.051020408 4:0.015306122 5:0.038265306 6:0.017857143 7:0.025510204 8:0.10714286 9:0.068877551 10:0.058673469 11:0.06122449 12:0.033163265 13:0.0025510204 14:0.025510204 15:0.084183673 16:0.025510204 17:0.058673469 18:0.033163265 19:0.015306122 20:0.089285714 4.148494620480828 0 1:0.06557377 2:0.15409836 3:0.029508197 4:0.013114754 5:0.029508197 6:0.0032786885 7:0.016393443 8:0.06557377 9:0.095081967 10:0.075409836 11:0.045901639 12:0.039344262 13:0.019672131 14:0.036065574 15:0.1147541 16:0.029508197 17:0.036065574 18:0.049180328 19:0.013114754 20:0.068852459 1.970909504420514 1.469543137993742 1:0.079625293 2:0.086651054 3:0.042154567 4:0.039812646 5:0.056206089 6:0.0023419204 7:0.028103044 8:0.11943794 9:0.10772834 10:0.053864169 11:0.058548009 12:0.056206089 13:0.011709602 14:0.028103044 15:0.044496487 16:0.035128806 17:0.021077283 18:0.056206089 19:0.021077283 20:0.051522248 2.343027791980051 0 1:0.060810811 2:0.14864865 3:0.023648649 4:0.02027027 5:0.050675676 6:0.02027027 7:0.013513514 8:0.070945946 9:0.14189189 10:0.047297297 11:0.057432432 12:0.054054054 13:0.013513514 14:0.037162162 15:0.091216216 16:0.0067567568 17:0.030405405 18:0.067567568 19:0.010135135 20:0.033783784 0 0.7046264071953378 1:0.12686567 2:0.097014925 3:0.029850746 4:0.044776119 5:0.037313433 6:0 7:0.02238806 8:0.067164179 9:0.082089552 10:0.0074626866 11:0.029850746 12:0.029850746 13:0.0074626866 14:0.052238806 15:0.067164179 16:0.067164179 17:0.029850746 18:0.1119403 19:0.02238806 20:0.067164179 0 1.115535580119849 1:0.082969432 2:0.10917031 3:0.069868996 4:0.030567686 5:0.017467249 6:0.0087336245 7:0.030567686 8:0.082969432 9:0.069868996 10:0.048034934 11:0.026200873 12:0.065502183 13:0.0043668122 14:0.017467249 15:0.082969432 16:0.043668122 17:0.034934498 18:0.091703057 19:0.017467249 20:0.065502183 7.142210537651729 1.47650739255786 1:0.10731707 2:0.10731707 3:0.043902439 4:0.034146341 5:0.03902439 6:0.0048780488 7:0.0048780488 8:0.087804878 9:0.16585366 10:0.073170732 11:0.03902439 12:0.03902439 13:0.019512195 14:0.0048780488 15:0.029268293 16:0.063414634 17:0.019512195 18:0.048780488 19:0.024390244 20:0.043902439 5.084297507612087 0 1:0.062686567 2:0.08358209 3:0.028358209 4:0.019402985 5:0.020895522 6:0.028358209 7:0.02238806 8:0.074626866 9:0.1358209 10:0.03880597 11:0.07761194 12:0.065671642 13:0.013432836 14:0.037313433 15:0.06119403 16:0.028358209 17:0.026865672 18:0.071641791 19:0.023880597 20:0.079104478 8 0 1:0.071691176 2:0.10294118 3:0.055147059 4:0.018382353 5:0.036764706 6:0.0091911765 7:0.033088235 8:0.091911765 9:0.11213235 10:0.033088235 11:0.047794118 12:0.064338235 13:0.020220588 14:0.023897059 15:0.025735294 16:0.075367647 17:0.023897059 18:0.049632353 19:0.040441176 20:0.064338235 8 3.501133365624939 1:0.051546392 2:0.10309278 3:0.056701031 4:0.036082474 5:0.054123711 6:0.012886598 7:0.015463918 8:0.072164948 9:0.056701031 10:0.043814433 11:0.043814433 12:0.056701031 13:0.0051546392 14:0.015463918 15:0.036082474 16:0.072164948 17:0.048969072 18:0.067010309 19:0.085051546 20:0.067010309 1.042280192697824 6.599311555715316 1:0.084337349 2:0.087349398 3:0.090361446 4:0.018072289 5:0.03313253 6:0.0090361446 7:0.030120482 8:0.075301205 9:0.069277108 10:0.024096386 11:0.063253012 12:0.069277108 13:0.0090361446 14:0.012048193 15:0.030120482 16:0.10843373 17:0.012048193 18:0.048192771 19:0.063253012 20:0.063253012 1.40090583605685 0 1:0.065822785 2:0.13164557 3:0.027848101 4:0.02278481 5:0.035443038 6:0.015189873 7:0.025316456 8:0.086075949 9:0.14177215 10:0.043037975 11:0.06835443 12:0.048101266 13:0.010126582 14:0.037974684 15:0.060759494 16:0.017721519 17:0.040506329 18:0.043037975 19:0.020253165 20:0.058227848 0.2972838251979828 2.368139077529406 1:0.1147541 2:0.081967213 3:0.040983607 4:0.024590164 5:0.049180328 6:0.016393443 7:0.0081967213 8:0.1147541 9:0.10655738 10:0.040983607 11:0.032786885 12:0.073770492 13:0 14:0.016393443 15:0.073770492 16:0.040983607 17:0.016393443 18:0.06557377 19:0.049180328 20:0.032786885 0.1078766884786763 0 1:0.083700441 2:0.10572687 3:0.039647577 4:0.030837004 5:0.022026432 6:0.0088105727 7:0.017621145 8:0.057268722 9:0.083700441 10:0.017621145 11:0.066079295 12:0.030837004 13:0 14:0.030837004 15:0.074889868 16:0.035242291 17:0.092511013 18:0.088105727 19:0.026431718 20:0.088105727 3.638849737588541 0 1:0.095846645 2:0.092651757 3:0.067092652 4:0.022364217 5:0.03514377 6:0.0095846645 7:0.025559105 8:0.067092652 9:0.11821086 10:0.047923323 11:0.028753994 12:0.057507987 13:0.0063897764 14:0.012779553 15:0.038338658 16:0.067092652 17:0.038338658 18:0.057507987 19:0.041533546 20:0.07028754 8 0 1:0.10447761 2:0.076492537 3:0.054104478 4:0.02238806 5:0.02238806 6:0.0018656716 7:0.0093283582 8:0.087686567 9:0.10820896 10:0.048507463 11:0.069029851 12:0.054104478 13:0 14:0.0074626866 15:0.046641791 16:0.054104478 17:0.065298507 18:0.072761194 19:0.037313433 20:0.057835821 4.063652998746698 0 1:0.09921671 2:0.075718016 3:0.057441253 4:0.041775457 5:0.033942559 6:0 7:0.0078328982 8:0.10966057 9:0.1227154 10:0.039164491 11:0.031331593 12:0.062663185 13:0 14:0.0078328982 15:0.046997389 16:0.044386423 17:0.036553525 18:0.067885117 19:0.041775457 20:0.07310705 -0.4367016930900829 0 1:0.067307692 2:0.11538462 3:0.048076923 4:0.048076923 5:0.0096153846 6:0.019230769 7:0 8:0.086538462 9:0.16346154 10:0.086538462 11:0.038461538 12:0.028846154 13:0.019230769 14:0.019230769 15:0 16:0.096153846 17:0.028846154 18:0.067307692 19:0.028846154 20:0.028846154 -0 0.4206934877608815 1:0.049180328 2:0.057377049 3:0.040983607 4:0.032786885 5:0.024590164 6:0.024590164 7:0.0081967213 8:0.073770492 9:0.20491803 10:0.057377049 11:0.049180328 12:0.049180328 13:0.032786885 14:0.0081967213 15:0.040983607 16:0.06557377 17:0.040983607 18:0.032786885 19:0.049180328 20:0.057377049 -0.9130831471015111 0 1:0.057803468 2:0.10982659 3:0.028901734 4:0.028901734 5:0.034682081 6:0.0057803468 7:0.011560694 8:0.13294798 9:0.10404624 10:0.052023121 11:0.040462428 12:0.069364162 13:0.011560694 14:0.063583815 15:0.011560694 16:0.069364162 17:0.034682081 18:0.028901734 19:0.046242775 20:0.057803468 -0 1.209413711785985 1:0.10632184 2:0.074712644 3:0.037356322 4:0.022988506 5:0.031609195 6:0.0028735632 7:0.034482759 8:0.086206897 9:0.1091954 10:0.037356322 11:0.048850575 12:0.10057471 13:0.0057471264 14:0.0086206897 15:0.022988506 16:0.068965517 17:0.037356322 18:0.020114943 19:0.066091954 20:0.077586207 -3.060966473203146 0 1:0.053846154 2:0.088461538 3:0.057692308 4:0.015384615 5:0.053846154 6:0.0038461538 7:0.030769231 8:0.080769231 9:0.11538462 10:0.034615385 11:0.057692308 12:0.053846154 13:0.0076923077 14:0.0038461538 15:0.034615385 16:0.096153846 17:0.046153846 18:0.065384615 19:0.030769231 20:0.069230769 -2.471114907185362 0.3729539677605425 1:0.10472973 2:0.094594595 3:0.047297297 4:0.023648649 5:0.023648649 6:0 7:0.010135135 8:0.081081081 9:0.15202703 10:0.030405405 11:0.037162162 12:0.050675676 13:0 14:0.010135135 15:0.02027027 16:0.094594595 17:0.057432432 18:0.033783784 19:0.054054054 20:0.074324324 -0 6.121243703135696 1:0.086614173 2:0.086614173 3:0.044619423 4:0.01312336 5:0.041994751 6:0.0052493438 7:0.041994751 8:0.091863517 9:0.062992126 10:0.047244094 11:0.062992126 12:0.073490814 13:0 14:0.015748031 15:0.023622047 16:0.070866142 17:0.062992126 18:0.026246719 19:0.089238845 20:0.052493438 -0 4.354142937599794 1:0.081031308 2:0.086556169 3:0.049723757 4:0.0073664825 5:0.02946593 6:0.023941068 7:0.051565378 8:0.042357274 9:0.08839779 10:0.057090239 11:0.057090239 12:0.060773481 13:0.0036832413 14:0.020257827 15:0.031307551 16:0.082872928 17:0.033149171 18:0.053406998 19:0.068139963 20:0.071823204 -8 0 1:0.072727273 2:0.08 3:0.050909091 4:0.029090909 5:0.034545455 6:0.012727273 7:0.038181818 8:0.090909091 9:0.087272727 10:0.047272727 11:0.038181818 12:0.052727273 13:0.0036363636 14:0.018181818 15:0.036363636 16:0.074545455 17:0.050909091 18:0.063636364 19:0.049090909 20:0.069090909 -7.499980569165563 0.9190188020359428 1:0.061452514 2:0.081005587 3:0.047486034 4:0.041899441 5:0.039106145 6:0.011173184 7:0.055865922 8:0.067039106 9:0.072625698 10:0.044692737 11:0.033519553 12:0.053072626 13:0.0027932961 14:0.027932961 15:0.027932961 16:0.089385475 17:0.044692737 18:0.06424581 19:0.047486034 20:0.086592179 -1.710706180607596 0.3886185262012503 1:0.041284404 2:0.087155963 3:0.027522936 4:0.055045872 5:0.032110092 6:0.050458716 7:0.013761468 8:0.032110092 9:0.11009174 10:0.041284404 11:0.068807339 12:0.041284404 13:0.013761468 14:0.027522936 15:0.12844037 16:0.02293578 17:0.077981651 18:0.077981651 19:0.018348624 20:0.032110092 -1.21797913877259 1.610509838124154 1:0.072249589 2:0.075533662 3:0.031198686 4:0.070607553 5:0.032840722 6:0.019704433 7:0.02955665 8:0.10344828 9:0.072249589 10:0.055829228 11:0.062397373 12:0.060755337 13:0.0016420361 14:0.032840722 15:0.0591133 16:0.036124795 17:0.024630542 18:0.05090312 19:0.032840722 20:0.075533662 -8 4.364215761118517 1:0.079365079 2:0.079365079 3:0.050793651 4:0.025396825 5:0.041269841 6:0.015873016 7:0.025396825 8:0.092063492 9:0.088888889 10:0.066666667 11:0.050793651 12:0.044444444 13:0.041269841 14:0.038095238 15:0.028571429 16:0.053968254 17:0.0095238095 18:0.079365079 19:0.041269841 20:0.047619048 -0 0.7903144566295344 1:0.072580645 2:0.064516129 3:0.040322581 4:0.048387097 5:0.032258065 6:0.0080645161 7:0.048387097 8:0.064516129 9:0.12903226 10:0.040322581 11:0.032258065 12:0.040322581 13:0.0080645161 14:0.024193548 15:0.064516129 16:0.088709677 17:0.056451613 18:0.048387097 19:0.040322581 20:0.048387097 -1.337708316361119 0 1:0.054744526 2:0.10948905 3:0.047445255 4:0.025547445 5:0.032846715 6:0.02189781 7:0.0072992701 8:0.072992701 9:0.076642336 10:0.087591241 11:0.072992701 12:0.054744526 13:0.02189781 14:0.032846715 15:0.054744526 16:0.047445255 17:0.069343066 18:0.02919708 19:0.018248175 20:0.062043796 -8 1.747312334365741 1:0.081911263 2:0.12286689 3:0.034129693 4:0.030716724 5:0.023890785 6:0.020477816 7:0.0034129693 8:0.085324232 9:0.12627986 10:0.040955631 11:0.071672355 12:0.071672355 13:0.0068259386 14:0.013651877 15:0.071672355 16:0.020477816 17:0.010238908 18:0.078498294 19:0.023890785 20:0.061433447 -2.061188296452615 0 1:0.037735849 2:0.14150943 3:0.056603774 4:0.012578616 5:0.022012579 6:0.012578616 7:0.028301887 8:0.066037736 9:0.14465409 10:0.047169811 11:0.062893082 12:0.062893082 13:0 14:0.01572327 15:0.040880503 16:0.040880503 17:0.081761006 18:0.040880503 19:0.031446541 20:0.053459119 -7.058688234670353 5.085596729897322 1:0.062761506 2:0.083682008 3:0.079497908 4:0.020920502 5:0.029288703 6:0.0083682008 7:0.041841004 8:0.054393305 9:0.066945607 10:0.071129707 11:0.079497908 12:0.054393305 13:0.0083682008 14:0 15:0.058577406 16:0.071129707 17:0.046025105 18:0.071129707 19:0.054393305 20:0.037656904 -8 3.390389852295181 1:0.051813472 2:0.13471503 3:0.031088083 4:0.025906736 5:0.020725389 6:0.0051813472 7:0.025906736 8:0.10362694 9:0.098445596 10:0.046632124 11:0.088082902 12:0.056994819 13:0.025906736 14:0.025906736 15:0.088082902 16:0.025906736 17:0.015544041 18:0.051813472 19:0.025906736 20:0.051813472 -0 3.862258679267308 1:0.068493151 2:0.058708415 3:0.029354207 4:0.0097847358 5:0.019569472 6:0.02739726 7:0.035225049 8:0.11741683 9:0.090019569 10:0.050880626 11:0.19765166 12:0.062622309 13:0.045009785 14:0.0019569472 15:0.033268102 16:0.01369863 17:0.037181996 18:0.017612524 19:0.056751468 20:0.02739726 -0.573565310708917 0.7339700752836049 1:0.048484848 2:0.084848485 3:0.057575758 4:0.042424242 5:0.048484848 6:0.033333333 7:0.033333333 8:0.072727273 9:0.078787879 10:0.081818182 11:0.075757576 12:0.048484848 13:0.033333333 14:0.015151515 15:0.042424242 16:0.045454545 17:0.03030303 18:0.036363636 19:0.054545455 20:0.036363636 -8 0 1:0.080952381 2:0.09047619 3:0.057142857 4:0.038095238 5:0.038095238 6:0.019047619 7:0.019047619 8:0.080952381 9:0.071428571 10:0.071428571 11:0.047619048 12:0.033333333 13:0.0095238095 14:0 15:0.09047619 16:0.042857143 17:0.061904762 18:0.057142857 19:0.028571429 20:0.061904762 -2.724824853153582 0 1:0.070754717 2:0.12264151 3:0.028301887 4:0.033018868 5:0.033018868 6:0.0047169811 7:0.014150943 8:0.080188679 9:0.08490566 10:0.037735849 11:0.056603774 12:0.04245283 13:0.0094339623 14:0.018867925 15:0.061320755 16:0.066037736 17:0.075471698 18:0.056603774 19:0.047169811 20:0.056603774 -4.643805842222945 8 1:0.068807339 2:0.084862385 3:0.055045872 4:0.02293578 5:0.03440367 6:0.016055046 7:0.036697248 8:0.098623853 9:0.064220183 10:0.038990826 11:0.080275229 12:0.052752294 13:0.004587156 14:0.018348624 15:0.064220183 16:0.020642202 17:0.059633028 18:0.050458716 19:0.068807339 20:0.059633028 -0 8 1:0.093023256 2:0.072093023 3:0.034883721 4:0.01627907 5:0.041860465 6:0.018604651 7:0.030232558 8:0.10232558 9:0.1 10:0.065116279 11:0.090697674 12:0.048837209 13:0 14:0.011627907 15:0.046511628 16:0.03255814 17:0.069767442 18:0.023255814 19:0.046511628 20:0.055813953 -7.96254902798491 0.2735360816971864 1:0.075067024 2:0.061662198 3:0.058981233 4:0.026809651 5:0.045576408 6:0.0026809651 7:0.029490617 8:0.10187668 9:0.13672922 10:0.040214477 11:0.034852547 12:0.056300268 13:0.0053619303 14:0.010723861 15:0.029490617 16:0.08310992 17:0.040214477 18:0.061662198 19:0.034852547 20:0.064343164 -4.934569424726362 0 1:0.058411215 2:0.086448598 3:0.072429907 4:0.037383178 5:0.023364486 6:0.011682243 7:0.011682243 8:0.060747664 9:0.11214953 10:0.03271028 11:0.063084112 12:0.039719626 13:0 14:0.018691589 15:0.058411215 16:0.053738318 17:0.079439252 18:0.051401869 19:0.063084112 20:0.065420561 -0 8 1:0.048728814 2:0.091101695 3:0.044491525 4:0.016949153 5:0.014830508 6:0.0063559322 7:0.025423729 8:0.097457627 9:0.14194915 10:0.042372881 11:0.055084746 12:0.088983051 13:0.01059322 14:0.021186441 15:0.029661017 16:0.055084746 17:0.055084746 18:0.050847458 19:0.046610169 20:0.05720339 -6.063893005663595 0.5849274082695956 1:0.099236641 2:0.053435115 3:0.045801527 4:0.045801527 5:0.038167939 6:0.0076335878 7:0.030534351 8:0.06870229 9:0.16030534 10:0.053435115 11:0.038167939 12:0.076335878 13:0.022900763 14:0.038167939 15:0.022900763 16:0.06870229 17:0.0076335878 18:0.06870229 19:0.015267176 20:0.038167939 -8 5.043692551696436 1:0.056338028 2:0.079812207 3:0.049295775 4:0.014084507 5:0.046948357 6:0.011737089 7:0.028169014 8:0.068075117 9:0.14553991 10:0.061032864 11:0.042253521 12:0.075117371 13:0.0070422535 14:0.023474178 15:0.058685446 16:0.018779343 17:0.044600939 18:0.072769953 19:0.021126761 20:0.075117371 -1.802455065877923 0 1:0.098765432 2:0.092592593 3:0.037037037 4:0.030864198 5:0.024691358 6:0.018518519 7:0.049382716 8:0.055555556 9:0.086419753 10:0.055555556 11:0.043209877 12:0.061728395 13:0.012345679 14:0.0061728395 15:0.030864198 16:0.098765432 17:0.037037037 18:0.074074074 19:0.030864198 20:0.055555556 -0.1057876322374004 0 1:0.053211009 2:0.051376147 3:0.036697248 4:0.023853211 5:0.025688073 6:0.0073394495 7:0.0091743119 8:0.080733945 9:0.12110092 10:0.060550459 11:0.04587156 12:0.051376147 13:0.058715596 14:0.056880734 15:0.047706422 16:0.099082569 17:0.034862385 18:0.047706422 19:0.025688073 20:0.062385321 -2.600446698857399 3.718844441911946 1:0.092592593 2:0.076719577 3:0.031746032 4:0.031746032 5:0.034391534 6:0.010582011 7:0.034391534 8:0.097883598 9:0.12169312 10:0.063492063 11:0.015873016 12:0.084656085 13:0.0026455026 14:0.044973545 15:0.037037037 16:0.044973545 17:0.026455026 18:0.058201058 19:0.037037037 20:0.052910053 -4.291451927899248 0 1:0.044378698 2:0.088757396 3:0.032544379 4:0.023668639 5:0.045857988 6:0.036982249 7:0.036982249 8:0.087278107 9:0.091715976 10:0.047337278 11:0.057692308 12:0.068047337 13:0.0073964497 14:0.028106509 15:0.039940828 16:0.047337278 17:0.062130178 18:0.036982249 19:0.035502959 20:0.081360947 -0 8 1:0.062411348 2:0.065248227 3:0.04964539 4:0.021276596 5:0.04964539 6:0.012765957 7:0.060992908 8:0.083687943 9:0.072340426 10:0.036879433 11:0.070921986 12:0.068085106 13:0.0028368794 14:0.011347518 15:0.022695035 16:0.1035461 17:0.035460993 18:0.04964539 19:0.053900709 20:0.066666667 -2.331832844561124 5.047647929412054 1:0.12935323 2:0.054726368 3:0.034825871 4:0.024875622 5:0.0099502488 6:0 7:0.0099502488 8:0.059701493 9:0.11940299 10:0.0049751244 11:0.10447761 12:0.094527363 13:0 14:0.014925373 15:0.019900498 16:0.10945274 17:0.039800995 18:0.049751244 19:0.034825871 20:0.084577114 -8 0 1:0.063981043 2:0.085308057 3:0.052132701 4:0.021327014 5:0.045023697 6:0.021327014 7:0.035545024 8:0.085308057 9:0.12559242 10:0.030805687 11:0.049763033 12:0.071090047 13:0.0047393365 14:0.0047393365 15:0.028436019 16:0.056872038 17:0.047393365 18:0.056872038 19:0.04028436 20:0.073459716 -0 0.6490092597370897 1:0.072289157 2:0.084337349 3:0.018072289 4:0.060240964 5:0.012048193 6:0.012048193 7:0.0060240964 8:0.024096386 9:0.090361446 10:0.024096386 11:0.10240964 12:0.054216867 13:0 14:0.042168675 15:0.078313253 16:0.030120482 17:0.12048193 18:0.090361446 19:0.048192771 20:0.030120482 -1.603861821668237 0.281884423767496 1:0.07028754 2:0.10223642 3:0.044728435 4:0.051118211 5:0.031948882 6:0.0063897764 7:0.028753994 8:0.038338658 9:0.13099042 10:0.051118211 11:0.073482428 12:0.076677316 13:0.0031948882 14:0.022364217 15:0.047923323 16:0.067092652 17:0.031948882 18:0.025559105 19:0.057507987 20:0.038338658 -7.113435918425009 0 1:0.10540541 2:0.083783784 3:0.059459459 4:0.037837838 5:0.021621622 6:0 7:0.0054054054 8:0.10540541 9:0.11351351 10:0.054054054 11:0.064864865 12:0.078378378 13:0.0054054054 14:0 15:0.043243243 16:0.035135135 17:0.054054054 18:0.037837838 19:0.051351351 20:0.043243243 -3.435273220847336 1.779914070169598 1:0.093385214 2:0.081712062 3:0.038910506 4:0.011673152 5:0.042801556 6:0.015564202 7:0.027237354 8:0.070038911 9:0.16342412 10:0.038910506 11:0.062256809 12:0.054474708 13:0 14:0.011673152 15:0.027237354 16:0.085603113 17:0.031128405 18:0.046692607 19:0.035019455 20:0.062256809 -8 0 1:0.078078078 2:0.099099099 3:0.033033033 4:0.039039039 5:0.039039039 6:0.009009009 7:0.027027027 8:0.072072072 9:0.13513514 10:0.045045045 11:0.039039039 12:0.06006006 13:0 14:0.018018018 15:0.018018018 16:0.072072072 17:0.039039039 18:0.096096096 19:0.036036036 20:0.045045045 -1.077200073487154 5.534677316710479 1:0.041353383 2:0.060150376 3:0.045112782 4:0.018796992 5:0.030075188 6:0.030075188 7:0.045112782 8:0.078947368 9:0.12406015 10:0.045112782 11:0.030075188 12:0.052631579 13:0.030075188 14:0.022556391 15:0.067669173 16:0.045112782 17:0.082706767 18:0.037593985 19:0.060150376 20:0.052631579 -8 0 1:0.078947368 2:0.11695906 3:0.043859649 4:0.029239766 5:0.029239766 6:0.020467836 7:0.032163743 8:0.073099415 9:0.10526316 10:0.055555556 11:0.067251462 12:0.029239766 13:0.0058479532 14:0.020467836 15:0.058479532 16:0.043859649 17:0.073099415 18:0.043859649 19:0.029239766 20:0.043859649 -3.291409616476488 4.494360245712103 1:0.065934066 2:0.11208791 3:0.081318681 4:0.015384615 5:0.030769231 6:0 7:0.0043956044 8:0.11208791 9:0.10769231 10:0.043956044 11:0.083516484 12:0.043956044 13:0 14:0.0021978022 15:0.035164835 16:0.059340659 17:0.054945055 18:0.046153846 19:0.048351648 20:0.052747253 -0 0.6876120407810307 1:0.068965517 2:0.068965517 3:0.034482759 4:0.014778325 5:0.054187192 6:0.02955665 7:0.014778325 8:0.0591133 9:0.088669951 10:0.034482759 11:0.093596059 12:0.073891626 13:0.0049261084 14:0.014778325 15:0.02955665 16:0.068965517 17:0.088669951 18:0.019704433 19:0.054187192 20:0.083743842 -2.599729924503841 0 1:0.066298343 2:0.12154696 3:0.033149171 4:0.044198895 5:0.049723757 6:0.0055248619 7:0.016574586 8:0.027624309 9:0.08839779 10:0.038674033 11:0.12707182 12:0.044198895 13:0.0055248619 14:0.049723757 15:0.049723757 16:0.027624309 17:0.082872928 18:0.049723757 19:0.027624309 20:0.044198895 -0 8 1:0.083123426 2:0.050377834 3:0.040302267 4:0.017632242 5:0.0302267 6:0.027707809 7:0.040302267 8:0.062972292 9:0.073047859 10:0.050377834 11:0.083123426 12:0.085642317 13:0.010075567 14:0.020151134 15:0.032745592 16:0.085642317 17:0.047858942 18:0.040302267 19:0.080604534 20:0.037783375 -0.7612188050216705 0 1:0.069277108 2:0.10240964 3:0.042168675 4:0.015060241 5:0.054216867 6:0.021084337 7:0.030120482 8:0.045180723 9:0.087349398 10:0.048192771 11:0.057228916 12:0.081325301 13:0.0060240964 14:0.021084337 15:0.018072289 16:0.081325301 17:0.063253012 18:0.048192771 19:0.051204819 20:0.057228916 -8 2.287819468348321 1:0.072727273 2:0.1030303 3:0.06969697 4:0.015151515 5:0.024242424 6:0.0090909091 7:0.027272727 8:0.081818182 9:0.12727273 10:0.081818182 11:0.033333333 12:0.060606061 13:0 14:0.018181818 15:0.048484848 16:0.051515152 17:0.027272727 18:0.054545455 19:0.036363636 20:0.054545455 -3.511681677426128 0 1:0.079225352 2:0.077464789 3:0.052816901 4:0.021126761 5:0.026408451 6:0.019366197 7:0.035211268 8:0.079225352 9:0.082746479 10:0.063380282 11:0.063380282 12:0.059859155 13:0.0035211268 14:0.029929577 15:0.040492958 16:0.075704225 17:0.040492958 18:0.047535211 19:0.035211268 20:0.066901408 -5.098580449232711 0 1:0.088607595 2:0.12025316 3:0.012658228 4:0.015822785 5:0.034810127 6:0.018987342 7:0.012658228 8:0.085443038 9:0.10443038 10:0.053797468 11:0.060126582 12:0.060126582 13:0.0094936709 14:0.015822785 15:0.085443038 16:0.022151899 17:0.063291139 18:0.056962025 19:0.015822785 20:0.063291139 -0 8 1:0.078651685 2:0.060995185 3:0.025682183 4:0.016051364 5:0.043338684 6:0.038523274 7:0.033707865 8:0.11717496 9:0.085072231 10:0.057784912 11:0.062600321 12:0.065810594 13:0.0048154093 14:0.025682183 15:0.041733547 16:0.064205457 17:0.027287319 18:0.036918138 19:0.040128411 20:0.073836276 -2.874160890661235 0.4870663927273422 1:0.073710074 2:0.11056511 3:0.017199017 4:0.017199017 5:0.049140049 6:0.029484029 7:0.029484029 8:0.081081081 9:0.073710074 10:0.073710074 11:0.051597052 12:0.036855037 13:0.014742015 14:0.017199017 15:0.12039312 16:0.027027027 17:0.036855037 18:0.036855037 19:0.022113022 20:0.081081081 -0 3.658676738324247 1:0.084398977 2:0.084398977 3:0.040920716 4:0.025575448 5:0.012787724 6:0.010230179 7:0.025575448 8:0.089514066 9:0.1202046 10:0.043478261 11:0.076726343 12:0.061381074 13:0.0025575448 14:0.0076726343 15:0.081841432 16:0.023017903 17:0.097186701 18:0.043478261 19:0.017902813 20:0.051150895 -8 2.000415432812609 1:0.058823529 2:0.11213235 3:0.022058824 4:0.016544118 5:0.045955882 6:0.040441176 7:0.022058824 8:0.077205882 9:0.086397059 10:0.071691176 11:0.060661765 12:0.027573529 13:0.0036764706 14:0.038602941 15:0.10294118 16:0.020220588 17:0.03125 18:0.0625 19:0.022058824 20:0.077205882 -8 6.528862915696394 1:0.08490566 2:0.06918239 3:0.050314465 4:0.029874214 5:0.040880503 6:0.014150943 7:0.02672956 8:0.086477987 9:0.072327044 10:0.047169811 11:0.053459119 12:0.055031447 13:0.017295597 14:0.033018868 15:0.059748428 16:0.061320755 17:0.018867925 18:0.058176101 19:0.040880503 20:0.080188679 -0 7.784284344284279 1:0.04534005 2:0.1209068 3:0.032745592 4:0.01511335 5:0.032745592 6:0.012594458 7:0.04534005 8:0.090680101 9:0.12342569 10:0.073047859 11:0.047858942 12:0.050377834 13:0.0050377834 14:0.01511335 15:0.068010076 16:0.037783375 17:0.062972292 18:0.037783375 19:0.022670025 20:0.060453401 -8 0 1:0.052238806 2:0.10820896 3:0.048507463 4:0.026119403 5:0.044776119 6:0.014925373 7:0.029850746 8:0.052238806 9:0.10447761 10:0.052238806 11:0.052238806 12:0.044776119 13:0.01119403 14:0.026119403 15:0.078358209 16:0.041044776 17:0.063432836 18:0.063432836 19:0.02238806 20:0.063432836 -0 2.209302762949475 1:0.073825503 2:0.053691275 3:0.026845638 4:0.013422819 5:0.040268456 6:0.013422819 7:0.0067114094 8:0.12751678 9:0.14765101 10:0.073825503 11:0.040268456 12:0.087248322 13:0.026845638 14:0.0067114094 15:0.053691275 16:0.040268456 17:0.040268456 18:0.040268456 19:0.060402685 20:0.026845638 -4.802117854980023 2.952197835101933 1:0.059945504 2:0.089918256 3:0.019073569 4:0.029972752 5:0.035422343 6:0.024523161 7:0.032697548 8:0.068119891 9:0.1253406 10:0.04359673 11:0.076294278 12:0.04359673 13:0.010899183 14:0.016348774 15:0.073569482 16:0.057220708 17:0.038147139 18:0.0626703 19:0.035422343 20:0.057220708 -4.671869755057796 8 1:0.06833713 2:0.072892938 3:0.056947608 4:0.041002278 5:0.022779043 6:0.006833713 7:0.043280182 8:0.082004556 9:0.10933941 10:0.050113895 11:0.072892938 12:0.047835991 13:0.011389522 14:0.018223235 15:0.061503417 16:0.043280182 17:0.038724374 18:0.036446469 19:0.054669704 20:0.061503417 -0 1.590432446410517 1:0.022058824 2:0.051470588 3:0.051470588 4:0.036764706 5:0.044117647 6:0.014705882 7:0.022058824 8:0.088235294 9:0.15441176 10:0.029411765 11:0.066176471 12:0.080882353 13:0.014705882 14:0.0073529412 15:0.0073529412 16:0.125 17:0.022058824 18:0.044117647 19:0.066176471 20:0.051470588 -0 0.07509135564985965 1:0.053763441 2:0.069892473 3:0.032258065 4:0.016129032 5:0.032258065 6:0.016129032 7:0.032258065 8:0.11827957 9:0.053763441 10:0.059139785 11:0.075268817 12:0.053763441 13:0.075268817 14:0.016129032 15:0.059139785 16:0.064516129 17:0.021505376 18:0.037634409 19:0.048387097 20:0.064516129 -5.706609966159265 1.029850408913825 1:0.045614035 2:0.075438596 3:0.029824561 4:0.036842105 5:0.015789474 6:0.029824561 7:0.033333333 8:0.080701754 9:0.080701754 10:0.047368421 11:0.042105263 12:0.071929825 13:0.031578947 14:0.035087719 15:0.052631579 16:0.078947368 17:0.022807018 18:0.084210526 19:0.050877193 20:0.054385965 -2.616082120107126 2.724995447829363 1:0.025210084 2:0.17647059 3:0.016806723 4:0.025210084 5:0.016806723 6:0.025210084 7:0 8:0.092436975 9:0.13445378 10:0.1092437 11:0.033613445 12:0.042016807 13:0.025210084 14:0.033613445 15:0.084033613 16:0.0084033613 17:0.033613445 18:0.050420168 19:0.0084033613 20:0.058823529 -5.163252430602697 1.288803732558049 1:0.049095607 2:0.095607235 3:0.018087855 4:0.020671835 5:0.046511628 6:0.007751938 7:0.028423773 8:0.082687339 9:0.14470284 10:0.064599483 11:0.033591731 12:0.054263566 13:0.012919897 14:0.020671835 15:0.07751938 16:0.031007752 17:0.036175711 18:0.067183463 19:0.033591731 20:0.074935401 -0 1.466971585059213 1:0.070381232 2:0.082111437 3:0.052785924 4:0.008797654 5:0.04398827 6:0.017595308 7:0.032258065 8:0.093841642 9:0.1085044 10:0.04398827 11:0.052785924 12:0.076246334 13:0.011730205 14:0.0029325513 15:0.035190616 16:0.055718475 17:0.032258065 18:0.04398827 19:0.073313783 20:0.061583578 -2.205410344860439 0.8602506511196986 1:0.064935065 2:0.090909091 3:0.032467532 4:0.038961039 5:0.025974026 6:0 7:0.025974026 8:0.045454545 9:0.12337662 10:0.097402597 11:0.051948052 12:0.064935065 13:0.025974026 14:0.032467532 15:0.084415584 16:0 17:0.051948052 18:0.058441558 19:0.038961039 20:0.045454545 -3.064840795915249 0 1:0.090196078 2:0.082352941 3:0.041830065 4:0.035294118 5:0.032679739 6:0.010457516 7:0.033986928 8:0.081045752 9:0.095424837 10:0.050980392 11:0.069281046 12:0.05751634 13:0.0013071895 14:0.013071895 15:0.045751634 16:0.062745098 17:0.037908497 18:0.050980392 19:0.039215686 20:0.067973856 -1.46394059759023 1.836861862499438 1:0.024539877 2:0.098159509 3:0.030674847 4:0.0061349693 5:0.030674847 6:0.012269939 7:0 8:0.11042945 9:0.17177914 10:0.079754601 11:0.061349693 12:0.036809816 13:0.098159509 14:0.018404908 15:0.09202454 16:0.012269939 17:0.024539877 18:0.055214724 19:0.0061349693 20:0.030674847 -8 8 1:0.072599532 2:0.10538642 3:0.046838407 4:0.011709602 5:0.046838407 6:0.014051522 7:0.025761124 8:0.077283372 9:0.091334895 10:0.051522248 11:0.051522248 12:0.049180328 13:0.0046838407 14:0.039812646 15:0.084309133 16:0.035128806 17:0.049180328 18:0.067915691 19:0.018735363 20:0.056206089 -2.172395958982213 -2.56375769703356 1:0.075675676 2:0.059459459 3:0.037837838 4:0.021621622 5:0.043243243 6:0 7:0.016216216 8:0.064864865 9:0.17297297 10:0.043243243 11:0.086486486 12:0.12972973 13:0.010810811 14:0.010810811 15:0.016216216 16:0.048648649 17:0.032432432 18:0.021621622 19:0.059459459 20:0.048648649 -0.7292580839583707 -4.151835645620884 1:0.067368421 2:0.090526316 3:0.050526316 4:0.016842105 5:0.023157895 6:0 7:0.014736842 8:0.077894737 9:0.12210526 10:0.016842105 11:0.090526316 12:0.11157895 13:0 14:0 15:0.027368421 16:0.082105263 17:0.037894737 18:0.035789474 19:0.058947368 20:0.073684211 -1.02101596169549 -2.279076880306903 1:0.061946903 2:0.088495575 3:0.044247788 4:0.03539823 5:0.0088495575 6:0 7:0.0088495575 8:0.10619469 9:0.15929204 10:0 11:0.088495575 12:0.088495575 13:0 14:0 15:0 16:0.097345133 17:0.061946903 18:0.026548673 19:0.079646018 20:0.044247788 -0.3749553116417264 -1.340898479234559 1:0.043103448 2:0.038793103 3:0.017241379 4:0.0043103448 5:0.021551724 6:0.017241379 7:0.025862069 8:0.051724138 9:0.20258621 10:0.056034483 11:0.090517241 12:0.056034483 13:0 14:0.012931034 15:0.0043103448 16:0.025862069 17:0.16810345 18:0.038793103 19:0.073275862 20:0.051724138 -7.903896326000703 -3.033654125152884 1:0.075581395 2:0.11627907 3:0.069767442 4:0.029069767 5:0.040697674 6:0.01744186 7:0.040697674 8:0.075581395 9:0.058139535 10:0.040697674 11:0.098837209 12:0.069767442 13:0.011627907 14:0.029069767 15:0.0058139535 16:0.075581395 17:0.023255814 18:0.040697674 19:0.040697674 20:0.040697674 -0.5933541013592225 -0.03518610399786863 1:0.06557377 2:0.032786885 3:0.032786885 4:0.024590164 5:0.086065574 6:0.0040983607 7:0.028688525 8:0.06557377 9:0.032786885 10:0.036885246 11:0.10245902 12:0.06557377 13:0.0040983607 14:0.012295082 15:0.090163934 16:0.069672131 17:0.016393443 18:0.06557377 19:0.086065574 20:0.077868852 -0 -8 1:0.056521739 2:0.055072464 3:0.044927536 4:0.01884058 5:0.026086957 6:0.010144928 7:0.034782609 8:0.075362319 9:0.072463768 10:0.047826087 11:0.072463768 12:0.07826087 13:0 14:0.036231884 15:0.037681159 16:0.08115942 17:0.069565217 18:0.046376812 19:0.075362319 20:0.060869565 -2.675406161354421 -3.719116124931764 1:0.096045198 2:0.056497175 3:0.056497175 4:0.02259887 5:0.039548023 6:0.011299435 7:0.04519774 8:0.062146893 9:0.050847458 10:0.033898305 11:0.039548023 12:0.056497175 13:0 14:0.016949153 15:0.039548023 16:0.1299435 17:0.056497175 18:0.056497175 19:0.079096045 20:0.050847458 -3.125766399948067 -0 1:0.087866109 2:0.025104603 3:0.11297071 4:0.012552301 5:0.046025105 6:0 7:0.037656904 8:0.092050209 9:0.066945607 10:0.037656904 11:0.054393305 12:0.050209205 13:0 14:0.012552301 15:0.012552301 16:0.09623431 17:0.041841004 18:0.054393305 19:0.09623431 20:0.062761506 -1.359109647986245 -0 1:0.085106383 2:0.042553191 3:0.072340426 4:0.017021277 5:0.046808511 6:0 7:0.038297872 8:0.09787234 9:0.063829787 10:0.029787234 11:0.076595745 12:0.068085106 13:0 14:0.034042553 15:0.0085106383 16:0.10638298 17:0.021276596 18:0.068085106 19:0.085106383 20:0.038297872 -1.079995285795496 -1.035494453063716 1:0.085889571 2:0.098159509 3:0.049079755 4:0.012269939 5:0.061349693 6:0.0061349693 7:0.055214724 8:0.055214724 9:0.036809816 10:0.049079755 11:0.079754601 12:0.079754601 13:0 14:0.012269939 15:0.012269939 16:0.12883436 17:0 18:0.09202454 19:0.036809816 20:0.049079755 -2.538209334104792 -7.009508402606112 1:0.076620825 2:0.05697446 3:0.043222004 4:0.013752456 5:0.035363458 6:0.0058939096 7:0.049115914 8:0.070726916 9:0.084479371 10:0.023575639 11:0.062868369 12:0.066797642 13:0.0019646365 14:0.019646365 15:0.015717092 16:0.10609037 17:0.039292731 18:0.064833006 19:0.068762279 20:0.094302554 -8 -8 1:0.042857143 2:0.14285714 3:0.057142857 4:0.028571429 5:0.042857143 6:0.014285714 7:0.014285714 8:0.071428571 9:0.092857143 10:0.1 11:0.014285714 12:0.028571429 13:0.028571429 14:0.021428571 15:0.078571429 16:0.028571429 17:0.035714286 18:0.085714286 19:0.021428571 20:0.05 -8 -5.104856729491351 1:0.057692308 2:0.073717949 3:0.051282051 4:0.019230769 5:0.032051282 6:0.016025641 7:0.044871795 8:0.08974359 9:0.12179487 10:0.057692308 11:0.044871795 12:0.086538462 13:0.022435897 14:0.03525641 15:0.076923077 16:0.012820513 17:0.025641026 18:0.044871795 19:0.0096153846 20:0.076923077 -8 -0.93498397085957 1:0.11607143 2:0.10714286 3:0.075892857 4:0.022321429 5:0.017857143 6:0.0089285714 7:0.026785714 8:0.071428571 9:0.049107143 10:0.044642857 11:0.044642857 12:0.049107143 13:0.0089285714 14:0.017857143 15:0.080357143 16:0.075892857 17:0.0044642857 18:0.098214286 19:0.040178571 20:0.040178571 -8 -8 1:0.082887701 2:0.088235294 3:0.072192513 4:0.029411765 5:0.026737968 6:0 7:0.021390374 8:0.10695187 9:0.096256684 10:0.042780749 11:0.056149733 12:0.064171123 13:0.0053475936 14:0.021390374 15:0.061497326 16:0.061497326 17:0.029411765 18:0.064171123 19:0.016042781 20:0.053475936 -3.213619212951361 -8 1:0.055803571 2:0.095982143 3:0.035714286 4:0.015625 5:0.035714286 6:0.022321429 7:0.024553571 8:0.11383929 9:0.14285714 10:0.071428571 11:0.058035714 12:0.03125 13:0.0044642857 14:0.017857143 15:0.064732143 16:0.026785714 17:0.0625 18:0.040178571 19:0.03125 20:0.049107143 -8 -1.43567174015868 1:0.051162791 2:0.093023256 3:0.093023256 4:0.027906977 5:0.060465116 6:0.0046511628 7:0.03255814 8:0.041860465 9:0.055813953 10:0.046511628 11:0.074418605 12:0.046511628 13:0.0093023256 14:0.018604651 15:0.013953488 16:0.08372093 17:0.013953488 18:0.069767442 19:0.10232558 20:0.060465116 -8 -7.148916663194218 1:0.13864307 2:0.073746313 3:0.050147493 4:0.02359882 5:0.026548673 6:0.0088495575 7:0.017699115 8:0.08259587 9:0.097345133 10:0.044247788 11:0.03539823 12:0.04719764 13:0.020648968 14:0.038348083 15:0.041297935 16:0.05899705 17:0.041297935 18:0.07079646 19:0.03539823 20:0.04719764 -8 -4.942619420808389 1:0.059602649 2:0.079470199 3:0.066225166 4:0.01986755 5:0.052980132 6:0.01986755 7:0.059602649 8:0.10596026 9:0.072847682 10:0.046357616 11:0.059602649 12:0.052980132 13:0.026490066 14:0.0066225166 15:0.01986755 16:0.059602649 17:0.046357616 18:0.079470199 19:0.039735099 20:0.026490066 -8 -8 1:0.081300813 2:0.097560976 3:0.024390244 4:0.026422764 5:0.028455285 6:0.010162602 7:0.022357724 8:0.069105691 9:0.1504065 10:0.071138211 11:0.034552846 12:0.056910569 13:0.0081300813 14:0.026422764 15:0.075203252 16:0.046747967 17:0.040650407 18:0.042682927 19:0.020325203 20:0.067073171 -8 -8 1:0.076142132 2:0.098984772 3:0.040609137 4:0.020304569 5:0.017766497 6:0.0025380711 7:0.02284264 8:0.10406091 9:0.14467005 10:0.045685279 11:0.043147208 12:0.060913706 13:0.0025380711 14:0.017766497 15:0.058375635 16:0.055837563 17:0.038071066 18:0.058375635 19:0.030456853 20:0.060913706 -7.092132096024378 -8 1:0.041025641 2:0.092307692 3:0.058974359 4:0.015384615 5:0.023076923 6:0.025641026 7:0.043589744 8:0.084615385 9:0.13076923 10:0.043589744 11:0.056410256 12:0.056410256 13:0.0025641026 14:0.0076923077 15:0.066666667 16:0.046153846 17:0.069230769 18:0.035897436 19:0.017948718 20:0.082051282 -0.3264968739392036 -0 1:0.071428571 2:0.078947368 3:0.052631579 4:0.033834586 5:0.052631579 6:0.0037593985 7:0.082706767 8:0.033834586 9:0.026315789 10:0.026315789 11:0.060150376 12:0.052631579 13:0.007518797 14:0.022556391 15:0.022556391 16:0.12781955 17:0.030075188 18:0.045112782 19:0.078947368 20:0.090225564 -3.952463768060047 -5.675303638711491 1:0.060606061 2:0.051948052 3:0.038961039 4:0.03030303 5:0.047619048 6:0.03030303 7:0.034632035 8:0.064935065 9:0.069264069 10:0.047619048 11:0.064935065 12:0.038961039 13:0.034632035 14:0.021645022 15:0.077922078 16:0.038961039 17:0.082251082 18:0.073593074 19:0.064935065 20:0.025974026 -0 -0.1087288973751009 1:0.062695925 2:0.059561129 3:0.059561129 4:0.028213166 5:0.031347962 6:0.02507837 7:0.043887147 8:0.07523511 9:0.043887147 10:0.031347962 11:0.078369906 12:0.097178683 13:0 14:0.012539185 15:0.034482759 16:0.090909091 17:0.031347962 18:0.034482759 19:0.078369906 20:0.081504702 -0 -1.465525992175706 1:0.081818182 2:0.063636364 3:0.045454545 4:0.012121212 5:0.03030303 6:0.012121212 7:0.063636364 8:0.063636364 9:0.051515152 10:0.042424242 11:0.078787879 12:0.048484848 13:0.0060606061 14:0.027272727 15:0.015151515 16:0.12424242 17:0.03030303 18:0.045454545 19:0.078787879 20:0.078787879 -0 -2.253325695101351 1:0.072595281 2:0.056261343 3:0.032667877 4:0.014519056 5:0.038112523 6:0.041742287 7:0.036297641 8:0.12885662 9:0.092558984 10:0.045372051 11:0.085299456 12:0.030852995 13:0.010889292 14:0.021778584 15:0.056261343 16:0.021778584 17:0.050816697 18:0.02722323 19:0.059891107 20:0.076225045 -0 -8 1:0.07073955 2:0.11897106 3:0.038585209 4:0.01607717 5:0.032154341 6:0.0064308682 7:0.035369775 8:0.10610932 9:0.090032154 10:0.051446945 11:0.1221865 12:0.051446945 13:0.0064308682 14:0.022508039 15:0.035369775 16:0.032154341 17:0.041800643 18:0.035369775 19:0.048231511 20:0.038585209 -0 -5.224677139207556 1:0.064896755 2:0.088495575 3:0.038348083 4:0.020648968 5:0.053097345 6:0.005899705 7:0.053097345 8:0.056047198 9:0.10324484 10:0.050147493 11:0.085545723 12:0.061946903 13:0.005899705 14:0.01179941 15:0.029498525 16:0.061946903 17:0.029498525 18:0.02359882 19:0.10324484 20:0.053097345 -0 -2.073821569223395 1:0.058479532 2:0.080409357 3:0.020467836 4:0.0058479532 5:0.01754386 6:0.019005848 7:0.019005848 8:0.10818713 9:0.14327485 10:0.070175439 11:0.078947368 12:0.05994152 13:0.029239766 14:0.029239766 15:0.084795322 16:0.029239766 17:0.042397661 18:0.029239766 19:0.019005848 20:0.055555556 -0 -1.029158507138896 1:0.051918736 2:0.051918736 3:0.027088036 4:0.033860045 5:0.045146727 6:0.031602709 7:0.042889391 8:0.10609481 9:0.11286682 10:0.0496614 11:0.074492099 12:0.072234763 13:0.013544018 14:0.022573363 15:0.0496614 16:0.033860045 17:0.038374718 18:0.036117381 19:0.063205418 20:0.042889391 -1.706604602749902 -5.116835954684479 1:0.095798319 2:0.038655462 3:0.063865546 4:0.023529412 5:0.021848739 6:0.01512605 7:0.043697479 8:0.094117647 9:0.099159664 10:0.048739496 11:0.070588235 12:0.063865546 13:0.0084033613 14:0.020168067 15:0.048739496 16:0.043697479 17:0.031932773 18:0.031932773 19:0.068907563 20:0.067226891 -2.816067234872035 -0 1:0.065732759 2:0.072198276 3:0.078663793 4:0.010775862 5:0.036637931 6:0.0032327586 7:0.023706897 8:0.090517241 9:0.059267241 10:0.011853448 11:0.074353448 12:0.078663793 13:0.0032327586 14:0.010775862 15:0.03125 16:0.089439655 17:0.033405172 18:0.061422414 19:0.10021552 20:0.064655172 -7.786735555513285 -8 1:0.068493151 2:0.083561644 3:0.024657534 4:0.015068493 5:0.036986301 6:0.032876712 7:0.053424658 8:0.082191781 9:0.080821918 10:0.061643836 11:0.065753425 12:0.052054795 13:0.0082191781 14:0.02739726 15:0.069863014 16:0.028767123 17:0.047945205 18:0.054794521 19:0.049315068 20:0.056164384 -8 -8 1:0.062138728 2:0.076589595 3:0.033236994 4:0.020231214 5:0.047687861 6:0.024566474 7:0.031791908 8:0.086705202 9:0.085260116 10:0.075144509 11:0.066473988 12:0.056358382 13:0.0043352601 14:0.027456647 15:0.076589595 16:0.037572254 17:0.040462428 18:0.039017341 19:0.046242775 20:0.062138728 -0 -7.390551885030089 1:0.075247525 2:0.063366337 3:0.053465347 4:0.015841584 5:0.027722772 6:0.031683168 7:0.035643564 8:0.099009901 9:0.085148515 10:0.045544554 11:0.079207921 12:0.075247525 13:0 14:0.017821782 15:0.041584158 16:0.035643564 17:0.037623762 18:0.035643564 19:0.075247525 20:0.069306931 -3.480141175219598 -0 1:0.077419355 2:0.085630499 3:0.039882698 4:0.013489736 5:0.024046921 6:0.008797654 7:0.024046921 8:0.14134897 9:0.13372434 10:0.02228739 11:0.058651026 12:0.045747801 13:0 14:0.02111437 15:0.055718475 16:0.026392962 17:0.039882698 18:0.049853372 19:0.038709677 20:0.093255132 -0 -8 1:0.071661238 2:0.068403909 3:0.03257329 4:0.022801303 5:0.03257329 6:0.013029316 7:0.042345277 8:0.09771987 9:0.084690554 10:0.045602606 11:0.081433225 12:0.078175896 13:0.003257329 14:0.022801303 15:0.048859935 16:0.068403909 17:0.042345277 18:0.048859935 19:0.039087948 20:0.055374593 -2.904079460240392 -3.175136298302513 1:0.04784689 2:0.074162679 3:0.038277512 4:0.028708134 5:0.055023923 6:0.026315789 7:0.057416268 8:0.040669856 9:0.11004785 10:0.035885167 11:0.074162679 12:0.081339713 13:0.009569378 14:0.016746411 15:0.031100478 16:0.050239234 17:0.031100478 18:0.062200957 19:0.076555024 20:0.052631579 Algorithm-SVM-0.13/sample.model.10000644000077436411310000017013210745457457016337 0ustar lairdmwg-userssvm_type c_svc kernel_type rbf gamma 64 nr_class 3 total_sv 199 rho 0.439702 0.336872 0.444534 label 10 0 -10 nr_sv 87 72 40 SV 7.711958119123536 7.365495639545627 1:0.042410714 2:0.091517857 3:0.040178571 4:0.015625 5:0.015625 6:0.022321429 7:0.022321429 8:0.082589286 9:0.12053571 10:0.073660714 11:0.053571429 12:0.053571429 13:0.017857143 14:0.035714286 15:0.11160714 16:0.033482143 17:0.022321429 18:0.060267857 19:0.020089286 20:0.064732143 0.9576898148768235 0.02039180431218783 1:0.11340206 2:0.10309278 3:0.06185567 4:0.030927835 5:0.010309278 6:0.010309278 7:0 8:0.06185567 9:0.15463918 10:0.020618557 11:0.041237113 12:0.030927835 13:0.010309278 14:0 15:0.051546392 16:0.041237113 17:0.13402062 18:0.06185567 19:0.010309278 20:0.051546392 0 0.8314817940045978 1:0.11309524 2:0.071428571 3:0.11309524 4:0.011904762 5:0.047619048 6:0 7:0 8:0.095238095 9:0.041666667 10:0.047619048 11:0.05952381 12:0.05952381 13:0 14:0.011904762 15:0.017857143 16:0.10119048 17:0.005952381 18:0.10119048 19:0.023809524 20:0.077380952 1.22386885617493 0 1:0.080229226 2:0.091690544 3:0.068767908 4:0.06017192 5:0.020057307 6:0.0028653295 7:0.011461318 8:0.077363897 9:0.11461318 10:0.06017192 11:0.068767908 12:0.042979943 13:0.005730659 14:0.028653295 15:0.06017192 16:0.045845272 17:0.034383954 18:0.054441261 19:0.025787966 20:0.045845272 8 8 1:0.054711246 2:0.060790274 3:0.036474164 4:0.03343465 5:0.054711246 6:0.021276596 7:0.051671733 8:0.10334347 9:0.079027356 10:0.05775076 11:0.048632219 12:0.051671733 13:0.012158055 14:0.030395137 15:0.045592705 16:0.063829787 17:0.027355623 18:0.060790274 19:0.042553191 20:0.063829787 8 8 1:0.059907834 2:0.076036866 3:0.046082949 4:0.023041475 5:0.043778802 6:0.01843318 7:0.039170507 8:0.080645161 9:0.12903226 10:0.039170507 11:0.057603687 12:0.062211982 13:0.011520737 14:0.01843318 15:0.036866359 16:0.048387097 17:0.066820276 18:0.062211982 19:0.02764977 20:0.052995392 0 0.9048219239798362 1:0.13402062 2:0.072164948 3:0.10309278 4:0.020618557 5:0.010309278 6:0 7:0.010309278 8:0.10309278 9:0.072164948 10:0.020618557 11:0.06185567 12:0.030927835 13:0 14:0.010309278 15:0.051546392 16:0.082474227 17:0 18:0.10309278 19:0.051546392 20:0.06185567 1.401142770750075 0 1:0.078125 2:0.1171875 3:0.046875 4:0.046875 5:0.046875 6:0.0078125 7:0.015625 8:0.078125 9:0.125 10:0.0234375 11:0.03125 12:0.0390625 13:0 14:0 15:0.03125 16:0.0859375 17:0.015625 18:0.0859375 19:0.0625 20:0.0625 0.3876175023992005 1.828284414673054 1:0.069053708 2:0.066496164 3:0.061381074 4:0.040920716 5:0.025575448 6:0.012787724 7:0.0076726343 8:0.12531969 9:0.16368286 10:0.035805627 11:0.051150895 12:0.046035806 13:0.012787724 14:0.015345269 15:0.046035806 16:0.046035806 17:0.035805627 18:0.051150895 19:0.035805627 20:0.051150895 1.188129274745781 5.414923507660803 1:0.10837438 2:0.044334975 3:0.083743842 4:0.034482759 5:0.039408867 6:0.0098522167 7:0.044334975 8:0.078817734 9:0.039408867 10:0.0591133 11:0.0591133 12:0.0591133 13:0.0098522167 14:0.039408867 15:0.049261084 16:0.049261084 17:0 18:0.068965517 19:0.078817734 20:0.044334975 1.807209070689883 0 1:0.04516129 2:0.10645161 3:0.058064516 4:0.025806452 5:0.05483871 6:0.022580645 7:0.035483871 8:0.064516129 9:0.058064516 10:0.064516129 11:0.074193548 12:0.032258065 13:0.012903226 14:0.025806452 15:0.093548387 16:0.025806452 17:0.048387097 18:0.061290323 19:0.038709677 20:0.051612903 8 0 1:0.056653491 2:0.08168643 3:0.057971014 4:0.040843215 5:0.039525692 6:0.0092226614 7:0.044795784 8:0.075098814 9:0.084321476 10:0.039525692 11:0.04743083 12:0.063241107 13:0.0065876153 14:0.018445323 15:0.050065876 16:0.065876153 17:0.039525692 18:0.063241107 19:0.046113307 20:0.069828722 0 6.431874337263299 1:0.13061224 2:0.10204082 3:0.057142857 4:0.0040816327 5:0.016326531 6:0.012244898 7:0.024489796 8:0.11428571 9:0.097959184 10:0.036734694 11:0.040816327 12:0.044897959 13:0.012244898 14:0.016326531 15:0.057142857 16:0.040816327 17:0.057142857 18:0.044897959 19:0.020408163 20:0.069387755 2.242011635231154 0 1:0.048582996 2:0.12955466 3:0.048582996 4:0.032388664 5:0.020242915 6:0.032388664 7:0.016194332 8:0.085020243 9:0.10121457 10:0.052631579 11:0.04048583 12:0.044534413 13:0.016194332 14:0.024291498 15:0.060728745 16:0.052631579 17:0.028340081 18:0.064777328 19:0.04048583 20:0.060728745 2.09133957835712 0 1:0.082236842 2:0.098684211 3:0.055921053 4:0.0065789474 5:0.039473684 6:0.023026316 7:0.016447368 8:0.072368421 9:0.11842105 10:0.075657895 11:0.055921053 12:0.032894737 13:0.023026316 14:0.023026316 15:0.049342105 16:0.032894737 17:0.0625 18:0.049342105 19:0.023026316 20:0.059210526 7.760173779100334 0 1:0.048275862 2:0.12068966 3:0.048275862 4:0.034482759 5:0.044827586 6:0.017241379 7:0.024137931 8:0.05862069 9:0.089655172 10:0.055172414 11:0.086206897 12:0.068965517 13:0.013793103 14:0.034482759 15:0.068965517 16:0.024137931 17:0.048275862 18:0.044827586 19:0.024137931 20:0.044827586 4.787234112788998 0 1:0.058823529 2:0.10160428 3:0.050802139 4:0.018716578 5:0.032085561 6:0.053475936 7:0.018716578 8:0.07486631 9:0.082887701 10:0.069518717 11:0.058823529 12:0.042780749 13:0.013368984 14:0.029411765 15:0.06684492 16:0.029411765 17:0.069518717 18:0.061497326 19:0.018716578 20:0.048128342 1.381004952256783 0 1:0.077803204 2:0.12814645 3:0.032036613 4:0.027459954 5:0.029748284 6:0.0068649886 7:0.020594966 8:0.091533181 9:0.11899314 10:0.038901602 11:0.04805492 12:0.054919908 13:0.016018307 14:0.022883295 15:0.050343249 16:0.034324943 17:0.034324943 18:0.061784897 19:0.04805492 20:0.057208238 0 0.7294772331011425 1:0.073891626 2:0.13793103 3:0.054187192 4:0.0049261084 5:0.034482759 6:0.0098522167 7:0.044334975 8:0.044334975 9:0.064039409 10:0.0591133 11:0.073891626 12:0.049261084 13:0.014778325 14:0.034482759 15:0.034482759 16:0.073891626 17:0.039408867 18:0.049261084 19:0.034482759 20:0.068965517 8 2.463563922537672 1:0.057636888 2:0.10662824 3:0.031700288 4:0.031700288 5:0.023054755 6:0.031700288 7:0.04610951 8:0.069164265 9:0.097982709 10:0.054755043 11:0.050432277 12:0.047550432 13:0.015850144 14:0.024495677 15:0.057636888 16:0.051873199 17:0.050432277 18:0.05907781 19:0.030259366 20:0.061959654 2.622733230440654 0 1:0.081081081 2:0.081081081 3:0.064864865 4:0.027027027 5:0.016216216 6:0 7:0.010810811 8:0.032432432 9:0.1027027 10:0.043243243 11:0.07027027 12:0.048648649 13:0.0054054054 14:0 15:0.091891892 16:0.086486486 17:0.032432432 18:0.075675676 19:0.027027027 20:0.1027027 1.844389048445678 0.22770244517299 1:0.082352941 2:0.10588235 3:0.023529412 4:0.023529412 5:0.047058824 6:0 7:0 8:0.070588235 9:0.10588235 10:0.023529412 11:0.070588235 12:0.11764706 13:0 14:0.023529412 15:0.011764706 16:0.082352941 17:0.070588235 18:0.10588235 19:0.023529412 20:0.011764706 4.315625030679095 0 1:0.07712766 2:0.125 3:0.039893617 4:0.021276596 5:0.02393617 6:0.0053191489 7:0.0053191489 8:0.090425532 9:0.14361702 10:0.018617021 11:0.050531915 12:0.066489362 13:0 14:0.018617021 15:0.039893617 16:0.042553191 17:0.061170213 18:0.061170213 19:0.045212766 20:0.063829787 1.919751876365703 0 1:0.094339623 2:0.12264151 3:0.066037736 4:0.037735849 5:0.018867925 6:0 7:0.037735849 8:0.066037736 9:0.13207547 10:0.056603774 11:0.047169811 12:0.018867925 13:0.0094339623 14:0.0094339623 15:0.018867925 16:0.094339623 17:0.037735849 18:0.075471698 19:0.028301887 20:0.028301887 1.115630872619839 0 1:0.081871345 2:0.11695906 3:0.035087719 4:0.040935673 5:0.058479532 6:0.01754386 7:0.029239766 8:0.14035088 9:0.058479532 10:0.040935673 11:0.023391813 12:0.046783626 13:0.011695906 14:0.011695906 15:0.052631579 16:0.064327485 17:0.023391813 18:0.052631579 19:0.023391813 20:0.070175439 1.368730552986554 0 1:0.072649573 2:0.061965812 3:0.055555556 4:0.040598291 5:0.055555556 6:0.0042735043 7:0.034188034 8:0.072649573 9:0.10042735 10:0.051282051 11:0.068376068 12:0.044871795 13:0.0064102564 14:0.027777778 15:0.040598291 16:0.053418803 17:0.019230769 18:0.072649573 19:0.036324786 20:0.081196581 8 2.502004762108732 1:0.062809917 2:0.094214876 3:0.034710744 4:0.026446281 5:0.042975207 6:0.036363636 7:0.034710744 8:0.074380165 9:0.084297521 10:0.061157025 11:0.042975207 12:0.038016529 13:0.011570248 14:0.036363636 15:0.066115702 16:0.031404959 17:0.067768595 18:0.044628099 19:0.034710744 20:0.074380165 0.444740694146401 0 1:0.056603774 2:0.094339623 3:0.08490566 4:0.018867925 5:0.047169811 6:0.018867925 7:0.047169811 8:0.028301887 9:0.047169811 10:0.028301887 11:0.056603774 12:0.066037736 13:0 14:0.037735849 15:0.12264151 16:0.037735849 17:0.08490566 18:0.028301887 19:0.0094339623 20:0.08490566 1.918995998595875 3.569199389027621 1:0.037617555 2:0.12225705 3:0.037617555 4:0.031347962 5:0.043887147 6:0.015673981 7:0.056426332 8:0.065830721 9:0.10031348 10:0.059561129 11:0.040752351 12:0.059561129 13:0.02507837 14:0.018808777 15:0.056426332 16:0.02507837 17:0.059561129 18:0.053291536 19:0.02507837 20:0.065830721 0 0.3172007235944944 1:0.068181818 2:0.11363636 3:0.015151515 4:0.0075757576 5:0.075757576 6:0.03030303 7:0.022727273 8:0.045454545 9:0.053030303 10:0.060606061 11:0.037878788 12:0.060606061 13:0.022727273 14:0.022727273 15:0.068181818 16:0.083333333 17:0.015151515 18:0.11363636 19:0.022727273 20:0.060606061 0.9003790663523249 1.874660213433047 1:0.036585366 2:0.19512195 3:0.036585366 4:0.012195122 5:0.030487805 6:0.024390244 7:0.024390244 8:0.067073171 9:0.048780488 10:0.079268293 11:0.067073171 12:0.067073171 13:0.012195122 14:0.006097561 15:0.085365854 16:0.042682927 17:0.036585366 18:0.085365854 19:0.006097561 20:0.036585366 5.767297152102819 0 1:0.058823529 2:0.10441176 3:0.029411765 4:0.026470588 5:0.052941176 6:0.022058824 7:0.035294118 8:0.058823529 9:0.11764706 10:0.041176471 11:0.045588235 12:0.047058824 13:0.0058823529 14:0.027941176 15:0.072058824 16:0.025 17:0.060294118 18:0.070588235 19:0.035294118 20:0.063235294 8 0 1:0.06147541 2:0.094262295 3:0.049180328 4:0.030737705 5:0.032786885 6:0.0040983607 7:0.014344262 8:0.079918033 9:0.12704918 10:0.047131148 11:0.06147541 12:0.067622951 13:0.0081967213 14:0.026639344 15:0.069672131 16:0.055327869 17:0.032786885 18:0.055327869 19:0.043032787 20:0.038934426 0 4.453396603315925 1:0.050816697 2:0.08892922 3:0.038112523 4:0.029038113 5:0.043557169 6:0.041742287 7:0.043557169 8:0.070780399 9:0.070780399 10:0.052631579 11:0.047186933 12:0.052631579 13:0.0090744102 14:0.034482759 15:0.061705989 16:0.032667877 17:0.052631579 18:0.059891107 19:0.052631579 20:0.067150635 0.5869925052851459 3.285963127756351 1:0.054347826 2:0.09057971 3:0.10869565 4:0.032608696 5:0.06884058 6:0.014492754 7:0.047101449 8:0.036231884 9:0.028985507 10:0.0036231884 11:0.12681159 12:0.039855072 13:0.014492754 14:0.0036231884 15:0.043478261 16:0.086956522 17:0.014492754 18:0.057971014 19:0.072463768 20:0.054347826 0.4124593304177421 0 1:0.070422535 2:0.056338028 3:0.0657277 4:0.046948357 5:0.042253521 6:0.0093896714 7:0.014084507 8:0.079812207 9:0.10798122 10:0.018779343 11:0.018779343 12:0.061032864 13:0.0093896714 14:0.03286385 15:0.056338028 16:0.098591549 17:0.023474178 18:0.10798122 19:0.018779343 20:0.061032864 0.909956697515001 0.3428599350149554 1:0.096774194 2:0.08797654 3:0.046920821 4:0.032258065 5:0.049853372 6:0.011730205 7:0.017595308 8:0.061583578 9:0.12903226 10:0.064516129 11:0.017595308 12:0.038123167 13:0 14:0.023460411 15:0.13196481 16:0.0058651026 17:0.014662757 18:0.090909091 19:0.017595308 20:0.061583578 0.5087018076244285 0.9252971932269028 1:0.12755102 2:0.071428571 3:0.015306122 4:0.020408163 5:0.020408163 6:0 7:0.020408163 8:0.14795918 9:0.071428571 10:0.025510204 11:0.025510204 12:0.025510204 13:0.030612245 14:0.10204082 15:0.020408163 16:0.030612245 17:0.030612245 18:0.086734694 19:0.030612245 20:0.096938776 0.4977585030127463 0 1:0.04691358 2:0.081481481 3:0.071604938 4:0.041975309 5:0.041975309 6:0.017283951 7:0.041975309 8:0.088888889 9:0.091358025 10:0.059259259 11:0.034567901 12:0.054320988 13:0.017283951 14:0.022222222 15:0.037037037 16:0.049382716 17:0.039506173 18:0.079012346 19:0.039506173 20:0.044444444 8 6.804891923489974 1:0.079545455 2:0.095454545 3:0.056818182 4:0.018181818 5:0.038636364 6:0.013636364 7:0.034090909 8:0.068181818 9:0.090909091 10:0.059090909 11:0.084090909 12:0.052272727 13:0.013636364 14:0.015909091 15:0.070454545 16:0.045454545 17:0.031818182 18:0.054545455 19:0.029545455 20:0.047727273 0 2.419125707605434 1:0.09469697 2:0.056818182 3:0.11363636 4:0.022727273 5:0.037878788 6:0 7:0.034090909 8:0.11742424 9:0.068181818 10:0.026515152 11:0.075757576 12:0.03030303 13:0.011363636 14:0.022727273 15:0.034090909 16:0.053030303 17:0.03030303 18:0.060606061 19:0.083333333 20:0.026515152 3.236273204294736 3.115870946018938 1:0.11464968 2:0.057324841 3:0.031847134 4:0.031847134 5:0.044585987 6:0.031847134 7:0.063694268 8:0.063694268 9:0.063694268 10:0.063694268 11:0.025477707 12:0.050955414 13:0.012738854 14:0.01910828 15:0.01910828 16:0.063694268 17:0.044585987 18:0.095541401 19:0.044585987 20:0.057324841 0.9603188152753798 1.36269535657466 1:0.076502732 2:0.076502732 3:0.071038251 4:0.038251366 5:0.027322404 6:0 7:0.016393443 8:0.071038251 9:0.054644809 10:0.054644809 11:0.027322404 12:0.027322404 13:0.010928962 14:0.0054644809 15:0.076502732 16:0.10382514 17:0.06010929 18:0.10928962 19:0.032786885 20:0.06010929 8 2.305676669762596 1:0.088095238 2:0.071428571 3:0.054761905 4:0.023809524 5:0.028571429 6:0.0071428571 7:0.05 8:0.09047619 9:0.10952381 10:0.042857143 11:0.047619048 12:0.038095238 13:0.014285714 14:0.030952381 15:0.030952381 16:0.071428571 17:0.038095238 18:0.061904762 19:0.047619048 20:0.052380952 1.220179697991229 1.07014928758646 1:0.069868996 2:0.12227074 3:0.061135371 4:0.013100437 5:0.034934498 6:0.021834061 7:0.0087336245 8:0.096069869 9:0.10917031 10:0.065502183 11:0.026200873 12:0.065502183 13:0.0043668122 14:0.021834061 15:0.052401747 16:0.030567686 17:0.030567686 18:0.074235808 19:0.056768559 20:0.034934498 2.277354265541319 0 1:0.11585366 2:0.091463415 3:0.067073171 4:0.024390244 5:0.030487805 6:0.012195122 7:0.018292683 8:0.042682927 9:0.079268293 10:0.042682927 11:0.030487805 12:0.06097561 13:0.012195122 14:0.018292683 15:0.06097561 16:0.054878049 17:0.067073171 18:0.036585366 19:0.054878049 20:0.079268293 8 4.382176816175112 1:0.071428571 2:0.071428571 3:0.066502463 4:0.024630542 5:0.027093596 6:0.0024630542 7:0.022167488 8:0.071428571 9:0.16256158 10:0.036945813 11:0.044334975 12:0.083743842 13:0.014778325 14:0.019704433 15:0.041871921 16:0.04679803 17:0.041871921 18:0.044334975 19:0.036945813 20:0.068965517 3.360535755615756 0 1:0.066193853 2:0.10874704 3:0.030732861 4:0.01891253 5:0.047281324 6:0.0094562648 7:0.028368794 8:0.061465721 9:0.082742317 10:0.04964539 11:0.044917258 12:0.035460993 13:0.011820331 14:0.023640662 15:0.066193853 16:0.056737589 17:0.073286052 18:0.087470449 19:0.042553191 20:0.054373522 0.5677176650782586 0 1:0.078313253 2:0.13253012 3:0.060240964 4:0.024096386 5:0.024096386 6:0.012048193 7:0.024096386 8:0.036144578 9:0.060240964 10:0.036144578 11:0.072289157 12:0.036144578 13:0 14:0.018072289 15:0.12048193 16:0.030120482 17:0.030120482 18:0.090361446 19:0.054216867 20:0.060240964 1.642300475912365 0 1:0.053719008 2:0.064049587 3:0.041322314 4:0.02892562 5:0.066115702 6:0.01446281 7:0.037190083 8:0.070247934 9:0.07231405 10:0.059917355 11:0.049586777 12:0.041322314 13:0.0041322314 14:0.039256198 15:0.066115702 16:0.059917355 17:0.037190083 18:0.070247934 19:0.049586777 20:0.074380165 0 1.803031079534803 1:0.047945205 2:0.12328767 3:0.089041096 4:0.01369863 5:0.054794521 6:0.01369863 7:0.047945205 8:0.075342466 9:0.054794521 10:0.02739726 11:0.04109589 12:0.034246575 13:0.02739726 14:0.0068493151 15:0.04109589 16:0.061643836 17:0.061643836 18:0.089041096 19:0.068493151 20:0.020547945 0 0.4172196602537238 1:0.077005348 2:0.075935829 3:0.065240642 4:0.011764706 5:0.027807487 6:0.0042780749 7:0.018181818 8:0.04171123 9:0.068449198 10:0.054545455 11:0.073796791 12:0.045989305 13:0.0042780749 14:0.022459893 15:0.097326203 16:0.056684492 17:0.044919786 18:0.10909091 19:0.052406417 20:0.048128342 2.463005397571117 0.5766041171231547 1:0.067307692 2:0.067307692 3:0.057692308 4:0.019230769 5:0.057692308 6:0.019230769 7:0.048076923 8:0.096153846 9:0.038461538 10:0.067307692 11:0.057692308 12:0.028846154 13:0.028846154 14:0.0096153846 15:0.019230769 16:0.11538462 17:0.019230769 18:0.086538462 19:0.019230769 20:0.076923077 0 0.2707682096198543 1:0.096069869 2:0.087336245 3:0.048034934 4:0.052401747 5:0.03930131 6:0.013100437 7:0.021834061 8:0.065502183 9:0.043668122 10:0.061135371 11:0.03930131 12:0.056768559 13:0.0043668122 14:0.030567686 15:0.096069869 16:0.030567686 17:0.030567686 18:0.10480349 19:0.021834061 20:0.056768559 0.4980663106856051 0 1:0.0859375 2:0.0859375 3:0.05078125 4:0.02734375 5:0.0390625 6:0.015625 7:0.01171875 8:0.109375 9:0.12109375 10:0.03125 11:0.06640625 12:0.0546875 13:0.0078125 14:0.01171875 15:0.06640625 16:0.03125 17:0.03125 18:0.0546875 19:0.0390625 20:0.05859375 8 7.086863049516101 1:0.093939394 2:0.078787879 3:0.057575758 4:0.027272727 5:0.024242424 6:0.003030303 7:0.021212121 8:0.075757576 9:0.12121212 10:0.036363636 11:0.078787879 12:0.084848485 13:0 14:0.015151515 15:0.027272727 16:0.072727273 17:0.033333333 18:0.057575758 19:0.051515152 20:0.039393939 0.16978315166816 0.6262833546527321 1:0.085714286 2:0.1047619 3:0.057142857 4:0.028571429 5:0.038095238 6:0.019047619 7:0.023809524 8:0.1047619 9:0.095238095 10:0.071428571 11:0.057142857 12:0.033333333 13:0 14:0.023809524 15:0.09047619 16:0.014285714 17:0.028571429 18:0.066666667 19:0.019047619 20:0.038095238 0 0.6225616587846976 1:0.047904192 2:0.10179641 3:0.077844311 4:0.023952096 5:0.047904192 6:0.011976048 7:0.041916168 8:0.041916168 9:0.05988024 10:0.011976048 11:0.083832335 12:0.023952096 13:0.005988024 14:0.05988024 15:0.005988024 16:0.083832335 17:0.047904192 18:0.10179641 19:0.071856287 20:0.047904192 3.539512469518375 0 1:0.084985836 2:0.067988669 3:0.056657224 4:0.036827195 5:0.050991501 6:0.0084985836 7:0.033994334 8:0.076487252 9:0.09631728 10:0.050991501 11:0.042492918 12:0.045325779 13:0.016997167 14:0.011331445 15:0.042492918 16:0.07082153 17:0.031161473 18:0.076487252 19:0.033994334 20:0.065155807 5.053320086622517 0 1:0.060526316 2:0.12631579 3:0.026315789 4:0.023684211 5:0.034210526 6:0.0026315789 7:0.026315789 8:0.076315789 9:0.086842105 10:0.084210526 11:0.068421053 12:0.039473684 13:0.0026315789 14:0.028947368 15:0.081578947 16:0.023684211 17:0.021052632 18:0.086842105 19:0.010526316 20:0.089473684 4.680405584726584 3.1607782751567 1:0.098671727 2:0.055028463 3:0.043643264 4:0.019924099 5:0.018026565 6:0.0066413662 7:0.009487666 8:0.090132827 9:0.13282732 10:0.068311195 11:0.065464896 12:0.05597723 13:0.0056925996 14:0.016129032 15:0.067362429 16:0.068311195 17:0.033206831 18:0.072106262 19:0.022770398 20:0.05028463 0.9161839814504772 0 1:0.067605634 2:0.095774648 3:0.076056338 4:0.025352113 5:0.036619718 6:0.0056338028 7:0.014084507 8:0.047887324 9:0.10704225 10:0.028169014 11:0.03943662 12:0.033802817 13:0.0084507042 14:0.022535211 15:0.030985915 16:0.12394366 17:0.028169014 18:0.098591549 19:0.042253521 20:0.067605634 0 3.413622919574543 1:0.057591623 2:0.12958115 3:0.083769634 4:0.0065445026 5:0.039267016 6:0.0013089005 7:0.02486911 8:0.045811518 9:0.060209424 10:0.041884817 11:0.082460733 12:0.044502618 13:0.018324607 14:0.028795812 15:0.023560209 16:0.090314136 17:0.030104712 18:0.078534031 19:0.069371728 20:0.043193717 0 1.623216404693624 1:0.080519481 2:0.07012987 3:0.064935065 4:0.015584416 5:0.041558442 6:0.0025974026 7:0.044155844 8:0.080519481 9:0.077922078 10:0.041558442 11:0.072727273 12:0.057142857 13:0.015584416 14:0.023376623 15:0.033766234 16:0.083116883 17:0.033766234 18:0.072727273 19:0.031168831 20:0.057142857 0 3.064899378088907 1:0.072186837 2:0.12951168 3:0.033970276 4:0.014861996 5:0.027600849 6:0.0042462845 7:0.016985138 8:0.10615711 9:0.17834395 10:0.053078556 11:0.042462845 12:0.038216561 13:0.0042462845 14:0.033970276 15:0.089171975 16:0.014861996 17:0.031847134 18:0.048832272 19:0.016985138 20:0.042462845 0.07321979021271291 0 1:0.098404255 2:0.058510638 3:0.07712766 4:0.037234043 5:0.037234043 6:0.010638298 7:0.015957447 8:0.12765957 9:0.079787234 10:0.053191489 11:0.037234043 12:0.066489362 13:0.026595745 14:0.02393617 15:0.037234043 16:0.04787234 17:0.029255319 18:0.058510638 19:0.029255319 20:0.04787234 1.414559832063233 1.713459987670003 1:0.072463768 2:0.028985507 3:0.057971014 4:0.028985507 5:0.10144928 6:0.014492754 7:0 8:0.11594203 9:0.043478261 10:0.014492754 11:0.072463768 12:0.057971014 13:0 14:0.014492754 15:0.028985507 16:0.086956522 17:0.11594203 18:0.057971014 19:0.028985507 20:0.057971014 0 3.61484382511828 1:0.047021944 2:0.094043887 3:0.10658307 4:0.018808777 5:0.040752351 6:0.0031347962 7:0.043887147 8:0.084639498 9:0.059561129 10:0.02507837 11:0.053291536 12:0.059561129 13:0.012539185 14:0.02507837 15:0.02507837 16:0.11912226 17:0.012539185 18:0.05015674 19:0.081504702 20:0.037617555 2.538096484875596 8 1:0.060240964 2:0.10240964 3:0.048192771 4:0.022088353 5:0.032128514 6:0.012048193 7:0.016064257 8:0.088353414 9:0.13453815 10:0.06626506 11:0.042168675 12:0.072289157 13:0.010040161 14:0.034136546 15:0.080321285 16:0.0080321285 17:0.03815261 18:0.044176707 19:0.014056225 20:0.074297189 2.939830810426161 0.750245447458261 1:0.050373134 2:0.11380597 3:0.029850746 4:0.018656716 5:0.039179104 6:0.024253731 7:0.039179104 8:0.089552239 9:0.10820896 10:0.067164179 11:0.044776119 12:0.050373134 13:0.0018656716 14:0.027985075 15:0.080223881 16:0.029850746 17:0.024253731 18:0.083955224 19:0.020522388 20:0.055970149 8 1.923322991472697 1:0.081632653 2:0.10714286 3:0.051020408 4:0.015306122 5:0.038265306 6:0.017857143 7:0.025510204 8:0.10714286 9:0.068877551 10:0.058673469 11:0.06122449 12:0.033163265 13:0.0025510204 14:0.025510204 15:0.084183673 16:0.025510204 17:0.058673469 18:0.033163265 19:0.015306122 20:0.089285714 4.148494620480828 0 1:0.06557377 2:0.15409836 3:0.029508197 4:0.013114754 5:0.029508197 6:0.0032786885 7:0.016393443 8:0.06557377 9:0.095081967 10:0.075409836 11:0.045901639 12:0.039344262 13:0.019672131 14:0.036065574 15:0.1147541 16:0.029508197 17:0.036065574 18:0.049180328 19:0.013114754 20:0.068852459 1.970909504420514 1.469543137993742 1:0.079625293 2:0.086651054 3:0.042154567 4:0.039812646 5:0.056206089 6:0.0023419204 7:0.028103044 8:0.11943794 9:0.10772834 10:0.053864169 11:0.058548009 12:0.056206089 13:0.011709602 14:0.028103044 15:0.044496487 16:0.035128806 17:0.021077283 18:0.056206089 19:0.021077283 20:0.051522248 2.343027791980051 0 1:0.060810811 2:0.14864865 3:0.023648649 4:0.02027027 5:0.050675676 6:0.02027027 7:0.013513514 8:0.070945946 9:0.14189189 10:0.047297297 11:0.057432432 12:0.054054054 13:0.013513514 14:0.037162162 15:0.091216216 16:0.0067567568 17:0.030405405 18:0.067567568 19:0.010135135 20:0.033783784 0 0.7046264071953378 1:0.12686567 2:0.097014925 3:0.029850746 4:0.044776119 5:0.037313433 6:0 7:0.02238806 8:0.067164179 9:0.082089552 10:0.0074626866 11:0.029850746 12:0.029850746 13:0.0074626866 14:0.052238806 15:0.067164179 16:0.067164179 17:0.029850746 18:0.1119403 19:0.02238806 20:0.067164179 0 1.115535580119849 1:0.082969432 2:0.10917031 3:0.069868996 4:0.030567686 5:0.017467249 6:0.0087336245 7:0.030567686 8:0.082969432 9:0.069868996 10:0.048034934 11:0.026200873 12:0.065502183 13:0.0043668122 14:0.017467249 15:0.082969432 16:0.043668122 17:0.034934498 18:0.091703057 19:0.017467249 20:0.065502183 7.142210537651729 1.47650739255786 1:0.10731707 2:0.10731707 3:0.043902439 4:0.034146341 5:0.03902439 6:0.0048780488 7:0.0048780488 8:0.087804878 9:0.16585366 10:0.073170732 11:0.03902439 12:0.03902439 13:0.019512195 14:0.0048780488 15:0.029268293 16:0.063414634 17:0.019512195 18:0.048780488 19:0.024390244 20:0.043902439 5.084297507612087 0 1:0.062686567 2:0.08358209 3:0.028358209 4:0.019402985 5:0.020895522 6:0.028358209 7:0.02238806 8:0.074626866 9:0.1358209 10:0.03880597 11:0.07761194 12:0.065671642 13:0.013432836 14:0.037313433 15:0.06119403 16:0.028358209 17:0.026865672 18:0.071641791 19:0.023880597 20:0.079104478 8 0 1:0.071691176 2:0.10294118 3:0.055147059 4:0.018382353 5:0.036764706 6:0.0091911765 7:0.033088235 8:0.091911765 9:0.11213235 10:0.033088235 11:0.047794118 12:0.064338235 13:0.020220588 14:0.023897059 15:0.025735294 16:0.075367647 17:0.023897059 18:0.049632353 19:0.040441176 20:0.064338235 8 3.501133365624939 1:0.051546392 2:0.10309278 3:0.056701031 4:0.036082474 5:0.054123711 6:0.012886598 7:0.015463918 8:0.072164948 9:0.056701031 10:0.043814433 11:0.043814433 12:0.056701031 13:0.0051546392 14:0.015463918 15:0.036082474 16:0.072164948 17:0.048969072 18:0.067010309 19:0.085051546 20:0.067010309 1.042280192697824 6.599311555715316 1:0.084337349 2:0.087349398 3:0.090361446 4:0.018072289 5:0.03313253 6:0.0090361446 7:0.030120482 8:0.075301205 9:0.069277108 10:0.024096386 11:0.063253012 12:0.069277108 13:0.0090361446 14:0.012048193 15:0.030120482 16:0.10843373 17:0.012048193 18:0.048192771 19:0.063253012 20:0.063253012 1.40090583605685 0 1:0.065822785 2:0.13164557 3:0.027848101 4:0.02278481 5:0.035443038 6:0.015189873 7:0.025316456 8:0.086075949 9:0.14177215 10:0.043037975 11:0.06835443 12:0.048101266 13:0.010126582 14:0.037974684 15:0.060759494 16:0.017721519 17:0.040506329 18:0.043037975 19:0.020253165 20:0.058227848 0.2972838251979828 2.368139077529406 1:0.1147541 2:0.081967213 3:0.040983607 4:0.024590164 5:0.049180328 6:0.016393443 7:0.0081967213 8:0.1147541 9:0.10655738 10:0.040983607 11:0.032786885 12:0.073770492 13:0 14:0.016393443 15:0.073770492 16:0.040983607 17:0.016393443 18:0.06557377 19:0.049180328 20:0.032786885 0.1078766884786763 0 1:0.083700441 2:0.10572687 3:0.039647577 4:0.030837004 5:0.022026432 6:0.0088105727 7:0.017621145 8:0.057268722 9:0.083700441 10:0.017621145 11:0.066079295 12:0.030837004 13:0 14:0.030837004 15:0.074889868 16:0.035242291 17:0.092511013 18:0.088105727 19:0.026431718 20:0.088105727 3.638849737588541 0 1:0.095846645 2:0.092651757 3:0.067092652 4:0.022364217 5:0.03514377 6:0.0095846645 7:0.025559105 8:0.067092652 9:0.11821086 10:0.047923323 11:0.028753994 12:0.057507987 13:0.0063897764 14:0.012779553 15:0.038338658 16:0.067092652 17:0.038338658 18:0.057507987 19:0.041533546 20:0.07028754 8 0 1:0.10447761 2:0.076492537 3:0.054104478 4:0.02238806 5:0.02238806 6:0.0018656716 7:0.0093283582 8:0.087686567 9:0.10820896 10:0.048507463 11:0.069029851 12:0.054104478 13:0 14:0.0074626866 15:0.046641791 16:0.054104478 17:0.065298507 18:0.072761194 19:0.037313433 20:0.057835821 4.063652998746698 0 1:0.09921671 2:0.075718016 3:0.057441253 4:0.041775457 5:0.033942559 6:0 7:0.0078328982 8:0.10966057 9:0.1227154 10:0.039164491 11:0.031331593 12:0.062663185 13:0 14:0.0078328982 15:0.046997389 16:0.044386423 17:0.036553525 18:0.067885117 19:0.041775457 20:0.07310705 -0.4367016930900829 0 1:0.067307692 2:0.11538462 3:0.048076923 4:0.048076923 5:0.0096153846 6:0.019230769 7:0 8:0.086538462 9:0.16346154 10:0.086538462 11:0.038461538 12:0.028846154 13:0.019230769 14:0.019230769 15:0 16:0.096153846 17:0.028846154 18:0.067307692 19:0.028846154 20:0.028846154 -0 0.4206934877608815 1:0.049180328 2:0.057377049 3:0.040983607 4:0.032786885 5:0.024590164 6:0.024590164 7:0.0081967213 8:0.073770492 9:0.20491803 10:0.057377049 11:0.049180328 12:0.049180328 13:0.032786885 14:0.0081967213 15:0.040983607 16:0.06557377 17:0.040983607 18:0.032786885 19:0.049180328 20:0.057377049 -0.9130831471015111 0 1:0.057803468 2:0.10982659 3:0.028901734 4:0.028901734 5:0.034682081 6:0.0057803468 7:0.011560694 8:0.13294798 9:0.10404624 10:0.052023121 11:0.040462428 12:0.069364162 13:0.011560694 14:0.063583815 15:0.011560694 16:0.069364162 17:0.034682081 18:0.028901734 19:0.046242775 20:0.057803468 -0 1.209413711785985 1:0.10632184 2:0.074712644 3:0.037356322 4:0.022988506 5:0.031609195 6:0.0028735632 7:0.034482759 8:0.086206897 9:0.1091954 10:0.037356322 11:0.048850575 12:0.10057471 13:0.0057471264 14:0.0086206897 15:0.022988506 16:0.068965517 17:0.037356322 18:0.020114943 19:0.066091954 20:0.077586207 -3.060966473203146 0 1:0.053846154 2:0.088461538 3:0.057692308 4:0.015384615 5:0.053846154 6:0.0038461538 7:0.030769231 8:0.080769231 9:0.11538462 10:0.034615385 11:0.057692308 12:0.053846154 13:0.0076923077 14:0.0038461538 15:0.034615385 16:0.096153846 17:0.046153846 18:0.065384615 19:0.030769231 20:0.069230769 -2.471114907185362 0.3729539677605425 1:0.10472973 2:0.094594595 3:0.047297297 4:0.023648649 5:0.023648649 6:0 7:0.010135135 8:0.081081081 9:0.15202703 10:0.030405405 11:0.037162162 12:0.050675676 13:0 14:0.010135135 15:0.02027027 16:0.094594595 17:0.057432432 18:0.033783784 19:0.054054054 20:0.074324324 -0 6.121243703135696 1:0.086614173 2:0.086614173 3:0.044619423 4:0.01312336 5:0.041994751 6:0.0052493438 7:0.041994751 8:0.091863517 9:0.062992126 10:0.047244094 11:0.062992126 12:0.073490814 13:0 14:0.015748031 15:0.023622047 16:0.070866142 17:0.062992126 18:0.026246719 19:0.089238845 20:0.052493438 -0 4.354142937599794 1:0.081031308 2:0.086556169 3:0.049723757 4:0.0073664825 5:0.02946593 6:0.023941068 7:0.051565378 8:0.042357274 9:0.08839779 10:0.057090239 11:0.057090239 12:0.060773481 13:0.0036832413 14:0.020257827 15:0.031307551 16:0.082872928 17:0.033149171 18:0.053406998 19:0.068139963 20:0.071823204 -8 0 1:0.072727273 2:0.08 3:0.050909091 4:0.029090909 5:0.034545455 6:0.012727273 7:0.038181818 8:0.090909091 9:0.087272727 10:0.047272727 11:0.038181818 12:0.052727273 13:0.0036363636 14:0.018181818 15:0.036363636 16:0.074545455 17:0.050909091 18:0.063636364 19:0.049090909 20:0.069090909 -7.499980569165563 0.9190188020359428 1:0.061452514 2:0.081005587 3:0.047486034 4:0.041899441 5:0.039106145 6:0.011173184 7:0.055865922 8:0.067039106 9:0.072625698 10:0.044692737 11:0.033519553 12:0.053072626 13:0.0027932961 14:0.027932961 15:0.027932961 16:0.089385475 17:0.044692737 18:0.06424581 19:0.047486034 20:0.086592179 -1.710706180607596 0.3886185262012503 1:0.041284404 2:0.087155963 3:0.027522936 4:0.055045872 5:0.032110092 6:0.050458716 7:0.013761468 8:0.032110092 9:0.11009174 10:0.041284404 11:0.068807339 12:0.041284404 13:0.013761468 14:0.027522936 15:0.12844037 16:0.02293578 17:0.077981651 18:0.077981651 19:0.018348624 20:0.032110092 -1.21797913877259 1.610509838124154 1:0.072249589 2:0.075533662 3:0.031198686 4:0.070607553 5:0.032840722 6:0.019704433 7:0.02955665 8:0.10344828 9:0.072249589 10:0.055829228 11:0.062397373 12:0.060755337 13:0.0016420361 14:0.032840722 15:0.0591133 16:0.036124795 17:0.024630542 18:0.05090312 19:0.032840722 20:0.075533662 -8 4.364215761118517 1:0.079365079 2:0.079365079 3:0.050793651 4:0.025396825 5:0.041269841 6:0.015873016 7:0.025396825 8:0.092063492 9:0.088888889 10:0.066666667 11:0.050793651 12:0.044444444 13:0.041269841 14:0.038095238 15:0.028571429 16:0.053968254 17:0.0095238095 18:0.079365079 19:0.041269841 20:0.047619048 -0 0.7903144566295344 1:0.072580645 2:0.064516129 3:0.040322581 4:0.048387097 5:0.032258065 6:0.0080645161 7:0.048387097 8:0.064516129 9:0.12903226 10:0.040322581 11:0.032258065 12:0.040322581 13:0.0080645161 14:0.024193548 15:0.064516129 16:0.088709677 17:0.056451613 18:0.048387097 19:0.040322581 20:0.048387097 -1.337708316361119 0 1:0.054744526 2:0.10948905 3:0.047445255 4:0.025547445 5:0.032846715 6:0.02189781 7:0.0072992701 8:0.072992701 9:0.076642336 10:0.087591241 11:0.072992701 12:0.054744526 13:0.02189781 14:0.032846715 15:0.054744526 16:0.047445255 17:0.069343066 18:0.02919708 19:0.018248175 20:0.062043796 -8 1.747312334365741 1:0.081911263 2:0.12286689 3:0.034129693 4:0.030716724 5:0.023890785 6:0.020477816 7:0.0034129693 8:0.085324232 9:0.12627986 10:0.040955631 11:0.071672355 12:0.071672355 13:0.0068259386 14:0.013651877 15:0.071672355 16:0.020477816 17:0.010238908 18:0.078498294 19:0.023890785 20:0.061433447 -2.061188296452615 0 1:0.037735849 2:0.14150943 3:0.056603774 4:0.012578616 5:0.022012579 6:0.012578616 7:0.028301887 8:0.066037736 9:0.14465409 10:0.047169811 11:0.062893082 12:0.062893082 13:0 14:0.01572327 15:0.040880503 16:0.040880503 17:0.081761006 18:0.040880503 19:0.031446541 20:0.053459119 -7.058688234670353 5.085596729897322 1:0.062761506 2:0.083682008 3:0.079497908 4:0.020920502 5:0.029288703 6:0.0083682008 7:0.041841004 8:0.054393305 9:0.066945607 10:0.071129707 11:0.079497908 12:0.054393305 13:0.0083682008 14:0 15:0.058577406 16:0.071129707 17:0.046025105 18:0.071129707 19:0.054393305 20:0.037656904 -8 3.390389852295181 1:0.051813472 2:0.13471503 3:0.031088083 4:0.025906736 5:0.020725389 6:0.0051813472 7:0.025906736 8:0.10362694 9:0.098445596 10:0.046632124 11:0.088082902 12:0.056994819 13:0.025906736 14:0.025906736 15:0.088082902 16:0.025906736 17:0.015544041 18:0.051813472 19:0.025906736 20:0.051813472 -0 3.862258679267308 1:0.068493151 2:0.058708415 3:0.029354207 4:0.0097847358 5:0.019569472 6:0.02739726 7:0.035225049 8:0.11741683 9:0.090019569 10:0.050880626 11:0.19765166 12:0.062622309 13:0.045009785 14:0.0019569472 15:0.033268102 16:0.01369863 17:0.037181996 18:0.017612524 19:0.056751468 20:0.02739726 -0.573565310708917 0.7339700752836049 1:0.048484848 2:0.084848485 3:0.057575758 4:0.042424242 5:0.048484848 6:0.033333333 7:0.033333333 8:0.072727273 9:0.078787879 10:0.081818182 11:0.075757576 12:0.048484848 13:0.033333333 14:0.015151515 15:0.042424242 16:0.045454545 17:0.03030303 18:0.036363636 19:0.054545455 20:0.036363636 -8 0 1:0.080952381 2:0.09047619 3:0.057142857 4:0.038095238 5:0.038095238 6:0.019047619 7:0.019047619 8:0.080952381 9:0.071428571 10:0.071428571 11:0.047619048 12:0.033333333 13:0.0095238095 14:0 15:0.09047619 16:0.042857143 17:0.061904762 18:0.057142857 19:0.028571429 20:0.061904762 -2.724824853153582 0 1:0.070754717 2:0.12264151 3:0.028301887 4:0.033018868 5:0.033018868 6:0.0047169811 7:0.014150943 8:0.080188679 9:0.08490566 10:0.037735849 11:0.056603774 12:0.04245283 13:0.0094339623 14:0.018867925 15:0.061320755 16:0.066037736 17:0.075471698 18:0.056603774 19:0.047169811 20:0.056603774 -4.643805842222945 8 1:0.068807339 2:0.084862385 3:0.055045872 4:0.02293578 5:0.03440367 6:0.016055046 7:0.036697248 8:0.098623853 9:0.064220183 10:0.038990826 11:0.080275229 12:0.052752294 13:0.004587156 14:0.018348624 15:0.064220183 16:0.020642202 17:0.059633028 18:0.050458716 19:0.068807339 20:0.059633028 -0 8 1:0.093023256 2:0.072093023 3:0.034883721 4:0.01627907 5:0.041860465 6:0.018604651 7:0.030232558 8:0.10232558 9:0.1 10:0.065116279 11:0.090697674 12:0.048837209 13:0 14:0.011627907 15:0.046511628 16:0.03255814 17:0.069767442 18:0.023255814 19:0.046511628 20:0.055813953 -7.96254902798491 0.2735360816971864 1:0.075067024 2:0.061662198 3:0.058981233 4:0.026809651 5:0.045576408 6:0.0026809651 7:0.029490617 8:0.10187668 9:0.13672922 10:0.040214477 11:0.034852547 12:0.056300268 13:0.0053619303 14:0.010723861 15:0.029490617 16:0.08310992 17:0.040214477 18:0.061662198 19:0.034852547 20:0.064343164 -4.934569424726362 0 1:0.058411215 2:0.086448598 3:0.072429907 4:0.037383178 5:0.023364486 6:0.011682243 7:0.011682243 8:0.060747664 9:0.11214953 10:0.03271028 11:0.063084112 12:0.039719626 13:0 14:0.018691589 15:0.058411215 16:0.053738318 17:0.079439252 18:0.051401869 19:0.063084112 20:0.065420561 -0 8 1:0.048728814 2:0.091101695 3:0.044491525 4:0.016949153 5:0.014830508 6:0.0063559322 7:0.025423729 8:0.097457627 9:0.14194915 10:0.042372881 11:0.055084746 12:0.088983051 13:0.01059322 14:0.021186441 15:0.029661017 16:0.055084746 17:0.055084746 18:0.050847458 19:0.046610169 20:0.05720339 -6.063893005663595 0.5849274082695956 1:0.099236641 2:0.053435115 3:0.045801527 4:0.045801527 5:0.038167939 6:0.0076335878 7:0.030534351 8:0.06870229 9:0.16030534 10:0.053435115 11:0.038167939 12:0.076335878 13:0.022900763 14:0.038167939 15:0.022900763 16:0.06870229 17:0.0076335878 18:0.06870229 19:0.015267176 20:0.038167939 -8 5.043692551696436 1:0.056338028 2:0.079812207 3:0.049295775 4:0.014084507 5:0.046948357 6:0.011737089 7:0.028169014 8:0.068075117 9:0.14553991 10:0.061032864 11:0.042253521 12:0.075117371 13:0.0070422535 14:0.023474178 15:0.058685446 16:0.018779343 17:0.044600939 18:0.072769953 19:0.021126761 20:0.075117371 -1.802455065877923 0 1:0.098765432 2:0.092592593 3:0.037037037 4:0.030864198 5:0.024691358 6:0.018518519 7:0.049382716 8:0.055555556 9:0.086419753 10:0.055555556 11:0.043209877 12:0.061728395 13:0.012345679 14:0.0061728395 15:0.030864198 16:0.098765432 17:0.037037037 18:0.074074074 19:0.030864198 20:0.055555556 -0.1057876322374004 0 1:0.053211009 2:0.051376147 3:0.036697248 4:0.023853211 5:0.025688073 6:0.0073394495 7:0.0091743119 8:0.080733945 9:0.12110092 10:0.060550459 11:0.04587156 12:0.051376147 13:0.058715596 14:0.056880734 15:0.047706422 16:0.099082569 17:0.034862385 18:0.047706422 19:0.025688073 20:0.062385321 -2.600446698857399 3.718844441911946 1:0.092592593 2:0.076719577 3:0.031746032 4:0.031746032 5:0.034391534 6:0.010582011 7:0.034391534 8:0.097883598 9:0.12169312 10:0.063492063 11:0.015873016 12:0.084656085 13:0.0026455026 14:0.044973545 15:0.037037037 16:0.044973545 17:0.026455026 18:0.058201058 19:0.037037037 20:0.052910053 -4.291451927899248 0 1:0.044378698 2:0.088757396 3:0.032544379 4:0.023668639 5:0.045857988 6:0.036982249 7:0.036982249 8:0.087278107 9:0.091715976 10:0.047337278 11:0.057692308 12:0.068047337 13:0.0073964497 14:0.028106509 15:0.039940828 16:0.047337278 17:0.062130178 18:0.036982249 19:0.035502959 20:0.081360947 -0 8 1:0.062411348 2:0.065248227 3:0.04964539 4:0.021276596 5:0.04964539 6:0.012765957 7:0.060992908 8:0.083687943 9:0.072340426 10:0.036879433 11:0.070921986 12:0.068085106 13:0.0028368794 14:0.011347518 15:0.022695035 16:0.1035461 17:0.035460993 18:0.04964539 19:0.053900709 20:0.066666667 -2.331832844561124 5.047647929412054 1:0.12935323 2:0.054726368 3:0.034825871 4:0.024875622 5:0.0099502488 6:0 7:0.0099502488 8:0.059701493 9:0.11940299 10:0.0049751244 11:0.10447761 12:0.094527363 13:0 14:0.014925373 15:0.019900498 16:0.10945274 17:0.039800995 18:0.049751244 19:0.034825871 20:0.084577114 -8 0 1:0.063981043 2:0.085308057 3:0.052132701 4:0.021327014 5:0.045023697 6:0.021327014 7:0.035545024 8:0.085308057 9:0.12559242 10:0.030805687 11:0.049763033 12:0.071090047 13:0.0047393365 14:0.0047393365 15:0.028436019 16:0.056872038 17:0.047393365 18:0.056872038 19:0.04028436 20:0.073459716 -0 0.6490092597370897 1:0.072289157 2:0.084337349 3:0.018072289 4:0.060240964 5:0.012048193 6:0.012048193 7:0.0060240964 8:0.024096386 9:0.090361446 10:0.024096386 11:0.10240964 12:0.054216867 13:0 14:0.042168675 15:0.078313253 16:0.030120482 17:0.12048193 18:0.090361446 19:0.048192771 20:0.030120482 -1.603861821668237 0.281884423767496 1:0.07028754 2:0.10223642 3:0.044728435 4:0.051118211 5:0.031948882 6:0.0063897764 7:0.028753994 8:0.038338658 9:0.13099042 10:0.051118211 11:0.073482428 12:0.076677316 13:0.0031948882 14:0.022364217 15:0.047923323 16:0.067092652 17:0.031948882 18:0.025559105 19:0.057507987 20:0.038338658 -7.113435918425009 0 1:0.10540541 2:0.083783784 3:0.059459459 4:0.037837838 5:0.021621622 6:0 7:0.0054054054 8:0.10540541 9:0.11351351 10:0.054054054 11:0.064864865 12:0.078378378 13:0.0054054054 14:0 15:0.043243243 16:0.035135135 17:0.054054054 18:0.037837838 19:0.051351351 20:0.043243243 -3.435273220847336 1.779914070169598 1:0.093385214 2:0.081712062 3:0.038910506 4:0.011673152 5:0.042801556 6:0.015564202 7:0.027237354 8:0.070038911 9:0.16342412 10:0.038910506 11:0.062256809 12:0.054474708 13:0 14:0.011673152 15:0.027237354 16:0.085603113 17:0.031128405 18:0.046692607 19:0.035019455 20:0.062256809 -8 0 1:0.078078078 2:0.099099099 3:0.033033033 4:0.039039039 5:0.039039039 6:0.009009009 7:0.027027027 8:0.072072072 9:0.13513514 10:0.045045045 11:0.039039039 12:0.06006006 13:0 14:0.018018018 15:0.018018018 16:0.072072072 17:0.039039039 18:0.096096096 19:0.036036036 20:0.045045045 -1.077200073487154 5.534677316710479 1:0.041353383 2:0.060150376 3:0.045112782 4:0.018796992 5:0.030075188 6:0.030075188 7:0.045112782 8:0.078947368 9:0.12406015 10:0.045112782 11:0.030075188 12:0.052631579 13:0.030075188 14:0.022556391 15:0.067669173 16:0.045112782 17:0.082706767 18:0.037593985 19:0.060150376 20:0.052631579 -8 0 1:0.078947368 2:0.11695906 3:0.043859649 4:0.029239766 5:0.029239766 6:0.020467836 7:0.032163743 8:0.073099415 9:0.10526316 10:0.055555556 11:0.067251462 12:0.029239766 13:0.0058479532 14:0.020467836 15:0.058479532 16:0.043859649 17:0.073099415 18:0.043859649 19:0.029239766 20:0.043859649 -3.291409616476488 4.494360245712103 1:0.065934066 2:0.11208791 3:0.081318681 4:0.015384615 5:0.030769231 6:0 7:0.0043956044 8:0.11208791 9:0.10769231 10:0.043956044 11:0.083516484 12:0.043956044 13:0 14:0.0021978022 15:0.035164835 16:0.059340659 17:0.054945055 18:0.046153846 19:0.048351648 20:0.052747253 -0 0.6876120407810307 1:0.068965517 2:0.068965517 3:0.034482759 4:0.014778325 5:0.054187192 6:0.02955665 7:0.014778325 8:0.0591133 9:0.088669951 10:0.034482759 11:0.093596059 12:0.073891626 13:0.0049261084 14:0.014778325 15:0.02955665 16:0.068965517 17:0.088669951 18:0.019704433 19:0.054187192 20:0.083743842 -2.599729924503841 0 1:0.066298343 2:0.12154696 3:0.033149171 4:0.044198895 5:0.049723757 6:0.0055248619 7:0.016574586 8:0.027624309 9:0.08839779 10:0.038674033 11:0.12707182 12:0.044198895 13:0.0055248619 14:0.049723757 15:0.049723757 16:0.027624309 17:0.082872928 18:0.049723757 19:0.027624309 20:0.044198895 -0 8 1:0.083123426 2:0.050377834 3:0.040302267 4:0.017632242 5:0.0302267 6:0.027707809 7:0.040302267 8:0.062972292 9:0.073047859 10:0.050377834 11:0.083123426 12:0.085642317 13:0.010075567 14:0.020151134 15:0.032745592 16:0.085642317 17:0.047858942 18:0.040302267 19:0.080604534 20:0.037783375 -0.7612188050216705 0 1:0.069277108 2:0.10240964 3:0.042168675 4:0.015060241 5:0.054216867 6:0.021084337 7:0.030120482 8:0.045180723 9:0.087349398 10:0.048192771 11:0.057228916 12:0.081325301 13:0.0060240964 14:0.021084337 15:0.018072289 16:0.081325301 17:0.063253012 18:0.048192771 19:0.051204819 20:0.057228916 -8 2.287819468348321 1:0.072727273 2:0.1030303 3:0.06969697 4:0.015151515 5:0.024242424 6:0.0090909091 7:0.027272727 8:0.081818182 9:0.12727273 10:0.081818182 11:0.033333333 12:0.060606061 13:0 14:0.018181818 15:0.048484848 16:0.051515152 17:0.027272727 18:0.054545455 19:0.036363636 20:0.054545455 -3.511681677426128 0 1:0.079225352 2:0.077464789 3:0.052816901 4:0.021126761 5:0.026408451 6:0.019366197 7:0.035211268 8:0.079225352 9:0.082746479 10:0.063380282 11:0.063380282 12:0.059859155 13:0.0035211268 14:0.029929577 15:0.040492958 16:0.075704225 17:0.040492958 18:0.047535211 19:0.035211268 20:0.066901408 -5.098580449232711 0 1:0.088607595 2:0.12025316 3:0.012658228 4:0.015822785 5:0.034810127 6:0.018987342 7:0.012658228 8:0.085443038 9:0.10443038 10:0.053797468 11:0.060126582 12:0.060126582 13:0.0094936709 14:0.015822785 15:0.085443038 16:0.022151899 17:0.063291139 18:0.056962025 19:0.015822785 20:0.063291139 -0 8 1:0.078651685 2:0.060995185 3:0.025682183 4:0.016051364 5:0.043338684 6:0.038523274 7:0.033707865 8:0.11717496 9:0.085072231 10:0.057784912 11:0.062600321 12:0.065810594 13:0.0048154093 14:0.025682183 15:0.041733547 16:0.064205457 17:0.027287319 18:0.036918138 19:0.040128411 20:0.073836276 -2.874160890661235 0.4870663927273422 1:0.073710074 2:0.11056511 3:0.017199017 4:0.017199017 5:0.049140049 6:0.029484029 7:0.029484029 8:0.081081081 9:0.073710074 10:0.073710074 11:0.051597052 12:0.036855037 13:0.014742015 14:0.017199017 15:0.12039312 16:0.027027027 17:0.036855037 18:0.036855037 19:0.022113022 20:0.081081081 -0 3.658676738324247 1:0.084398977 2:0.084398977 3:0.040920716 4:0.025575448 5:0.012787724 6:0.010230179 7:0.025575448 8:0.089514066 9:0.1202046 10:0.043478261 11:0.076726343 12:0.061381074 13:0.0025575448 14:0.0076726343 15:0.081841432 16:0.023017903 17:0.097186701 18:0.043478261 19:0.017902813 20:0.051150895 -8 2.000415432812609 1:0.058823529 2:0.11213235 3:0.022058824 4:0.016544118 5:0.045955882 6:0.040441176 7:0.022058824 8:0.077205882 9:0.086397059 10:0.071691176 11:0.060661765 12:0.027573529 13:0.0036764706 14:0.038602941 15:0.10294118 16:0.020220588 17:0.03125 18:0.0625 19:0.022058824 20:0.077205882 -8 6.528862915696394 1:0.08490566 2:0.06918239 3:0.050314465 4:0.029874214 5:0.040880503 6:0.014150943 7:0.02672956 8:0.086477987 9:0.072327044 10:0.047169811 11:0.053459119 12:0.055031447 13:0.017295597 14:0.033018868 15:0.059748428 16:0.061320755 17:0.018867925 18:0.058176101 19:0.040880503 20:0.080188679 -0 7.784284344284279 1:0.04534005 2:0.1209068 3:0.032745592 4:0.01511335 5:0.032745592 6:0.012594458 7:0.04534005 8:0.090680101 9:0.12342569 10:0.073047859 11:0.047858942 12:0.050377834 13:0.0050377834 14:0.01511335 15:0.068010076 16:0.037783375 17:0.062972292 18:0.037783375 19:0.022670025 20:0.060453401 -8 0 1:0.052238806 2:0.10820896 3:0.048507463 4:0.026119403 5:0.044776119 6:0.014925373 7:0.029850746 8:0.052238806 9:0.10447761 10:0.052238806 11:0.052238806 12:0.044776119 13:0.01119403 14:0.026119403 15:0.078358209 16:0.041044776 17:0.063432836 18:0.063432836 19:0.02238806 20:0.063432836 -0 2.209302762949475 1:0.073825503 2:0.053691275 3:0.026845638 4:0.013422819 5:0.040268456 6:0.013422819 7:0.0067114094 8:0.12751678 9:0.14765101 10:0.073825503 11:0.040268456 12:0.087248322 13:0.026845638 14:0.0067114094 15:0.053691275 16:0.040268456 17:0.040268456 18:0.040268456 19:0.060402685 20:0.026845638 -4.802117854980023 2.952197835101933 1:0.059945504 2:0.089918256 3:0.019073569 4:0.029972752 5:0.035422343 6:0.024523161 7:0.032697548 8:0.068119891 9:0.1253406 10:0.04359673 11:0.076294278 12:0.04359673 13:0.010899183 14:0.016348774 15:0.073569482 16:0.057220708 17:0.038147139 18:0.0626703 19:0.035422343 20:0.057220708 -4.671869755057796 8 1:0.06833713 2:0.072892938 3:0.056947608 4:0.041002278 5:0.022779043 6:0.006833713 7:0.043280182 8:0.082004556 9:0.10933941 10:0.050113895 11:0.072892938 12:0.047835991 13:0.011389522 14:0.018223235 15:0.061503417 16:0.043280182 17:0.038724374 18:0.036446469 19:0.054669704 20:0.061503417 -0 1.590432446410517 1:0.022058824 2:0.051470588 3:0.051470588 4:0.036764706 5:0.044117647 6:0.014705882 7:0.022058824 8:0.088235294 9:0.15441176 10:0.029411765 11:0.066176471 12:0.080882353 13:0.014705882 14:0.0073529412 15:0.0073529412 16:0.125 17:0.022058824 18:0.044117647 19:0.066176471 20:0.051470588 -0 0.07509135564985965 1:0.053763441 2:0.069892473 3:0.032258065 4:0.016129032 5:0.032258065 6:0.016129032 7:0.032258065 8:0.11827957 9:0.053763441 10:0.059139785 11:0.075268817 12:0.053763441 13:0.075268817 14:0.016129032 15:0.059139785 16:0.064516129 17:0.021505376 18:0.037634409 19:0.048387097 20:0.064516129 -5.706609966159265 1.029850408913825 1:0.045614035 2:0.075438596 3:0.029824561 4:0.036842105 5:0.015789474 6:0.029824561 7:0.033333333 8:0.080701754 9:0.080701754 10:0.047368421 11:0.042105263 12:0.071929825 13:0.031578947 14:0.035087719 15:0.052631579 16:0.078947368 17:0.022807018 18:0.084210526 19:0.050877193 20:0.054385965 -2.616082120107126 2.724995447829363 1:0.025210084 2:0.17647059 3:0.016806723 4:0.025210084 5:0.016806723 6:0.025210084 7:0 8:0.092436975 9:0.13445378 10:0.1092437 11:0.033613445 12:0.042016807 13:0.025210084 14:0.033613445 15:0.084033613 16:0.0084033613 17:0.033613445 18:0.050420168 19:0.0084033613 20:0.058823529 -5.163252430602697 1.288803732558049 1:0.049095607 2:0.095607235 3:0.018087855 4:0.020671835 5:0.046511628 6:0.007751938 7:0.028423773 8:0.082687339 9:0.14470284 10:0.064599483 11:0.033591731 12:0.054263566 13:0.012919897 14:0.020671835 15:0.07751938 16:0.031007752 17:0.036175711 18:0.067183463 19:0.033591731 20:0.074935401 -0 1.466971585059213 1:0.070381232 2:0.082111437 3:0.052785924 4:0.008797654 5:0.04398827 6:0.017595308 7:0.032258065 8:0.093841642 9:0.1085044 10:0.04398827 11:0.052785924 12:0.076246334 13:0.011730205 14:0.0029325513 15:0.035190616 16:0.055718475 17:0.032258065 18:0.04398827 19:0.073313783 20:0.061583578 -2.205410344860439 0.8602506511196986 1:0.064935065 2:0.090909091 3:0.032467532 4:0.038961039 5:0.025974026 6:0 7:0.025974026 8:0.045454545 9:0.12337662 10:0.097402597 11:0.051948052 12:0.064935065 13:0.025974026 14:0.032467532 15:0.084415584 16:0 17:0.051948052 18:0.058441558 19:0.038961039 20:0.045454545 -3.064840795915249 0 1:0.090196078 2:0.082352941 3:0.041830065 4:0.035294118 5:0.032679739 6:0.010457516 7:0.033986928 8:0.081045752 9:0.095424837 10:0.050980392 11:0.069281046 12:0.05751634 13:0.0013071895 14:0.013071895 15:0.045751634 16:0.062745098 17:0.037908497 18:0.050980392 19:0.039215686 20:0.067973856 -1.46394059759023 1.836861862499438 1:0.024539877 2:0.098159509 3:0.030674847 4:0.0061349693 5:0.030674847 6:0.012269939 7:0 8:0.11042945 9:0.17177914 10:0.079754601 11:0.061349693 12:0.036809816 13:0.098159509 14:0.018404908 15:0.09202454 16:0.012269939 17:0.024539877 18:0.055214724 19:0.0061349693 20:0.030674847 -8 8 1:0.072599532 2:0.10538642 3:0.046838407 4:0.011709602 5:0.046838407 6:0.014051522 7:0.025761124 8:0.077283372 9:0.091334895 10:0.051522248 11:0.051522248 12:0.049180328 13:0.0046838407 14:0.039812646 15:0.084309133 16:0.035128806 17:0.049180328 18:0.067915691 19:0.018735363 20:0.056206089 -2.172395958982213 -2.56375769703356 1:0.075675676 2:0.059459459 3:0.037837838 4:0.021621622 5:0.043243243 6:0 7:0.016216216 8:0.064864865 9:0.17297297 10:0.043243243 11:0.086486486 12:0.12972973 13:0.010810811 14:0.010810811 15:0.016216216 16:0.048648649 17:0.032432432 18:0.021621622 19:0.059459459 20:0.048648649 -0.7292580839583707 -4.151835645620884 1:0.067368421 2:0.090526316 3:0.050526316 4:0.016842105 5:0.023157895 6:0 7:0.014736842 8:0.077894737 9:0.12210526 10:0.016842105 11:0.090526316 12:0.11157895 13:0 14:0 15:0.027368421 16:0.082105263 17:0.037894737 18:0.035789474 19:0.058947368 20:0.073684211 -1.02101596169549 -2.279076880306903 1:0.061946903 2:0.088495575 3:0.044247788 4:0.03539823 5:0.0088495575 6:0 7:0.0088495575 8:0.10619469 9:0.15929204 10:0 11:0.088495575 12:0.088495575 13:0 14:0 15:0 16:0.097345133 17:0.061946903 18:0.026548673 19:0.079646018 20:0.044247788 -0.3749553116417264 -1.340898479234559 1:0.043103448 2:0.038793103 3:0.017241379 4:0.0043103448 5:0.021551724 6:0.017241379 7:0.025862069 8:0.051724138 9:0.20258621 10:0.056034483 11:0.090517241 12:0.056034483 13:0 14:0.012931034 15:0.0043103448 16:0.025862069 17:0.16810345 18:0.038793103 19:0.073275862 20:0.051724138 -7.903896326000703 -3.033654125152884 1:0.075581395 2:0.11627907 3:0.069767442 4:0.029069767 5:0.040697674 6:0.01744186 7:0.040697674 8:0.075581395 9:0.058139535 10:0.040697674 11:0.098837209 12:0.069767442 13:0.011627907 14:0.029069767 15:0.0058139535 16:0.075581395 17:0.023255814 18:0.040697674 19:0.040697674 20:0.040697674 -0.5933541013592225 -0.03518610399786863 1:0.06557377 2:0.032786885 3:0.032786885 4:0.024590164 5:0.086065574 6:0.0040983607 7:0.028688525 8:0.06557377 9:0.032786885 10:0.036885246 11:0.10245902 12:0.06557377 13:0.0040983607 14:0.012295082 15:0.090163934 16:0.069672131 17:0.016393443 18:0.06557377 19:0.086065574 20:0.077868852 -0 -8 1:0.056521739 2:0.055072464 3:0.044927536 4:0.01884058 5:0.026086957 6:0.010144928 7:0.034782609 8:0.075362319 9:0.072463768 10:0.047826087 11:0.072463768 12:0.07826087 13:0 14:0.036231884 15:0.037681159 16:0.08115942 17:0.069565217 18:0.046376812 19:0.075362319 20:0.060869565 -2.675406161354421 -3.719116124931764 1:0.096045198 2:0.056497175 3:0.056497175 4:0.02259887 5:0.039548023 6:0.011299435 7:0.04519774 8:0.062146893 9:0.050847458 10:0.033898305 11:0.039548023 12:0.056497175 13:0 14:0.016949153 15:0.039548023 16:0.1299435 17:0.056497175 18:0.056497175 19:0.079096045 20:0.050847458 -3.125766399948067 -0 1:0.087866109 2:0.025104603 3:0.11297071 4:0.012552301 5:0.046025105 6:0 7:0.037656904 8:0.092050209 9:0.066945607 10:0.037656904 11:0.054393305 12:0.050209205 13:0 14:0.012552301 15:0.012552301 16:0.09623431 17:0.041841004 18:0.054393305 19:0.09623431 20:0.062761506 -1.359109647986245 -0 1:0.085106383 2:0.042553191 3:0.072340426 4:0.017021277 5:0.046808511 6:0 7:0.038297872 8:0.09787234 9:0.063829787 10:0.029787234 11:0.076595745 12:0.068085106 13:0 14:0.034042553 15:0.0085106383 16:0.10638298 17:0.021276596 18:0.068085106 19:0.085106383 20:0.038297872 -1.079995285795496 -1.035494453063716 1:0.085889571 2:0.098159509 3:0.049079755 4:0.012269939 5:0.061349693 6:0.0061349693 7:0.055214724 8:0.055214724 9:0.036809816 10:0.049079755 11:0.079754601 12:0.079754601 13:0 14:0.012269939 15:0.012269939 16:0.12883436 17:0 18:0.09202454 19:0.036809816 20:0.049079755 -2.538209334104792 -7.009508402606112 1:0.076620825 2:0.05697446 3:0.043222004 4:0.013752456 5:0.035363458 6:0.0058939096 7:0.049115914 8:0.070726916 9:0.084479371 10:0.023575639 11:0.062868369 12:0.066797642 13:0.0019646365 14:0.019646365 15:0.015717092 16:0.10609037 17:0.039292731 18:0.064833006 19:0.068762279 20:0.094302554 -8 -8 1:0.042857143 2:0.14285714 3:0.057142857 4:0.028571429 5:0.042857143 6:0.014285714 7:0.014285714 8:0.071428571 9:0.092857143 10:0.1 11:0.014285714 12:0.028571429 13:0.028571429 14:0.021428571 15:0.078571429 16:0.028571429 17:0.035714286 18:0.085714286 19:0.021428571 20:0.05 -8 -5.104856729491351 1:0.057692308 2:0.073717949 3:0.051282051 4:0.019230769 5:0.032051282 6:0.016025641 7:0.044871795 8:0.08974359 9:0.12179487 10:0.057692308 11:0.044871795 12:0.086538462 13:0.022435897 14:0.03525641 15:0.076923077 16:0.012820513 17:0.025641026 18:0.044871795 19:0.0096153846 20:0.076923077 -8 -0.93498397085957 1:0.11607143 2:0.10714286 3:0.075892857 4:0.022321429 5:0.017857143 6:0.0089285714 7:0.026785714 8:0.071428571 9:0.049107143 10:0.044642857 11:0.044642857 12:0.049107143 13:0.0089285714 14:0.017857143 15:0.080357143 16:0.075892857 17:0.0044642857 18:0.098214286 19:0.040178571 20:0.040178571 -8 -8 1:0.082887701 2:0.088235294 3:0.072192513 4:0.029411765 5:0.026737968 6:0 7:0.021390374 8:0.10695187 9:0.096256684 10:0.042780749 11:0.056149733 12:0.064171123 13:0.0053475936 14:0.021390374 15:0.061497326 16:0.061497326 17:0.029411765 18:0.064171123 19:0.016042781 20:0.053475936 -3.213619212951361 -8 1:0.055803571 2:0.095982143 3:0.035714286 4:0.015625 5:0.035714286 6:0.022321429 7:0.024553571 8:0.11383929 9:0.14285714 10:0.071428571 11:0.058035714 12:0.03125 13:0.0044642857 14:0.017857143 15:0.064732143 16:0.026785714 17:0.0625 18:0.040178571 19:0.03125 20:0.049107143 -8 -1.43567174015868 1:0.051162791 2:0.093023256 3:0.093023256 4:0.027906977 5:0.060465116 6:0.0046511628 7:0.03255814 8:0.041860465 9:0.055813953 10:0.046511628 11:0.074418605 12:0.046511628 13:0.0093023256 14:0.018604651 15:0.013953488 16:0.08372093 17:0.013953488 18:0.069767442 19:0.10232558 20:0.060465116 -8 -7.148916663194218 1:0.13864307 2:0.073746313 3:0.050147493 4:0.02359882 5:0.026548673 6:0.0088495575 7:0.017699115 8:0.08259587 9:0.097345133 10:0.044247788 11:0.03539823 12:0.04719764 13:0.020648968 14:0.038348083 15:0.041297935 16:0.05899705 17:0.041297935 18:0.07079646 19:0.03539823 20:0.04719764 -8 -4.942619420808389 1:0.059602649 2:0.079470199 3:0.066225166 4:0.01986755 5:0.052980132 6:0.01986755 7:0.059602649 8:0.10596026 9:0.072847682 10:0.046357616 11:0.059602649 12:0.052980132 13:0.026490066 14:0.0066225166 15:0.01986755 16:0.059602649 17:0.046357616 18:0.079470199 19:0.039735099 20:0.026490066 -8 -8 1:0.081300813 2:0.097560976 3:0.024390244 4:0.026422764 5:0.028455285 6:0.010162602 7:0.022357724 8:0.069105691 9:0.1504065 10:0.071138211 11:0.034552846 12:0.056910569 13:0.0081300813 14:0.026422764 15:0.075203252 16:0.046747967 17:0.040650407 18:0.042682927 19:0.020325203 20:0.067073171 -8 -8 1:0.076142132 2:0.098984772 3:0.040609137 4:0.020304569 5:0.017766497 6:0.0025380711 7:0.02284264 8:0.10406091 9:0.14467005 10:0.045685279 11:0.043147208 12:0.060913706 13:0.0025380711 14:0.017766497 15:0.058375635 16:0.055837563 17:0.038071066 18:0.058375635 19:0.030456853 20:0.060913706 -7.092132096024378 -8 1:0.041025641 2:0.092307692 3:0.058974359 4:0.015384615 5:0.023076923 6:0.025641026 7:0.043589744 8:0.084615385 9:0.13076923 10:0.043589744 11:0.056410256 12:0.056410256 13:0.0025641026 14:0.0076923077 15:0.066666667 16:0.046153846 17:0.069230769 18:0.035897436 19:0.017948718 20:0.082051282 -0.3264968739392036 -0 1:0.071428571 2:0.078947368 3:0.052631579 4:0.033834586 5:0.052631579 6:0.0037593985 7:0.082706767 8:0.033834586 9:0.026315789 10:0.026315789 11:0.060150376 12:0.052631579 13:0.007518797 14:0.022556391 15:0.022556391 16:0.12781955 17:0.030075188 18:0.045112782 19:0.078947368 20:0.090225564 -3.952463768060047 -5.675303638711491 1:0.060606061 2:0.051948052 3:0.038961039 4:0.03030303 5:0.047619048 6:0.03030303 7:0.034632035 8:0.064935065 9:0.069264069 10:0.047619048 11:0.064935065 12:0.038961039 13:0.034632035 14:0.021645022 15:0.077922078 16:0.038961039 17:0.082251082 18:0.073593074 19:0.064935065 20:0.025974026 -0 -0.1087288973751009 1:0.062695925 2:0.059561129 3:0.059561129 4:0.028213166 5:0.031347962 6:0.02507837 7:0.043887147 8:0.07523511 9:0.043887147 10:0.031347962 11:0.078369906 12:0.097178683 13:0 14:0.012539185 15:0.034482759 16:0.090909091 17:0.031347962 18:0.034482759 19:0.078369906 20:0.081504702 -0 -1.465525992175706 1:0.081818182 2:0.063636364 3:0.045454545 4:0.012121212 5:0.03030303 6:0.012121212 7:0.063636364 8:0.063636364 9:0.051515152 10:0.042424242 11:0.078787879 12:0.048484848 13:0.0060606061 14:0.027272727 15:0.015151515 16:0.12424242 17:0.03030303 18:0.045454545 19:0.078787879 20:0.078787879 -0 -2.253325695101351 1:0.072595281 2:0.056261343 3:0.032667877 4:0.014519056 5:0.038112523 6:0.041742287 7:0.036297641 8:0.12885662 9:0.092558984 10:0.045372051 11:0.085299456 12:0.030852995 13:0.010889292 14:0.021778584 15:0.056261343 16:0.021778584 17:0.050816697 18:0.02722323 19:0.059891107 20:0.076225045 -0 -8 1:0.07073955 2:0.11897106 3:0.038585209 4:0.01607717 5:0.032154341 6:0.0064308682 7:0.035369775 8:0.10610932 9:0.090032154 10:0.051446945 11:0.1221865 12:0.051446945 13:0.0064308682 14:0.022508039 15:0.035369775 16:0.032154341 17:0.041800643 18:0.035369775 19:0.048231511 20:0.038585209 -0 -5.224677139207556 1:0.064896755 2:0.088495575 3:0.038348083 4:0.020648968 5:0.053097345 6:0.005899705 7:0.053097345 8:0.056047198 9:0.10324484 10:0.050147493 11:0.085545723 12:0.061946903 13:0.005899705 14:0.01179941 15:0.029498525 16:0.061946903 17:0.029498525 18:0.02359882 19:0.10324484 20:0.053097345 -0 -2.073821569223395 1:0.058479532 2:0.080409357 3:0.020467836 4:0.0058479532 5:0.01754386 6:0.019005848 7:0.019005848 8:0.10818713 9:0.14327485 10:0.070175439 11:0.078947368 12:0.05994152 13:0.029239766 14:0.029239766 15:0.084795322 16:0.029239766 17:0.042397661 18:0.029239766 19:0.019005848 20:0.055555556 -0 -1.029158507138896 1:0.051918736 2:0.051918736 3:0.027088036 4:0.033860045 5:0.045146727 6:0.031602709 7:0.042889391 8:0.10609481 9:0.11286682 10:0.0496614 11:0.074492099 12:0.072234763 13:0.013544018 14:0.022573363 15:0.0496614 16:0.033860045 17:0.038374718 18:0.036117381 19:0.063205418 20:0.042889391 -1.706604602749902 -5.116835954684479 1:0.095798319 2:0.038655462 3:0.063865546 4:0.023529412 5:0.021848739 6:0.01512605 7:0.043697479 8:0.094117647 9:0.099159664 10:0.048739496 11:0.070588235 12:0.063865546 13:0.0084033613 14:0.020168067 15:0.048739496 16:0.043697479 17:0.031932773 18:0.031932773 19:0.068907563 20:0.067226891 -2.816067234872035 -0 1:0.065732759 2:0.072198276 3:0.078663793 4:0.010775862 5:0.036637931 6:0.0032327586 7:0.023706897 8:0.090517241 9:0.059267241 10:0.011853448 11:0.074353448 12:0.078663793 13:0.0032327586 14:0.010775862 15:0.03125 16:0.089439655 17:0.033405172 18:0.061422414 19:0.10021552 20:0.064655172 -7.786735555513285 -8 1:0.068493151 2:0.083561644 3:0.024657534 4:0.015068493 5:0.036986301 6:0.032876712 7:0.053424658 8:0.082191781 9:0.080821918 10:0.061643836 11:0.065753425 12:0.052054795 13:0.0082191781 14:0.02739726 15:0.069863014 16:0.028767123 17:0.047945205 18:0.054794521 19:0.049315068 20:0.056164384 -8 -8 1:0.062138728 2:0.076589595 3:0.033236994 4:0.020231214 5:0.047687861 6:0.024566474 7:0.031791908 8:0.086705202 9:0.085260116 10:0.075144509 11:0.066473988 12:0.056358382 13:0.0043352601 14:0.027456647 15:0.076589595 16:0.037572254 17:0.040462428 18:0.039017341 19:0.046242775 20:0.062138728 -0 -7.390551885030089 1:0.075247525 2:0.063366337 3:0.053465347 4:0.015841584 5:0.027722772 6:0.031683168 7:0.035643564 8:0.099009901 9:0.085148515 10:0.045544554 11:0.079207921 12:0.075247525 13:0 14:0.017821782 15:0.041584158 16:0.035643564 17:0.037623762 18:0.035643564 19:0.075247525 20:0.069306931 -3.480141175219598 -0 1:0.077419355 2:0.085630499 3:0.039882698 4:0.013489736 5:0.024046921 6:0.008797654 7:0.024046921 8:0.14134897 9:0.13372434 10:0.02228739 11:0.058651026 12:0.045747801 13:0 14:0.02111437 15:0.055718475 16:0.026392962 17:0.039882698 18:0.049853372 19:0.038709677 20:0.093255132 -0 -8 1:0.071661238 2:0.068403909 3:0.03257329 4:0.022801303 5:0.03257329 6:0.013029316 7:0.042345277 8:0.09771987 9:0.084690554 10:0.045602606 11:0.081433225 12:0.078175896 13:0.003257329 14:0.022801303 15:0.048859935 16:0.068403909 17:0.042345277 18:0.048859935 19:0.039087948 20:0.055374593 -2.904079460240392 -3.175136298302513 1:0.04784689 2:0.074162679 3:0.038277512 4:0.028708134 5:0.055023923 6:0.026315789 7:0.057416268 8:0.040669856 9:0.11004785 10:0.035885167 11:0.074162679 12:0.081339713 13:0.009569378 14:0.016746411 15:0.031100478 16:0.050239234 17:0.031100478 18:0.062200957 19:0.076555024 20:0.052631579