gbm/0000755000176200001440000000000014637453022011025 5ustar liggesusersgbm/tests/0000755000176200001440000000000014547111634012167 5ustar liggesusersgbm/tests/tinytest.R0000644000176200001440000000025314547111634014175 0ustar liggesusers if ( requireNamespace("tinytest", quietly = TRUE) ){ home <- length(unclass(packageVersion("gbm"))[[1L]]) == 4 tinytest::test_package("gbm", at_home = home) } gbm/.Rinstignore0000644000176200001440000000027114547111627013333 0ustar liggesusersinst/doc/gbm.tex inst/doc/srcltx.sty inst/doc/shrinkage-v-iterations.eps inst/doc/shrinkage-v-iterations.pdf inst/doc/oobperf2.eps inst/doc/oobperf2.pdf inst/doc/shrinkageplot.R gbm/CHANGES0000644000176200001440000003403214547111627012024 0ustar liggesusersChanges in version 2.1 - The cross-validation loop is now parallelized. The functions attempt to guess a sensible number of cores to use, or the user can specify how many through new argument n.cores. - A fair amount of code refactoring. - Added type='response' for predict when distribution='adaboost'. - Fixed a bug that caused offset not to be used if the first element of offset was 0. - Updated predict.gbm and plot.gbm to cope with objects created using gbm version 1.6. - Changed default value of verbose to 'CV'. gbm now defaults to letting the user know which block of CV folds it is running. If verbose=TRUE is specified, the final run of the model also prints its progress to screen as in earlier versions. - Fixed bug that caused predict to return wrong result when distribution == 'multinomial' and length(n.trees) > 1. - Fixed bug that caused n.trees to be wrong in relative.influence if no CV or validation set was used. - Relative influence was computed wrongly when distribution="multinomial". Fixed. - Cross-validation predictions now included in the output object. - Fixed bug in relative.influence that caused labels to be wrong when sort.=TRUE. - Modified interact.gbm to do additional sanity check, updated help file - Fixed bug in interact.gbm so that it now works for distribution="multinomial" - Modified predict.gbm to improve performance on large datasets Changes in version 2.0 Lots of new features added so it warrants a change to the first digit of the version number. Major changes: - Several new distributions are now available thanks to Harry Southworth and Daniel Edwards: multinomial and tdist. - New distribution 'pairwise' for Learning to Rank Applications (LambdaMART), including four different ranking measures, thanks to Stefan Schroedl. - The gbm package is now managed on R-Forge by Greg Ridgeway and Harry Southworth. Visit http://r-forge.r-project.org/projects/gbm/ to get the latest or to contribute to the package Minor changes: - the "quantile" distribution now handles weighted data - relative.influence changed to give names to the returned vector - Added print.gbm and show.gbm. These give basic summaries of the fitted model - Added support function and reconstructGBMdata() to facilitate reconstituting the data for certain plots and summaries - gbm was not using the weights when using cross-validation due to a bug. That's been fixed (Thanks to Trevor Hastie for catching this) - predict.gbm now tries to guess the number of trees, also defaults to using the training data if no newdata is given. - relative.influence has has 2 new arguments, scale. and sort. that default to FALSE. The returned vector now has names. - gbm now tries to guess what distribution you meant if you didn't specify. - gbm has a new argument, class.stratifiy.cv, to control if cross-validation is stratified by class with distribution is "bernoulli" or "multinomial". Defaults to TRUE for multinomial, FALSE for bernoulli. The purpose is to avoid unusable training sets. - gbm.perf now puts a vertical line at the best number of trees when method = "cv" or "test". Tries to guess what method you meant if you don't tell it. - .First.lib had a bug that would crash gbm if gbm was installed as a local library. Fixed. - plot.gbm has a new argument, type, defaulting to "link". For bernoulli, multinomial, poisson, "response" is allowed. - models with large interactions (>24) were using up all the terminal nodes in the stack. The stack has been increased to 101 nodes allowing interaction.depth up to 49. A more graceful error is now issued if interaction.depth exceeds 49. (Thanks to Tom Dietterich for catching this). - gbm now uses the R macro R_NaN in the C++ code rather than NAN, which would not compile on Sun OS. - If covariates marked missing values with NaN instead of NA, the model fit would not be consistent (Thanks to JR Lockwood for noting this) Changes in version 1.6 - Quantile regression is now available thanks to a contribution from Brian Kriegler. Use list(name="quantile",alpha=0.05) as the distribution parameter to construct a predictor of the 5% of the conditional distribution - gbm() now stores cv.folds in the returned gbm object - Added a normalize parameter to summary.gbm that allows one to choose whether or not to normalize the variable influence to sum to 100 or not - Corrected a minor bug in plot.gbm that put the wrong variable label on the x axis when plotting a numeric variable and a factor variable - the C function gbm_plot can now handle missing values. This does not effect the R function plot.gbm(), but it makes gbm_plot potentially more useful for computing partial dependence plots - mgcv is no longer a required package, but the splines package is needed for calibrate.plot() - minor changes for compatibility with R 2.6.0 (thanks to Seth Falcon) - corrected a bug in the cox model computation when all terminal nodes had exactly the minimum number of observations permitted, which caused gbm and R to crash ungracefully. This was likely to occur with small datasets (thanks to Brian Ring) - corrected a bug in Laplace that always made the terminal node predictions slightly larger than the median. Corrected again in a minor release due to a bug caught by Jon McAuliffe - corrected a bug in interact.gbm that caused it to crash for factors. Caught by David Carslaw - added a plot of cross-validated error to the plots generated by gbm.perf Changes in version 1.5 - gbm would fail if there was only one x. Now drop=FALSE is set in all data.frame subsetting (thanks to Gregg Keller for noticing this). - Corrected gbm.perf() to check if bag.fraction=1 and skips trying to create the OOB plots and estimates. - Corrected a typo in the vignette specifying the gradient for the Cox model. - Fixed the OOB-reps.R demo. For non-Gaussian cases it was maximizing the deviance rather than minimizing. - Increased the largest factor variable allowed from 256 levels to 1024 levels. gbm stops if any factor variable exceeds 1024. Will try to make this cleaner in the future. - predict.gbm now allows n.trees to be a vector and efficiently computes predictions for each indicated model. Avoids having to call predict.gbm several times for different choices of n.trees. - fixed a bug that occurred when using cross-validation for coxph. Was computing length(y) when y is a Surv object which return 2*N rather than N. This generated out-of-range indices for the training dataset. - Changed the method for extracting the name of the outcome variable to work around a change in terms.formula() when using "." in formulas. Changes in version 1.4 - The formula interface now allows for "-x" to indicate not including certain variables in the model fit. - Fixed the formula interface to allow offset(). The offset argument has now been removed from gbm(). - Added basehaz.gbm that computes the Breslow estimate of the baseline hazard. At a later stage this will be substituted with a call to survfit, which is much more general handling not only left-censored data. - OOB estimator is known to be conservative. A warning is now issued when using method="OOB" and there is no longer a default method for gbm.perf() - cv.folds now an option to gbm and method="cv" is an option for gbm.perf. Performs v-fold cross validation for estimating the optimal number of iterations - There is now a package vignette with details on the user options and the mathematics behind the gbm engine. Changes in version 1.3 - All likelihood based loss functions are now in terms of Deviance (-2*log likelihood). As a result, gbm always minimizes the loss. Previous versions minimized losses for some choices of distribution and maximized a likelihood for other choices. - Fixed the Poisson regression to avoid predicting +/- infinity which occurs when a terminal node has only observations with y=0. The largest predicted value is now +/-19, similar to what glm predicts for these extreme cases for linear Poisson regression. The shrinkage factor will be applied to the -19 predictions so it will take 1/shrinkage gbm iterations locating pure terminal nodes before gbm would actually return a predicted value of +/-19. - Introduces shrink.gbm.pred() that does a lasso-style variable selection Consider this function as still in an experimental phase. - Bug fix in plot.gbm - All calls to ISNAN now call ISNA (avoids using isnan) Changes in version 1.2 - fixed gbm.object help file and updated the function to check for missing values to the latest R standard. - gbm.plot now allows i.var to be the names of the variables to plot or the index of the variables used - gbm now requires "stats" package into which "modreg" has been merged - documentation for predict.gbm corrected Changes in version 1.1 - all calculations of loss functions now compute averages rather than totals. That is, all performance measures (text of progress, gbm.perf) now report average log-likelihood rather than total log-likelihood (e.g. mean squared error rather than sum of squared error). A slight exception applies to distribution="coxph". For these models the averaging pertains only to the uncensored observations. The denominator is sum(w[i]*delta[i]) rather than the usual sum(w[i]). - summary.gbm now has an experimental "method" argument. The default computes the relative influence as before. The option "method=permutation.test.gbm" performs a permutation test for the relative influence. Give it a try and let me know how it works. It currently is not implemented for "distribution=coxph". - added gbm.fit, a function that avoids the model.frame call, which is tragically slow with lots of variables. gbm is now just a formula/model.frame wrapper for the gbm.fit function. (based on a suggestion and code from Jim Garrett) - corrected a bug in the use of offsets. Now the user must pass the offset vector with the offset argument rather than in the formula. Previously, offsets were being used once as offsets and a second time as a predictor. - predict.gbm now has a single.tree option. When set to TRUE the function will return predictions from only that tree. The idea is that this may be useful for reweighting the trees using a post-model fit adjustment. - corrected a bug in CPoisson::BagImprovement that incorrectly computed the bagged estimate of improvement - corrected a bug for distribution="coxph" in gbm() and gbm.more(). If there was a single predictor the functions would drop the unused array dimension issuing an error. - corrected gbm() distribution="coxph" when train.fraction=1.0. The program would set two non-existant observations in the validation set and issue a warning. - if a predictor variable has no variation a warning (rather than an error) is now issued - updated the documentation for calibrate.plot to match the implementation - changed the some of the default values in gbm(), bag.fraction=0.5, train.fraction=1.0, and shrinkage=0.001. - corrected a bug in predict.gbm. The C code producing the predictions would go into an infinite loop if predicting an observation with a level of a categorical variable not seen in the training dataset. Now the routine uses the missing value prediction. (Feng Zeng) - added a "type" parameter to predict.gbm. The default ("link") is the same as before, predictions are on the canonical scale (gradient scale). The new option ("response") converts back the same scale as the outcome (probability for bernoulli, mean for gaussian, etc.). - gbm and gbm.more now have verbose options which can be set to FALSE to suppress the progress and performance indicators. (several users requested this nice feature) - gbm.perf no longer prints out verbose information about the best iteration estimate. It simply returns the estimate and creates the plots if requested. - ISNAN, since R 1.8.0, R.h changed declarations for ISNAN(). These changes broke gbm 1.0. I added the following code to buildinfo.h to fix this #ifdef IEEE_754 #undef ISNAN #define ISNAN(x) R_IsNaNorNA(x) #endif seems to work now but I'll look for a more elegant solution. Changes in version 0.8 - Additional documentation about the loss functions, graphics, and methods is now available with the package - Fixed the initial value for the adaboost exponential loss. Prior to version 0.8 the initial value was 0.0, now half the baseline log-odds - Changes in some headers and #define's to compile under gcc 3.2 (Brian Ripley) Changes in version 0.7 - gbm.perf, the argument named best.iter.calc has been renamed "method" for greater simplicity - all entries in the design matrix are now coerced to doubles (Thanks to Bonnie Ghosh) - now checks that all predictors are either numeric, ordinal, or factor - summary.gbm now reports the correct relative influence when some variables do not enter the model. (Thanks to Hugh Chipman) - renamed several #define'd variables in buildinfo.h so they do not conflict with standard winerror.h names. Planned future changes 1. Add weighted median functionality to Laplace 2. Automate the fitting process, ie, selecting shrinkage and number of iterations 3. Add overlay factor*continuous predictor plot as an option rather than lattice plots 4. Add multinomial and ordered logistic regression procedures Thanks to RAND for sponsoring the development of this software through statistical methods funding. Kurt Hornik, Brian Ripley, and Jan De Leeuw for helping me get gbm up to the R standard and into CRAN. Dan McCaffrey for testing and evangelizing the utility of this program. Bonnie Ghosh for finding bugs. Arnab Mukherji for testing and suggesting new features. Daniela Golinelli for finding bugs and marrying me. Andrew Morral for suggesting improvements and finding new applications of the method in the evaluation of drug treatment programs. Katrin Hambarsoomians for finding bugs. Hugh Chipman for finding bugs. Jim Garrett for many suggestions and contributions. gbm/MD50000644000176200001440000001410514637453022011336 0ustar liggesusers13d645ea5a474dec3c644512353ee958 *CHANGES e8a38246290de8fe95c46ac1ebc38f2e *DESCRIPTION 73ed5410da67876bbccbc45fda025419 *LICENSE 8a3be7d6e9633a3f4a89c866a98df466 *NAMESPACE 7df0f18225094712a3d207c4486a7db7 *NEWS.md 30015ca11f94eab451fcfa77bf7283b6 *R/basehaz.gbm.R 3f6808cb2e1ed79239d9c2b6722399ca *R/calibrate.plot.R 688bf34a31974b997020f6d3eae12cec *R/gbm-internals.R e100ef1e30f39323e8fb5b37f4d8e287 *R/gbm-package.R de5d551d0dcd8061a77cbd728bb915b3 *R/gbm.R f2310d9a5225ecacd1a28b13720646b6 *R/gbm.fit.R 6647c2a0cf4b52f74282d4a30a28c782 *R/gbm.more.R 0eb9492a4889dd77c7b93ff268e75873 *R/gbm.object.R 5ddc9dfe12ff465c75b11330bdf4551b *R/gbm.perf.R 2dc03e69665e4599c7289f2993eb3250 *R/gbmCrossVal.R b6d4e6c4e10836d532d2d5cf4e8d692b *R/interact.gbm.R 03f0e1b453cd47ce2da756d9c873627c *R/ir.measures.R ca11cea5656713e463f95791318b498f *R/plot.gbm.R b3400b409ef852a38f2c93eca941c42d *R/predict.gbm.R 5aea47b751e58bde1cd3e07623e68d1b *R/pretty.gbm.tree.R 30f9776164937c618296bf4bf32bb9cc *R/print.gbm.R 37f93983b3789b3549b825b052746538 *R/reconstructGBMdata.R 933e88de5ef87ab875028c5dba6e5829 *R/relative.influence.R 08f46c17919f160031d6a2f18cd6ef19 *R/test.gbm.R 8d5424719baba6f979cc347268487581 *R/utils.R b7c53e14f289dc6caa147cb80b381355 *R/zzz.R 9d96211a993278ccaa571850bf8e4209 *README.md 2a555c7321f4ebc815961fd106c7c578 *build/vignette.rds 34473f8b2b3d89242031c1af2af38155 *demo/00Index c8111e88c09d7a1b3729b41f67b871d1 *demo/OOB-reps.R 0b6bf4e6c515ffea8a137067f03724d7 *demo/bernoulli.R ee94468e32b128e2b7689a61853a998b *demo/coxph.R 3d39292d223c02e0dd77d9c6dd51ec55 *demo/gaussian.R 31906c0a7bce9676949413f0fbff2c6c *demo/multinomial.R 65fbbb93da8204c57435c7e42aef2f89 *demo/pairwise.R fe8f5f9946ae5cd80cf18c033488d5c5 *demo/printExamples.R ac91cb39bbb7b8072ba65cf5933bcf68 *demo/robustReg.R 94b9c4c04c246fd60085115c9f5a7cf8 *inst/doc/gbm.Rnw b0ff78e02298d83412611177b001f844 *inst/doc/gbm.pdf d7dfb5eccdb0251ad7b9bda805040cdf *inst/tinytest/test_bernoulli.R 0ecf23a69c73dc027f34b243a9476eae *inst/tinytest/test_coxph.R 08794176cae150c50cc7bd8e36dbdd23 *inst/tinytest/test_least_squares.R 0015c0bb9b92caefb3233ca592ef26b8 *inst/tinytest/test_relative_influence.R 055bd4822b0fa0192e8252df774749f7 *man/basehaz.gbm.Rd 63c9c494f46f5435ba5929fc6635557c *man/calibrate.plot.Rd c2ba2c010f3bf3b1b1d879a9a06fa263 *man/gbm-internals.Rd 710ff7b9b3914708202a6c31c4da448c *man/gbm-package.Rd 716427ed642502c7a633f6e691a02198 *man/gbm.Rd 2a033c4b1eb145278f354148c18183b7 *man/gbm.fit.Rd b4728cd03e5ca0807ce012514c397747 *man/gbm.more.Rd e40368d2c5bd45e17f520c6bf465b878 *man/gbm.object.Rd 537a379fbf804a4f708d059345859c2e *man/gbm.perf.Rd bd538d953cfba416e0a48e02f61fed5d *man/gbm.roc.area.Rd 82696b8c5305ac18217c9f1509bef563 *man/gbmCrossVal.Rd 8fde31aedb860ce6b49e1f807d5e5d04 *man/interact.gbm.Rd 3492ad829f6d213cd83ef0f396746c92 *man/plot.gbm.Rd 2c025c91a0ad58280a6dda58a76ae9fe *man/predict.gbm.Rd 0cc9c5bc1a0cfc0c7194b548bfff1238 *man/pretty.gbm.tree.Rd 8e0d84bc42a9be3c623fda10ec7e7eb2 *man/print.gbm.Rd 20f861788abbb0e65a7dc81b2cfd219d *man/quantile.rug.Rd df48c14dc7d0f50f60d3e69eb79ed4fe *man/reconstructGBMdata.Rd 069434ff43e5f47a2cf584d51d61fd3b *man/relative.influence.Rd eeb8e8f08f7d57a76c29169682128fd4 *man/summary.gbm.Rd abb04bba52be23fc28a479c5052bcbf2 *man/test.gbm.Rd 9a8e2ff37f4bbc272786d7fb9f018f84 *src/adaboost.cpp 8bc0e7e6170218dc764317a071ff7929 *src/adaboost.h 1725d231e272a6353bc694525c5da0f7 *src/bernoulli.cpp 1b6ea8a45b1a6b9820bf0dbba721acd6 *src/bernoulli.h 91e43b317576bdf297617babbc600840 *src/buildinfo.h e15f767c646f66e54eb5bb20ccd7cebd *src/coxph.cpp f1bea0a96e9bba97336f64aef316659d *src/coxph.h 3616890b5d7af2b3edd52dc5f29544b0 *src/dataset.cpp d30f46362b1915f76e5a328ce95c7136 *src/dataset.h b5824ccf353076bf59018429ae3ac6ac *src/distribution.cpp 91d88e455827695f63bf23df5dfb3108 *src/distribution.h 6d2bd44a11975c8f023640eb7a9036c3 *src/gaussian.cpp ca699b9ef743d6f4c44a469a12dfc7b2 *src/gaussian.h 27ffff3bcc49d50e130083ef8f2081e5 *src/gbm-init.c 884c1603e6c98b2ae795ed35b07556bd *src/gbm.cpp 37dfe9bd4a1d1d9e70b9d61c44949b91 *src/gbm.h c0c572eb464dae70700ffe8fdc3f6b9f *src/gbm_engine.cpp 87f344712c06f95ca9ada358aaa3f74a *src/gbm_engine.h 55715ab97df43bf3f1e62ed2a17e2bb7 *src/gbmentry.cpp 1fba83f37e9f092d8b005e0c8f32a97b *src/huberized.cpp e2e3e25937cde555f9099b3099aecf82 *src/huberized.h ad0c18190ee958063e3f68564e52b204 *src/laplace.cpp a109ba9c2fd1f34dbd04033081580adf *src/laplace.h 45ede1eab442e5165ecde06bb5e0a028 *src/locationm.cpp b79ca7b9c1ece1f8cad0f2958046f436 *src/locationm.h 5ce6e2bb396ca0b45b97270abae1bb36 *src/matrix.h 8036edaddcc69c2b8c3f130706f491c7 *src/multinomial.cpp 2ea82018296bb369e77f96ac9c31c483 *src/multinomial.h 75737afcbdd3162c62fcdd82b027e1d2 *src/node.cpp 11d20f2bb5fad44684e8ceef011f490d *src/node.h 2e99439ebd925c2fe54e609ad93affea *src/node_categorical.cpp 98afbdcf5bb70211102e58ed262fcec1 *src/node_categorical.h a04d827631d157efd47de6ef63633858 *src/node_continuous.cpp f09bd89f861430f58cb80ccf0de77c6a *src/node_continuous.h af2b9dd107d657344891521829c52243 *src/node_factory.cpp b39d639f336508ce5b86a4539b754366 *src/node_factory.h 56dc9a7a6309294654e641c14a32023d *src/node_nonterminal.cpp 062cbcf913ad61d33048c36ab0b76735 *src/node_nonterminal.h 3f255ece01a44a03fa19e2a1cd4f1c79 *src/node_search.cpp 4e7e0301f857dae21654e3f1b352a638 *src/node_search.h c6943942255ce8138259b6b47caa0c08 *src/node_terminal.cpp b9753b03cb6ac034a62ad24d7d86c82d *src/node_terminal.h 67575e1a6a961c63f3898e334bb5d46e *src/pairwise.cpp a15faf94a4de6b6e5b7b82834766cc3b *src/pairwise.h 756422dc1f3f394260fa4d77ec42d1ed *src/poisson.cpp f59ea02f9d947cbba7e37fdba58384b4 *src/poisson.h 6091d95aba012b027507708aebdaf899 *src/quantile.cpp 91e37941d9bcb59386adb5b2bc1ea350 *src/quantile.h 519b30584e7e752480750e86027aea7e *src/tdist.cpp 9ab15eb81fc9a18ee7d14a76f7aefd2a *src/tdist.h 276e36bf158250eb458a1cdabcf975b5 *src/tree.cpp 6b2f1cd60e5d67638e110e1ac9552b27 *src/tree.h 9c6e92cd674234f443e27de76b68bb8e *tests/tinytest.R 94b9c4c04c246fd60085115c9f5a7cf8 *vignettes/gbm.Rnw 914be8d7b37847472ad8d58b3bd5f39a *vignettes/gbm.bib 7ba661d197d25537a69fc34d737b4d29 *vignettes/oobperf2.pdf 3fda19791155842b0e48565781441aa2 *vignettes/shrinkage-v-iterations.pdf gbm/R/0000755000176200001440000000000014547111627011230 5ustar liggesusersgbm/R/gbm-package.R0000644000176200001440000000451614562477375013533 0ustar liggesusers#' Generalized Boosted Regression Models (GBMs) #' #' This package implements extensions to Freund and Schapire's AdaBoost #' algorithm and J. Friedman's gradient boosting machine. Includes regression #' methods for least squares, absolute loss, logistic, Poisson, Cox #' proportional hazards partial likelihood, multinomial, t-distribution, #' AdaBoost exponential loss, Learning to Rank, and Huberized hinge loss. #' This gbm package is no longer under further development. Consider #' https://github.com/gbm-developers/gbm3 for the latest version. #' #' Further information is available in vignette: #' \code{browseVignettes(package = "gbm")} #' #' @import lattice #' #' @importFrom grDevices rainbow #' @importFrom graphics abline axis barplot lines mtext par plot polygon rug #' @importFrom graphics segments title #' @importFrom stats approx binomial delete.response gaussian glm loess #' @importFrom stats model.extract model.frame model.offset model.response #' @importFrom stats model.weights na.pass poisson predict quantile rbinom #' @importFrom stats reformulate reorder rexp rnorm runif sd supsmu terms var #' @importFrom stats weighted.mean #' @importFrom survival Surv #' #' @useDynLib gbm, .registration = TRUE #' #' #' @author Greg Ridgeway \email{gridge@@upenn.edu} with contributions by #' Daniel Edwards, Brian Kriegler, Stefan Schroedl, Harry Southworth, #' and Brandon Greenwell #' #' @references #' Y. Freund and R.E. Schapire (1997) \dQuote{A decision-theoretic #' generalization of on-line learning and an application to boosting,} #' \emph{Journal of Computer and System Sciences,} 55(1):119-139. #' #' G. Ridgeway (1999). \dQuote{The state of boosting,} \emph{Computing Science #' and Statistics} 31:172-181. #' #' J.H. Friedman, T. Hastie, R. Tibshirani (2000). \dQuote{Additive Logistic #' Regression: a Statistical View of Boosting,} \emph{Annals of Statistics} #' 28(2):337-374. #' #' J.H. Friedman (2001). \dQuote{Greedy Function Approximation: A Gradient #' Boosting Machine,} \emph{Annals of Statistics} 29(5):1189-1232. #' #' J.H. Friedman (2002). \dQuote{Stochastic Gradient Boosting,} #' \emph{Computational Statistics and Data Analysis} 38(4):367-378. #' #' The \href{https://jerryfriedman.su.domains/R-MART.html}{MART} website. #' #' @keywords package "_PACKAGE" NULLgbm/R/pretty.gbm.tree.R0000644000176200001440000000412514547111627014406 0ustar liggesusers#' Print gbm tree components #' #' \code{gbm} stores the collection of trees used to construct the model in a #' compact matrix structure. This function extracts the information from a #' single tree and displays it in a slightly more readable form. This function #' is mostly for debugging purposes and to satisfy some users' curiosity. #' #' #' @param object a \code{\link{gbm.object}} initially fit using #' \code{\link{gbm}} #' @param i.tree the index of the tree component to extract from \code{object} #' and display #' @return \code{pretty.gbm.tree} returns a data frame. Each row corresponds to #' a node in the tree. Columns indicate \item{SplitVar}{index of which variable #' is used to split. -1 indicates a terminal node.} \item{SplitCodePred}{if the #' split variable is continuous then this component is the split point. If the #' split variable is categorical then this component contains the index of #' \code{object$c.split} that describes the categorical split. If the node is a #' terminal node then this is the prediction.} \item{LeftNode}{the index of the #' row corresponding to the left node.} \item{RightNode}{the index of the row #' corresponding to the right node.} \item{ErrorReduction}{the reduction in the #' loss function as a result of splitting this node.} \item{Weight}{the total #' weight of observations in the node. If weights are all equal to 1 then this #' is the number of observations in the node.} #' @author Greg Ridgeway \email{gregridgeway@@gmail.com} #' @seealso \code{\link{gbm}}, \code{\link{gbm.object}} #' @keywords print #' @export pretty.gbm.tree pretty.gbm.tree <- function(object,i.tree=1) { if((i.tree<1) || (i.tree>length(object$trees))) { stop("i.tree is out of range. Must be less than ",length(object$trees)) } else { temp <- data.frame(object$trees[[i.tree]]) names(temp) <- c("SplitVar","SplitCodePred","LeftNode", "RightNode","MissingNode","ErrorReduction", "Weight","Prediction") row.names(temp) <- 0:(nrow(temp)-1) } return(temp) } gbm/R/predict.gbm.R0000644000176200001440000001420714547111627013555 0ustar liggesusers#' Predict method for GBM Model Fits #' #' Predicted values based on a generalized boosted model object #' #' \code{predict.gbm} produces predicted values for each observation in #' \code{newdata} using the the first \code{n.trees} iterations of the boosting #' sequence. If \code{n.trees} is a vector than the result is a matrix with #' each column representing the predictions from gbm models with #' \code{n.trees[1]} iterations, \code{n.trees[2]} iterations, and so on. #' #' The predictions from \code{gbm} do not include the offset term. The user may #' add the value of the offset to the predicted value if desired. #' #' If \code{object} was fit using \code{\link{gbm.fit}} there will be no #' \code{Terms} component. Therefore, the user has greater responsibility to #' make sure that \code{newdata} is of the same format (order and number of #' variables) as the one originally used to fit the model. #' #' @param object Object of class inheriting from (\code{\link{gbm.object}}) #' #' @param newdata Data frame of observations for which to make predictions #' #' @param n.trees Number of trees used in the prediction. \code{n.trees} may be #' a vector in which case predictions are returned for each iteration specified #' #' @param type The scale on which gbm makes the predictions #' #' @param single.tree If \code{single.tree=TRUE} then \code{predict.gbm} #' returns only the predictions from tree(s) \code{n.trees} #' #' @param \dots further arguments passed to or from other methods #' #' @return Returns a vector of predictions. By default the predictions are on #' the scale of f(x). For example, for the Bernoulli loss the returned value is #' on the log odds scale, poisson loss on the log scale, and coxph is on the #' log hazard scale. #' #' If \code{type="response"} then \code{gbm} converts back to the same scale as #' the outcome. Currently the only effect this will have is returning #' probabilities for bernoulli and expected counts for poisson. For the other #' distributions "response" and "link" return the same. #' #' @author Greg Ridgeway \email{gregridgeway@@gmail.com} #' #' @seealso \code{\link{gbm}}, \code{\link{gbm.object}} #' #' @keywords models regression #' #' @export predict.gbm #' @export predict.gbm <- function(object, newdata, n.trees, type = "link", single.tree = FALSE, ...) { if ( missing( newdata ) ){ newdata <- reconstructGBMdata(object) } if ( missing(n.trees) || length(n.trees) < 1 ) { if ( object$train.fraction < 1 ) { n.trees <- gbm.perf( object, method = "test", plot.it = FALSE ) } else if (!is.null(object$cv.error)) { n.trees <- gbm.perf( object, method = "cv", plot.it = FALSE ) } else { n.trees <- length( object$train.error ) } message( paste( "Using", n.trees, "trees...\n" ) ) } if (!is.element(type, c("link", "response"))) { stop("type must be either 'link' or 'response'") } if (!is.null(object$Terms)) { x <- model.frame(terms(reformulate(object$var.names)), newdata, na.action = na.pass) } else { x <- newdata } cRows <- nrow(x) cCols <- ncol(x) for(i in 1:cCols) { if(is.factor(x[,i])) { if (length(levels(x[,i])) > length(object$var.levels[[i]])) { new.compare <- levels(x[,i])[1:length(object$var.levels[[i]])] } else { new.compare <- levels(x[,i]) } if (!identical(object$var.levels[[i]], new.compare)) { x[,i] <- factor(x[,i], union(object$var.levels[[i]], levels(x[,i]))) } x[,i] <- as.numeric(factor(x[,i], levels = object$var.levels[[i]]))-1 } } x <- as.vector(unlist(x, use.names=FALSE)) if (missing(n.trees) || any(n.trees > object$n.trees)) { n.trees[n.trees>object$n.trees] <- object$n.trees warning("Number of trees not specified or exceeded number fit so far. ", "Using ", paste(n.trees, collapse = " "), ".") } i.ntree.order <- order(n.trees) # Next if block for compatibility with objects created with version 1.6. if (is.null(object$num.classes)){ object$num.classes <- 1 } predF <- .Call("gbm_pred", X=as.double(x), cRows=as.integer(cRows), cCols=as.integer(cCols), cNumClasses = as.integer(object$num.classes), n.trees=as.integer(n.trees[i.ntree.order]), initF=object$initF, trees=object$trees, c.split=object$c.split, var.type=as.integer(object$var.type), single.tree = as.integer(single.tree), PACKAGE = "gbm") if ((length(n.trees) > 1) || (object$num.classes > 1)) { if (object$distribution$name=="multinomial") { predF <- array(predF, dim=c(cRows,object$num.classes,length(n.trees))) dimnames(predF) <- list(NULL, object$classes, n.trees) predF[,,i.ntree.order] <- predF } else { predF <- matrix(predF, ncol=length(n.trees), byrow=FALSE) colnames(predF) <- n.trees predF[,i.ntree.order] <- predF } } if (type=="response") { if (is.element(object$distribution$name, c("bernoulli", "pairwise"))) { predF <- 1 / (1 + exp(-predF)) } else if (object$distribution$name =="poisson") { predF <- exp(predF) } else if (object$distribution$name == "adaboost"){ predF <- 1 / (1 + exp(-2*predF)) } if (object$distribution$name == "multinomial") { pexp <- exp(predF) psum <- apply(pexp, c(1, 3), function(x) { x / sum(x) }) # Transpose each 2d array predF <- aperm(psum, c(2, 1, 3)) } if ((length(n.trees) == 1) && (object$distribution$name != "multinomial")) { predF <- as.vector(predF) } } if(!is.null(attr(object$Terms,"offset"))) { warning("predict.gbm does not add the offset to the predicted values.") } return(predF) } gbm/R/test.gbm.R0000644000176200001440000002643314547111627013106 0ustar liggesusers#' Test the \code{gbm} package. #' #' Run tests on \code{gbm} functions to perform logical checks and #' reproducibility. #' #' The function uses functionality in the \code{RUnit} package. A fairly small #' validation suite is executed that checks to see that relative influence #' identifies sensible variables from simulated data, and that predictions from #' GBMs with Gaussian, Cox or binomial distributions are sensible, #' #' @aliases validate.gbm test.gbm test.relative.influence #' @return An object of class \code{RUnitTestData}. See the help for #' \code{RUnit} for details. #' @note The test suite is not comprehensive. #' @author Harry Southworth #' @seealso \code{\link{gbm}} #' @keywords models #' @examples #' #' # Uncomment the following lines to run - commented out to make CRAN happy #' #library(RUnit) #' #val <- validate.texmex() #' #printHTMLProtocol(val, "texmexReport.html") #' @export test.gbm <- function(){ # Based on example in R package # Gaussian example ############################################################################ ## test Gaussian distribution gbm model set.seed(123) cat("Running least squares regression example.\n") # create some data N <- 1000 X1 <- runif(N) X2 <- 2*runif(N) X3 <- factor(sample(letters[1:4],N,replace=T)) X4 <- ordered(sample(letters[1:6],N,replace=T)) X5 <- factor(sample(letters[1:3],N,replace=T)) X6 <- 3*runif(N) mu <- c(-1,0,1,2)[as.numeric(X3)] SNR <- 10 # signal-to-noise ratio Y <- X1**1.5 + 2 * (X2**.5) + mu sigma <- sqrt(var(Y)/SNR) Y <- Y + rnorm(N,0,sigma) # create a bunch of missing values X1[sample(1:N,size=100)] <- NA X3[sample(1:N,size=300)] <- NA w <- rep(1,N) data <- data.frame(Y=Y,X1=X1,X2=X2,X3=X3,X4=X4,X5=X5,X6=X6) # fit initial model gbm1 <- gbm(Y~X1+X2+X3+X4+X5+X6, # formula data=data, # dataset var.monotone=c(0,0,0,0,0,0), # -1: monotone decrease, +1: monotone increase, 0: no monotone restrictions distribution="gaussian", # bernoulli, adaboost, gaussian, poisson, coxph, or # list(name="quantile",alpha=0.05) for quantile regression n.trees=2000, # number of trees shrinkage=0.005, # shrinkage or learning rate, 0.001 to 0.1 usually work interaction.depth=3, # 1: additive model, 2: two-way interactions, etc bag.fraction = 0.5, # subsampling fraction, 0.5 is probably best train.fraction = 0.5, # fraction of data for training, first train.fraction*N used for training n.minobsinnode = 10, # minimum number of obs needed in each node keep.data=TRUE, cv.folds=10) # do 10-fold cross-validation # Get best model best.iter <- gbm.perf(gbm1,method="cv", plot.it=FALSE) # returns cv estimate of best number of trees set.seed(223) # make some new data N <- 1000 X1 <- runif(N) X2 <- 2*runif(N) X3 <- factor(sample(letters[1:4],N,replace=TRUE)) X4 <- ordered(sample(letters[1:6],N,replace=TRUE)) X5 <- factor(sample(letters[1:3],N,replace=TRUE)) X6 <- 3*runif(N) mu <- c(-1,0,1,2)[as.numeric(X3)] # Actual underlying signal Y <- X1**1.5 + 2 * (X2**.5) + mu # Want to see how close predictions are to the underlying signal; noise would just interfere with this # Y <- Y + rnorm(N,0,sigma) data2 <- data.frame(Y=Y,X1=X1,X2=X2,X3=X3,X4=X4,X5=X5,X6=X6) # predict on the new data using "best" number of trees f.predict <- predict(gbm1,data2,best.iter) # f.predict will be on the canonical scale (logit,log,etc.) # Base the validation tests on observed discrepancies RUnit::checkTrue(abs(mean(data2$Y-f.predict)) < 0.01, msg="Gaussian absolute error within tolerance") RUnit::checkTrue(sd(data2$Y-f.predict) < sigma , msg="Gaussian squared erroor within tolerance") ############################################################################ ## test coxph distribution gbm model ## COX PROPORTIONAL HAZARDS REGRESSION EXAMPLE cat("Running cox proportional hazards regression example.\n") # create some data set.seed(2) N <- 3000 X1 <- runif(N) X2 <- runif(N) X3 <- factor(sample(letters[1:4],N,replace=T)) mu <- c(-1,0,1,2)[as.numeric(X3)] f <- 0.5*sin(3*X1 + 5*X2^2 + mu/10) tt.surv <- rexp(N,exp(f)) tt.cens <- rexp(N,0.5) delta <- as.numeric(tt.surv <= tt.cens) tt <- apply(cbind(tt.surv,tt.cens),1,min) # throw in some missing values X1[sample(1:N,size=100)] <- NA X3[sample(1:N,size=300)] <- NA # random weights if you want to experiment with them w <- rep(1,N) data <- data.frame(tt=tt,delta=delta,X1=X1,X2=X2,X3=X3) # fit initial model gbm1 <- gbm(Surv(tt,delta)~X1+X2+X3, # formula data=data, # dataset weights=w, var.monotone=c(0,0,0), # -1: monotone decrease, +1: monotone increase, 0: no monotone restrictions distribution="coxph", n.trees=3000, # number of trees shrinkage=0.001, # shrinkage or learning rate, 0.001 to 0.1 usually work interaction.depth=3, # 1: additive model, 2: two-way interactions, etc bag.fraction = 0.5, # subsampling fraction, 0.5 is probably best train.fraction = 0.5, # fraction of data for training, first train.fraction*N used for training cv.folds = 5, # do 5-fold cross-validation n.minobsinnode = 10, # minimum total weight needed in each node keep.data = TRUE) best.iter <- gbm.perf(gbm1,method="test", plot.it=FALSE) # returns test set estimate of best number of trees # make some new data set.seed(2) N <- 1000 X1 <- runif(N) X2 <- runif(N) X3 <- factor(sample(letters[1:4],N,replace=T)) mu <- c(-1,0,1,2)[as.numeric(X3)] f <- 0.5*sin(3*X1 + 5*X2^2 + mu/10) # -0.5 <= f <= 0.5 via sin fn. tt.surv <- rexp(N,exp(f)) tt.cens <- rexp(N,0.5) data2 <- data.frame(tt=apply(cbind(tt.surv,tt.cens),1,min), delta=as.numeric(tt.surv <= tt.cens), f=f, X1=X1,X2=X2,X3=X3) # predict on the new data using "best" number of trees # f.predict will be on the canonical scale (logit,log,etc.) f.predict <- predict(gbm1, newdata = data2, n.trees = best.iter) #plot(data2$f,f.predict) # Use observed sd RUnit::checkTrue(sd(data2$f - f.predict) < 0.4, msg="Coxph: squared error within tolerance") ############################################################################ ## Test bernoulli distribution gbm model set.seed(1) cat("Running logistic regression example.\n") # create some data N <- 1000 X1 <- runif(N) X2 <- runif(N) X3 <- factor(sample(letters[1:4],N,replace=T)) mu <- c(-1,0,1,2)[as.numeric(X3)] p <- 1/(1+exp(-(sin(3*X1) - 4*X2 + mu))) Y <- rbinom(N,1,p) # random weights if you want to experiment with them w <- rexp(N) w <- N*w/sum(w) data <- data.frame(Y=Y,X1=X1,X2=X2,X3=X3) # fit initial model gbm1 <- gbm(Y~X1+X2+X3, # formula data=data, # dataset weights=w, var.monotone=c(0,0,0), # -1: monotone decrease, +1: monotone increase, 0: no monotone restrictions distribution="bernoulli", n.trees=3000, # number of trees shrinkage=0.001, # shrinkage or learning rate, 0.001 to 0.1 usually work interaction.depth=3, # 1: additive model, 2: two-way interactions, etc bag.fraction = 0.5, # subsampling fraction, 0.5 is probably best train.fraction = 0.5, # fraction of data for training, first train.fraction*N used for training cv.folds=5, # do 5-fold cross-validation n.minobsinnode = 10) # minimum total weight needed in each node best.iter.test <- gbm.perf(gbm1,method="test", plot.it=FALSE) # returns test set estimate of best number of trees best.iter <- best.iter.test # make some new data set.seed(2) N <- 1000 X1 <- runif(N) X2 <- runif(N) X3 <- factor(sample(letters[1:4],N,replace=T)) mu <- c(-1,0,1,2)[as.numeric(X3)] p <- 1/(1+exp(-(sin(3*X1) - 4*X2 + mu))) Y <- rbinom(N,1,p) data2 <- data.frame(Y=Y,X1=X1,X2=X2,X3=X3) # predict on the new data using "best" number of trees # f.predict will be on the canonical scale (logit,log,etc.) f.1.predict <- predict(gbm1,data2, n.trees=best.iter.test) # compute quantity prior to transformation f.new = sin(3*X1) - 4*X2 + mu # Base the validation tests on observed discrepancies RUnit::checkTrue(sd(f.new - f.1.predict) < 1.0 ) invisible() } ################################################################################ ########################### test.relative.influence() ########################## ########################### ########################## #' @export test.relative.influence <- function(){ # Test that relative.influence really does pick out the true predictors set.seed(1234) X1 <- matrix(nrow=1000, ncol=50) X1 <- apply(X1, 2, function(x) rnorm(1000)) # Random noise X2 <- matrix(nrow=1000, ncol=5) X2 <- apply(X2, 2, function(x) c(rnorm(500), rnorm(500, 3))) # Real predictors cls <- rep(c(0, 1), ea=500) # Class X <- data.frame(cbind(X1, X2, cls)) mod <- gbm(cls ~ ., data= X, n.trees=1000, cv.folds=5, shrinkage=.01, interaction.depth=2) ri <- rev(sort(relative.influence(mod))) wh <- names(ri)[1:5] res <- sum(wh %in% paste("V", 51:55, sep = "")) RUnit::checkEqualsNumeric(res, 5, msg="Testing relative.influence identifies true predictors") } ################################################################################ ################################ validate.gbm() ################################ ################################ ################################ #' @export validate.gbm <- function () { wh <- (1:length(search()))[search() == "package:gbm"] tests <- objects(wh)[substring(objects(wh), 1, 5) == "test."] # Create temporary directory to put tests into sep <- if (.Platform$OS.type == "windows") "\\" else "/" dir <- file.path(tempdir(), "gbm.tests", fsep = sep) dir.create(dir) for (i in 1:length(tests)) { str <- paste(dir, sep, tests[i], ".R", sep = "") dump(tests[i], file = str) } res <- RUnit::defineTestSuite("gbm", dirs = dir, testFuncRegexp = "^test.+", testFileRegexp = "*.R") cat("Running gbm test suite.\nThis will take some time...\n\n") RUnit::runTestSuite(res) } gbm/R/gbm.perf.R0000644000176200001440000000726114547111627013061 0ustar liggesusers#' GBM performance #' #' Estimates the optimal number of boosting iterations for a \code{gbm} object #' and optionally plots various performance measures #' #' @param object A \code{\link{gbm.object}} created from an initial call to #' \code{\link{gbm}}. #' #' @param plot.it An indicator of whether or not to plot the performance #' measures. Setting \code{plot.it = TRUE} creates two plots. The first plot #' plots \code{object$train.error} (in black) and \code{object$valid.error} #' (in red) versus the iteration number. The scale of the error measurement, #' shown on the left vertical axis, depends on the \code{distribution} #' argument used in the initial call to \code{\link{gbm}}. #' #' @param oobag.curve Indicates whether to plot the out-of-bag performance #' measures in a second plot. #' #' @param overlay If TRUE and oobag.curve=TRUE then a right y-axis is added to #' the training and test error plot and the estimated cumulative improvement #' in the loss function is plotted versus the iteration number. #' #' @param method Indicate the method used to estimate the optimal number of #' boosting iterations. \code{method = "OOB"} computes the out-of-bag estimate #' and \code{method = "test"} uses the test (or validation) dataset to compute #' an out-of-sample estimate. \code{method = "cv"} extracts the optimal number #' of iterations using cross-validation if \code{gbm} was called with #' \code{cv.folds} > 1. #' #' @return \code{gbm.perf} Returns the estimated optimal number of iterations. #' The method of computation depends on the \code{method} argument. #' #' @author Greg Ridgeway \email{gregridgeway@@gmail.com} #' #' @seealso \code{\link{gbm}}, \code{\link{gbm.object}} #' #' @keywords nonlinear survival nonparametric tree #' #' @export gbm.perf <- function(object, plot.it = TRUE, oobag.curve = FALSE, overlay = TRUE, method) { # Determine method, if missing if (missing(method)) { method <- guess_error_method(object) } # Determine "optimal" number of iterations best.iter <- best_iter(object, method = method) # Plot results if (plot.it) { # Determine an appropriate y-axis label ylab <- get_ylab(object) # Determine an appropriate range for the y-axis ylim <- get_ylim(object, method = method) # Plot results plot(object$train.error, ylim = ylim, type = "l", xlab = "Iteration", ylab = ylab) if (object$train.fraction != 1) { lines(object$valid.error, col = "red") } if (method=="cv") { lines(object$cv.error, col = "green") } if (!is.na(best.iter)) { abline(v = best.iter, col = "blue", lwd = 2, lty = 2) } if (oobag.curve) { smoother <- attr(best.iter, "smoother") if (overlay) { # smoother <- attr(best.iter, "smoother") par(new = TRUE) plot(smoother$x, cumsum(smoother$y), col = "blue", type = "l", xlab = "", ylab = "", axes = FALSE) axis(4, srt = 0) at <- mean(range(smoother$y)) mtext(paste("OOB improvement in", ylab), side = 4, srt = 270, line = 2) abline(h = 0, col = "blue", lwd = 2) } plot(object$oobag.improve, type = "l", xlab = "Iteration", ylab = paste("OOB change in", ylab)) lines(smoother, col = "red", lwd = 2) abline(h = 0, col = "blue", lwd = 1) abline(v =best.iter, col = "blue", lwd = 1) } } # Return "best" number of iterations (i.e., number of boosted trees) best.iter } gbm/R/print.gbm.R0000644000176200001440000001650114547111627013256 0ustar liggesusers#' Print model summary #' #' Display basic information about a \code{gbm} object. #' #' Prints some information about the model object. In particular, this method #' prints the call to \code{gbm()}, the type of loss function that was used, #' and the total number of iterations. #' #' If cross-validation was performed, the 'best' number of trees as estimated #' by cross-validation error is displayed. If a test set was used, the 'best' #' number of trees as estimated by the test set error is displayed. #' #' The number of available predictors, and the number of those having non-zero #' influence on predictions is given (which might be interesting in data mining #' applications). #' #' If multinomial, bernoulli or adaboost was used, the confusion matrix and #' prediction accuracy are printed (objects being allocated to the class with #' highest probability for multinomial and bernoulli). These classifications #' are performed on the entire training data using the model with the 'best' #' number of trees as described above, or the maximum number of trees if the #' 'best' cannot be computed. #' #' If the 'distribution' was specified as gaussian, laplace, quantile or #' t-distribution, a summary of the residuals is displayed. The residuals are #' for the training data with the model at the 'best' number of trees, as #' described above, or the maximum number of trees if the 'best' cannot be #' computed. #' #' @aliases print.gbm show.gbm #' @param x an object of class \code{gbm}. #' @param \dots arguments passed to \code{print.default}. #' @author Harry Southworth, Daniel Edwards #' @seealso \code{\link{gbm}} #' @keywords models nonlinear survival nonparametric #' @examples #' #' data(iris) #' iris.mod <- gbm(Species ~ ., distribution="multinomial", data=iris, #' n.trees=2000, shrinkage=0.01, cv.folds=5, #' verbose=FALSE, n.cores=1) #' iris.mod #' #data(lung) #' #lung.mod <- gbm(Surv(time, status) ~ ., distribution="coxph", data=lung, #' # n.trees=2000, shrinkage=0.01, cv.folds=5,verbose =FALSE) #' #lung.mod #' @rdname print.gbm #' @export print.gbm <- function(x, ... ) { if (!is.null(x$call)){ print(x$call) } dist.name <- x$distribution$name if (dist.name == "pairwise") { if (!is.null(x$distribution$max.rank) && x$distribution$max.rank > 0) { dist.name <- sprintf("pairwise (metric=%s, max.rank=%d)", x$distribution$metric, x$distribution$max.rank) } else { dist.name <- sprintf("pairwise (metric=%s)", x$distribution$metric) } } cat( paste( "A gradient boosted model with", dist.name, "loss function.\n" )) cat( paste( length( x$train.error ), "iterations were performed.\n" ) ) best <- length( x$train.error ) if ( !is.null( x$cv.error ) ) { best <- gbm.perf( x, plot.it = FALSE, method="cv" ) cat( paste("The best cross-validation iteration was ", best, ".\n", sep = "" ) ) } if ( x$train.fraction < 1 ) { best <- gbm.perf( x, plot.it = FALSE, method="test" ) cat( paste("The best test-set iteration was ", best, ".\n", sep = "" ) ) } if ( is.null( best ) ) { best <- length( x$train.error ) } ri <- relative.influence( x, n.trees=best ) cat( "There were", length( x$var.names ), "predictors of which", sum( ri > 0 ), "had non-zero influence.\n" ) invisible() } #' @rdname print.gbm #' #' @export show.gbm <- print.gbm #' Summary of a gbm object #' #' Computes the relative influence of each variable in the gbm object. #' #' For \code{distribution="gaussian"} this returns exactly the reduction of #' squared error attributable to each variable. For other loss functions this #' returns the reduction attributable to each variable in sum of squared error #' in predicting the gradient on each iteration. It describes the relative #' influence of each variable in reducing the loss function. See the references #' below for exact details on the computation. #' #' @param object a \code{gbm} object created from an initial call to #' \code{\link{gbm}}. #' @param cBars the number of bars to plot. If \code{order=TRUE} the only the #' variables with the \code{cBars} largest relative influence will appear in #' the barplot. If \code{order=FALSE} then the first \code{cBars} variables #' will appear in the plot. In either case, the function will return the #' relative influence of all of the variables. #' @param n.trees the number of trees used to generate the plot. Only the first #' \code{n.trees} trees will be used. #' @param plotit an indicator as to whether the plot is generated. #' @param order an indicator as to whether the plotted and/or returned relative #' influences are sorted. #' @param method The function used to compute the relative influence. #' \code{\link{relative.influence}} is the default and is the same as that #' described in Friedman (2001). The other current (and experimental) choice is #' \code{\link{permutation.test.gbm}}. This method randomly permutes each #' predictor variable at a time and computes the associated reduction in #' predictive performance. This is similar to the variable importance measures #' Breiman uses for random forests, but \code{gbm} currently computes using the #' entire training dataset (not the out-of-bag observations). #' @param normalize if \code{FALSE} then \code{summary.gbm} returns the #' unnormalized influence. #' @param ... other arguments passed to the plot function. #' @return Returns a data frame where the first component is the variable name #' and the second is the computed relative influence, normalized to sum to 100. #' @author Greg Ridgeway \email{gregridgeway@@gmail.com} #' @seealso \code{\link{gbm}} #' @references J.H. Friedman (2001). "Greedy Function Approximation: A Gradient #' Boosting Machine," Annals of Statistics 29(5):1189-1232. #' #' L. Breiman #' (2001).\url{https://www.stat.berkeley.edu/users/breiman/randomforest2001.pdf}. #' @keywords hplot #' #' @export summary.gbm #' @export summary.gbm <- function(object, cBars=length(object$var.names), n.trees=object$n.trees, plotit=TRUE, order=TRUE, method=relative.influence, normalize=TRUE, ...) { if(n.trees < 1) { stop("n.trees must be greater than 0.") } if(n.trees > object$n.trees) { warning("Exceeded total number of GBM terms. Results use n.trees=",object$n.trees," terms.\n") n.trees <- object$n.trees } rel.inf <- method(object,n.trees) rel.inf[rel.inf<0] <- 0 if(order) { i <- order(-rel.inf) } else { i <- 1:length(rel.inf) } if(cBars==0) cBars <- min(10,length(object$var.names)) if(cBars>length(object$var.names)) cBars <- length(object$var.names) if(normalize) rel.inf <- 100*rel.inf/sum(rel.inf) if(plotit) { barplot(rel.inf[i[cBars:1]], horiz=TRUE, col=rainbow(cBars,start=3/6,end=4/6), names=object$var.names[i[cBars:1]], xlab="Relative influence",...) } return(data.frame(var=object$var.names[i], rel.inf=rel.inf[i])) } gbm/R/basehaz.gbm.R0000644000176200001440000000547514547111627013547 0ustar liggesusers# rd2rox <- function(path = file.choose()) { # info <- Rd2roxygen::parse_file(path) # cat(Rd2roxygen::create_roxygen(info), sep = "\n") # } #' Baseline hazard function #' #' Computes the Breslow estimator of the baseline hazard function for a #' proportional hazard regression model. #' #' The proportional hazard model assumes h(t|x)=lambda(t)*exp(f(x)). #' \code{\link{gbm}} can estimate the f(x) component via partial likelihood. #' After estimating f(x), \code{basehaz.gbm} can compute the a nonparametric #' estimate of lambda(t). #' #' @param t The survival times. #' @param delta The censoring indicator. #' @param f.x The predicted values of the regression model on the log hazard #' scale. #' @param t.eval Values at which the baseline hazard will be evaluated. #' @param smooth If \code{TRUE} \code{basehaz.gbm} will smooth the estimated #' baseline hazard using Friedman's super smoother \code{\link{supsmu}}. #' @param cumulative If \code{TRUE} the cumulative survival function will be #' computed. #' @return A vector of length equal to the length of t (or of length #' \code{t.eval} if \code{t.eval} is not \code{NULL}) containing the baseline #' hazard evaluated at t (or at \code{t.eval} if \code{t.eval} is not #' \code{NULL}). If \code{cumulative} is set to \code{TRUE} then the returned #' vector evaluates the cumulative hazard function at those values. #' @author Greg Ridgeway \email{gregridgeway@@gmail.com} #' @seealso \code{\link[survival]{survfit}}, \code{\link{gbm}} #' @references #' N. Breslow (1972). "Discussion of `Regression Models and #' Life-Tables' by D.R. Cox," Journal of the Royal Statistical Society, Series #' B, 34(2):216-217. #' #' N. Breslow (1974). "Covariance analysis of censored survival data," #' Biometrics 30:89-99. #' @keywords methods survival #' @export basehaz.gbm <- function(t,delta, f.x, t.eval = NULL, smooth = FALSE, cumulative = TRUE) { t.unique <- sort(unique(t[delta==1])) alpha <- length(t.unique) for(i in 1:length(t.unique)) { alpha[i] <- sum(t[delta==1]==t.unique[i])/ sum(exp(f.x[t>=t.unique[i]])) } if(!smooth && !cumulative) { if(!is.null(t.eval)) { stop("Cannot evaluate unsmoothed baseline hazard at t.eval.") } } else { if(smooth && !cumulative) { lambda.smooth <- supsmu(t.unique,alpha) } else { if(smooth && cumulative) { lambda.smooth <- supsmu(t.unique, cumsum(alpha)) } else { # (!smooth && cumulative) - THE DEFAULT lambda.smooth <- list(x = t.unique, y = cumsum(alpha)) } } } obj <- if(!is.null(t.eval)) { approx(lambda.smooth$x, lambda.smooth$y, xout = t.eval)$y } else { approx(lambda.smooth$x, lambda.smooth$y, xout = t)$y } return(obj) } gbm/R/gbm.R0000644000176200001440000005421414547340475012133 0ustar liggesusers#' Generalized Boosted Regression Modeling (GBM) #' #' Fits generalized boosted regression models. For technical details, see the #' vignette: \code{utils::browseVignettes("gbm")}. #' #' \code{gbm.fit} provides the link between R and the C++ gbm engine. #' \code{gbm} is a front-end to \code{gbm.fit} that uses the familiar R #' modeling formulas. However, \code{\link[stats]{model.frame}} is very slow if #' there are many predictor variables. For power-users with many variables use #' \code{gbm.fit}. For general practice \code{gbm} is preferable. #' #' @param formula A symbolic description of the model to be fit. The formula #' may include an offset term (e.g. y~offset(n)+x). If #' \code{keep.data = FALSE} in the initial call to \code{gbm} then it is the #' user's responsibility to resupply the offset to \code{\link{gbm.more}}. #' #' @param distribution Either a character string specifying the name of the #' distribution to use or a list with a component \code{name} specifying the #' distribution and any additional parameters needed. If not specified, #' \code{gbm} will try to guess: if the response has only 2 unique values, #' bernoulli is assumed; otherwise, if the response is a factor, multinomial is #' assumed; otherwise, if the response has class \code{"Surv"}, coxph is #' assumed; otherwise, gaussian is assumed. #' #' Currently available options are \code{"gaussian"} (squared error), #' \code{"laplace"} (absolute loss), \code{"tdist"} (t-distribution loss), #' \code{"bernoulli"} (logistic regression for 0-1 outcomes), #' \code{"huberized"} (huberized hinge loss for 0-1 outcomes), #' \code{"adaboost"} (the AdaBoost exponential loss for 0-1 outcomes), #' \code{"poisson"} (count outcomes), \code{"coxph"} (right censored #' observations), \code{"quantile"}, or \code{"pairwise"} (ranking measure #' using the LambdaMart algorithm). #' #' If quantile regression is specified, \code{distribution} must be a list of #' the form \code{list(name = "quantile", alpha = 0.25)} where \code{alpha} is #' the quantile to estimate. The current version's quantile regression method #' does not handle non-constant weights and will stop. #' #' If \code{"tdist"} is specified, the default degrees of freedom is 4 and #' this can be controlled by specifying #' \code{distribution = list(name = "tdist", df = DF)} where \code{DF} is your #' chosen degrees of freedom. #' #' If "pairwise" regression is specified, \code{distribution} must be a list of #' the form \code{list(name="pairwise",group=...,metric=...,max.rank=...)} #' (\code{metric} and \code{max.rank} are optional, see below). \code{group} is #' a character vector with the column names of \code{data} that jointly #' indicate the group an instance belongs to (typically a query in Information #' Retrieval applications). For training, only pairs of instances from the same #' group and with different target labels can be considered. \code{metric} is #' the IR measure to use, one of #' \describe{ #' \item{list("conc")}{Fraction of concordant pairs; for binary labels, this #' is equivalent to the Area under the ROC Curve} #' \item{:}{Fraction of concordant pairs; for binary labels, this is #' equivalent to the Area under the ROC Curve} #' \item{list("mrr")}{Mean reciprocal rank of the highest-ranked positive #' instance} #' \item{:}{Mean reciprocal rank of the highest-ranked positive instance} #' \item{list("map")}{Mean average precision, a generalization of \code{mrr} #' to multiple positive instances}\item{:}{Mean average precision, a #' generalization of \code{mrr} to multiple positive instances} #' \item{list("ndcg:")}{Normalized discounted cumulative gain. The score is #' the weighted sum (DCG) of the user-supplied target values, weighted #' by log(rank+1), and normalized to the maximum achievable value. This #' is the default if the user did not specify a metric.} #' } #' #' \code{ndcg} and \code{conc} allow arbitrary target values, while binary #' targets \{0,1\} are expected for \code{map} and \code{mrr}. For \code{ndcg} #' and \code{mrr}, a cut-off can be chosen using a positive integer parameter #' \code{max.rank}. If left unspecified, all ranks are taken into account. #' #' Note that splitting of instances into training and validation sets follows #' group boundaries and therefore only approximates the specified #' \code{train.fraction} ratio (the same applies to cross-validation folds). #' Internally, queries are randomly shuffled before training, to avoid bias. #' #' Weights can be used in conjunction with pairwise metrics, however it is #' assumed that they are constant for instances from the same group. #' #' For details and background on the algorithm, see e.g. Burges (2010). #' #' @param data an optional data frame containing the variables in the model. By #' default the variables are taken from \code{environment(formula)}, typically #' the environment from which \code{gbm} is called. If \code{keep.data=TRUE} in #' the initial call to \code{gbm} then \code{gbm} stores a copy with the #' object. If \code{keep.data=FALSE} then subsequent calls to #' \code{\link{gbm.more}} must resupply the same dataset. It becomes the user's #' responsibility to resupply the same data at this point. #' #' @param weights an optional vector of weights to be used in the fitting #' process. Must be positive but do not need to be normalized. If #' \code{keep.data=FALSE} in the initial call to \code{gbm} then it is the #' user's responsibility to resupply the weights to \code{\link{gbm.more}}. #' #' @param var.monotone an optional vector, the same length as the number of #' predictors, indicating which variables have a monotone increasing (+1), #' decreasing (-1), or arbitrary (0) relationship with the outcome. #' #' @param n.trees Integer specifying the total number of trees to fit. This is #' equivalent to the number of iterations and the number of basis functions in #' the additive expansion. Default is 100. #' #' @param interaction.depth Integer specifying the maximum depth of each tree #' (i.e., the highest level of variable interactions allowed). A value of 1 #' implies an additive model, a value of 2 implies a model with up to 2-way #' interactions, etc. Default is 1. #' #' @param n.minobsinnode Integer specifying the minimum number of observations #' in the terminal nodes of the trees. Note that this is the actual number of #' observations, not the total weight. #' #' @param shrinkage a shrinkage parameter applied to each tree in the #' expansion. Also known as the learning rate or step-size reduction; 0.001 to #' 0.1 usually work, but a smaller learning rate typically requires more trees. #' Default is 0.1. #' #' @param bag.fraction the fraction of the training set observations randomly #' selected to propose the next tree in the expansion. This introduces #' randomnesses into the model fit. If \code{bag.fraction} < 1 then running the #' same model twice will result in similar but different fits. \code{gbm} uses #' the R random number generator so \code{set.seed} can ensure that the model #' can be reconstructed. Preferably, the user can save the returned #' \code{\link{gbm.object}} using \code{\link{save}}. Default is 0.5. #' #' @param train.fraction The first \code{train.fraction * nrows(data)} #' observations are used to fit the \code{gbm} and the remainder are used for #' computing out-of-sample estimates of the loss function. #' #' @param cv.folds Number of cross-validation folds to perform. If #' \code{cv.folds}>1 then \code{gbm}, in addition to the usual fit, will #' perform a cross-validation, calculate an estimate of generalization error #' returned in \code{cv.error}. #' #' @param keep.data a logical variable indicating whether to keep the data and #' an index of the data stored with the object. Keeping the data and index #' makes subsequent calls to \code{\link{gbm.more}} faster at the cost of #' storing an extra copy of the dataset. #' #' @param verbose Logical indicating whether or not to print out progress and #' performance indicators (\code{TRUE}). If this option is left unspecified for #' \code{gbm.more}, then it uses \code{verbose} from \code{object}. Default is #' \code{FALSE}. #' #' @param class.stratify.cv Logical indicating whether or not the #' cross-validation should be stratified by class. Defaults to \code{TRUE} for #' \code{distribution = "multinomial"} and is only implemented for #' \code{"multinomial"} and \code{"bernoulli"}. The purpose of stratifying the #' cross-validation is to help avoiding situations in which training sets do #' not contain all classes. #' #' @param n.cores The number of CPU cores to use. The cross-validation loop #' will attempt to send different CV folds off to different cores. If #' \code{n.cores} is not specified by the user, it is guessed using the #' \code{detectCores} function in the \code{parallel} package. Note that the #' documentation for \code{detectCores} makes clear that it is not failsafe and #' could return a spurious number of available cores. #' #' @return A \code{\link{gbm.object}} object. #' #' @details #' This package implements the generalized boosted modeling framework. Boosting #' is the process of iteratively adding basis functions in a greedy fashion so #' that each additional basis function further reduces the selected loss #' function. This implementation closely follows Friedman's Gradient Boosting #' Machine (Friedman, 2001). #' #' In addition to many of the features documented in the Gradient Boosting #' Machine, \code{gbm} offers additional features including the out-of-bag #' estimator for the optimal number of iterations, the ability to store and #' manipulate the resulting \code{gbm} object, and a variety of other loss #' functions that had not previously had associated boosting algorithms, #' including the Cox partial likelihood for censored data, the poisson #' likelihood for count outcomes, and a gradient boosting implementation to #' minimize the AdaBoost exponential loss function. This gbm package is no #' longer under further development. Consider #' https://github.com/gbm-developers/gbm3 for the latest version. #' #' @author Greg Ridgeway \email{gregridgeway@@gmail.com} #' #' Quantile regression code developed by Brian Kriegler #' \email{bk@@stat.ucla.edu} #' #' t-distribution, and multinomial code developed by Harry Southworth and #' Daniel Edwards #' #' Pairwise code developed by Stefan Schroedl \email{schroedl@@a9.com} #' #' @seealso \code{\link{gbm.object}}, \code{\link{gbm.perf}}, #' \code{\link{plot.gbm}}, \code{\link{predict.gbm}}, \code{\link{summary.gbm}}, #' and \code{\link{pretty.gbm.tree}}. #' #' @references #' Y. Freund and R.E. Schapire (1997) \dQuote{A decision-theoretic #' generalization of on-line learning and an application to boosting,} #' \emph{Journal of Computer and System Sciences,} 55(1):119-139. #' #' G. Ridgeway (1999). \dQuote{The state of boosting,} \emph{Computing Science #' and Statistics} 31:172-181. #' #' J.H. Friedman, T. Hastie, R. Tibshirani (2000). \dQuote{Additive Logistic #' Regression: a Statistical View of Boosting,} \emph{Annals of Statistics} #' 28(2):337-374. #' #' J.H. Friedman (2001). \dQuote{Greedy Function Approximation: A Gradient #' Boosting Machine,} \emph{Annals of Statistics} 29(5):1189-1232. #' #' J.H. Friedman (2002). \dQuote{Stochastic Gradient Boosting,} #' \emph{Computational Statistics and Data Analysis} 38(4):367-378. #' #' B. Kriegler (2007). Cost-Sensitive Stochastic Gradient Boosting Within a #' Quantitative Regression Framework. Ph.D. Dissertation. University of #' California at Los Angeles, Los Angeles, CA, USA. Advisor(s) Richard A. Berk. #' \url{https://dl.acm.org/doi/book/10.5555/1354603}. #' #' C. Burges (2010). \dQuote{From RankNet to LambdaRank to LambdaMART: An #' Overview,} Microsoft Research Technical Report MSR-TR-2010-82. #' #' @export #' #' @examples #' # #' # A least squares regression example #' # #' #' # Simulate data #' set.seed(101) # for reproducibility #' N <- 1000 #' X1 <- runif(N) #' X2 <- 2 * runif(N) #' X3 <- ordered(sample(letters[1:4], N, replace = TRUE), levels = letters[4:1]) #' X4 <- factor(sample(letters[1:6], N, replace = TRUE)) #' X5 <- factor(sample(letters[1:3], N, replace = TRUE)) #' X6 <- 3 * runif(N) #' mu <- c(-1, 0, 1, 2)[as.numeric(X3)] #' SNR <- 10 # signal-to-noise ratio #' Y <- X1 ^ 1.5 + 2 * (X2 ^ 0.5) + mu #' sigma <- sqrt(var(Y) / SNR) #' Y <- Y + rnorm(N, 0, sigma) #' X1[sample(1:N,size=500)] <- NA # introduce some missing values #' X4[sample(1:N,size=300)] <- NA # introduce some missing values #' data <- data.frame(Y, X1, X2, X3, X4, X5, X6) #' #' # Fit a GBM #' set.seed(102) # for reproducibility #' gbm1 <- gbm(Y ~ ., data = data, var.monotone = c(0, 0, 0, 0, 0, 0), #' distribution = "gaussian", n.trees = 100, shrinkage = 0.1, #' interaction.depth = 3, bag.fraction = 0.5, train.fraction = 0.5, #' n.minobsinnode = 10, cv.folds = 5, keep.data = TRUE, #' verbose = FALSE, n.cores = 1) #' #' # Check performance using the out-of-bag (OOB) error; the OOB error typically #' # underestimates the optimal number of iterations #' best.iter <- gbm.perf(gbm1, method = "OOB") #' print(best.iter) #' #' # Check performance using the 50% heldout test set #' best.iter <- gbm.perf(gbm1, method = "test") #' print(best.iter) #' #' # Check performance using 5-fold cross-validation #' best.iter <- gbm.perf(gbm1, method = "cv") #' print(best.iter) #' #' # Plot relative influence of each variable #' par(mfrow = c(1, 2)) #' summary(gbm1, n.trees = 1) # using first tree #' summary(gbm1, n.trees = best.iter) # using estimated best number of trees #' #' # Compactly print the first and last trees for curiosity #' print(pretty.gbm.tree(gbm1, i.tree = 1)) #' print(pretty.gbm.tree(gbm1, i.tree = gbm1$n.trees)) #' #' # Simulate new data #' set.seed(103) # for reproducibility #' N <- 1000 #' X1 <- runif(N) #' X2 <- 2 * runif(N) #' X3 <- ordered(sample(letters[1:4], N, replace = TRUE)) #' X4 <- factor(sample(letters[1:6], N, replace = TRUE)) #' X5 <- factor(sample(letters[1:3], N, replace = TRUE)) #' X6 <- 3 * runif(N) #' mu <- c(-1, 0, 1, 2)[as.numeric(X3)] #' Y <- X1 ^ 1.5 + 2 * (X2 ^ 0.5) + mu + rnorm(N, 0, sigma) #' data2 <- data.frame(Y, X1, X2, X3, X4, X5, X6) #' #' # Predict on the new data using the "best" number of trees; by default, #' # predictions will be on the link scale #' Yhat <- predict(gbm1, newdata = data2, n.trees = best.iter, type = "link") #' #' # least squares error #' print(sum((data2$Y - Yhat)^2)) #' #' # Construct univariate partial dependence plots #' plot(gbm1, i.var = 1, n.trees = best.iter) #' plot(gbm1, i.var = 2, n.trees = best.iter) #' plot(gbm1, i.var = "X3", n.trees = best.iter) # can use index or name #' #' # Construct bivariate partial dependence plots #' plot(gbm1, i.var = 1:2, n.trees = best.iter) #' plot(gbm1, i.var = c("X2", "X3"), n.trees = best.iter) #' plot(gbm1, i.var = 3:4, n.trees = best.iter) #' #' # Construct trivariate partial dependence plots #' plot(gbm1, i.var = c(1, 2, 6), n.trees = best.iter, #' continuous.resolution = 20) #' plot(gbm1, i.var = 1:3, n.trees = best.iter) #' plot(gbm1, i.var = 2:4, n.trees = best.iter) #' plot(gbm1, i.var = 3:5, n.trees = best.iter) #' #' # Add more (i.e., 100) boosting iterations to the ensemble #' gbm2 <- gbm.more(gbm1, n.new.trees = 100, verbose = FALSE) gbm <- function(formula = formula(data), distribution = "bernoulli", data = list(), weights, var.monotone = NULL, n.trees = 100, interaction.depth = 1, n.minobsinnode = 10, shrinkage = 0.1, bag.fraction = 0.5, train.fraction = 1.0, cv.folds = 0, keep.data = TRUE, verbose = FALSE, class.stratify.cv = NULL, n.cores = NULL) { # Match the call to gbm mcall <- match.call() # Verbose output? lVerbose <- if (!is.logical(verbose)) { FALSE } else { verbose } # Construct model frame, terms object, weights, and offset mf <- match.call(expand.dots = FALSE) m <- match(c("formula", "data", "weights", "offset"), names(mf), 0) mf <- mf[c(1, m)] mf$drop.unused.levels <- TRUE mf$na.action <- na.pass mf[[1]] <- as.name("model.frame") m <- mf mf <- eval(mf, parent.frame()) Terms <- attr(mf, "terms") w <- model.weights(mf) offset <- model.offset(mf) y <- model.response(mf) # extract response values # Determine and check response distribution if (missing(distribution)) { # y <- data[, all.vars(formula)[1L], drop = TRUE] distribution <- guessDist(y) } if (is.character(distribution)) { distribution <- list(name = distribution) } if (!is.element(distribution$name, getAvailableDistributions())) { stop("Distribution ", distribution$name, " is not supported.") } if (distribution$name == "multinomial") { warning("Setting `distribution = \"multinomial\"` is ill-advised as it is ", "currently broken. It exists only for backwards compatibility. ", "Use at your own risk.", call. = FALSE) } # Construct data frame of predictor values var.names <- attributes(Terms)$term.labels x <- model.frame(terms(reformulate(var.names)), data = data, na.action = na.pass) # Extract response name as a character string response.name <- as.character(formula[[2L]]) # Stratify cross-validation by class (only for bernoulli and multinomial) class.stratify.cv <- getStratify(class.stratify.cv, d = distribution) # Groups (for pairwise distribution only) group <- NULL num.groups <- 0 # Determine number of training instances if (distribution$name != "pairwise"){ # Number of training instances nTrain <- floor(train.fraction * nrow(x)) } else { # Sampling is by group, so we need to calculate them here distribution.group <- distribution[["group"]] if (is.null(distribution.group)) { stop(paste("For pairwise regression, `distribution` must be a list of", "the form `list(name = \"pairwise\", group = c(\"date\",", "\"session\", \"category\", \"keywords\"))`.")) } # Check if group names are valid i <- match(distribution.group, colnames(data)) if (any(is.na(i))) { stop("Group column does not occur in data: ", distribution.group[is.na(i)], ".") } # Construct group index group <- factor( do.call(paste, c(data[, distribution.group, drop = FALSE], sep = ":")) ) # Check that weights are constant across groups if ((!missing(weights)) && (!is.null(weights))) { w.min <- tapply(w, INDEX = group, FUN = min) w.max <- tapply(w, INDEX = group, FUN = max) if (any(w.min != w.max)) { stop("For `distribution = \"pairwise\"`, all instances for the same ", "group must have the same weight.") } w <- w * length(w.min) / sum(w.min) # normalize across groups } # Shuffle groups to remove bias when split into train/test sets and/or CV # folds perm.levels <- levels(group)[sample(1:nlevels(group))] group <- factor(group, levels = perm.levels) # The C function expects instances to be sorted by group and descending by # target ord.group <- order(group, -y) group <- group[ord.group] y <- y[ord.group] x <- x[ord.group, , drop = FALSE] w <- w[ord.group] # Split into train and validation sets at group boundary num.groups.train <- max(1, round(train.fraction * nlevels(group))) # Include all groups up to the num.groups.train nTrain <- max(which(group==levels(group)[num.groups.train])) Misc <- group } # Set up for k-fold cross-validation cv.error <- NULL # FIXME: Is there a better way to handle this? if (cv.folds == 1) { cv.folds <- 0 # o/w, an uninformative error is thrown } if(cv.folds > 1) { cv.results <- gbmCrossVal(cv.folds = cv.folds, nTrain = nTrain, n.cores = n.cores, class.stratify.cv = class.stratify.cv, data = data, x = x, y = y, offset = offset, distribution = distribution, w = w, var.monotone = var.monotone, n.trees = n.trees, interaction.depth = interaction.depth, n.minobsinnode = n.minobsinnode, shrinkage = shrinkage, bag.fraction = bag.fraction, var.names = var.names, response.name = response.name, group = group) cv.error <- cv.results$error p <- cv.results$predictions } # Fit a GBM gbm.obj <- gbm.fit(x = x, y = y, offset = offset, distribution = distribution, w = w, var.monotone = var.monotone, n.trees = n.trees, interaction.depth = interaction.depth, n.minobsinnode = n.minobsinnode, shrinkage = shrinkage, bag.fraction = bag.fraction, nTrain = nTrain, keep.data = keep.data, verbose = lVerbose, var.names = var.names, response.name = response.name, group = group) # Attach further components gbm.obj$train.fraction <- train.fraction gbm.obj$Terms <- Terms gbm.obj$cv.error <- cv.error gbm.obj$cv.folds <- cv.folds gbm.obj$call <- mcall gbm.obj$m <- m if (cv.folds > 1) { # FIXME: Was previously `cv.folds > 0`? gbm.obj$cv.fitted <- p } if (distribution$name == "pairwise") { # Data has been reordered according to queries. We need to permute the # fitted values so that they correspond to the original order. gbm.obj$ord.group <- ord.group gbm.obj$fit <- gbm.obj$fit[order(ord.group)] } # Return "gbm" object gbm.obj } gbm/R/gbm.object.R0000644000176200001440000000472514547111627013375 0ustar liggesusers#' Generalized Boosted Regression Model Object #' #' These are objects representing fitted \code{gbm}s. #' #' @return \item{initF}{The "intercept" term, the initial predicted value to #' which trees make adjustments.} \item{fit}{A vector containing the fitted #' values on the scale of regression function (e.g. log-odds scale for #' bernoulli, log scale for poisson).} \item{train.error}{A vector of length #' equal to the number of fitted trees containing the value of the loss #' function for each boosting iteration evaluated on the training data.} #' \item{valid.error}{A vector of length equal to the number of fitted trees #' containing the value of the loss function for each boosting iteration #' evaluated on the validation data.} \item{cv.error}{If \code{cv.folds} < 2 this #' component is \code{NULL}. Otherwise, this component is a vector of length equal to #' the number of fitted trees containing a cross-validated estimate of the loss #' function for each boosting iteration.} \item{oobag.improve}{A vector of #' length equal to the number of fitted trees containing an out-of-bag estimate #' of the marginal reduction in the expected value of the loss function. The #' out-of-bag estimate uses only the training data and is useful for estimating #' the optimal number of boosting iterations. See \code{\link{gbm.perf}}.} #' \item{trees}{A list containing the tree structures. The components are best #' viewed using \code{\link{pretty.gbm.tree}}.} \item{c.splits}{A list of all #' the categorical splits in the collection of trees. If the \code{trees[[i]]} #' component of a \code{gbm} object describes a categorical split then the #' splitting value will refer to a component of \code{c.splits}. That component #' of \code{c.splits} will be a vector of length equal to the number of levels #' in the categorical split variable. -1 indicates left, +1 indicates right, #' and 0 indicates that the level was not present in the training data.} #' \item{cv.fitted}{If cross-validation was performed, the cross-validation #' predicted values on the scale of the linear predictor. That is, the fitted #' values from the i-th CV-fold, for the model having been trained on the data #' in all other folds.} #' #' @section Structure: The following components must be included in a #' legitimate \code{gbm} object. #' #' @author Greg Ridgeway \email{gregridgeway@@gmail.com} #' #' @seealso \code{\link{gbm}} #' #' @keywords methods #' #' @name gbm.object NULL gbm/R/interact.gbm.R0000644000176200001440000001321614547111627013733 0ustar liggesusers#' Estimate the strength of interaction effects #' #' Computes Friedman's H-statistic to assess the strength of variable #' interactions. #' #' @param x A \code{\link{gbm.object}} fitted using a call to \code{\link{gbm}}. #' #' @param data The dataset used to construct \code{x}. If the original dataset #' is large, a random subsample may be used to accelerate the computation in #' \code{interact.gbm}. #' #' @param i.var A vector of indices or the names of the variables for compute #' the interaction effect. If using indices, the variables are indexed in the #' same order that they appear in the initial \code{gbm} formula. #' #' @param n.trees The number of trees used to generate the plot. Only the first #' \code{n.trees} trees will be used. #' #' @return Returns the value of \eqn{H}. #' #' @details #' \code{interact.gbm} computes Friedman's H-statistic to assess the relative #' strength of interaction effects in non-linear models. H is on the scale of #' [0-1] with higher values indicating larger interaction effects. To connect #' to a more familiar measure, if \eqn{x_1} and \eqn{x_2} are uncorrelated #' covariates with mean 0 and variance 1 and the model is of the form #' \deqn{y=\beta_0+\beta_1x_1+\beta_2x_2+\beta_3x_3} then #' \deqn{H=\frac{\beta_3}{\sqrt{\beta_1^2+\beta_2^2+\beta_3^2}}} #' #' Note that if the main effects are weak, the estimated H will be unstable. #' For example, if (in the case of a two-way interaction) neither main effect #' is in the selected model (relative influence is zero), the result will be #' 0/0. Also, with weak main effects, rounding errors can result in values of H #' > 1 which are not possible. #' #' @author Greg Ridgeway \email{gregridgeway@@gmail.com} #' @seealso \code{\link{gbm}}, \code{\link{gbm.object}} #' @references J.H. Friedman and B.E. Popescu (2005). \dQuote{Predictive #' Learning via Rule Ensembles.} Section 8.1 #' @keywords methods #' @export interact.gbm <- function(x, data, i.var = 1, n.trees = x$n.trees){ ############################################################### # Do sanity checks on the call if (x$interaction.depth < length(i.var)){ stop("interaction.depth too low in model call") } if (all(is.character(i.var))){ i <- match(i.var, x$var.names) if (any(is.na(i))) { stop("Variables given are not used in gbm model fit: ", i.var[is.na(i)]) } else { i.var <- i } } if ((min(i.var) < 1) || (max(i.var) > length(x$var.names))) { warning("i.var must be between 1 and ", length(x$var.names)) } if (n.trees > x$n.trees) { warning(paste("n.trees exceeds the number of trees in the model, ", x$n.trees,". Using ", x$n.trees, " trees.", sep = "")) n.trees <- x$n.trees } # End of sanity checks ############################################################### unique.tab <- function(z,i.var) { a <- unique(z[,i.var,drop=FALSE]) a$n <- table(factor(apply(z[,i.var,drop=FALSE],1,paste,collapse="\r"), levels=apply(a,1,paste,collapse="\r"))) return(a) } # convert factors for(j in i.var) { if(is.factor(data[,x$var.names[j]])) data[,x$var.names[j]] <- as.numeric(data[,x$var.names[j]])-1 } # generate a list with all combinations of variables a <- apply(expand.grid(rep(list(c(FALSE,TRUE)), length(i.var)))[-1,],1, function(x) as.numeric(which(x))) FF <- vector("list",length(a)) for(j in 1:length(a)) { FF[[j]]$Z <- data.frame(unique.tab(data, x$var.names[i.var[a[[j]]]])) FF[[j]]$n <- as.numeric(FF[[j]]$Z$n) FF[[j]]$Z$n <- NULL FF[[j]]$f <- .Call("gbm_plot", X = as.double(data.matrix(FF[[j]]$Z)), cRows = as.integer(nrow(FF[[j]]$Z)), cCols = as.integer(ncol(FF[[j]]$Z)), n.class = as.integer(x$num.classes), i.var = as.integer(i.var[a[[j]]] - 1), n.trees = as.integer(n.trees), initF = as.double(x$initF), trees = x$trees, c.splits = x$c.splits, var.type = as.integer(x$var.type), PACKAGE = "gbm") # FF[[jj]]$Z is the data, f is the predictions, n is the number of levels for factors # Need to restructure f to deal with multinomial case FF[[j]]$f <- matrix(FF[[j]]$f, ncol=x$num.classes, byrow=FALSE) # center the values FF[[j]]$f <- apply(FF[[j]]$f, 2, function(x, w){ x - weighted.mean(x, w, na.rm=TRUE) }, w=FF[[j]]$n) # precompute the sign of these terms to appear in H FF[[j]]$sign <- ifelse(length(a[[j]]) %% 2 == length(i.var) %% 2, 1, -1) } H <- FF[[length(a)]]$f for(j in 1:(length(a)-1)){ i1 <- apply(FF[[length(a)]]$Z[,a[[j]], drop=FALSE], 1, paste, collapse="\r") i2 <- apply(FF[[j]]$Z,1,paste,collapse="\r") i <- match(i1, i2) H <- H + with(FF[[j]], sign*f[i,]) } # Compute H w <- matrix(FF[[length(a)]]$n, ncol=1) f <- matrix(FF[[length(a)]]$f^2, ncol=x$num.classes, byrow=FALSE) top <- apply(H^2, 2, weighted.mean, w = w, na.rm = TRUE) btm <- apply(f, 2, weighted.mean, w = w, na.rm = TRUE) H <- top / btm if (x$distribution$name=="multinomial"){ names(H) <- x$classes } # If H > 1, rounding and tiny main effects have messed things up H[H > 1] <- NaN return(sqrt(H)) } gbm/R/gbm.fit.R0000644000176200001440000005652214547666333012724 0ustar liggesusers#' Generalized Boosted Regression Modeling (GBM) #' #' Workhorse function providing the link between R and the C++ gbm engine. #' \code{gbm} is a front-end to \code{gbm.fit} that uses the familiar R #' modeling formulas. However, \code{\link[stats]{model.frame}} is very slow if #' there are many predictor variables. For power-users with many variables use #' \code{gbm.fit}. For general practice \code{gbm} is preferable. #' #' @param x A data frame or matrix containing the predictor variables. The #' number of rows in \code{x} must be the same as the length of \code{y}. #' #' @param y A vector of outcomes. The number of rows in \code{x} must be the #' same as the length of \code{y}. #' #' @param offset A vector of offset values. #' #' @param misc An R object that is simply passed on to the gbm engine. It can be #' used for additional data for the specific distribution. Currently it is only #' used for passing the censoring indicator for the Cox proportional hazards #' model. #' #' @param distribution Either a character string specifying the name of the #' distribution to use or a list with a component \code{name} specifying the #' distribution and any additional parameters needed. If not specified, #' \code{gbm} will try to guess: if the response has only 2 unique values, #' bernoulli is assumed; otherwise, if the response is a factor, multinomial is #' assumed; otherwise, if the response has class \code{"Surv"}, coxph is #' assumed; otherwise, gaussian is assumed. #' #' Currently available options are \code{"gaussian"} (squared error), #' \code{"laplace"} (absolute loss), \code{"tdist"} (t-distribution loss), #' \code{"bernoulli"} (logistic regression for 0-1 outcomes), #' \code{"huberized"} (huberized hinge loss for 0-1 outcomes), #' \code{"adaboost"} (the AdaBoost exponential loss for 0-1 outcomes), #' \code{"poisson"} (count outcomes), \code{"coxph"} (right censored #' observations), \code{"quantile"}, or \code{"pairwise"} (ranking measure #' using the LambdaMart algorithm). #' #' If quantile regression is specified, \code{distribution} must be a list of #' the form \code{list(name = "quantile", alpha = 0.25)} where \code{alpha} is #' the quantile to estimate. The current version's quantile regression method #' does not handle non-constant weights and will stop. #' #' If \code{"tdist"} is specified, the default degrees of freedom is 4 and #' this can be controlled by specifying #' \code{distribution = list(name = "tdist", df = DF)} where \code{DF} is your #' chosen degrees of freedom. #' #' If "pairwise" regression is specified, \code{distribution} must be a list of #' the form \code{list(name="pairwise",group=...,metric=...,max.rank=...)} #' (\code{metric} and \code{max.rank} are optional, see below). \code{group} is #' a character vector with the column names of \code{data} that jointly #' indicate the group an instance belongs to (typically a query in Information #' Retrieval applications). For training, only pairs of instances from the same #' group and with different target labels can be considered. \code{metric} is #' the IR measure to use, one of #' \describe{ #' \item{list("conc")}{Fraction of concordant pairs; for binary labels, this #' is equivalent to the Area under the ROC Curve} #' \item{:}{Fraction of concordant pairs; for binary labels, this is #' equivalent to the Area under the ROC Curve} #' \item{list("mrr")}{Mean reciprocal rank of the highest-ranked positive #' instance} #' \item{:}{Mean reciprocal rank of the highest-ranked positive instance} #' \item{list("map")}{Mean average precision, a generalization of \code{mrr} #' to multiple positive instances}\item{:}{Mean average precision, a #' generalization of \code{mrr} to multiple positive instances} #' \item{list("ndcg:")}{Normalized discounted cumulative gain. The score is #' the weighted sum (DCG) of the user-supplied target values, weighted #' by log(rank+1), and normalized to the maximum achievable value. This #' is the default if the user did not specify a metric.} #' } #' #' \code{ndcg} and \code{conc} allow arbitrary target values, while binary #' targets \{0,1\} are expected for \code{map} and \code{mrr}. For \code{ndcg} #' and \code{mrr}, a cut-off can be chosen using a positive integer parameter #' \code{max.rank}. If left unspecified, all ranks are taken into account. #' #' Note that splitting of instances into training and validation sets follows #' group boundaries and therefore only approximates the specified #' \code{train.fraction} ratio (the same applies to cross-validation folds). #' Internally, queries are randomly shuffled before training, to avoid bias. #' #' Weights can be used in conjunction with pairwise metrics, however it is #' assumed that they are constant for instances from the same group. #' #' For details and background on the algorithm, see e.g. Burges (2010). #' #' @param w A vector of weights of the same length as the \code{y}. #' #' @param var.monotone an optional vector, the same length as the number of #' predictors, indicating which variables have a monotone increasing (+1), #' decreasing (-1), or arbitrary (0) relationship with the outcome. #' #' @param n.trees the total number of trees to fit. This is equivalent to the #' number of iterations and the number of basis functions in the additive #' expansion. #' #' @param interaction.depth The maximum depth of variable interactions. A value #' of 1 implies an additive model, a value of 2 implies a model with up to 2-way #' interactions, etc. Default is \code{1}. #' #' @param n.minobsinnode Integer specifying the minimum number of observations #' in the trees terminal nodes. Note that this is the actual number of #' observations not the total weight. #' #' @param shrinkage The shrinkage parameter applied to each tree in the #' expansion. Also known as the learning rate or step-size reduction; 0.001 to #' 0.1 usually work, but a smaller learning rate typically requires more trees. #' Default is \code{0.1}. #' #' @param bag.fraction The fraction of the training set observations randomly #' selected to propose the next tree in the expansion. This introduces #' randomnesses into the model fit. If \code{bag.fraction} < 1 then running the #' same model twice will result in similar but different fits. \code{gbm} uses #' the R random number generator so \code{set.seed} can ensure that the model #' can be reconstructed. Preferably, the user can save the returned #' \code{\link{gbm.object}} using \code{\link{save}}. Default is \code{0.5}. #' #' @param nTrain An integer representing the number of cases on which to train. #' This is the preferred way of specification for \code{gbm.fit}; The option #' \code{train.fraction} in \code{gbm.fit} is deprecated and only maintained #' for backward compatibility. These two parameters are mutually exclusive. If #' both are unspecified, all data is used for training. #' #' @param train.fraction The first \code{train.fraction * nrows(data)} #' observations are used to fit the \code{gbm} and the remainder are used for #' computing out-of-sample estimates of the loss function. #' #' @param keep.data Logical indicating whether or not to keep the data and an #' index of the data stored with the object. Keeping the data and index makes #' subsequent calls to \code{\link{gbm.more}} faster at the cost of storing an #' extra copy of the dataset. #' #' @param verbose Logical indicating whether or not to print out progress and #' performance indicators (\code{TRUE}). If this option is left unspecified for #' \code{gbm.more}, then it uses \code{verbose} from \code{object}. Default is #' \code{FALSE}. #' #' @param var.names Vector of strings of length equal to the number of columns #' of \code{x} containing the names of the predictor variables. #' #' @param response.name Character string label for the response variable. #' #' @param group The \code{group} to use when \code{distribution = "pairwise"}. #' #' @return A \code{\link{gbm.object}} object. #' #' @details #' This package implements the generalized boosted modeling framework. Boosting #' is the process of iteratively adding basis functions in a greedy fashion so #' that each additional basis function further reduces the selected loss #' function. This implementation closely follows Friedman's Gradient Boosting #' Machine (Friedman, 2001). #' #' In addition to many of the features documented in the Gradient Boosting #' Machine, \code{gbm} offers additional features including the out-of-bag #' estimator for the optimal number of iterations, the ability to store and #' manipulate the resulting \code{gbm} object, and a variety of other loss #' functions that had not previously had associated boosting algorithms, #' including the Cox partial likelihood for censored data, the poisson #' likelihood for count outcomes, and a gradient boosting implementation to #' minimize the AdaBoost exponential loss function. #' #' @author Greg Ridgeway \email{gregridgeway@@gmail.com} #' #' Quantile regression code developed by Brian Kriegler #' \email{bk@@stat.ucla.edu} #' #' t-distribution, and multinomial code developed by Harry Southworth and #' Daniel Edwards #' #' Pairwise code developed by Stefan Schroedl \email{schroedl@@a9.com} #' #' @seealso \code{\link{gbm.object}}, \code{\link{gbm.perf}}, #' \code{\link{plot.gbm}}, \code{\link{predict.gbm}}, \code{\link{summary.gbm}}, #' and \code{\link{pretty.gbm.tree}}. #' #' @references #' Y. Freund and R.E. Schapire (1997) \dQuote{A decision-theoretic #' generalization of on-line learning and an application to boosting,} #' \emph{Journal of Computer and System Sciences,} 55(1):119-139. #' #' G. Ridgeway (1999). \dQuote{The state of boosting,} \emph{Computing Science #' and Statistics} 31:172-181. #' #' J.H. Friedman, T. Hastie, R. Tibshirani (2000). \dQuote{Additive Logistic #' Regression: a Statistical View of Boosting,} \emph{Annals of Statistics} #' 28(2):337-374. #' #' J.H. Friedman (2001). \dQuote{Greedy Function Approximation: A Gradient #' Boosting Machine,} \emph{Annals of Statistics} 29(5):1189-1232. #' #' J.H. Friedman (2002). \dQuote{Stochastic Gradient Boosting,} #' \emph{Computational Statistics and Data Analysis} 38(4):367-378. #' #' B. Kriegler (2007). Cost-Sensitive Stochastic Gradient Boosting Within a #' Quantitative Regression Framework. Ph.D. Dissertation. University of #' California at Los Angeles, Los Angeles, CA, USA. Advisor(s) Richard A. Berk. #' \url{https://dl.acm.org/doi/book/10.5555/1354603}. #' #' C. Burges (2010). \dQuote{From RankNet to LambdaRank to LambdaMART: An #' Overview,} Microsoft Research Technical Report MSR-TR-2010-82. #' #' @export gbm.fit <- function(x, y, offset = NULL, misc = NULL, distribution = "bernoulli", w = NULL, var.monotone = NULL, n.trees = 100, interaction.depth = 1, n.minobsinnode = 10, shrinkage = 0.001, bag.fraction = 0.5, nTrain = NULL, train.fraction = NULL, keep.data = TRUE, verbose = TRUE, var.names = NULL, response.name = "y", group = NULL) { # Reformat distribution into a named list if(is.character(distribution)) { distribution <- list(name = distribution) } # Dimensions of predictor data cRows <- nrow(x) cCols <- ncol(x) if(nrow(x) != ifelse(inherits(y, "Surv"), nrow(y), length(y))) { stop("The number of rows in x does not equal the length of y.") } # The preferred way to specify the number of training instances is via the # parameter `nTrain`. The parameter `train.fraction` is only maintained for # back compatibility. if(!is.null(nTrain) && !is.null(train.fraction)) { stop("Parameters `nTrain` and `train.fraction` cannot both be specified.") } else if(!is.null(train.fraction)) { warning("Parameter `train.fraction` is deprecated, please specify ", "`nTrain` instead.") nTrain <- floor(train.fraction*cRows) } else if(is.null(nTrain)) { nTrain <- cRows # both undefined, use all training data } if (is.null(train.fraction)){ train.fraction <- nTrain / cRows } # Extract var.names if NULL if(is.null(var.names)) { var.names <- getVarNames(x) } # Check size of data if(nTrain * bag.fraction <= 2 * n.minobsinnode + 1) { stop("The data set is too small or the subsampling rate is too large: ", "`nTrain * bag.fraction <= 2 * n.minobsinnode + 1`") } if (distribution$name != "pairwise") { w <- w * length(w) / sum(w) # normalize to N } # Sanity checks ch <- checkMissing(x, y) interaction.depth <- checkID(interaction.depth) w <- checkWeights(w, length(y)) offset <- checkOffset(offset, y) Misc <- NA # setup variable types var.type <- rep(0,cCols) var.levels <- vector("list",cCols) for(i in 1:length(var.type)) { if(all(is.na(x[,i]))) { stop("variable ",i,": ",var.names[i]," has only missing values.") } if(is.ordered(x[,i])) { var.levels[[i]] <- levels(factor(x[,i])) x[,i] <- as.numeric(factor(x[,i]))-1 var.type[i] <- 0 } else if(is.factor(x[,i])) { if(length(levels(x[,i]))>1024) stop("gbm does not currently handle categorical variables with more than 1024 levels. Variable ",i,": ",var.names[i]," has ",length(levels(x[,i]))," levels.") var.levels[[i]] <- levels(factor(x[,i])) x[,i] <- as.numeric(factor(x[,i]))-1 var.type[i] <- max(x[,i],na.rm=TRUE)+1 } else if(is.numeric(x[,i])) { var.levels[[i]] <- quantile(x[,i],prob=(0:10)/10,na.rm=TRUE) } else { stop("variable ",i,": ",var.names[i]," is not of type numeric, ordered, or factor.") } # check for some variation in each variable if(length(unique(var.levels[[i]])) == 1) { warning("variable ",i,": ",var.names[i]," has no variation.") } } nClass <- 1 if(!("name" %in% names(distribution))) { stop("The distribution is missing a `name` component; for example, ", "distribution = list(name = \"gaussian\").") } supported.distributions <- getAvailableDistributions() distribution.call.name <- distribution$name # Check for potential problems with the distribution if(!is.element(distribution$name, supported.distributions)) { stop("Distribution ",distribution$name," is not supported") } if((distribution$name == "bernoulli") && (!is.numeric(y) || !all(is.element(y, 0:1)))) { stop("Bernoulli requires the response to be numeric in {0,1}") } if((distribution$name == "huberized") && (!is.numeric(y) || !all(is.element(y, 0:1)))) { stop("Huberized square hinged loss requires the response to be numeric in {0,1}") } if((distribution$name == "poisson") && any(y<0)) { stop("Poisson requires the response to be non-negative") } if((distribution$name == "poisson") && any(y != trunc(y))) { stop("Poisson requires the response to be non-negative") } if((distribution$name == "adaboost") && (!is.numeric(y) || !all(is.element(y, 0:1)))) { stop("This version of AdaBoost requires the response to be numeric in {0,1}") } if(distribution$name == "quantile") { if(length(unique(w)) > 1) { stop("This version of gbm for the quantile regression lacks a weighted quantile. For now the weights must be constant.") } if(is.null(distribution$alpha)) { stop("For quantile regression, the distribution parameter must be a list with a parameter 'alpha' indicating the quantile, for example list(name=\"quantile\",alpha=0.95).") } else { if((distribution$alpha < 0) || (distribution$alpha > 1)) { stop("alpha must be between 0 and 1.") } } Misc <- c(alpha=distribution$alpha) } if(distribution$name == "coxph") { if(!inherits(y, "Surv")) { stop("Outcome must be a survival object Surv(time,failure)") } if(attr(y,"type")!="right") { stop("gbm() currently only handles right censored observations") } Misc <- y[,2] y <- y[,1] # reverse sort the failure times to compute risk sets on the fly i.train <- order(-y[1:nTrain]) n.test <- cRows - nTrain if(n.test > 0) { i.test <- order(-y[(nTrain+1):cRows]) + nTrain } else { i.test <- NULL } i.timeorder <- c(i.train,i.test) y <- y[i.timeorder] Misc <- Misc[i.timeorder] x <- x[i.timeorder,,drop=FALSE] w <- w[i.timeorder] if(!is.na(offset)) offset <- offset[i.timeorder] } if(distribution$name == "tdist") { if (is.null(distribution$df) || !is.numeric(distribution$df)){ Misc <- 4 } else { Misc <- distribution$df[1] } } if (distribution$name == "multinomial") { ## Ensure that the training set contains all classes classes <- attr(factor(y), "levels") nClass <- length(classes) if (nClass > nTrain) { stop(paste("Number of classes (", nClass, ") must be less than the", " size of the training set (", nTrain, ").", sep = "")) } new.idx <- as.vector(sapply(classes, function(a,x){ min((1:length(x))[x==a]) }, y)) all.idx <- 1:length(y) new.idx <- c(new.idx, all.idx[!(all.idx %in% new.idx)]) y <- y[new.idx] x <- x[new.idx, ] w <- w[new.idx] if (!is.null(offset)) { offset <- offset[new.idx] } ## Get the factors y <- as.numeric(as.vector(outer(y, classes, "=="))) ## Fill out the weight and offset w <- rep(w, nClass) if (!is.null(offset)) { offset <- rep(offset, nClass) } } # close if (dist... == "multinomial" if(distribution$name == "pairwise") { distribution.metric <- distribution[["metric"]] if (!is.null(distribution.metric)) { distribution.metric <- tolower(distribution.metric) supported.metrics <- c("conc", "ndcg", "map", "mrr") if (!is.element(distribution.metric, supported.metrics)) { stop("Metric '", distribution.metric, "' is not supported, use either 'conc', 'ndcg', 'map', or 'mrr'") } metric <- distribution.metric } else { warning("No metric specified, using 'ndcg'") metric <- "ndcg" # default distribution[["metric"]] <- metric } if (any(y<0)) { stop("targets for 'pairwise' should be non-negative") } if (is.element(metric, c("mrr", "map")) && (!all(is.element(y, 0:1)))) { stop("Metrics 'map' and 'mrr' require the response to be in {0,1}") } # Cut-off rank for metrics # Default of 0 means no cutoff max.rank <- 0 if (!is.null(distribution[["max.rank"]]) && distribution[["max.rank"]] > 0) { if (is.element(metric, c("ndcg", "mrr"))) { max.rank <- distribution[["max.rank"]] } else { stop("Parameter 'max.rank' cannot be specified for metric '", distribution.metric, "', only supported for 'ndcg' and 'mrr'") } } # We pass the cut-off rank to the C function as the last element in the Misc vector Misc <- c(group, max.rank) distribution.call.name <- sprintf("pairwise_%s", metric) } # close if (dist... == "pairwise" # create index upfront... subtract one for 0 based order x.order <- apply(x[1:nTrain,,drop=FALSE],2,order,na.last=FALSE)-1 x <- as.vector(data.matrix(x)) predF <- rep(0,length(y)) train.error <- rep(0,n.trees) valid.error <- rep(0,n.trees) oobag.improve <- rep(0,n.trees) if(is.null(var.monotone)) { var.monotone <- rep(0,cCols) } else if(length(var.monotone)!=cCols) { stop("Length of var.monotone != number of predictors") } else if(!all(is.element(var.monotone,-1:1))) { stop("var.monotone must be -1, 0, or 1") } fError <- FALSE gbm.obj <- .Call("gbm_fit", Y=as.double(y), Offset=as.double(offset), X=as.double(x), X.order=as.integer(x.order), weights=as.double(w), Misc=as.double(Misc), cRows=as.integer(cRows), cCols=as.integer(cCols), var.type=as.integer(var.type), var.monotone=as.integer(var.monotone), distribution=as.character(distribution.call.name), n.trees=as.integer(n.trees), interaction.depth=as.integer(interaction.depth), n.minobsinnode=as.integer(n.minobsinnode), n.classes = as.integer(nClass), shrinkage=as.double(shrinkage), bag.fraction=as.double(bag.fraction), nTrain=as.integer(nTrain), fit.old=as.double(NA), n.cat.splits.old=as.integer(0), n.trees.old=as.integer(0), verbose=as.integer(verbose), PACKAGE = "gbm") names(gbm.obj) <- c("initF","fit","train.error","valid.error", "oobag.improve","trees","c.splits") gbm.obj$bag.fraction <- bag.fraction gbm.obj$distribution <- distribution gbm.obj$interaction.depth <- interaction.depth gbm.obj$n.minobsinnode <- n.minobsinnode gbm.obj$num.classes <- nClass gbm.obj$n.trees <- length(gbm.obj$trees) / nClass gbm.obj$nTrain <- nTrain gbm.obj$train.fraction <- train.fraction gbm.obj$response.name <- response.name gbm.obj$shrinkage <- shrinkage gbm.obj$var.levels <- var.levels gbm.obj$var.monotone <- var.monotone gbm.obj$var.names <- var.names gbm.obj$var.type <- var.type gbm.obj$verbose <- verbose gbm.obj$Terms <- NULL if(distribution$name == "coxph") { gbm.obj$fit[i.timeorder] <- gbm.obj$fit } ## If K-Classification is used then split the fit and tree components if (distribution$name == "multinomial") { gbm.obj$fit <- matrix(gbm.obj$fit, ncol = nClass) dimnames(gbm.obj$fit)[[2]] <- classes gbm.obj$classes <- classes ## Also get the class estimators exp.f <- exp(gbm.obj$fit) denom <- matrix(rep(rowSums(exp.f), nClass), ncol = nClass) gbm.obj$estimator <- exp.f/denom } if(keep.data) { if(distribution$name == "coxph") { # Put the observations back in order gbm.obj$data <- list( y = y, x = x, x.order = x.order, offset = offset, Misc = Misc, w = w, i.timeorder = i.timeorder ) } else if ( distribution$name == "multinomial" ) { # Restore original order of the data new.idx <- order(new.idx) gbm.obj$data <- list( y = as.vector(matrix(y, ncol = length(classes), byrow = FALSE)[new.idx, ]), x = as.vector(matrix(x, ncol = length(var.names), byrow = FALSE)[new.idx, ]), x.order = x.order, offset = offset[new.idx], Misc = Misc, w = w[new.idx] ) } else { gbm.obj$data <- list( y = y, x = x, x.order = x.order, offset = offset, Misc = Misc, w = w ) } } else { gbm.obj$data <- NULL } # Reuturn object of class "gbm" class(gbm.obj) <- "gbm" gbm.obj } gbm/R/zzz.R0000644000176200001440000000045214547261506012213 0ustar liggesusers#' @keywords internal .onAttach <- function(lib, pkg) { vers <- utils::packageVersion("gbm") packageStartupMessage("Loaded gbm ", vers) packageStartupMessage("This version of gbm is no longer under development. Consider transitioning to gbm3, https://github.com/gbm-developers/gbm3") } gbm/R/relative.influence.R0000644000176200001440000001477114547111627015147 0ustar liggesusers#' Methods for estimating relative influence #' #' Helper functions for computing the relative influence of each variable in #' the gbm object. #' #' @details #' This is not intended for end-user use. These functions offer the different #' methods for computing the relative influence in \code{\link{summary.gbm}}. #' \code{gbm.loss} is a helper function for \code{permutation.test.gbm}. #' #' @aliases relative.influence permutation.test.gbm gbm.loss #' #' @param object a \code{gbm} object created from an initial call to #' \code{\link{gbm}}. #' #' @param n.trees the number of trees to use for computations. If not provided, #' the the function will guess: if a test set was used in fitting, the number #' of trees resulting in lowest test set error will be used; otherwise, if #' cross-validation was performed, the number of trees resulting in lowest #' cross-validation error will be used; otherwise, all trees will be used. #' #' @param scale. whether or not the result should be scaled. Defaults to #' \code{FALSE}. #' #' @param sort. whether or not the results should be (reverse) sorted. #' Defaults to \code{FALSE}. #' #' @param y,f,w,offset,dist,baseline For \code{gbm.loss}: These components are #' the outcome, predicted value, observation weight, offset, distribution, and #' comparison loss function, respectively. #' #' @param group,max.rank Used internally when \code{distribution = #' \'pairwise\'}. #' #' @return By default, returns an unprocessed vector of estimated relative #' influences. If the \code{scale.} and \code{sort.} arguments are used, #' returns a processed version of the same. #' #' @author Greg Ridgeway \email{gregridgeway@@gmail.com} #' #' @seealso \code{\link{summary.gbm}} #' #' @references J.H. Friedman (2001). "Greedy Function Approximation: A Gradient #' Boosting Machine," Annals of Statistics 29(5):1189-1232. #' #' L. Breiman (2001). #' \url{https://www.stat.berkeley.edu/users/breiman/randomforest2001.pdf}. #' #' @keywords hplot #' #' @rdname relative.influence #' #' @export relative.influence <- function(object, n.trees, scale. = FALSE, sort. = FALSE ) { if( missing( n.trees ) ){ if ( object$train.fraction < 1 ){ n.trees <- gbm.perf( object, method="test", plot.it=FALSE ) } else if ( !is.null( object$cv.error ) ){ n.trees <- gbm.perf( object, method="cv", plot.it = FALSE ) } else{ # If dist=multinomial, object$n.trees = n.trees * num.classes # so use the following instead. n.trees <- length( object$train.error ) } cat( paste( "n.trees not given. Using", n.trees, "trees.\n" ) ) } if (object$distribution$name == "multinomial") { n.trees <- n.trees * object$num.classes } get.rel.inf <- function(obj) { lapply(split(obj[[6]],obj[[1]]),sum) # 6 - Improvement, 1 - var name } temp <- unlist(lapply(object$trees[1:n.trees],get.rel.inf)) rel.inf.compact <- unlist(lapply(split(temp,names(temp)),sum)) rel.inf.compact <- rel.inf.compact[names(rel.inf.compact)!="-1"] # rel.inf.compact excludes those variable that never entered the model # insert 0's for the excluded variables rel.inf <- rep(0,length(object$var.names)) i <- as.numeric(names(rel.inf.compact))+1 rel.inf[i] <- rel.inf.compact names(rel.inf) <- object$var.names if (scale.){ rel.inf <- rel.inf / max(rel.inf) } if (sort.){ rel.inf <- rev(sort(rel.inf)) } return(rel.inf=rel.inf) } #' @rdname relative.influence #' @export permutation.test.gbm <- function(object, n.trees) { # get variables used in the model i.vars <- sort(unique(unlist(lapply(object$trees[1:n.trees], function(x){unique(x[[1]])})))) i.vars <- i.vars[i.vars!=-1] + 1 rel.inf <- rep(0,length(object$var.names)) if(!is.null(object$data)) { y <- object$data$y os <- object$data$offset Misc <- object$data$Misc w <- object$data$w x <- matrix(object$data$x, ncol=length(object$var.names)) object$Terms <- NULL # this makes predict.gbm take x as it is if (object$distribution$name == "pairwise") { # group and cutoff are only relevant for distribution "pairwise" # in this case, the last element specifies the max rank # max rank = 0 means no cut off group <- Misc[1:length(y)] max.rank <- Misc[length(y)+1] } } else { stop("Model was fit with keep.data=FALSE. permutation.test.gbm has not been implemented for that case.") } # the index shuffler j <- sample(1:nrow(x)) for(i in 1:length(i.vars)) { x[ ,i.vars[i]] <- x[j,i.vars[i]] new.pred <- predict.gbm(object,newdata=x,n.trees=n.trees) rel.inf[i.vars[i]] <- gbm.loss(y,new.pred,w,os, object$distribution, object$train.error[n.trees], group, max.rank) x[j,i.vars[i]] <- x[ ,i.vars[i]] } return(rel.inf=rel.inf) } #' @rdname relative.influence #' @export gbm.loss <- function(y, f, w, offset, dist, baseline, group=NULL, max.rank=NULL) { if (!is.na(offset)) { f <- offset+f } if (dist$name != "pairwise") { switch(dist$name, gaussian = weighted.mean((y - f)^2,w) - baseline, bernoulli = -2*weighted.mean(y*f - log(1+exp(f)),w) - baseline, laplace = weighted.mean(abs(y-f),w) - baseline, adaboost = weighted.mean(exp(-(2*y-1)*f),w) - baseline, poisson = -2*weighted.mean(y*f-exp(f),w) - baseline, stop(paste("Distribution",dist$name,"is not yet supported for method=permutation.test.gbm"))) } else # dist$name == "pairwise" { if (is.null(dist$metric)) { stop("No metric specified for distribution 'pairwise'") } if (!is.element(dist$metric, c("conc", "ndcg", "map", "mrr"))) { stop("Invalid metric '", dist$metric, "' specified for distribution 'pairwise'") } if (is.null(group)) { stop("For distribution 'pairwise', parameter 'group' has to be supplied") } # Loss = 1 - utility (1 - perf.pairwise(y, f, group, dist$metric, w, max.rank)) - baseline } } gbm/R/gbmCrossVal.R0000644000176200001440000002033614547636654013615 0ustar liggesusers#' Cross-validate a gbm #' #' Functions for cross-validating gbm. These functions are used internally and #' are not intended for end-user direct usage. #' #' These functions are not intended for end-user direct usage, but are used #' internally by \code{gbm}. #' #' @aliases gbmCrossVal gbmCrossValModelBuild gbmDoFold gbmCrossValErr #' gbmCrossValPredictions #' @param cv.folds The number of cross-validation folds. #' @param nTrain The number of training samples. #' @param n.cores The number of cores to use. #' @param class.stratify.cv Whether or not stratified cross-validation samples #' are used. #' @param data The data. #' @param x The model matrix. #' @param y The response variable. #' @param offset The offset. #' @param distribution The type of loss function. See \code{\link{gbm}}. #' @param w Observation weights. #' @param var.monotone See \code{\link{gbm}}. #' @param n.trees The number of trees to fit. #' @param interaction.depth The degree of allowed interactions. See #' \code{\link{gbm}}. #' @param n.minobsinnode See \code{\link{gbm}}. #' @param shrinkage See \code{\link{gbm}}. #' @param bag.fraction See \code{\link{gbm}}. #' @param var.names See \code{\link{gbm}}. #' @param response.name See \code{\link{gbm}}. #' @param group Used when \code{distribution = "pairwise"}. See #' \code{\link{gbm}}. #' @param i.train Items in the training set. #' @param cv.models A list containing the models for each fold. #' @param cv.group A vector indicating the cross-validation fold for each #' member of the training set. #' @param best.iter.cv The iteration with lowest cross-validation error. #' @param X Index (cross-validation fold) on which to subset. #' @param s Random seed. #' @return A list containing the cross-validation error and predictions. #' @author Greg Ridgeway \email{gregridgeway@@gmail.com} #' @seealso \code{\link{gbm}} #' @references J.H. Friedman (2001). "Greedy Function Approximation: A Gradient #' Boosting Machine," Annals of Statistics 29(5):1189-1232. #' #' L. Breiman (2001). #' \url{https://www.stat.berkeley.edu/users/breiman/randomforest2001.pdf}. #' @keywords models # Perform gbm cross-validation # # This function has far too many arguments, but there isn't the # abstraction in gbm to lose them. #' @rdname gbmCrossVal #' @export gbmCrossVal <- function(cv.folds, nTrain, n.cores, class.stratify.cv, data, x, y, offset, distribution, w, var.monotone, n.trees, interaction.depth, n.minobsinnode, shrinkage, bag.fraction, var.names, response.name, group) { i.train <- 1:nTrain cv.group <- getCVgroup(distribution, class.stratify.cv, y, i.train, cv.folds, group) ## build the models cv.models <- gbmCrossValModelBuild(cv.folds, cv.group, n.cores, i.train, x, y, offset, distribution, w, var.monotone, n.trees, interaction.depth, n.minobsinnode, shrinkage, bag.fraction, var.names, response.name, group) ## get the errors cv.error <- gbmCrossValErr(cv.models, cv.folds, cv.group, nTrain, n.trees) best.iter.cv <- which.min(cv.error) ## get the predictions predictions <- gbmCrossValPredictions(cv.models, cv.folds, cv.group, best.iter.cv, distribution, data[i.train, ], y) list(error = cv.error, predictions = predictions) } # Get the gbm cross-validation error #' @rdname gbmCrossVal #' @export gbmCrossValErr <- function(cv.models, cv.folds, cv.group, nTrain, n.trees) { in.group <- tabulate(cv.group, nbins = cv.folds) cv.error <- vapply(1:cv.folds, function(index) { model <- cv.models[[index]] model$valid.error * in.group[[index]] }, double(n.trees)) ## this is now a (n.trees, cv.folds) matrix ## and now a n.trees vector rowSums(cv.error) / nTrain } #' @rdname gbmCrossVal #' @export gbmCrossValPredictions <- function(cv.models, cv.folds, cv.group, best.iter.cv, distribution, data, y) { # Get the predictions for GBM cross validation. This function is not as nice # as it could be (i.e., leakage of y) # Test that cv.group and data match if (nrow(data) != length(cv.group)) { stop("Mismatch between `data` and `cv.group`.") } # This is a little complicated due to multinomial distribution num.cols <- if (distribution$name == "multinomial") { nlevels(factor(y)) } else { 1 } # Initialize results matrix res <- matrix(nrow = nrow(data), ncol = num.cols) # There's no real reason to do this as other than a for loop data.names <- names(data) # column names for (ind in 1:cv.folds) { # These are the particular elements flag <- cv.group == ind model <- cv.models[[ind]] # The %in% here is to handle coxph # my.data <- data[flag, !(data.names %in% model$response.name)] my.data <- data[flag, model$var.names, drop=FALSE] predictions <- predict(model, newdata = my.data, n.trees = best.iter.cv) # FIXME predictions <- matrix(predictions, ncol = num.cols) res[flag, ] <- predictions } # Handle multinomial case if (distribution$name != "multinomial") { res <- as.numeric(res) } # Return the result res } # Perform gbm cross-validation # # This function has far too many arguments. #' @rdname gbmCrossVal #' @export gbmCrossValModelBuild <- function(cv.folds, cv.group, n.cores, i.train, x, y, offset, distribution, w, var.monotone, n.trees, interaction.depth, n.minobsinnode, shrinkage, bag.fraction, var.names, response.name, group) { # Set random seeds seeds <- as.integer(runif(cv.folds, -(2^31 - 1), 2^31)) # Perform cross-validation model builds if (!is.null(n.cores) && n.cores == 1) { lapply(1:cv.folds, FUN = gbmDoFold, i.train, x, y, offset, distribution, w, var.monotone, n.trees, interaction.depth, n.minobsinnode, shrinkage, bag.fraction, cv.group, var.names, response.name, group, seeds) } else { # Set up cluster and add finalizer cluster <- gbmCluster(n.cores) on.exit(parallel::stopCluster(cluster)) parallel::parLapply(cl = cluster, X = 1:cv.folds, fun = gbmDoFold, i.train, x, y, offset, distribution, w, var.monotone, n.trees, interaction.depth, n.minobsinnode, shrinkage, bag.fraction, cv.group, var.names, response.name, group, seeds) } } #' @rdname gbmCrossVal #' @export gbmDoFold <- function(X, i.train, x, y, offset, distribution, w, var.monotone, n.trees, interaction.depth, n.minobsinnode, shrinkage, bag.fraction, cv.group, var.names, response.name, group, s) { # Do specified cross-validation fold---a self-contained function for passing # to individual cores. # Load required packages for core library(gbm, quietly = TRUE) # Print CV information cat("CV:", X, "\n") # Setup set.seed(s[[X]]) i <- order(cv.group == X) x <- x[i.train, , drop = FALSE][i, , drop = FALSE] y <- y[i.train][i] offset <- offset[i.train][i] nTrain <- length(which(cv.group != X)) group <- group[i.train][i] # Return a fitted GBM gbm.fit(x = x, y = y, offset = offset, distribution = distribution, w = w, var.monotone = var.monotone, n.trees = n.trees, interaction.depth = interaction.depth, n.minobsinnode = n.minobsinnode, shrinkage = shrinkage, bag.fraction = bag.fraction, nTrain = nTrain, keep.data = FALSE, verbose = FALSE, response.name = response.name, group = group) } gbm/R/calibrate.plot.R0000644000176200001440000001360414547111627014262 0ustar liggesusers#' Quantile rug plot #' #' Marks the quantiles on the axes of the current plot. #' #' @param x A numeric vector. #' #' @param prob The quantiles of x to mark on the x-axis. #' #' @param ... Additional optional arguments to be passed onto #' \code{\link[graphics]{rug}} #' #' @return No return values. #' #' @author Greg Ridgeway \email{gregridgeway@@gmail.com}. #' #' @seealso \code{\link[graphics:plot.default]{plot}}, \code{\link[stats]{quantile}}, #' \code{\link[base]{jitter}}, \code{\link[graphics]{rug}}. #' #' @keywords aplot #' #' @export quantile.rug #' #' @examples #' x <- rnorm(100) #' y <- rnorm(100) #' plot(x, y) #' quantile.rug(x) quantile.rug <- function(x, prob = 0:10/10, ...) { quants <- quantile(x[!is.na(x)], prob = prob) if(length(unique(quants)) < length(prob)) { quants <- jitter(quants) } rug(quants, ...) } #' Calibration plot #' #' An experimental diagnostic tool that plots the fitted values versus the #' actual average values. Currently only available when #' \code{distribution = "bernoulli"}. #' #' Uses natural splines to estimate E(y|p). Well-calibrated predictions imply #' that E(y|p) = p. The plot also includes a pointwise 95% confidence band. #' #' @param y The outcome 0-1 variable. #' #' @param p The predictions estimating E(y|x). #' #' @param distribution The loss function used in creating \code{p}. #' \code{bernoulli} and \code{poisson} are currently the only special options. #' All others default to squared error assuming \code{gaussian}. #' #' @param replace Determines whether this plot will replace or overlay the #' current plot. \code{replace=FALSE} is useful for comparing the calibration #' of several methods. #' #' @param line.par Graphics parameters for the line. #' #' @param shade.col Color for shading the 2 SE region. \code{shade.col=NA} #' implies no 2 SE region. #' #' @param shade.density The \code{density} parameter for \code{\link{polygon}}. #' #' @param rug.par Graphics parameters passed to \code{\link{rug}}. #' #' @param xlab x-axis label corresponding to the predicted values. #' #' @param ylab y-axis label corresponding to the observed average. #' #' @param xlim,ylim x- and y-axis limits. If not specified te function will #' select limits. #' #' @param knots,df These parameters are passed directly to #' \code{\link[splines]{ns}} for constructing a natural spline smoother for the #' calibration curve. #' #' @param ... Additional optional arguments to be passed onto #' \code{\link[graphics:plot.default]{plot}} #' #' @return No return values. #' #' @author Greg Ridgeway \email{gregridgeway@@gmail.com} #' #' @references #' J.F. Yates (1982). "External correspondence: decomposition of #' the mean probability score," Organisational Behaviour and Human Performance #' 30:132-156. #' #' D.J. Spiegelhalter (1986). "Probabilistic Prediction in Patient Management #' and Clinical Trials," Statistics in Medicine 5:421-433. #' @keywords hplot #' #' @export #' #' @examples #' # Don't want R CMD check to think there is a dependency on rpart #' # so comment out the example #' #library(rpart) #' #data(kyphosis) #' #y <- as.numeric(kyphosis$Kyphosis)-1 #' #x <- kyphosis$Age #' #glm1 <- glm(y~poly(x,2),family=binomial) #' #p <- predict(glm1,type="response") #' #calibrate.plot(y, p, xlim=c(0,0.6), ylim=c(0,0.6)) calibrate.plot <- function(y, p, distribution = "bernoulli", replace = TRUE, line.par = list(col = "black"), shade.col = "lightyellow", shade.density = NULL, rug.par = list(side = 1), xlab = "Predicted value", ylab = "Observed average", xlim = NULL, ylim = NULL, knots = NULL, df = 6, ...) { # Sanity check if (!requireNamespace("splines", quietly = TRUE)) { stop("The splines package is needed for this function to work. Please ", "install it.", call. = FALSE) } data <- data.frame(y = y, p = p) # Check spline parameters if(is.null(knots) && is.null(df)) { stop("Either knots or df must be specified") } if((df != round(df)) || (df < 1)) { stop("df must be a positive integer") } # Check distribution if(distribution == "bernoulli") { family1 <- binomial } else if(distribution == "poisson") { family1 <- poisson } else { family1 <- gaussian } # Fit a GLM using natural cubic splines gam1 <- glm(y ~ splines::ns(p, df = df, knots = knots), data = data, family = family1) # Plotting data x <- seq(min(p), max(p), length = 200) yy <- predict(gam1, newdata = data.frame(p = x), se.fit = TRUE, type = "response") x <- x[!is.na(yy$fit)] yy$se.fit <- yy$se.fit[!is.na(yy$fit)] yy$fit <- yy$fit[!is.na(yy$fit)] # Plotting parameters if(!is.na(shade.col)) { se.lower <- yy$fit - 2 * yy$se.fit se.upper <- yy$fit + 2 * yy$se.fit if(distribution == "bernoulli") { se.lower[se.lower < 0] <- 0 se.upper[se.upper > 1] <- 1 } if(distribution == "poisson") { se.lower[se.lower < 0] <- 0 } if(is.null(xlim)) { xlim <- range(se.lower, se.upper, x) } if(is.null(ylim)) { ylim <- range(se.lower, se.upper, x) } } else { if(is.null(xlim)) { xlim <- range(yy$fit,x) } if(is.null(ylim)) { ylim <- range(yy$fit,x) } } # Construct plot if(replace) { plot(0, 0, type = "n", xlab = xlab, ylab = ylab, xlim = xlim, ylim = ylim, ...) } if(!is.na(shade.col)) { polygon(c(x, rev(x), x[1L]), c(se.lower, rev(se.upper), se.lower[1L]), col = shade.col, border = NA, density = shade.density) } lines(x, yy$fit, col = line.par$col) quantile.rug(p, side = rug.par$side) abline(0, 1, col = "red") } gbm/R/utils.R0000644000176200001440000000767614547646516012545 0ustar liggesusers#' @keywords internal getAvailableDistributions <- function() { c("adaboost", "bernoulli", "coxph", "gaussian", "huberized", "laplace", "multinomial", "pairwise", "poisson", "quantile", "tdist") } #' @keywords internal guess_error_method <- function(object) { if (has_train_test_split(object)) { "test" } else if (has_cross_validation(object)) { "cv" } else { "OOB" } } #' @keywords internal has_train_test_split <- function(object) { object$train.fraction < 1 } #' @keywords internal has_cross_validation <- function(object) { !is.null(object$cv.error) } #' @keywords internal best_iter <- function(object, method) { check_if_gbm_fit(object) if (method == "OOB") { best_iter_out_of_bag(object) } else if (method == "test") { best_iter_test(object) } else if (method == "cv") { best_iter_cv(object) } else { stop("method must be one of \"cv\", \"test\", or \"OOB\"") } } #' @keywords internal best_iter_test <- function(object) { check_if_gbm_fit(object) best_iter_test <- which.min(object$valid.error) return(best_iter_test) } #' @keywords internal best_iter_cv <- function(object) { check_if_gbm_fit(object) if(!has_cross_validation(object)) { stop('In order to use method="cv" gbm must be called with cv.folds>1.') } best_iter_cv <- which.min(object$cv.error) return(best_iter_cv) } #' @keywords internal best_iter_out_of_bag <- function(object) { check_if_gbm_fit(object) if(object$bag.fraction == 1) { stop("Cannot compute OOB estimate or the OOB curve when bag_fraction=1.") } if(all(!is.finite(object$oobag.improve))) { stop("Cannot compute OOB estimate or the OOB curve. No finite OOB ", "estimates of improvement.") } message("OOB generally underestimates the optimal number of iterations ", "although predictive performance is reasonably competitive. Using ", "cv_folds>1 when calling gbm usually results in improved predictive ", "performance.") smoother <- generate_smoother_oobag(object) best_iter_oob <- smoother$x[which.min(-cumsum(smoother$y))] attr(best_iter_oob, "smoother") <- smoother return(best_iter_oob) } #' @keywords internal generate_smoother_oobag <- function(object) { check_if_gbm_fit(object) x <- seq_len(object$n.trees) smoother <- loess(object$oobag.improve ~ x, enp.target = min(max(4, length(x) / 10), 50)) smoother$y <- smoother$fitted smoother$x <- x return(smoother) } #' @keywords internal check_if_gbm_fit <- function(object) { if (!inherits(object, "gbm")) { stop(deparse(substitute(object)), " is not a valid \"gbm\" object.") } } #' @keywords internal get_ylab <- function(object) { check_if_gbm_fit(object) if (object$distribution$name != "pairwise") { switch(substring(object$distribution$name, 1, 2), ga = "Squared error loss", be = "Bernoulli deviance", po = "Poisson deviance", ad = "AdaBoost exponential bound", co = "Cox partial deviance", la = "Absolute loss", qu = "Quantile loss", mu = "Multinomial deviance", td = "t-distribution deviance") } else { switch(object$distribution$metric, conc = "Fraction of concordant pairs", ndcg = "Normalized discounted cumulative gain", map = "Mean average precision", mrr = "Mean reciprocal rank") } } #' @keywords internal get_ylim <- function(object, method) { check_if_gbm_fit(object) if(object$train.fraction == 1) { if ( method=="cv" ) { range(object$train.error, object$cv.error) } else if ( method == "test" ) { range( object$train.error, object$valid.error) } else { range(object$train.error) } } else { range(object$train.error, object$valid.error) } } gbm/R/plot.gbm.R0000644000176200001440000003074714547111627013110 0ustar liggesusers#' Marginal plots of fitted gbm objects #' #' Plots the marginal effect of the selected variables by "integrating" out the #' other variables. #' #' \code{plot.gbm} produces low dimensional projections of the #' \code{\link{gbm.object}} by integrating out the variables not included in #' the \code{i.var} argument. The function selects a grid of points and uses #' the weighted tree traversal method described in Friedman (2001) to do the #' integration. Based on the variable types included in the projection, #' \code{plot.gbm} selects an appropriate display choosing amongst line plots, #' contour plots, and \code{\link[lattice:Lattice]{lattice}} plots. If the default #' graphics are not sufficient the user may set \code{return.grid = TRUE}, store #' the result of the function, and develop another graphic display more #' appropriate to the particular example. #' #' @param x A \code{\link{gbm.object}} that was fit using a call to #' \code{\link{gbm}}. #' #' @param i.var Vector of indices or the names of the variables to plot. If #' using indices, the variables are indexed in the same order that they appear #' in the initial \code{gbm} formula. If \code{length(i.var)} is between 1 and #' 3 then \code{plot.gbm} produces the plots. Otherwise, \code{plot.gbm} #' returns only the grid of evaluation points and their average predictions #' #' @param n.trees Integer specifying the number of trees to use to generate the #' plot. Default is to use \code{x$n.trees} (i.e., the entire ensemble). #' #' @param continuous.resolution Integer specifying the number of equally space #' points at which to evaluate continuous predictors. #' #' @param return.grid Logical indicating whether or not to produce graphics #' \code{FALSE} or only return the grid of evaluation points and their average #' predictions \code{TRUE}. This is useful for customizing the graphics for #' special variable types, or for higher dimensional graphs. #' #' @param type Character string specifying the type of prediction to plot on the #' vertical axis. See \code{\link{predict.gbm}} for details. #' #' @param level.plot Logical indicating whether or not to use a false color #' level plot (\code{TRUE}) or a 3-D surface (\code{FALSE}). Default is #' \code{TRUE}. #' #' @param contour Logical indicating whether or not to add contour lines to the #' level plot. Only used when \code{level.plot = TRUE}. Default is \code{FALSE}. #' #' @param number Integer specifying the number of conditional intervals to use #' for the continuous panel variables. See \code{\link[graphics:coplot]{co.intervals}} #' and \code{\link[lattice:shingles]{equal.count}} for further details. #' #' @param overlap The fraction of overlap of the conditioning variables. See #' \code{\link[graphics:coplot]{co.intervals}} and \code{\link[lattice:shingles]{equal.count}} #' for further details. #' #' @param col.regions Color vector to be used if \code{level.plot} is #' \code{TRUE}. Defaults to the wonderful Matplotlib 'viridis' color map #' provided by the \code{viridis} package. See \code{\link[viridis:reexports]{viridis}} #' for details. #' #' @param ... Additional optional arguments to be passed onto #' \code{\link[graphics:plot.default]{plot}}. #' #' @return If \code{return.grid = TRUE}, a grid of evaluation points and their #' average predictions. Otherwise, a plot is returned. #' #' @note More flexible plotting is available using the #' \code{\link[pdp]{partial}} and \code{\link[pdp]{plotPartial}} functions. #' #' @seealso \code{\link[pdp]{partial}}, \code{\link[pdp]{plotPartial}}, #' \code{\link{gbm}}, and \code{\link{gbm.object}}. #' #' @references J. H. Friedman (2001). "Greedy Function Approximation: A Gradient #' Boosting Machine," Annals of Statistics 29(4). #' #' @references B. M. Greenwell (2017). "pdp: An R Package for Constructing #' Partial Dependence Plots," The R Journal 9(1), 421--436. #' \url{https://journal.r-project.org/archive/2017/RJ-2017-016/index.html}. #' #' @export plot.gbm #' @export plot.gbm <- function(x, i.var = 1, n.trees = x$n.trees, continuous.resolution = 100, return.grid = FALSE, type = c("link", "response"), level.plot = TRUE, contour = FALSE, number = 4, overlap = 0.1, col.regions = viridis::viridis, ...) { # Match type argument type <- match.arg(type) # Sanity checks if(all(is.character(i.var))) { i <- match(i.var, x$var.names) if(any(is.na(i))) { stop("Requested variables not found in ", deparse(substitute(x)), ": ", i.var[is.na(i)]) } else { i.var <- i } } if((min(i.var) < 1) || (max(i.var) > length(x$var.names))) { warning("i.var must be between 1 and ", length(x$var.names)) } if(n.trees > x$n.trees) { warning(paste("n.trees exceeds the number of tree(s) in the model: ", x$n.trees, ". Using ", x$n.trees, " tree(s) instead.", sep = "")) n.trees <- x$n.trees } if(length(i.var) > 3) { warning("plot.gbm() will only create up to (and including) 3-way ", "interaction plots.\nBeyond that, plot.gbm() will only return ", "the plotting data structure.") return.grid <- TRUE } # Generate grid of predictor values on which to compute the partial # dependence values grid.levels <- vector("list", length(i.var)) for(i in 1:length(i.var)) { if(is.numeric(x$var.levels[[i.var[i]]])) { # continuous grid.levels[[i]] <- seq(from = min(x$var.levels[[i.var[i]]]), to = max(x$var.levels[[i.var[i]]]), length = continuous.resolution) } else { # categorical grid.levels[[i]] <- as.numeric(factor(x$var.levels[[i.var[i]]], levels = x$var.levels[[i.var[i]]])) - 1 } } X <- expand.grid(grid.levels) names(X) <- paste("X", 1:length(i.var), sep = "") # For compatibility with gbm version 1.6 if (is.null(x$num.classes)) { x$num.classes <- 1 } # Compute partial dependence values y <- .Call("gbm_plot", X = as.double(data.matrix(X)), cRows = as.integer(nrow(X)), cCols = as.integer(ncol(X)), n.class = as.integer(x$num.classes), i.var = as.integer(i.var - 1), n.trees = as.integer(n.trees), initF = as.double(x$initF), trees = x$trees, c.splits = x$c.splits, var.type = as.integer(x$var.type), PACKAGE = "gbm") if (x$distribution$name == "multinomial") { # reshape into matrix X$y <- matrix(y, ncol = x$num.classes) colnames(X$y) <- x$classes # Convert to class probabilities (if requested) if (type == "response") { X$y <- exp(X$y) X$y <- X$y / matrix(rowSums(X$y), ncol = ncol(X$y), nrow = nrow(X$y)) } } else if(is.element(x$distribution$name, c("bernoulli", "pairwise")) && type == "response") { X$y <- 1 / (1 + exp(-y)) } else if ((x$distribution$name == "poisson") && (type == "response")) { X$y <- exp(y) } else if (type == "response"){ warning("`type = \"response\"` only implemented for \"bernoulli\", ", "\"poisson\", \"multinomial\", and \"pairwise\" distributions. ", "Ignoring." ) } else { X$y <- y } # Transform categorical variables back to factors f.factor <- rep(FALSE, length(i.var)) for(i in 1:length(i.var)) { if(!is.numeric(x$var.levels[[i.var[i]]])) { X[,i] <- factor(x$var.levels[[i.var[i]]][X[, i] + 1], levels = x$var.levels[[i.var[i]]]) f.factor[i] <- TRUE } } # Return original variable names names(X)[1:length(i.var)] <- x$var.names[i.var] # Return grid only (if requested) if(return.grid) { return(X) } # Determine number of predictors nx <- length(i.var) # Determine which type of plot to draw based on the number of predictors if (nx == 1L) { # Single predictor plotOnePredictorPDP(X, ...) } else if (nx == 2) { # Two predictors plotTwoPredictorPDP(X, level.plot = level.plot, contour = contour, col.regions = col.regions, ...) } else { # Three predictors (paneled version of plotTwoPredictorPDP) plotThreePredictorPDP(X, nx = nx, level.plot = level.plot, contour = contour, col.regions = col.regions, number = number, overlap = overlap, ...) } } #' @keywords internal plotOnePredictorPDP <- function(X, ...) { # Use the first column to determine which type of plot to construct if (is.numeric(X[[1L]])) { # Draw a line plot lattice::xyplot(stats::as.formula(paste("y ~", names(X)[1L])), data = X, type = "l", ...) } else { # Draw a Cleveland dot plot lattice::dotplot(stats::as.formula(paste("y ~", names(X)[1L])), data = X, xlab = names(X)[1L], ...) } } #' @keywords internal plotTwoPredictorPDP <- function(X, level.plot, contour, col.regions, ...) { # Use the first two columns to determine which type of plot to construct if (is.factor(X[[1L]]) && is.factor(X[[2L]])) { # Draw a Cleveland dot plot lattice::dotplot(stats::as.formula( paste("y ~", paste(names(X)[1L:2L], collapse = "|")) ), data = X, xlab = names(X)[1L], ...) } else if (is.factor(X[[1L]]) || is.factor(X[[2L]])) { # Lattice plot formula form <- if (is.factor(X[[1L]])) { stats::as.formula(paste("y ~", paste(names(X)[2L:1L], collapse = "|"))) } else { stats::as.formula(paste("y ~", paste(names(X)[1L:2L], collapse = "|"))) } # Draw a paneled line plot lattice::xyplot(form, data = X, type = "l", ...) } else { # Lattice plot formula form <- stats::as.formula( paste("y ~", paste(names(X)[1L:2L], collapse = "*")) ) # Draw a three-dimensional surface if (level.plot) { # Draw a false color level plot lattice::levelplot(form, data = X, col.regions = col.regions, contour = contour, ...) } else { # Draw a wireframe plot lattice::wireframe(form, data = X, ...) } } } #' @keywords internal plotThreePredictorPDP <- function(X, nx, level.plot, contour, col.regions, number, overlap, ...) { # Factor, numeric, numeric if (is.factor(X[[1L]]) && !is.factor(X[[2L]]) && !is.factor(X[[3L]])) { X[, 1L:3L] <- X[, c(2L, 3L, 1L)] } # Numeric, factor, numeric if (!is.factor(X[[1L]]) && is.factor(X[[2L]]) && !is.factor(X[[3L]])) { X[, 1L:3L] <- X[, c(1L, 3L, 2L)] } # Factor, factor, numeric if (is.factor(X[[1L]]) && is.factor(X[[2L]]) && !is.factor(X[[3L]])) { X[, 1L:3L] <- X[, c(3L, 1L, 2L)] } # Factor, numeric, factor if (is.factor(X[[1L]]) && !is.factor(X[[2L]]) && is.factor(X[[3L]])) { X[, 1L:3L] <- X[, c(2L, 1L, 3L)] } # Convert third predictor to a factor using the equal count algorithm if (is.numeric(X[[3L]])) { X[[3L]] <- equal.count(X[[3L]], number = number, overlap = overlap) } if (is.factor(X[[1L]]) && is.factor(X[[2L]])) { # Lattice plot formula form <- stats::as.formula( paste("y ~", names(X)[1L], "|", paste(names(X)[2L:nx], collapse = "*")) ) # Produce a paneled dotplot lattice::dotplot(form, data = X, xlab = names(X)[1L], ...) } else if (is.numeric(X[[1L]]) && is.factor(X[[2L]])) { # Lattice plot formula form <- stats::as.formula( paste("y ~", names(X)[1L], "|", paste(names(X)[2L:nx], collapse = "*")) ) # Produce a paneled lineplot lattice::xyplot(form, data = X, type = "l", ...) } else { # Lattice plot formula form <- stats::as.formula( paste("y ~", paste(names(X)[1L:2L], collapse = "*"), "|", paste(names(X)[3L:nx], collapse = "*")) ) # Draw a three-dimensional surface if (level.plot) { # Draw a false color level plot lattice::levelplot(form, data = X, col.regions = col.regions, contour = contour, ...) } else { # Draw a wireframe plot lattice::wireframe(form, data = X, ...) } } }gbm/R/reconstructGBMdata.R0000644000176200001440000000344114547113406015105 0ustar liggesusers#' Reconstruct a GBM's Source Data #' #' Helper function to reconstitute the data for plots and summaries. This #' function is not intended for the user to call directly. #' #' #' @param x a \code{\link{gbm.object}} initially fit using \code{\link{gbm}} #' @return Returns a data used to fit the gbm in a format that can subsequently #' be used for plots and summaries #' @author Harry Southworth #' @seealso \code{\link{gbm}}, \code{\link{gbm.object}} #' @keywords manip #' @export reconstructGBMdata <- function(x) { if(!inherits(x, "gbm")) { stop( "This function is for use only with objects having class 'gbm'" ) } else if (is.null(x$data)) { stop("Cannot reconstruct data from gbm object. gbm() was called with keep.data=FALSE") } else if (x$distribution$name=="multinomial") { y <- matrix(x$data$y, ncol=x$num.classes, byrow=FALSE) yn <- apply(y, 1, function(z,nc) {(1:nc)[z == 1]}, nc = x$num.classes) y <- factor(yn, labels=x$classes) xdat <- matrix(x$data$x, ncol=ncol(x$data$x.order), byrow=FALSE) d <- data.frame(y, xdat) names(d) <- c(x$response.name, x$var.names) } else if (x$distribution$name == "coxph") { xdat <- matrix(x$data$x, ncol=ncol(x$data$x.order), byrow=FALSE) status <- x$data$Misc y <- x$data$y[order(x$data$i.timeorder)] d <- data.frame(y, status, xdat) names(d) <- c(x$response.name[-1], colnames(x$data$x.order)) } else { y <- x$data$y xdat <- matrix(x$data$x, ncol=ncol(x$data$x.order), byrow=FALSE) d <- data.frame(y, xdat) rn <- ifelse(length(x$response.name) > 1, x$response.name[2], x$response.name) names(d) <- c(rn, colnames(x$data$x.order)) } invisible(d) } gbm/R/ir.measures.R0000644000176200001440000001472714547111627013623 0ustar liggesusers# Functions to compute IR measures for pairwise loss for # a single group # Notes: # * Inputs are passed as a 2-elemen (y,f) list, to # facilitate the 'by' iteration # * Return the respective metric, or a negative value if # it is undefined for the given group # * For simplicity, we have no special handling for ties; # instead, we break ties randomly. This is slightly # inaccurate for individual groups, but should have # a small effect on the overall measure. #' Compute Information Retrieval measures. #' #' Functions to compute Information Retrieval measures for pairwise loss for a #' single group. The function returns the respective metric, or a negative #' value if it is undefined for the given group. #' #' @param obs Observed value. #' @param pred Predicted value. #' @param metric What type of performance measure to compute. #' @param y,y.f,f,w,group,max.rank Used internally. #' @param x Numeric vector. #' @return The requested performance measure. #' #' @details #' For simplicity, we have no special handling for ties; instead, we break ties #' randomly. This is slightly inaccurate for individual groups, but should have #' only a small effect on the overall measure. #' #' \code{gbm.conc} computes the concordance index: Fraction of all pairs (i,j) #' with i 0]) - (n1 + 1)/2)/(n - n1)) } # Concordance Index: # Fraction of all pairs (i,j) with i0) if (length(f) <= 1 || num.pos == 0 || num.pos == length(f)) { return (-1.0) } else { return (gbm.roc.area(obs=y, pred=f)) } } #' @rdname gbm.roc.area #' @export ir.measure.mrr <- function(y.f, max.rank) { y <- y.f[[1]] f <- y.f[[2]] num.pos <- sum(y>0) if (length(f) <= 1 || num.pos == 0 || num.pos == length(f)) { return (-1.0) } ord <- order(f, decreasing=TRUE) min.idx.pos <- min(which(y[ord]>0)) if (min.idx.pos <= max.rank) { return (1.0 / min.idx.pos) } else { return (0.0) } } #' @rdname gbm.roc.area #' @export ir.measure.map <- function(y.f, max.rank=0) { # Note: max.rank is meaningless for MAP y <- y.f[[1]] f <- y.f[[2]] ord <- order(f, decreasing=TRUE) idx.pos <- which(y[ord]>0) num.pos <- length(idx.pos) if (length(f) <= 1 || num.pos == 0 || num.pos == length(f)) { return (-1.0) } # Above and including the rank of the i-th positive result, # there are exactly i positives and rank(i) total results return (sum((1:length(idx.pos))/idx.pos) / num.pos) } #' @rdname gbm.roc.area #' @export ir.measure.ndcg <- function(y.f, max.rank) { y <- y.f[[1]] f <- y.f[[2]] if (length(f) <= 1 || all(diff(y)==0)) { return (-1.0) } num.items <- min(length(f), max.rank) ord <- order(f, decreasing=TRUE) dcg <- sum(y[ord][1:num.items] / log2(2:(num.items+1))) # The best possible DCG: order by target ord.max <- order(y, decreasing=TRUE) dcg.max <- sum(y[ord.max][1:num.items] / log2(2:(num.items+1))) # Normalize return (dcg / dcg.max) } #' @rdname gbm.roc.area #' @export perf.pairwise <- function(y, f, group, metric="ndcg", w=NULL, max.rank=0) { func.name <- switch(metric, conc = "ir.measure.conc", mrr = "ir.measure.mrr", map = "ir.measure.map", ndcg = "ir.measure.ndcg", stop(paste("Metric",metric,"is not supported")) ) # Optimization: for binary targets, # AUC is equivalent but faster than CONC if (metric == "conc" && all(is.element(y, 0:1))) { func.name <- "ir.measure.auc" } # Max rank = 0 means no cut off if (max.rank <= 0) { max.rank <- length(y)+1 } # Random tie breaking in case of duplicate scores. # (Without tie breaking, we would overestimate if instances are # sorted descending on target) f <- f + 1E-10 * runif(length(f), min=-0.5, max=0.5) measure.by.group <- as.matrix(by(list(y, f), INDICES=group, FUN=get(func.name), max.rank=max.rank)) # Exclude groups with single result or only negative or positive instances idx <- which((!is.null(measure.by.group)) & measure.by.group >= 0) if (is.null(w)) { return (mean(measure.by.group[idx])) } else { # Assumption: weights are constant per group w.by.group <- tapply(w, group, mean) return (weighted.mean(measure.by.group[idx], w=w.by.group[idx])) } } gbm/R/gbm-internals.R0000644000176200001440000001120714547622015014114 0ustar liggesusers#' gbm internal functions #' #' Helper functions for preprocessing data prior to building a \code{"gbm"} #' object. #' #' @param y The response variable. #' @param d,distribution The distribution, either specified by the user or #' implied. #' @param class.stratify.cv Whether or not to stratify, if provided by the user. #' @param i.train Computed internally by \code{gbm}. #' @param group The group, if using \code{distibution = "pairwise"}. #' @param strat Whether or not to stratify. #' @param cv.folds The number of cross-validation folds. #' @param x The design matrix. #' @param id The interaction depth. #' @param w The weights. #' @param n The number of cores to use in the cluster. #' @param o The offset. #' #' @details #' These are functions used internally by \code{gbm} and not intended for direct #' use by the user. #' #' @aliases guessDist getStratify getCVgroup checkMissing checkID checkWeights #' checkOffset getVarNames gbmCluster #' #' @rdname gbm-internals #' @export guessDist <- function(y){ # If distribution is not given, try to guess it if (length(unique(y)) == 2){ d <- "bernoulli" } else if (inherits(y, "Surv")){ d <- "coxph" } else if (is.factor(y)){ d <- "multinomial" } else{ d <- "gaussian" } cat(paste("Distribution not specified, assuming", d, "...\n")) list(name=d) } #' @rdname gbm-internals #' @export getCVgroup <- function(distribution, class.stratify.cv, y, i.train, cv.folds, group) { # Construct cross-validation groups depending on the type of model to be fit if (distribution$name %in% c( "bernoulli", "multinomial" ) & class.stratify.cv ){ nc <- table(y[i.train]) # Number in each class uc <- names(nc) if (min(nc) < cv.folds){ stop( paste("The smallest class has only", min(nc), "objects in the training set. Can't do", cv.folds, "fold cross-validation.")) } cv.group <- vector(length = length(i.train)) for (i in 1:length(uc)){ cv.group[y[i.train] == uc[i]] <- sample(rep(1:cv.folds , length = nc[i])) } } # Close if else if (distribution$name == "pairwise") { # Split into CV folds at group boundaries s <- sample(rep(1:cv.folds, length=nlevels(group))) cv.group <- s[as.integer(group[i.train])] } else { cv.group <- sample(rep(1:cv.folds, length=length(i.train))) } cv.group } #' @rdname gbm-internals #' @export getStratify <- function(strat, d){ if (is.null(strat)){ if (d$name == "multinomial" ){ strat <- TRUE } else { strat <- FALSE } } else { if (!is.element(d$name, c( "bernoulli", "multinomial"))){ warning("You can only use class.stratify.cv when distribution is bernoulli or multinomial. Ignored.") strat <- FALSE } } # Close else strat } #' @rdname gbm-internals #' @export checkMissing <- function(x, y){ nms <- getVarNames(x) #### Check for NaNs in x and NAs in response j <- apply(x, 2, function(z) any(is.nan(z))) if(any(j)) { stop("Use NA for missing values. NaN found in predictor variables:", paste(nms[j],collapse=",")) } if(any(is.na(y))) stop("Missing values are not allowed in the response") invisible(NULL) } #' @rdname gbm-internals #' @export checkWeights <- function(w, n){ # Logical checks on weights if(length(w)==0) { w <- rep(1, n) } else if(any(w < 0)) stop("negative weights not allowed") w } #' @rdname gbm-internals #' @export checkID <- function(id){ # Check for disallowed interaction.depth if(id < 1) { stop("interaction.depth must be at least 1.") } else if(id > 49) { stop("interaction.depth must be less than 50. You should also ask yourself why you want such large interaction terms. A value between 1 and 5 should be sufficient for most applications.") } invisible(id) } #' @rdname gbm-internals #' @export checkOffset <- function(o, y){ # Check offset if(is.null(o) | all(o==0)) { o <- NA } else if(length(o) != length(y)) { stop("The length of offset does not equal the length of y.") } o } #' @rdname gbm-internals #' @export getVarNames <- function(x){ if(is.matrix(x)) { var.names <- colnames(x) } else if(is.data.frame(x)) { var.names <- names(x) } else { var.names <- paste("X",1:ncol(x),sep="") } var.names } #' @rdname gbm-internals #' @export gbmCluster <- function(n) { # If number of cores (n) not given, try to work it out from the number # that appear to be available and the number of CV folds. if (is.null(n)) { n <- parallel::detectCores() } parallel::makeCluster(n) } gbm/R/gbm.more.R0000644000176200001440000003476614547111627013101 0ustar liggesusers#' Generalized Boosted Regression Modeling (GBM) #' #' Adds additional trees to a \code{\link{gbm.object}} object. #' #' @param object A \code{\link{gbm.object}} object created from an initial call #' to \code{\link{gbm}}. #' #' @param n.new.trees Integer specifying the number of additional trees to add #' to \code{object}. Default is 100. #' #' @param data An optional data frame containing the variables in the model. By #' default the variables are taken from \code{environment(formula)}, typically #' the environment from which \code{gbm} is called. If \code{keep.data=TRUE} in #' the initial call to \code{gbm} then \code{gbm} stores a copy with the #' object. If \code{keep.data=FALSE} then subsequent calls to #' \code{\link{gbm.more}} must resupply the same dataset. It becomes the user's #' responsibility to resupply the same data at this point. #' #' @param weights An optional vector of weights to be used in the fitting #' process. Must be positive but do not need to be normalized. If #' \code{keep.data=FALSE} in the initial call to \code{gbm} then it is the #' user's responsibility to resupply the weights to \code{\link{gbm.more}}. #' #' @param offset A vector of offset values. #' #' @param verbose Logical indicating whether or not to print out progress and #' performance indicators (\code{TRUE}). If this option is left unspecified for #' \code{gbm.more}, then it uses \code{verbose} from \code{object}. Default is #' \code{FALSE}. #' #' @return A \code{\link{gbm.object}} object. #' #' @export #' #' @examples #' # #' # A least squares regression example #' # #' #' # Simulate data #' set.seed(101) # for reproducibility #' N <- 1000 #' X1 <- runif(N) #' X2 <- 2 * runif(N) #' X3 <- ordered(sample(letters[1:4], N, replace = TRUE), levels = letters[4:1]) #' X4 <- factor(sample(letters[1:6], N, replace = TRUE)) #' X5 <- factor(sample(letters[1:3], N, replace = TRUE)) #' X6 <- 3 * runif(N) #' mu <- c(-1, 0, 1, 2)[as.numeric(X3)] #' SNR <- 10 # signal-to-noise ratio #' Y <- X1 ^ 1.5 + 2 * (X2 ^ 0.5) + mu #' sigma <- sqrt(var(Y) / SNR) #' Y <- Y + rnorm(N, 0, sigma) #' X1[sample(1:N,size=500)] <- NA # introduce some missing values #' X4[sample(1:N,size=300)] <- NA # introduce some missing values #' data <- data.frame(Y, X1, X2, X3, X4, X5, X6) #' #' # Fit a GBM #' set.seed(102) # for reproducibility #' gbm1 <- gbm(Y ~ ., data = data, var.monotone = c(0, 0, 0, 0, 0, 0), #' distribution = "gaussian", n.trees = 100, shrinkage = 0.1, #' interaction.depth = 3, bag.fraction = 0.5, train.fraction = 0.5, #' n.minobsinnode = 10, cv.folds = 5, keep.data = TRUE, #' verbose = FALSE, n.cores = 1) #' #' # Check performance using the out-of-bag (OOB) error; the OOB error typically #' # underestimates the optimal number of iterations #' best.iter <- gbm.perf(gbm1, method = "OOB") #' print(best.iter) #' #' # Check performance using the 50% heldout test set #' best.iter <- gbm.perf(gbm1, method = "test") #' print(best.iter) #' #' # Check performance using 5-fold cross-validation #' best.iter <- gbm.perf(gbm1, method = "cv") #' print(best.iter) #' #' # Plot relative influence of each variable #' par(mfrow = c(1, 2)) #' summary(gbm1, n.trees = 1) # using first tree #' summary(gbm1, n.trees = best.iter) # using estimated best number of trees #' #' # Compactly print the first and last trees for curiosity #' print(pretty.gbm.tree(gbm1, i.tree = 1)) #' print(pretty.gbm.tree(gbm1, i.tree = gbm1$n.trees)) #' #' # Simulate new data #' set.seed(103) # for reproducibility #' N <- 1000 #' X1 <- runif(N) #' X2 <- 2 * runif(N) #' X3 <- ordered(sample(letters[1:4], N, replace = TRUE)) #' X4 <- factor(sample(letters[1:6], N, replace = TRUE)) #' X5 <- factor(sample(letters[1:3], N, replace = TRUE)) #' X6 <- 3 * runif(N) #' mu <- c(-1, 0, 1, 2)[as.numeric(X3)] #' Y <- X1 ^ 1.5 + 2 * (X2 ^ 0.5) + mu + rnorm(N, 0, sigma) #' data2 <- data.frame(Y, X1, X2, X3, X4, X5, X6) #' #' # Predict on the new data using the "best" number of trees; by default, #' # predictions will be on the link scale #' Yhat <- predict(gbm1, newdata = data2, n.trees = best.iter, type = "link") #' #' # least squares error #' print(sum((data2$Y - Yhat)^2)) #' #' # Construct univariate partial dependence plots #' plot(gbm1, i.var = 1, n.trees = best.iter) #' plot(gbm1, i.var = 2, n.trees = best.iter) #' plot(gbm1, i.var = "X3", n.trees = best.iter) # can use index or name #' #' # Construct bivariate partial dependence plots #' plot(gbm1, i.var = 1:2, n.trees = best.iter) #' plot(gbm1, i.var = c("X2", "X3"), n.trees = best.iter) #' plot(gbm1, i.var = 3:4, n.trees = best.iter) #' #' # Construct trivariate partial dependence plots #' plot(gbm1, i.var = c(1, 2, 6), n.trees = best.iter, #' continuous.resolution = 20) #' plot(gbm1, i.var = 1:3, n.trees = best.iter) #' plot(gbm1, i.var = 2:4, n.trees = best.iter) #' plot(gbm1, i.var = 3:5, n.trees = best.iter) #' #' # Add more (i.e., 100) boosting iterations to the ensemble #' gbm2 <- gbm.more(gbm1, n.new.trees = 100, verbose = FALSE) gbm.more <- function(object, n.new.trees = 100, data = NULL, weights = NULL, offset = NULL, verbose = NULL) { theCall <- match.call() nTrain <- object$nTrain if (object$distribution$name != "pairwise") { distribution.call.name <- object$distribution$name } else { distribution.call.name <- sprintf("pairwise_%s", object$distribution$metric) } if(is.null(object$Terms) && is.null(object$data)) { stop("The gbm model was fit using gbm.fit (rather than gbm) and keep.data was set to FALSE. gbm.more cannot locate the dataset.") } else if(is.null(object$data) && is.null(data)) { stop("keep.data was set to FALSE on original gbm call and argument 'data' is NULL") } else if(is.null(object$data)) { m <- eval(object$m, parent.frame()) Terms <- attr(m, "terms") a <- attributes(Terms) y <- as.vector(model.extract(m, "response")) offset <- model.extract(m,offset) x <- model.frame(delete.response(Terms), data, na.action=na.pass) w <- weights if(length(w)==0) w <- rep(1, nrow(x)) if (object$distribution$name != "pairwise") { w <- w*length(w)/sum(w) # normalize to N } if(is.null(offset) || (offset==0)) { offset <- NA } Misc <- NA if(object$distribution$name == "coxph") { Misc <- as.numeric(y)[-(1:cRows)] y <- as.numeric(y)[1:cRows] # reverse sort the failure times to compute risk sets on the fly i.train <- order(-y[1:nTrain]) i.test <- order(-y[(nTrain+1):cRows]) + nTrain i.timeorder <- c(i.train,i.test) y <- y[i.timeorder] Misc <- Misc[i.timeorder] x <- x[i.timeorder,,drop=FALSE] w <- w[i.timeorder] if(!is.na(offset)) offset <- offset[i.timeorder] object$fit <- object$fit[i.timeorder] } else if(object$distribution$name == "tdist" ){ Misc <- object$distribution$df } else if (object$distribution$name == "pairwise"){ # Check if group names are valid distribution.group <- object$distribution$group i <- match(distribution.group, colnames(data)) if (any(is.na(i))) { stop("Group column does not occur in data: ", distribution.group[is.na(i)]) } # construct group index group <- factor(do.call(paste, c(data[, distribution.group, drop = FALSE], sep = ":"))) # Check that weights are constant across groups if ((!missing(weights)) && (!is.null(weights))) { w.min <- tapply(w, INDEX=group, FUN=min) w.max <- tapply(w, INDEX=group, FUN=max) if (any(w.min != w.max)) { stop("For distribution 'pairwise', all instances for the same ", "group must have the same weight") } # Normalize across groups w <- w * length(w.min) / sum(w.min) } # Shuffle groups, to remove bias when splitting into train/test set and/or CV folds perm.levels <- levels(group)[sample(1:nlevels(group))] group <- factor(group, levels=perm.levels) # The C function expects instances to be sorted by group and descending by target ord.group <- object$ord.group group <- group[ord.group] y <- y[ord.group] x <- x[ord.group,,drop=FALSE] w <- x[ord.group] object$fit <- object$fit[ord.group] # object$fit is stored in the original order # Split into train and validation set, at group boundary num.groups.train <- max(1, round(object$train.fraction * nlevels(group))) # include all groups up to the num.groups.train nTrain <- max(which(group==levels(group)[num.groups.train])) metric <- object$distribution[["metric"]] if (is.element(metric, c("mrr", "map")) && (!all(is.element(y, 0:1)))) { stop("Metrics 'map' and 'mrr' require the response to be in {0,1}") } # Cut-off rank for metrics # We pass this argument as the last element in the Misc vector # Default of 0 means no cutoff max.rank <- 0 if (!is.null(object$distribution[["max.rank"]]) && object$distribution[["max.rank"]] > 0) { if (is.element(metric, c("ndcg", "mrr"))) { max.rank <- object$distribution[["max.rank"]] } else { stop("Parameter 'max.rank' cannot be specified for metric '", metric, "', only supported for 'ndcg' and 'mrr'") } } Misc <- c(group, max.rank) } # create index upfront... subtract one for 0 based order x.order <- apply(x[1:nTrain,,drop=FALSE],2,order,na.last=FALSE)-1 x <- data.matrix(x) cRows <- nrow(x) cCols <- ncol(x) } else { y <- object$data$y x <- object$data$x x.order <- object$data$x.order offset <- object$data$offset Misc <- object$data$Misc w <- object$data$w nTrain <- object$nTrain cRows <- length(y) cCols <- length(x)/cRows if(object$distribution$name == "coxph") { i.timeorder <- object$data$i.timeorder object$fit <- object$fit[i.timeorder] } if (object$distribution$name == "pairwise") { object$fit <- object$fit[object$ord.group] # object$fit is stored in the original order } } if(is.null(verbose)) { verbose <- object$verbose } x <- as.vector(x) gbm.obj <- .Call("gbm_fit", Y = as.double(y), Offset = as.double(offset), X = as.double(x), X.order = as.integer(x.order), weights = as.double(w), Misc = as.double(Misc), cRows = as.integer(cRows), cCols = as.integer(cCols), var.type = as.integer(object$var.type), var.monotone = as.integer(object$var.monotone), distribution = as.character(distribution.call.name), n.trees = as.integer(n.new.trees), interaction.depth = as.integer(object$interaction.depth), n.minobsinnode = as.integer(object$n.minobsinnode), n.classes = as.integer(object$num.classes), shrinkage = as.double(object$shrinkage), bag.fraction = as.double(object$bag.fraction), train.fraction = as.integer(nTrain), fit.old = as.double(object$fit), n.cat.splits.old = as.integer(length(object$c.splits)), n.trees.old = as.integer(object$n.trees), verbose = as.integer(verbose), PACKAGE = "gbm") names(gbm.obj) <- c("initF","fit","train.error","valid.error", "oobag.improve","trees","c.splits") gbm.obj$initF <- object$initF gbm.obj$train.error <- c(object$train.error, gbm.obj$train.error) gbm.obj$valid.error <- c(object$valid.error, gbm.obj$valid.error) gbm.obj$oobag.improve <- c(object$oobag.improve, gbm.obj$oobag.improve) gbm.obj$trees <- c(object$trees, gbm.obj$trees) gbm.obj$c.splits <- c(object$c.splits, gbm.obj$c.splits) # cv.error not updated when using gbm.more gbm.obj$cv.error <- object$cv.error gbm.obj$cv.folds <- object$cv.folds gbm.obj$n.trees <- length(gbm.obj$trees) gbm.obj$distribution <- object$distribution gbm.obj$train.fraction <- object$train.fraction gbm.obj$shrinkage <- object$shrinkage gbm.obj$bag.fraction <- object$bag.fraction gbm.obj$var.type <- object$var.type gbm.obj$var.monotone <- object$var.monotone gbm.obj$var.names <- object$var.names gbm.obj$interaction.depth <- object$interaction.depth gbm.obj$n.minobsinnode <- object$n.minobsinnode gbm.obj$num.classes <- object$num.classes gbm.obj$nTrain <- object$nTrain gbm.obj$response.name <- object$response.name gbm.obj$Terms <- object$Terms gbm.obj$var.levels <- object$var.levels gbm.obj$verbose <- verbose if(object$distribution$name == "coxph") { gbm.obj$fit[i.timeorder] <- gbm.obj$fit } if (object$distribution$name == "pairwise") { # Data has been reordered according to queries. # We need to permute the fitted values to correspond # to the original order. gbm.obj$fit <- gbm.obj$fit[order(object$ord.group)] object$fit <- object$fit[order(object$ord.group)] gbm.obj$ord.group <- object$ord.group } if(!is.null(object$data)) { gbm.obj$data <- object$data } else { gbm.obj$data <- NULL } gbm.obj$m <- object$m gbm.obj$call <- theCall class(gbm.obj) <- "gbm" return(gbm.obj) } gbm/demo/0000755000176200001440000000000014547111627011753 5ustar liggesusersgbm/demo/bernoulli.R0000644000176200001440000000654614547111627014104 0ustar liggesusers# LOGISTIC REGRESSION EXAMPLE cat("Running logistic regression example.\n") # create some data N <- 1000 X1 <- runif(N) X2 <- runif(N) X3 <- factor(sample(letters[1:4],N,replace=T)) mu <- c(-1,0,1,2)[as.numeric(X3)] p <- 1/(1+exp(-(sin(3*X1) - 4*X2 + mu))) Y <- rbinom(N,1,p) # random weights if you want to experiment with them w <- rexp(N) w <- N*w/sum(w) data <- data.frame(Y=Y,X1=X1,X2=X2,X3=X3) # fit initial model gbm1 <- gbm(Y~X1+X2+X3, # formula data=data, # dataset weights=w, var.monotone=c(0,0,0), # -1: monotone decrease, +1: monotone increase, 0: no monotone restrictions distribution="bernoulli", n.trees=3000, # number of trees shrinkage=0.001, # shrinkage or learning rate, 0.001 to 0.1 usually work interaction.depth=3, # 1: additive model, 2: two-way interactions, etc bag.fraction = 0.5, # subsampling fraction, 0.5 is probably best train.fraction = 0.5, # fraction of data for training, first train.fraction*N used for training cv.folds=5, # do 5-fold cross-validation n.minobsinnode = 10, # minimum total weight needed in each node verbose = FALSE) # don't print progress # plot the performance best.iter.oob <- gbm.perf(gbm1,method="OOB") # returns out-of-bag estimated best number of trees print(best.iter.oob) best.iter.cv <- gbm.perf(gbm1,method="cv") # returns 5-fold cv estimate of best number of trees print(best.iter.cv) best.iter.test <- gbm.perf(gbm1,method="test") # returns test set estimate of best number of trees print(best.iter.test) best.iter <- best.iter.test # plot variable influence summary(gbm1,n.trees=1) # based on the first tree summary(gbm1,n.trees=best.iter) # based on the estimated best number of trees # create marginal plots # plot variable X1,X2,X3 after "best" iterations par(mfrow=c(1,3)) plot.gbm(gbm1,1,best.iter) plot.gbm(gbm1,2,best.iter) plot.gbm(gbm1,3,best.iter) par(mfrow=c(1,1)) plot.gbm(gbm1,1:2,best.iter) # contour plot of variables 1 and 2 after "best" number iterations plot.gbm(gbm1,2:3,best.iter) # lattice plot of variables 2 and 3 after "best" number iterations # 3-way plot plot.gbm(gbm1,1:3,best.iter) # print the first and last trees print(pretty.gbm.tree(gbm1,1)) print(pretty.gbm.tree(gbm1,gbm1$n.trees)) # make some new data N <- 1000 X1 <- runif(N) X2 <- runif(N) X3 <- factor(sample(letters[1:4],N,replace=T)) mu <- c(-1,0,1,2)[as.numeric(X3)] p <- 1/(1+exp(-(sin(3*X1) - 4*X2 + mu))) Y <- rbinom(N,1,p) data2 <- data.frame(Y=Y,X1=X1,X2=X2,X3=X3) # predict on the new data using "best" number of trees # f.predict will be on the canonical scale (logit,log,etc.) f.predict <- predict.gbm(gbm1,data2, n.trees=c(best.iter.oob,best.iter.cv,best.iter.test)) # transform to probability scale for logistic regression p.pred <- 1/(1+exp(-f.predict)) # calibration plot for logistic regression - well calibrated means a 45 degree line par(mfrow=c(1,1)) calibrate.plot(Y,p.pred[,3]) # logistic error sum(data2$Y*f.predict[,1] - log(1+exp(f.predict[,1]))) sum(data2$Y*f.predict[,2] - log(1+exp(f.predict[,2]))) sum(data2$Y*f.predict[,3] - log(1+exp(f.predict[,3]))) gbm/demo/OOB-reps.R0000644000176200001440000004760114547111627013474 0ustar liggesusersset.seed(06182001) # number of replicates n.reps <- 20 # should data be loaded from the web? If FALSE use alt.path load.from.web <- TRUE run.all <- TRUE # if data not downloaded from the web, give path to datasets alt.path <- "" n.datasets <- 12 # needs to match the number of datasets i.data <- 0 squared.error.loss <- function(y,f.x) { mean((y-f.x)^2) } bernoulli.loglikelihood <- function(y,f.x) { mean(y*f.x - log(1+exp(f.x))) } if(run.all) { dataset <- vector("list",n.datasets) # abalone i.data <- i.data + 1 dataset[[i.data]] <- list(name="Abalone", distribution="gaussian", urlpath="http://ftp.ics.uci.edu/pub/machine-learning-databases/abalone/", filename="abalone.data", var.names=c("sex","length","diameter","height","whole.weight", "shucked.weight","viscera.weight","shell.weight", "Rings"), outcome="Rings", factors="sex", na.strings="", sep=",", shrinkage=0.02) # Adult i.data <- i.data + 1 dataset[[i.data]] <- list(name="Adult", distribution="bernoulli", urlpath="http://ftp.ics.uci.edu/pub/machine-learning-databases/adult/", filename="adult.data", var.names=c("age","workclass","w","education","education.num", "marital.status","occupation","relationship","race", "male","capital.gain","capital.loss", "hours.per.week","native.country","income"), outcome="income", factors=c("workclass","education","marital.status","occupation", "relationship","race","native.country","male"), na.strings="?", sep=",", shrinkage=0.04) # Housing i.data <- i.data + 1 dataset[[i.data]] <- list(name="Boston housing", distribution="gaussian", urlpath="http://ftp.ics.uci.edu/pub/machine-learning-databases/housing/", filename="housing.data", var.names=c("CRIM","ZN","INDUS","CHAS","NOX","RM","AGE", "DIS","RAD","TAX","PTRATIO","B","LSTAT","MEDV"), factors=NULL, outcome="MEDV", na.strings="", sep="", shrinkage=0.005) # mushrooms i.data <- i.data + 1 dataset[[i.data]] <- list(name="Mushrooms", distribution="bernoulli", urlpath="http://ftp.ics.uci.edu/pub/machine-learning-databases/mushroom/", filename="agaricus-lepiota.data", var.names=c("poisonous","cap-shape","cap-surface","cap-color", "bruises","odor","gill-attachment", "gill-spacing","gill-size","gill-color", "stalk-shape","stalk-root","stalk-surface-above-ring", "stalk-surface-below-ring","stalk-color-above-ring", "stalk-color-below-ring","veil-type","veil-color", "ring-number","ring-type","spore-print-color", "population","habitat"), factors=c("cap-shape","cap-surface","cap-color", "bruises","odor","gill-attachment", "gill-spacing","gill-size","gill-color", "stalk-shape","stalk-root","stalk-surface-above-ring", "stalk-surface-below-ring","stalk-color-above-ring", "stalk-color-below-ring","veil-type","veil-color", "ring-number","ring-type","spore-print-color", "population","habitat"), outcome="poisonous", drop.vars=c("veil-type"), na.strings="?", sep=",", shrinkage=0.05) # autoprices 1 i.data <- i.data + 1 dataset[[i.data]] <- list(name="Auto Prices", distribution="gaussian", urlpath="http://ftp.ics.uci.edu/pub/machine-learning-databases/autos/", filename="imports-85.data", var.names=c("symboling","normalizedlosses","make","fueltype", "aspiration","ndoors","bodystyle", "drivewheels","enginelocation", "wheelbase", "length", "width", "height", "curbweight", "enginetype", "numerofcylinders", "enginesize", "fuelsystem", "bore", "stroke", "compressionratio", "horsepower", "peakrpm", "citympg", "highwatmpg", "price"), factors=c("symboling","make","fueltype","aspiration","ndoors", "bodystyle","drivewheels","enginelocation", "enginetype", "numerofcylinders", "fuelsystem"), outcome="price", na.strings="?", sep=",", shrinkage=0.002) # auto MPG i.data <- i.data + 1 dataset[[i.data]] <- list(name="Auto MPG", distribution="gaussian", urlpath="http://ftp.ics.uci.edu/pub/machine-learning-databases/auto-mpg/", filename="auto-mpg.data", var.names=c("mpg","cylinders","displacement","horsepower","weight", "acceleration","modelyear","origin","carname"), factors=c("cylinders", "modelyear", "origin"), outcome="mpg", drop.vars=c("carname"), na.strings="?", sep="", shrinkage=0.005) # CPU i.data <- i.data + 1 dataset[[i.data]] <- list(name="CPU Performance", distribution="gaussian", urlpath="http://ftp.ics.uci.edu/pub/machine-learning-databases/cpu-performance/", filename="machine.data", var.names=c("vendorname","modelname","myct","mmin","mmax", "cach","chmin","chmax","prp","ERP"), factors=c("vendorname","modelname"), outcome="prp", na.strings="", drop.vars=c("vendorname","modelname"), sep=",", shrinkage=0.01) # credit i.data <- i.data + 1 dataset[[i.data]] <- list(name="Credit rating", distribution="bernoulli", urlpath="http://ftp.ics.uci.edu/pub/machine-learning-databases/credit-screening/", filename="crx.data", var.names=c("A1","A2","A3","A4","A5","A6","A7","A8","A9","A10","A11", "A12", "A13", "A14", "A15","CLASS"), factors=c("A1","A4", "A5", "A6", "A7", "A9", "A10", "A12", "A13","CLASS"), outcome="CLASS", na.strings="?", sep=",", shrinkage=0.005) # Haberman i.data <- i.data + 1 dataset[[i.data]] <- list(name="Haberman", distribution="bernoulli", urlpath="http://ftp.ics.uci.edu/pub/machine-learning-databases/haberman/", filename="haberman.data", var.names=c("age","year","nodes","CLASS"), outcome="CLASS", factors=c("CLASS"), na.strings="", sep=",", shrinkage=0.001) # Diabetes i.data <- i.data + 1 dataset[[i.data]] <- list(name="Diabetes", distribution="bernoulli", urlpath="http://ftp.ics.uci.edu/pub/machine-learning-databases/pima-indians-diabetes/", filename="pima-indians-diabetes.data", var.names=c("n_preg","plasma","blood-pre","triceps","serum", "mass-index","pedigree","age","CLASS"), factors=c("CLASS"), outcome="CLASS", na.strings="?", sep=",", shrinkage=0.005) # Ionosphere i.data <- i.data + 1 dataset[[i.data]] <- list(name="Ionosphere", distribution="bernoulli", urlpath="http://ftp.ics.uci.edu/pub/machine-learning-databases/ionosphere/", filename="ionosphere.data", var.names=c("A1","A2","A3","A4","A5","A6","A7","A8","A9","A10","A11", "A12","A13","A14","A15","A16","A17","A18","A19","A20", "A21","A22","A23","A24","A25","A26","A27","A28","A29", "A30","A31","A32","A33","A34","CLASS"), factors=c("CLASS"), outcome="CLASS", na.strings="", sep=",", shrinkage=0.005) # Breast cancer i.data <- i.data + 1 dataset[[i.data]] <- list(name="breast cancer", distribution="bernoulli", urlpath="http://ftp.ics.uci.edu/pub/machine-learning-databases/breast-cancer-wisconsin/", filename="breast-cancer-wisconsin.data", var.names=c("CODE","thickness","cellsize","cellshape","adhension", "singleecell","bnuclei","chromatin","nnucleo","mitoses", "CLASS"), factors=c("CODE","CLASS"), outcome="CLASS", drop.vars=c("CODE"), na.strings="?", sep=",", shrinkage=0.005) if(FALSE) # this dataset is not public, can substitute other datasets { # time in treatment i.data <- i.data + 1 dataset[[i.data]] <- list(name="time in treatment", distribution="gaussian", urlpath="./", filename="txdet.csv", var.names=NULL, factors=c("b1","xsite4","b3new","b8new","s1a1new","m3dnew","e1new","e13anew"), outcome="txdet", drop.vars=c("xpid","xobs","maxcefu","recovfu","nontxdet","s7e5","r2f", "r3a9","e4a6","l5p","v2.4","v2.7","v2.8"), na.strings="NA", sep=",", shrinkage=0.0022) } # Load datasets for(i.data in 1:n.datasets) # for(i.data in which(sapply(dataset,function(x){is.null(x$oob.iter)}))) { # Progress cat("Dataset ",i.data,":",dataset[[i.data]]$name," N = ") filename <- paste(switch(load.from.web+1, alt.path, dataset[[i.data]]$url), dataset[[i.data]]$filename, sep="") dataset[[i.data]]$data <- read.table(file=filename, na.strings=dataset[[i.data]]$na.strings, sep=dataset[[i.data]]$sep, header=is.null(dataset[[i.data]]$var.names)) if(!is.null(dataset[[i.data]]$var.names)) { names(dataset[[i.data]]$data) <- dataset[[i.data]]$var.names } # take care of nominal predictors for(j in dataset[[i.data]]$factors) { dataset[[i.data]]$data[,j] <- factor(dataset[[i.data]]$data[,j]) } # take care of factor binary outcomes if( with(dataset[[i.data]], (distribution=="bernoulli") && is.factor(data[,outcome])) ) { dataset[[i.data]]$data[,dataset[[i.data]]$outcome] <- with(dataset[[i.data]], as.numeric(data[,outcome])-1) } # drop observations with missing outcomes i <- with(dataset[[i.data]], !is.na(data[,outcome])) dataset[[i.data]]$data <- dataset[[i.data]]$data[i,] # drop selected predictor variables if(!is.null(dataset[[i.data]]$drop.vars)) { j <- match(dataset[[i.data]]$drop.vars,names(dataset[[i.data]]$data)) dataset[[i.data]]$data <- dataset[[i.data]]$data[,-j] } dataset[[i.data]]$loss <- switch(dataset[[i.data]]$distribution, gaussian=squared.error.loss, bernoulli=bernoulli.loglikelihood) cat(nrow(dataset[[i.data]]$data),"\n") } save(dataset,file="dataset.RData") } # run.all # make sure gbm is installed if(!is.element("gbm",installed.packages()[,1])) { stop("The gbm package is not installed. Use install.packages(\"gbm\") to install. On Unix machines this must be executed in an R session started as root or installed to a local library, see help(install.packages)") } library(gbm) # loop over all the datasets i.datasets <- which(sapply(dataset,function(x){is.null(x$oob.loss)})) for(i.data in i.datasets) # for(i.data in which(sapply(dataset,function(x){is.null(x$oob.iter)}))) { N <- nrow(dataset[[i.data]]$data) # Progress cat("Dataset ",i.data,":",dataset[[i.data]]$name," N = ",N,"\n",sep="") # construct model formula for this dataset formula.fit <- formula(paste(dataset[[i.data]]$outcome,"~ .")) # initialize prediction pred.oob <- pred.base <- pred.test33 <- pred.test20 <- pred.cv5 <- rep(0,N) # track iteration estimates dataset[[i.data]]$oob.iter <- rep(NA,n.reps) dataset[[i.data]]$test33.iter <- rep(NA,n.reps) dataset[[i.data]]$test20.iter <- rep(NA,n.reps) dataset[[i.data]]$cv5.iter <- rep(NA,n.reps) # do replicates for(i.rep in 1:n.reps) { cat("rep:",i.rep,"") i.train <- sample(1:N,size=0.75*N,replace=FALSE) i.valid <- (1:N)[-i.train] # use out-of-bag method cat("OOB, ") gbm1 <- gbm(formula.fit, data=dataset[[i.data]]$data[i.train,], distribution=dataset[[i.data]]$distribution, train.fraction=1.0, bag.fraction=0.5, shrinkage=dataset[[i.data]]$shrinkage, n.trees=1000, verbose = FALSE) best.iter.oob <- gbm.perf(gbm1,method="OOB",plot.it=FALSE) while((gbm1$n.trees-best.iter.oob < 1000) && !all(gbm1$oobag.improve[(gbm1$n.trees-100):gbm1$n.trees] < 1e-6)) { gbm1 <- gbm.more(gbm1,1000) best.iter.oob <- gbm.perf(gbm1,method="OOB",plot.it=FALSE) } pred.oob[i.valid] <- predict(gbm1, newdata=dataset[[i.data]]$data[i.valid,], n.trees=best.iter.oob) dataset[[i.data]]$oob.iter[i.rep] <- best.iter.oob # use a 1/3 test set cat("33% test data, ") gbm1 <- gbm(formula.fit, data=dataset[[i.data]]$data[i.train,], distribution=dataset[[i.data]]$distribution, train.fraction=2/3, bag.fraction=0.5, shrinkage=dataset[[i.data]]$shrinkage, n.trees=1000, verbose = FALSE) best.iter.test <- gbm.perf(gbm1,method="test",plot.it=FALSE) while((gbm1$n.trees-best.iter.test < 1000) && !all(abs(gbm1$valid.error[(gbm1$n.trees-100):gbm1$n.trees]) < 1e-6)) { gbm1 <- gbm.more(gbm1,1000) best.iter.test <- gbm.perf(gbm1,method="test",plot.it=FALSE) } pred.test33[i.valid] <- predict(gbm1, newdata=dataset[[i.data]]$data[i.valid,], n.trees=best.iter.test) dataset[[i.data]]$test33.iter[i.rep] <- best.iter.test # use a 20% test set cat("20% test data, ") gbm1 <- gbm(formula.fit, data=dataset[[i.data]]$data[i.train,], distribution=dataset[[i.data]]$distribution, train.fraction=0.80, bag.fraction=0.5, shrinkage=dataset[[i.data]]$shrinkage, n.trees=1000, verbose = FALSE) best.iter.test <- gbm.perf(gbm1,method="test",plot.it=FALSE) while((gbm1$n.trees-best.iter.test < 1000) && !all(abs(gbm1$valid.error[(gbm1$n.trees-100):gbm1$n.trees]) < 1e-6)) { gbm1 <- gbm.more(gbm1,1000) best.iter.test <- gbm.perf(gbm1,method="test",plot.it=FALSE) } pred.test20[i.valid] <- predict(gbm1, newdata=dataset[[i.data]]$data[i.valid,], n.trees=best.iter.test) dataset[[i.data]]$test20.iter[i.rep] <- best.iter.test # use 5-fold cross-validation cat("5-fold CV") n.cv <- 5 cv.group <- sample(rep(1:n.cv,length=length(i.train))) max.iters <- round(best.iter.test*1.2) cv.loss <- matrix(0,ncol=n.cv,nrow=max.iters) for(i.cv in 1:n.cv) { cat(".") i <- order(cv.group==i.cv) # used to put the held out obs last gbm1 <- gbm(formula.fit, data=dataset[[i.data]]$data[i.train[i],], distribution=dataset[[i.data]]$distribution, train.fraction=mean(cv.group!=i.cv), bag.fraction=0.5, shrinkage=dataset[[i.data]]$shrinkage, n.trees=max.iters, verbose = FALSE) cv.loss[,i.cv] <- gbm1$valid.error } cat("\n") best.iter.cv <- which.min(apply(cv.loss,1,weighted.mean,w=table(cv.group))) gbm1 <- gbm(formula.fit, data=dataset[[i.data]]$data[i.train,], distribution=dataset[[i.data]]$distribution, train.fraction=1.0, bag.fraction=0.5, shrinkage=dataset[[i.data]]$shrinkage, n.trees=best.iter.cv, verbose = FALSE) pred.cv5[i.valid] <- predict(gbm1, newdata=dataset[[i.data]]$data[i.valid,], n.trees=best.iter.cv) dataset[[i.data]]$cv5.iter[i.rep] <- best.iter.cv # baseline prediction pred.base[i.valid] <- gbm1$initF # evalute the methods dataset[[i.data]]$base.loss[i.rep] <- with(dataset[[i.data]], loss(data[i.valid,outcome],pred.base[i.valid])) dataset[[i.data]]$oob.loss[i.rep] <- with(dataset[[i.data]], loss(data[i.valid,outcome],pred.oob[i.valid])) dataset[[i.data]]$test33.loss[i.rep] <- with(dataset[[i.data]], loss(data[i.valid,outcome],pred.test33[i.valid])) dataset[[i.data]]$test20.loss[i.rep] <- with(dataset[[i.data]], loss(data[i.valid,outcome],pred.test20[i.valid])) dataset[[i.data]]$cv5.loss[i.rep] <- with(dataset[[i.data]], loss(data[i.valid,outcome],pred.cv5[i.valid])) with(dataset[[i.data]], cat(oob.iter[i.rep],test33.iter[i.rep],test20.iter[i.rep], cv5.iter[i.rep],"\n")) } save.image(compress=TRUE) } #rm(dataset) save.image(compress=TRUE) results <- data.frame(problem=sapply(dataset,function(x){x$name}), N=sapply(dataset,function(x){nrow(x$data)}), d=sapply(dataset,function(x){ncol(x$data)-1}), loss=sapply(dataset,function(x){x$distribution}), base=sapply(dataset,function(x){mean(x$base.loss)}), oob=sapply(dataset,function(x){mean(x$oob.loss)}), test33=sapply(dataset,function(x){mean(x$test33.loss)}), test20=sapply(dataset,function(x){mean(x$test20.loss)}), cv5=sapply(dataset,function(x){mean(x$cv5.loss)})) j <- match(c("base","oob","test33","test20","cv5"),names(results)) results[results$loss=="bernoulli",j] <- -2*results[results$loss=="bernoulli",j] results$win <- c("base","oob","test33","test20","cv5")[apply(results[,j],1,which.min)] results$oob.rank <- apply(results[,j],1,rank)[2,] results$perf <- (results$base-results$oob)/apply(results$base-results[,j],1,max) plot(0,0,ylim=c(0,14000),xlim=c(0,n.datasets+1), xlab="Dataset",ylab="Number of iterations", type="n",axes=FALSE) lines(sapply(dataset,function(x){mean(x$oob.iter)}), col="blue") lines(sapply(dataset,function(x){mean(x$test33.iter)}), col="red") lines(sapply(dataset,function(x){mean(x$test20.iter)}), col="green") lines(sapply(dataset,function(x){mean(x$cv5.iter)}), col="purple") axis(1,at=1:n.datasets,labels=as.character(results$problem)) gbm/demo/robustReg.R0000644000176200001440000000321614547111627014054 0ustar liggesuserslibrary( MASS ) set.seed( 20090415 ) x <- mvrnorm( 100, mu=rep( 0, 5 ) , Sigma=diag( rep( 1, 5 ) ) ) r <- rnorm( 100 ) r <- ifelse( runif( 100 ) < .25 , r * 4, r ) y <- apply( x, 1, sum ) + r d <- data.frame( y=y , x) gmod <- gbm( y ~ ., data=d, distribution="gaussian", n.tree = 2000, shrinkage = .01 , cv.folds=5, verbose = FALSE, n.cores=1) tmod4 <- gbm( y ~ ., data=d, distribution="tdist", # defaults to 4 df n.tree=2000, shrinkage = .01, cv.folds=5, verbose = FALSE, n.cores=1) tmod6 <- gbm( y ~ ., data=d, distribution=list( name="tdist", df=6 ), n.tree=2000, shrinkage = .01, cv.folds=5, verbose = FALSE, n.cores=1) tmod100 <- gbm( y ~ ., data=d, distribution=list( name="tdist", df=100 ), n.tree=2000, shrinkage = .01, cv.folds=5, verbose = FALSE, n.cores=1) par(mfrow=c( 2, 2 ) ) gbest <- gbm.perf( gmod , method="cv" ) t4best <- gbm.perf( tmod4 , method="cv" ) t6best <- gbm.perf( tmod6 , method="cv" ) t100best <- gbm.perf( tmod100 , method="cv" ) qscale <- function( x ){ x / abs( diff( quantile( x , prob=c( .25, .75 ) ) ) ) } rg <- qscale( resid( gmod , n.trees=gbest) ) rt4 <- qscale( resid( tmod4 , n.trees=t4best) ) rt6 <- qscale( resid( tmod6 , n.trees=t6best) ) rt100 <- qscale( resid( tmod100 , n.trees=t100best ) ) ylimits <- range(rg, rt4, rt6, rt100) plot( rg, main="Gaussian", ylim=ylimits ); abline( h=0 ) plot( rt4, main="t(4)", ylim=ylimits ); abline( h=0 ) plot( rt6, main="t(6)", ylim=ylimits ); abline( h=0 ) plot( rt100, main="t(100)", ylim=ylimits ); abline( h=0 ) dev.off() gbm/demo/multinomial.R0000644000176200001440000000060314547111627014427 0ustar liggesusersdata( iris ) set.seed( 20090415 ) mod <- gbm(Species ~ ., data = iris, distribution = "multinomial", n.tree = 5000, shrinkage = 0.001, cv.folds = 2, bag.fraction = 0.8, interaction.depth = 3, verbose = FALSE) gbm.perf( mod, method="cv" ) mod gbm/demo/gaussian.R0000644000176200001440000000777514547111627013730 0ustar liggesusers# LEAST SQUARES EXAMPLE cat("Running least squares regression example.\n") # create some data N <- 1000 X1 <- runif(N) X2 <- 2*runif(N) X3 <- factor(sample(letters[1:4],N,replace=T)) X4 <- ordered(sample(letters[1:6],N,replace=T)) X5 <- factor(sample(letters[1:3],N,replace=T)) X6 <- 3*runif(N) mu <- c(-1,0,1,2)[as.numeric(X3)] SNR <- 10 # signal-to-noise ratio Y <- X1**1.5 + 2 * (X2**.5) + mu sigma <- sqrt(var(Y)/SNR) Y <- Y + rnorm(N,0,sigma) # create a bunch of missing values X1[sample(1:N,size=100)] <- NA X3[sample(1:N,size=300)] <- NA # random weights if you want to experiment with them # w <- rexp(N) # w <- N*w/sum(w) w <- rep(1,N) data <- data.frame(Y=Y,X1=X1,X2=X2,X3=X3,X4=X4,X5=X5,X6=X6) # fit initial model gbm1 <- gbm(Y~X1+X2+X3+X4+X5+X6, # formula data=data, # dataset var.monotone=c(0,0,0,0,0,0), # -1: monotone decrease, +1: monotone increase, 0: no monotone restrictions distribution="gaussian", # bernoulli, adaboost, gaussian, poisson, coxph, or # list(name="quantile",alpha=0.05) for quantile regression n.trees=2000, # number of trees shrinkage=0.005, # shrinkage or learning rate, 0.001 to 0.1 usually work interaction.depth=3, # 1: additive model, 2: two-way interactions, etc bag.fraction = 0.5, # subsampling fraction, 0.5 is probably best train.fraction = 0.5, # fraction of data for training, first train.fraction*N used for training n.minobsinnode = 10, # minimum number of obs needed in each node keep.data=TRUE, cv.folds=10, # do 10-fold cross-validation verbose = FALSE) # don't print progress # plot the performance best.iter <- gbm.perf(gbm1,method="OOB") # returns out-of-bag estimated best number of trees best.iter <- gbm.perf(gbm1,method="test") # returns test set estimate of best number of trees best.iter <- gbm.perf(gbm1,method="cv") # returns cv estimate of best number of trees # plot variable influence summary(gbm1,n.trees=1) # based on the first tree summary(gbm1,n.trees=best.iter) # based on the estimated best number of trees # print the first and last trees print(pretty.gbm.tree(gbm1,1)) print(pretty.gbm.tree(gbm1,gbm1$n.trees)) print(gbm1$c.splits[1:3]) # make some new data N <- 1000 X1 <- runif(N) X2 <- 2*runif(N) X3 <- factor(sample(letters[1:4],N,replace=TRUE)) X4 <- ordered(sample(letters[1:6],N,replace=TRUE)) X5 <- factor(sample(letters[1:3],N,replace=TRUE)) X6 <- 3*runif(N) mu <- c(-1,0,1,2)[as.numeric(X3)] Y <- X1**1.5 + 2 * (X2**.5) + mu Y <- Y + rnorm(N,0,sigma) data2 <- data.frame(Y=Y,X1=X1,X2=X2,X3=X3,X4=X4,X5=X5,X6=X6) print(data2[1:10,]) # predict on the new data using "best" number of trees f.predict <- predict(gbm1,data2,best.iter) # f.predict will be on the canonical scale (logit,log,etc.) print(f.predict[1:10]) # least squares error print(sum((data2$Y-f.predict)^2)) # create marginal plots # plot variable X1,X2,X3 after "best" iterations par(mfrow=c(1,3)) plot(gbm1,1,best.iter) plot(gbm1,2,best.iter) plot(gbm1,3,best.iter) par(mfrow=c(1,1)) plot(gbm1,1:2,best.iter) # contour plot of variables 1 and 2 after "best" number iterations plot(gbm1,2:3,best.iter) # lattice plot of variables 2 and 3 after "best" number iterations plot(gbm1,3:4,best.iter) # lattice plot of variables 2 and 3 after "best" number iterations plot(gbm1,c(1,2,6),best.iter,cont=20) # 3-way plots plot(gbm1,1:3,best.iter) plot(gbm1,2:4,best.iter) plot(gbm1,3:5,best.iter) # check interactions interact.gbm(gbm1,data=data,i.var=1:2,n.trees=best.iter) # get all two way interactions i.var <- subset(expand.grid(x1=1:6,x2=1:6), x1=data2$tt[i])*exp(f.predict) ) } cat("Boosting:",sum( data2$delta*( f.predict - log(risk) ) ),"\n") # linear model coxph1 <- coxph(Surv(tt,delta)~X1+X2+X3,data=data) f.predict <- predict(coxph1,newdata=data2) risk <- rep(0,N) for(i in 1:N) { risk[i] <- sum( (data2$tt>=data2$tt[i])*exp(f.predict) ) } cat("Linear model:",sum( data2$delta*( f.predict - log(risk) ) ),"\n") gbm/demo/pairwise.R0000644000176200001440000001631614547111627013730 0ustar liggesusers# RANKING EXAMPLE cat("Running ranking (LambdaMart) example.\n") # Create synthetic data that shows how pairwise training can be better # Note: no claim to represent 'real world' data! generate.data <- function(N) { # create query groups, with an average size of 25 items each num.queries <- floor(N/25) query <- sample(1:num.queries, N, replace=TRUE) # X1 is a variable determined by query group only query.level <- runif(num.queries) X1 <- query.level[query] # X2 varies with each item X2 <- runif(N) # X3 is uncorrelated with target X3 <- runif(N) # The target Y <- X1 + X2 # Add some random noise to X2 that is correlated with # queries, but uncorrelated with items X2 <- X2 + scale(runif(num.queries))[query] # Add some random noise to target SNR <- 5 # signal-to-noise ratio sigma <- sqrt(var(Y)/SNR) Y <- Y + runif(N, 0, sigma) data.frame(Y, query=query, X1, X2, X3) } cat('Generating data\n') N=1000 data.train <- generate.data(N) # Now we fit 3 different models to the same data: # * Gaussian # * Pairwise with NDCG ranking metric # * Pairwise with CONC (fraction of concordant pairs) ranking metric cat('Fitting a model with gaussian loss function\n') gbm.gaussian <- gbm(Y~X1+X2+X3, # formula data=data.train, # dataset distribution='gaussian', # loss function: gaussian n.trees=2000, # number of trees shrinkage=0.005, # learning rate interaction.depth=3, # number per splits per tree bag.fraction = 0.5, # subsampling fraction train.fraction = 1, # fraction of data for training n.minobsinnode = 10, # minimum number of obs for split keep.data=TRUE, # store copy of input data in model cv.folds=5, # number of cross validation folds verbose = FALSE, # don't print progress n.cores = 1) # use a single core (to prevent possible problems caused by wronly detecting cores) # estimate number of trees best.iter.gaussian <- gbm.perf(gbm.gaussian, method="cv") title('Training of gaussian model') cat('Fitting a model with pairwise loss function (ranking metric: normalized discounted cumulative gain)\n') gbm.ndcg <- gbm(Y~X1+X2+X3, # formula data=data.train, # dataset distribution=list( # loss function: name='pairwise', # pairwise metric="ndcg", # ranking metric: normalized discounted cumulative gain group='query'), # column indicating query groups n.trees=2000, # number of trees shrinkage=0.005, # learning rate interaction.depth=3, # number per splits per tree bag.fraction = 0.5, # subsampling fraction train.fraction = 1, # fraction of data for training n.minobsinnode = 10, # minimum number of obs for split keep.data=TRUE, # store copy of input data in model cv.folds=5, # number of cross validation folds verbose = FALSE, # don't print progress n.cores = 1) # use a single core # estimate number of trees best.iter.ndcg <- gbm.perf(gbm.ndcg, method='cv') title('Training of pairwise model with ndcg metric') cat('Fit a model with pairwise loss function (ranking metric: fraction of concordant pairs)\n') gbm.conc <- gbm(Y~X1+X2+X3, # formula data=data.train, # dataset distribution=list( # loss function: name='pairwise', # pairwise metric="conc", # ranking metric: concordant pairs group='query'), # column indicating query groups n.trees=2000, # number of trees shrinkage=0.005, # learning rate interaction.depth=3, # number per splits per tree bag.fraction = 0.5, # subsampling fraction train.fraction = 1, # fraction of data for training n.minobsinnode = 10, # minimum number of obs for split keep.data=TRUE, # store copy of input data in model cv.folds=5, # number of cross validation folds verbose = FALSE, # don't print progress n.cores = 1) # use a single core # estimate number of trees best.iter.conc <- gbm.perf(gbm.conc, method='cv') title('Training of pairwise model with conc metric') # plot variable importance par.old <- par(mfrow=c(1,3)) summary(gbm.gaussian, n.trees=best.iter.gaussian, main='gaussian') summary(gbm.ndcg, n.trees=best.iter.ndcg, main='pairwise (ndcg)') summary(gbm.conc, n.trees=best.iter.conc, main='pairwise (conc)') par(par.old) cat("Generating some new data\n") data.test <- generate.data(N) cat("Calculating predictions\n") predictions <- data.frame(random=runif(N), X2=data.test$X2, gaussian=predict(gbm.gaussian, data.test, best.iter.gaussian), pairwise.ndcg=predict(gbm.ndcg, data.test, best.iter.ndcg), pairwise.conc=predict(gbm.conc, data.test, best.iter.conc)) cat("Computing loss metrics\n") result.table <- data.frame(measure=c('random', 'X2 only', 'gaussian', 'pairwise (ndcg)', 'pairwise (conc)'), squared.loss=sapply(1:length(predictions), FUN=function(i) { gbm.loss(y=data.test$Y, predictions[[i]], w=rep(1,N), offset=NA, dist=list(name="gaussian"), baseline=0) }), ndcg5.loss=sapply(1:length(predictions), FUN=function(i) { gbm.loss(y=data.test$Y, predictions[[i]], w=rep(1,N), offset=NA, dist=list(name='pairwise', metric="ndcg"), baseline=0, group=data.test$query, max.rank=5) }), concordant.pairs.loss=sapply(1:length(predictions), FUN=function(i) { gbm.loss(y=data.test$Y, predictions[[i]], w=rep(1,N), offset=NA, dist=list(name='pairwise', metric="conc"), baseline=0, group=data.test$query, max.rank=0) }), row.names=NULL) cat('Performance measures for the different models on the test set (smaller is better):\n') print(result.table,digits=2) # Brief explanation: Variable X1 is not correlated with the order of items, only # with queries. Variable X2 is the only one that is correlated with the order of # items within queries. However, it has a high query-correlated variance. # Therefore, the 'optimal' possible ranking is just by X2. Of course, the # pairwise models don't know this and don't completely achieve the same # accuracy, due to noise and data limitation. # # The Gaussian model uses mostly X1, due to the high variance of X2; on the # contrary, the pairwise models rely mainly on X2. The loss table shows that # both pairwise models are better in terms of the ranking metrics, but worse in # terms of squared loss. gbm/vignettes/0000755000176200001440000000000014637005163013034 5ustar liggesusersgbm/vignettes/oobperf2.pdf0000644000176200001440000002317114547111634015252 0ustar liggesusers%PDF-1.3 %Çì¢ 6 0 obj <> stream xœ­ZKof· Ý¿ân ´æVêµMÑM¢Mc´ët^I`O03ió÷K‰Qòg{&6‚‰MŠ’Ž)ŠÒõÇÃþpý?þùúîòñò§Âñþóå[ü5Ÿ±µvõ,ḻ¤r¢¢ªævjœ‡3vtRÅ—.?¦’¯ïŽonp2½?[ˆù¸yw!øÿ3ñḹ»üþøÃÍO—¿ÜßlîÌ5 ]Mݺ¹p|zݼ¤ÏpøÂì 8'ŸÞ^ÞýñRÎáømÿŠÿ~zÜþ;\ޝ¨ÿõ);i.ñL*ï[:³*nѤâ‚|míôÝ`H-ôµkkpÈg’Öàb“¾ Ï§½½[ÓdÁ&¢P¥êr©™Ukëˆ\û¥FÂX+œ£¿§ñ*6 {'öƒ“R2ÅF­®K‘’”‹Hd›Ali¬ì¥æì‚%mXSb{^Kb¬ =‹ ƺ¹¼ŒÖ\]fk’Á ižÑ1Z•‹´ÚâÖvY»ÊYÚi|aNæf_]ÐWEKkS?¯Š—ô©¦—Eæ$4÷Êl‚¾['ó z`NϤŠ^=—bÒv‡RÔV”BYmC\Ç nËmXü†Õ+6Z‹ï´N&¬ ÆšXœ£Ës6òÂDC^šhE–Õ—Shk»®]ä´Ž¯ÌñüÊ,ドŒ_xmÓoÆ«ÝË>B§ ÎnLŽÔ³ï–êž05YîiS¶ÀüÞyìy¦¥pVU ll2c8ú±ÚÈ^< Z`¶7Ó3 ÙØ|6–$:æ¡a%«¢¢x[úGižó**zŠö¼Qû´;Z:êàÕbìÜîʲ*¼–Ä,ƒÖ–ì¤=”½U›¢ì {è³9gzÜ‘è¼*`é€åeÀ4Ò‹™23F…”£‚&…YV)®¹´[(¬˜\e&V†oBjü‘`ñ­Ôús=4IÁLÜ]㊸,×u:x0Õ5ÄCi2!{³lþ®%Ë ¬Èeë’Ó6hÖUŒIó•5)Ê:%]4-Ê(ú¢“‚æx˜íi5çx˜ãsÈp1nt7Ñ.Àå½Ú÷I°“‚§ë‰ÂϰªZBW*]” TÙ’i•O¦| tõÈê3ëFH\j¡b\@.CÂ`W,g0ýÖ†èÔ vaV' 躧©…Í«ÚnÜÈ ¯4‡l"Æ&踤Y]EPúx’ÂPQ–œ&ô™.žSõdœ“\Z4a±'pv4ì¡0¹.Ê– ª|Ê´i"%`&6ºúL§^]ý>B¡—³€¦' ·çÅlN\î.Fâžùc³¸½oñÝ%:W–,£Û³ÐÙx¾€d~{A…6™ ‹è!“{ |¹à-Š–Ý}l4m’ãPFEP RÈþEE6—œÙ·$Uë²+Cœ  £d]EÔUð:ƒÂf’§"ûá†ÙeœÁvМ—Is¡ *ï°‹(æÂDѶ.“+t²9&l3¨é †=ýÅ ›]|nÂÀ7à”éAÅ‹‚.Œ0)žo|ê"TõÝ ã¨ðj‘Í™­ídQ¸m¯ >idÂrÒUøÉ…[n©Ì–Udº¥Î.ÛÝlxÄN: \a¦eùöELˆ,Ì´˜f¦ü|f0?•—gÆcÕ…«^˜R]Þ˜©XRGÖå+Ì`mêèe”˜ùVå ¯l/òó™ñX}à­ã¥™Á`õõUÌÔkÌya&ô[±Èú&Àö"¿33ùÕ?rx3XšeæÛ?±›¼=¯7f‚‡Æ™r0!ò­Ê^ÞBØ^äç3º¦½@]釳ð8òÁç1ÜßúpeÖ{]¹W:?líû¡ý†J/·UFOùŸº¯÷å:L=·C•zá¡¥¼&9åuibþãÖýÃÖýýõÚ#?TqïL]©,·~Hüï6ȹ+“ÿ<=ϧ­òÚË¦Ý ì¥|ªvlù‡ãd'ŸÛS Éß?gâ aG¹;èóºkñNø$ÅAhe<$©X=ñû4ÞŠ¿©¤nx¿X¶×Kä1> /Contents 6 0 R >> endobj 3 0 obj << /Type /Pages /Kids [ 5 0 R ] /Count 1 >> endobj 1 0 obj <> endobj 4 0 obj <> endobj 14 0 obj <> endobj 15 0 obj <> endobj 12 0 obj <> endobj 11 0 obj <>stream xœ]V TÇíq˜îV“wYÅ=€²ÈòA6²ˆ0:ÈŽ¢q‹[D â↚î¨ ›{T "aP02¢¨ä›¨?¯I çÿ4?çÿsæTwW×}õú½{oˆ2@‰D"÷¹n ÷¹V^JU†2-!&Z?9A° ##ÅØŸý³àO_ÉHJqàå’†{M¡ÇÃÊ/(±H”•Wè–”œ’Ÿ&Ÿ°ÐÒÊjòß3ö3gΔ/ÉþëÜ]™š—(On2”ª¤dµ21m¶Ü¬V©bäqªìäøTytl¬2V ‰V)—Ëç%¨’““2äÝ,åvvöÖdp˜Ÿ ^’ž*ŒNL•+äʸtUtÊÿLR5.1Æ×5)ÖMé·Ô?%.5>-!=(C¢^â8uÚ k¹Ý8{‡)åG¹S6ÔÊŸò ÆRó(;Ê“²§)/*ˆò¦‚)*„šJ-¤|)Wj:5Ÿr£8ÊŒ2§†QF”1%¢L(š2¥¤Ô—ÔxRVÊ€„ûŽº) íý9`ñ€jñpqxŸ¸Í@eð³Ä\â/I–œ–Ü•ôÐèýôuf“ÌeþÍ&³ïŠšä.øjÐàA‰ƒÞÁ)#¡ šã‹LQ«ó3ØÞa.½Ð$sÒïuÅÏà"#mþ£¾éùƒ‹1 þw‡ g:}êÇ;…ª]#dÒ ZÆHØ’Ö*Ln¡N±°~ã6__û}fi\¥¢Ô±x”-6 mvê’Ãh0ik/wóÓéÕÓG»#Ö6¨¾óZÍ˦ËK\wóF½Qq¯{‘i…Z±oÑHt íUcŽÁñ}ÆXÕk,Á\ŸZ+ 1:õÚ¢vÈo‡œvSÔ mÝó»Í¥B›´Óèú¡sYéDz³EÕ7†#mLM+œ:Sûx8ªÉ¼ªüAy&t¿I´’îÆ9È/¦•n<šu,sŸÅ ØÕª™)Y‰’E^ø9ùÉT “k§’ȇÓö.Žâ¿Yžž–žª^‰XRÑ¿*5b8*”qx‚ÃhìŽÝµ£Á&t}„9àíø[ñ¹ÎÜëzgl†8ÙÚ´‚Lohºy}MP± Ð€3éL'T“¾äzÕÜ |„ޝ <¡Ïz„áÙ¼´{tbŒzTuü~…Lš3ï1ƒW Ñ\×ÙØDÿzÈ‚ÙSlç?#0ºóô¥Ì¨wiZ»ÐÓ.BÝâÞIâÁÑVƒ‘%úJìçí7áA–mt¯_ð(ù ‚yèÝÛÃÌZÑB×%ä$ªý|œHãlðÓ£nÕe'䧪öªx5€c«(÷¹XØ k8ôäÛk9çžÏ®¶$ØñÖ„s𜗣`v4]Lè‘6/<Î … È#I—3On<¹­šÍkàvöܺ÷ ±í÷<§nE[·m%t9ô×G…ÝbaXo6‡•x4¶ÁY8 È–jÛŽWÝâÛ^ ‚Á,|ƒy°ÆËd3 ÀáwlˆØ}4¹Ø`Û1`^àõ\lx#8ETáÙÍ¢¦1ì!‚pl§Ï<»s;ÚvHÖÈdå¯Í]‰X爨9¼ƒ‡g“.¼Cïdôt,† í°KŸ¼èƒîrMß@ι|þÐUô…‘³ZñX®íSwÓ‚…ì‚æEÍ¢Ù³C9ퟓí]µŸEi.-oÔ+²T¯È£ j9Ty¾”•6;ttÇÍ\öfuÞÆÜµ(-Y¾È‰•–¿!ª¬Á‰M`¢ça|?.(O;Æ…‘öÜþºÔ˳Ç,vžZâq)”?U¿¢=@WN\}À&3È}ýâŒÄtÕâ쯑%ìN;˜U¸aߦSì4z×ÄV0FôðèÏ_+|€À˜%Á}H`œåËuÕ¸`slì2Å.¨ŸÐõšü}¢ Ö«ðŒ¸mï=oâ!„F_)&ͼ _¨yMÖÍÕ%H9,,l™k¤²àp†lMá·…›ÊÙ)t>Ò°F"†>»ÿ›&êÒ˜£üìCžû£òa×.jxX–”'ûliÀ…8Iq4q‘MŸºƒ¸™ÞËŽ_ŸS$;²rw ZÊ~ò4­¢nÜÜÈ´„Éà$CšŸÖ,LjýD:¿“Tl*ŒÇº{þ¤“Ó!eêûEEyß51k·¬Ù¶ ±qëv—ó€Ÿ` Ù8³|‹D¹;‰`gèŠ;è^µÁôÑÏh½šCõ+LI^ùDÑYB†™P<W1Ò y ‡£wôù{2™¡›È8Þü½µòxc¥LšåJ Äévlìýlh×5΄}Ý$ð–þEpÖ "„b‰5'ꜰ\p"· ÐKúsë÷Ögb˜Lê1½Oý)£&fÁ®UÏ‘‚þŒNWrÒŠ­ë7£M‰« ñp‡éö¬Âœ‹"=6^–š¼N½u!ûŒÞùãÙR b[.&…òé ŠÏÈñÞ€ådo^¾f~Š*y²Ò¬Éüþõ ¦¤ö–lGȱÔZtäßIdžJÚ°2%-AµdUb}”§jêÊO¼(àµ{æŸ(`ûe¤OWa;zÕ’‰4ŽÔõáeBŸd …}jIÿiñùäzÉúŸ¹´RtéeÒÎHŸþ\w§á ÷é'ôwÊb½õÂHæupµ¥Gt†˜LU}Äy ˆ‘ VZù˜ù?÷í€êNâ¾µý,jyÈH+矯Il¶YHÀœ§~Ä2·¯“ýcy(e`<.æºoÏÂ&x°ï,») ž€1ß|¢åû‰ –m‚‚tªZ+†'àÌag-8l-,öap‹.@bÃ@u‡ShH{’þT S¶"ah—XHî åtè™ä)}¡ò‡uˆ}sÛgô8_+çðsO2ø)ôÞ»“ŠWœ~¤zBÄ2æÝ˜r›·xDhÌõþ„Iàý)ôMGñs(÷\"Ñ7‘ò­Ö¥SÈ y(£Qó2æ|Ðå¢âlÀ]é:Xô¼†Úwc©GPòB%Ÿ sëßþ†î¢K±û}váA/d¼–Çú¥(×Fmucµôök;J÷–”\ºr¬±On¸.[¤ˆãmâ‰Ó{mÀvÃÕgº•´@N«i¡ ºæ‘Þ °4ܺ> endobj 8 0 obj <>stream xœcd`ab`ddöuóññÐÉÌM-Ö ÊÏMÌ «ýfü!ÃôC–ù·Íﵿfÿ2`•eð™ËÌ»àû¡ïG¿äÿ¾O€™‘qãé»Îù•E™é% ¡AášÚÚ:CKKK…¤J˜Œ‚Kjqfzž‚Q–š“_›šWb­à T““™¬žSYQ¬˜’’šÒ–˜“š­à–™“YP_¦ á¬©`d``¨ $Œ¬üJsS‹òu2óÒ2ó2K*óRüsSÓrSRA¸äf–U*˜dæÁuûeæ&•+€=«à—o©à£”š^š“X„)ÃÀÀÀ¨Ä LŒŒ,ìßûø€¨|þ°ùßCæ³­äzÀ½r2σ¹<¼ }nx endstream endobj 17 0 obj 298 endobj 10 0 obj <> endobj 13 0 obj <> endobj 2 0 obj <>endobj xref 0 18 0000000000 65535 f 0000002821 00000 n 0000009361 00000 n 0000002762 00000 n 0000002869 00000 n 0000002602 00000 n 0000000015 00000 n 0000002582 00000 n 0000006633 00000 n 0000006395 00000 n 0000007036 00000 n 0000003360 00000 n 0000003011 00000 n 0000008197 00000 n 0000002938 00000 n 0000002968 00000 n 0000006374 00000 n 0000007016 00000 n trailer << /Size 18 /Root 1 0 R /Info 2 0 R >> startxref 9411 %%EOF gbm/vignettes/gbm.Rnw0000644000176200001440000007342614547111634014306 0ustar liggesusers\documentclass{article} \bibliographystyle{plain} \newcommand{\EV}{\mathrm{E}} \newcommand{\Var}{\mathrm{Var}} \newcommand{\aRule}{\begin{center} \rule{5in}{1mm} \end{center}} \title{Generalized Boosted Models:\\A guide to the gbm package} \author{Greg Ridgeway} %\VignetteEngine{knitr::knitr} %\VignetteIndexEntry{Generalized Boosted Models: A guide to the gbm package} \newcommand{\mathgbf}[1]{{\mbox{\boldmath$#1$\unboldmath}}} \begin{document} \maketitle Boosting takes on various forms with different programs using different loss functions, different base models, and different optimization schemes. The gbm package takes the approach described in \cite{Friedman:2001} and \cite{Friedman:2002}. Some of the terminology differs, mostly due to an effort to cast boosting terms into more standard statistical terminology (e.g. deviance). In addition, the gbm package implements boosting for models commonly used in statistics but not commonly associated with boosting. The Cox proportional hazard model, for example, is an incredibly useful model and the boosting framework applies quite readily with only slight modification \cite{Ridgeway:1999}. Also some algorithms implemented in the gbm package differ from the standard implementation. The AdaBoost algorithm \cite{FreundSchapire:1997} has a particular loss function and a particular optimization algorithm associated with it. The gbm implementation of AdaBoost adopts AdaBoost's exponential loss function (its bound on misclassification rate) but uses Friedman's gradient descent algorithm rather than the original one proposed. So the main purposes of this document is to spell out in detail what the gbm package implements. \section{Gradient boosting} This section essentially presents the derivation of boosting described in \cite{Friedman:2001}. The gbm package also adopts the stochastic gradient boosting strategy, a small but important tweak on the basic algorithm, described in \cite{Friedman:2002}. \subsection{Friedman's gradient boosting machine} \label{sec:GradientBoostingMachine} \begin{figure} \aRule Initialize $\hat f(\mathbf{x})$ to be a constant, $\hat f(\mathbf{x}) = \arg \min_{\rho} \sum_{i=1}^N \Psi(y_i,\rho)$. \\ For $t$ in $1,\ldots,T$ do \begin{enumerate} \item Compute the negative gradient as the working response \begin{equation} z_i = -\frac{\partial}{\partial f(\mathbf{x}_i)} \Psi(y_i,f(\mathbf{x}_i)) \mbox{\Huge $|$}_{f(\mathbf{x}_i)=\hat f(\mathbf{x}_i)} \end{equation} \item Fit a regression model, $g(\mathbf{x})$, predicting $z_i$ from the covariates $\mathbf{x}_i$. \item Choose a gradient descent step size as \begin{equation} \rho = \arg \min_{\rho} \sum_{i=1}^N \Psi(y_i,\hat f(\mathbf{x}_i)+\rho g(\mathbf{x}_i)) \end{equation} \item Update the estimate of $f(\mathbf{x})$ as \begin{equation} \hat f(\mathbf{x}) \leftarrow \hat f(\mathbf{x}) + \rho g(\mathbf{x}) \end{equation} \end{enumerate} \aRule \caption{Friedman's Gradient Boost algorithm} \label{fig:GradientBoost} \end{figure} Friedman (2001) and the companion paper Friedman (2002) extended the work of Friedman, Hastie, and Tibshirani (2000) and laid the ground work for a new generation of boosting algorithms. Using the connection between boosting and optimization, this new work proposes the Gradient Boosting Machine. In any function estimation problem we wish to find a regression function, $\hat f(\mathbf{x})$, that minimizes the expectation of some loss function, $\Psi(y,f)$, as shown in (\ref{NonparametricRegression1}). \begin{eqnarray} \hspace{0.5in} \hat f(\mathbf{x}) &=& \arg \min_{f(\mathbf{x})} \EV_{y,\mathbf{x}} \Psi(y,f(\mathbf{x})) \nonumber \\ \label{NonparametricRegression1} &=& \arg \min_{f(\mathbf{x})} \EV_x \left[ \EV_{y|\mathbf{x}} \Psi(y,f(\mathbf{x})) \Big| \mathbf{x} \right] \end{eqnarray} We will focus on finding estimates of $f(\mathbf{x})$ such that \begin{equation} \label{NonparametricRegression2} \hspace{0.5in} \hat f(\mathbf{x}) = \arg \min_{f(\mathbf{x})} \EV_{y|\mathbf{x}} \left[ \Psi(y,f(\mathbf{x}))|\mathbf{x} \right] \end{equation} Parametric regression models assume that $f(\mathbf{x})$ is a function with a finite number of parameters, $\beta$, and estimates them by selecting those values that minimize a loss function (e.g. squared error loss) over a training sample of $N$ observations on $(y,\mathbf{x})$ pairs as in (\ref{eq:Friedman1}). \begin{equation} \label{eq:Friedman1} \hspace{0.5in} \hat\beta = \arg \min_{\beta} \sum_{i=1}^N \Psi(y_i,f(\mathbf{x}_i;\beta)) \end{equation} When we wish to estimate $f(\mathbf{x})$ non-parametrically the task becomes more difficult. Again we can proceed similarly to \cite{FHT:2000} and modify our current estimate of $f(\mathbf{x})$ by adding a new function $f(\mathbf{x})$ in a greedy fashion. Letting $f_i = f(\mathbf{x}_i)$, we see that we want to decrease the $N$ dimensional function \begin{eqnarray} \label{EQ:Friedman2} \hspace{0.5in} J(\mathbf{f}) &=& \sum_{i=1}^N \Psi(y_i,f(\mathbf{x}_i)) \nonumber \\ &=& \sum_{i=1}^N \Psi(y_i,F_i). \end{eqnarray} The negative gradient of $J(\mathbf{f})$ indicates the direction of the locally greatest decrease in $J(\mathbf{f})$. Gradient descent would then have us modify $\mathbf{f}$ as \begin{equation} \label{eq:Friedman3} \hspace{0.5in} \hat \mathbf{f} \leftarrow \hat \mathbf{f} - \rho \nabla J(\mathbf{f}) \end{equation} where $\rho$ is the size of the step along the direction of greatest descent. Clearly, this step alone is far from our desired goal. First, it only fits $f$ at values of $\mathbf{x}$ for which we have observations. Second, it does not take into account that observations with similar $\mathbf{x}$ are likely to have similar values of $f(\mathbf{x})$. Both these problems would have disastrous effects on generalization error. However, Friedman suggests selecting a class of functions that use the covariate information to approximate the gradient, usually a regression tree. This line of reasoning produces his Gradient Boosting algorithm shown in Figure~\ref{fig:GradientBoost}. At each iteration the algorithm determines the direction, the gradient, in which it needs to improve the fit to the data and selects a particular model from the allowable class of functions that is in most agreement with the direction. In the case of squared-error loss, $\Psi(y_i,f(\mathbf{x}_i)) = \sum_{i=1}^N (y_i-f(\mathbf{x}_i))^2$, this algorithm corresponds exactly to residual fitting. There are various ways to extend and improve upon the basic framework suggested in Figure~\ref{fig:GradientBoost}. For example, Friedman (2001) substituted several choices in for $\Psi$ to develop new boosting algorithms for robust regression with least absolute deviation and Huber loss functions. Friedman (2002) showed that a simple subsampling trick can greatly improve predictive performance while simultaneously reduce computation time. Section~\ref{GBMModifications} discusses some of these modifications. \section{Improving boosting methods using control of the learning rate, sub-sampling, and a decomposition for interpretation} \label{GBMModifications} This section explores the variations of the previous algorithms that have the potential to improve their predictive performance and interpretability. In particular, by controlling the optimization speed or learning rate, introducing low-variance regression methods, and applying ideas from robust regression we can produce non-parametric regression procedures with many desirable properties. As a by-product some of these modifications lead directly into implementations for learning from massive datasets. All these methods take advantage of the general form of boosting \begin{equation} \hat f(\mathbf{x}) \leftarrow \hat f(\mathbf{x}) + \EV(z(y,\hat f(\mathbf{x}))|\mathbf{x}). \end{equation} So far we have taken advantage of this form only by substituting in our favorite regression procedure for $\EV_w(z|\mathbf{x})$. I will discuss some modifications to estimating $\EV_w(z|\mathbf{x})$ that have the potential to improve our algorithm. \subsection{Decreasing the learning rate} As several authors have phrased slightly differently, ``...boosting, whatever flavor, seldom seems to overfit, no matter how many terms are included in the additive expansion''. This is not true as the discussion to \cite{FHT:2000} points out. In the update step of any boosting algorithm we can introduce a learning rate to dampen the proposed move. \begin{equation} \label{eq:shrinkage} \hat f(\mathbf{x}) \leftarrow \hat f(\mathbf{x}) + \lambda \EV(z(y,\hat f(\mathbf{x}))|\mathbf{x}). \end{equation} By multiplying the gradient step by $\lambda$ as in equation~\ref{eq:shrinkage} we have control on the rate at which the boosting algorithm descends the error surface (or ascends the likelihood surface). When $\lambda=1$ we return to performing full gradient steps. Friedman (2001) relates the learning rate to regularization through shrinkage. The optimal number of iterations, $T$, and the learning rate, $\lambda$, depend on each other. In practice I set $\lambda$ to be as small as possible and then select $T$ by cross-validation. Performance is best when $\lambda$ is as small as possible performance with decreasing marginal utility for smaller and smaller $\lambda$. Slower learning rates do not necessarily scale the number of optimal iterations. That is, if when $\lambda=1.0$ and the optimal $T$ is 100 iterations, does {\it not} necessarily imply that when $\lambda=0.1$ the optimal $T$ is 1000 iterations. \subsection{Variance reduction using subsampling} Friedman (2002) proposed the stochastic gradient boosting algorithm that simply samples uniformly without replacement from the dataset before estimating the next gradient step. He found that this additional step greatly improved performance. We estimate the regression $\EV(z(y,\hat f(\mathbf{x}))|\mathbf{x})$ using a random subsample of the dataset. \subsection{ANOVA decomposition} Certain function approximation methods are decomposable in terms of a ``functional ANOVA decomposition''. That is a function is decomposable as \begin{equation} \label{ANOVAdecomp} f(\mathbf{x}) = \sum_j f_j(x_j) + \sum_{jk} f_{jk}(x_j,x_k) + \sum_{jk\ell} f_{jk\ell}(x_j,x_k,x_\ell) + \cdots. \end{equation} This applies to boosted trees. Regression stumps (one split decision trees) depend on only one variable and fall into the first term of \ref{ANOVAdecomp}. Trees with two splits fall into the second term of \ref{ANOVAdecomp} and so on. By restricting the depth of the trees produced on each boosting iteration we can control the order of approximation. Often additive components are sufficient to approximate a multivariate function well, generalized additive models, the na\"{\i}ve Bayes classifier, and boosted stumps for example. When the approximation is restricted to a first order we can also produce plots of $x_j$ versus $f_j(x_j)$ to demonstrate how changes in $x_j$ might affect changes in the response variable. \subsection{Relative influence} Friedman (2001) also develops an extension of a variable's ``relative influence'' for boosted estimates. For tree based methods the approximate relative influence of a variable $x_j$ is \begin{equation} \label{RelInfluence} \hspace{0.5in} \hat J_j^2 = \hspace{-0.1in}\sum_{\mathrm{splits~on~}x_j}\hspace{-0.2in}I_t^2 \end{equation} where $I_t^2$ is the empirical improvement by splitting on $x_j$ at that point. Friedman's extension to boosted models is to average the relative influence of variable $x_j$ across all the trees generated by the boosting algorithm. \begin{figure} \aRule Select \begin{itemize} \item a loss function (\texttt{distribution}) \item the number of iterations, $T$ (\texttt{n.trees}) \item the depth of each tree, $K$ (\texttt{interaction.depth}) \item the shrinkage (or learning rate) parameter, $\lambda$ (\texttt{shrinkage}) \item the subsampling rate, $p$ (\texttt{bag.fraction}) \end{itemize} Initialize $\hat f(\mathbf{x})$ to be a constant, $\hat f(\mathbf{x}) = \arg \min_{\rho} \sum_{i=1}^N \Psi(y_i,\rho)$ \\ For $t$ in $1,\ldots,T$ do \begin{enumerate} \item Compute the negative gradient as the working response \begin{equation} z_i = -\frac{\partial}{\partial f(\mathbf{x}_i)} \Psi(y_i,f(\mathbf{x}_i)) \mbox{\Huge $|$}_{f(\mathbf{x}_i)=\hat f(\mathbf{x}_i)} \end{equation} \item Randomly select $p\times N$ cases from the dataset \item Fit a regression tree with $K$ terminal nodes, $g(\mathbf{x})=\EV(z|\mathbf{x})$. This tree is fit using only those randomly selected observations \item Compute the optimal terminal node predictions, $\rho_1,\ldots,\rho_K$, as \begin{equation} \rho_k = \arg \min_{\rho} \sum_{\mathbf{x}_i\in S_k} \Psi(y_i,\hat f(\mathbf{x}_i)+\rho) \end{equation} where $S_k$ is the set of $\mathbf{x}$s that define terminal node $k$. Again this step uses only the randomly selected observations. \item Update $\hat f(\mathbf{x})$ as \begin{equation} \hat f(\mathbf{x}) \leftarrow \hat f(\mathbf{x}) + \lambda\rho_{k(\mathbf{x})} \end{equation} where $k(\mathbf{x})$ indicates the index of the terminal node into which an observation with features $\mathbf{x}$ would fall. \end{enumerate} \aRule \caption{Boosting as implemented in \texttt{gbm()}} \label{fig:gbm} \end{figure} \section{Common user options} This section discusses the options to gbm that most users will need to change or tune. \subsection{Loss function} The first and foremost choice is \texttt{distribution}. This should be easily dictated by the application. For most classification problems either \texttt{bernoulli} or \texttt{adaboost} will be appropriate, the former being recommended. For continuous outcomes the choices are \texttt{gaussian} (for minimizing squared error), \texttt{laplace} (for minimizing absolute error), and quantile regression (for estimating percentiles of the conditional distribution of the outcome). Censored survival outcomes should require \texttt{coxph}. Count outcomes may use \texttt{poisson} although one might also consider \texttt{gaussian} or \texttt{laplace} depending on the analytical goals. \subsection{The relationship between shrinkage and number of iterations} The issues that most new users of gbm struggle with are the choice of \texttt{n.trees} and \texttt{shrinkage}. It is important to know that smaller values of \texttt{shrinkage} (almost) always give improved predictive performance. That is, setting \texttt{shrinkage=0.001} will almost certainly result in a model with better out-of-sample predictive performance than setting \texttt{shrinkage=0.01}. However, there are computational costs, both storage and CPU time, associated with setting \texttt{shrinkage} to be low. The model with \texttt{shrinkage=0.001} will likely require ten times as many iterations as the model with \texttt{shrinkage=0.01}, increasing storage and computation time by a factor of 10. Figure~\ref{fig:shrinkViters} shows the relationship between predictive performance, the number of iterations, and the shrinkage parameter. Note that the increase in the optimal number of iterations between two choices for shrinkage is roughly equal to the ratio of the shrinkage parameters. It is generally the case that for small shrinkage parameters, 0.001 for example, there is a fairly long plateau in which predictive performance is at its best. My rule of thumb is to set \texttt{shrinkage} as small as possible while still being able to fit the model in a reasonable amount of time and storage. I usually aim for 3,000 to 10,000 iterations with shrinkage rates between 0.01 and 0.001. \begin{figure}[ht] \begin{center} \includegraphics[width=5in]{shrinkage-v-iterations} \end{center} \caption{Out-of-sample predictive performance by number of iterations and shrinkage. Smaller values of the shrinkage parameter offer improved predictive performance, but with decreasing marginal improvement.} \label{fig:shrinkViters} \end{figure} \subsection{Estimating the optimal number of iterations} gbm offers three methods for estimating the optimal number of iterations after the gbm model has been fit, an independent test set (\texttt{test}), out-of-bag estimation (\texttt{OOB}), and $v$-fold cross validation (\texttt{cv}). The function \texttt{gbm.perf} computes the iteration estimate. Like Friedman's MART software, the independent test set method uses a single holdout test set to select the optimal number of iterations. If \texttt{train.fraction} is set to be less than 1, then only the \textit{first} \texttt{train.fraction}$\times$\texttt{nrow(data)} will be used to fit the model. Note that if the data are sorted in a systematic way (such as cases for which $y=1$ come first), then the data should be shuffled before running gbm. Those observations not used in the model fit can be used to get an unbiased estimate of the optimal number of iterations. The downside of this method is that a considerable number of observations are used to estimate the single regularization parameter (number of iterations) leaving a reduced dataset for estimating the entire multivariate model structure. Use \texttt{gbm.perf(...,method="test")} to obtain an estimate of the optimal number of iterations using the held out test set. If \texttt{bag.fraction} is set to be greater than 0 (0.5 is recommended), gbm computes an out-of-bag estimate of the improvement in predictive performance. It evaluates the reduction in deviance on those observations not used in selecting the next regression tree. The out-of-bag estimator underestimates the reduction in deviance. As a result, it almost always is too conservative in its selection for the optimal number of iterations. The motivation behind this method was to avoid having to set aside a large independent dataset, which reduces the information available for learning the model structure. Use \texttt{gbm.perf(...,method="OOB")} to obtain the OOB estimate. Lastly, gbm offers $v$-fold cross validation for estimating the optimal number of iterations. If when fitting the gbm model, \texttt{cv.folds=5} then gbm will do 5-fold cross validation. gbm will fit five gbm models in order to compute the cross validation error estimate and then will fit a sixth and final gbm model with \texttt{n.trees}iterations using all of the data. The returned model object will have a component labeled \texttt{cv.error}. Note that \texttt{gbm.more} will do additional gbm iterations but will not add to the \texttt{cv.error} component. Use \texttt{gbm.perf(...,method="cv")} to obtain the cross validation estimate. \begin{figure}[ht] \begin{center} \includegraphics[width=5in]{oobperf2} \end{center} \caption{Out-of-sample predictive performance of four methods of selecting the optimal number of iterations. The vertical axis plots performance relative the best. The boxplots indicate relative performance across thirteen real datasets from the UCI repository. See \texttt{demo(OOB-reps)}.} \label{fig:oobperf} \end{figure} Figure~\ref{fig:oobperf} compares the three methods for estimating the optimal number of iterations across 13 datasets. The boxplots show the methods performance relative to the best method on that dataset. For most datasets the method perform similarly, however, 5-fold cross validation is consistently the best of them. OOB, using a 33\% test set, and using a 20\% test set all have datasets for which the perform considerably worse than the best method. My recommendation is to use 5- or 10-fold cross validation if you can afford the computing time. Otherwise you may choose among the other options, knowing that OOB is conservative. \section{Available distributions} This section gives some of the mathematical detail for each of the distribution options that gbm offers. The gbm engine written in C++ has access to a C++ class for each of these distributions. Each class contains methods for computing the associated deviance, initial value, the gradient, and the constants to predict in each terminal node. In the equations shown below, for non-zero offset terms, replace $f(\mathbf{x}_i)$ with $o_i + f(\mathbf{x}_i)$. \subsection{Gaussian} \begin{tabular}{ll} Deviance & $\displaystyle \frac{1}{\sum w_i} \sum w_i(y_i-f(\mathbf{x}_i))^2$ \\ Initial value & $\displaystyle f(\mathbf{x})=\frac{\sum w_i(y_i-o_i)}{\sum w_i}$ \\ Gradient & $z_i=y_i - f(\mathbf{x}_i)$ \\ Terminal node estimates & $\displaystyle \frac{\sum w_i(y_i-f(\mathbf{x}_i))}{\sum w_i}$ \end{tabular} \subsection{AdaBoost} \begin{tabular}{ll} Deviance & $\displaystyle \frac{1}{\sum w_i} \sum w_i\exp(-(2y_i-1)f(\mathbf{x}_i))$ \\ Initial value & $\displaystyle \frac{1}{2}\log\frac{\sum y_iw_ie^{-o_i}}{\sum (1-y_i)w_ie^{o_i}}$ \\ Gradient & $\displaystyle z_i= -(2y_i-1)\exp(-(2y_i-1)f(\mathbf{x}_i))$ \\ Terminal node estimates & $\displaystyle \frac{\sum (2y_i-1)w_i\exp(-(2y_i-1)f(\mathbf{x}_i))} {\sum w_i\exp(-(2y_i-1)f(\mathbf{x}_i))}$ \end{tabular} \subsection{Bernoulli} \begin{tabular}{ll} Deviance & $\displaystyle -2\frac{1}{\sum w_i} \sum w_i(y_if(\mathbf{x}_i)-\log(1+\exp(f(\mathbf{x}_i))))$ \\ Initial value & $\displaystyle \log\frac{\sum w_iy_i}{\sum w_i(1-y_i)}$ \\ Gradient & $\displaystyle z_i=y_i-\frac{1}{1+\exp(-f(\mathbf{x}_i))}$ \\ Terminal node estimates & $\displaystyle \frac{\sum w_i(y_i-p_i)}{\sum w_ip_i(1-p_i)}$ \\ & where $\displaystyle p_i = \frac{1}{1+\exp(-f(\mathbf{x}_i))}$ \\ \end{tabular} Notes: \begin{itemize} \item For non-zero offset terms, the computation of the initial value requires Newton-Raphson. Initialize $f_0=0$ and iterate $\displaystyle f_0 \leftarrow f_0 + \frac{\sum w_i(y_i-p_i)}{\sum w_ip_i(1-p_i)}$ where $\displaystyle p_i = \frac{1}{1+\exp(-(o_i+f_0))}$. \end{itemize} \subsection{Laplace} \begin{tabular}{ll} Deviance & $\frac{1}{\sum w_i} \sum w_i|y_i-f(\mathbf{x}_i)|$ \\ Initial value & $\mbox{median}_w(y)$ \\ Gradient & $z_i=\mbox{sign}(y_i-f(\mathbf{x}_i))$ \\ Terminal node estimates & $\mbox{median}_w(z)$ \end{tabular} Notes: \begin{itemize} \item $\mbox{median}_w(y)$ denotes the weighted median, defined as the solution to the equation $\frac{\sum w_iI(y_i\leq m)}{\sum w_i}=\frac{1}{2}$ \item \texttt{gbm()} currently does not implement the weighted median and issues a warning when the user uses weighted data with \texttt{distribution="laplace"}. \end{itemize} \subsection{Quantile regression} Contributed by Brian Kriegler (see \cite{Kriegler:2010}). \begin{tabular}{ll} Deviance & $\frac{1}{\sum w_i} \left(\alpha\sum_{y_i>f(\mathbf{x}_i)} w_i(y_i-f(\mathbf{x}_i))\right. +$ \\ & \hspace{0.5in}$\left.(1-\alpha)\sum_{y_i\leq f(\mathbf{x}_i)} w_i(f(\mathbf{x}_i)-y_i)\right)$ \\ Initial value & $\mathrm{quantile}^{(\alpha)}_w(y)$ \\ Gradient & $z_i=\alpha I(y_i>f(\mathbf{x}_i))-(1-\alpha)I(y_i\leq f(\mathbf{x}_i))$ \\ Terminal node estimates & $\mathrm{quantile}^{(\alpha)}_w(z)$ \end{tabular} Notes: \begin{itemize} \item $\mathrm{quantile}^{(\alpha)}_w(y)$ denotes the weighted quantile, defined as the solution to the equation $\frac{\sum w_iI(y_i\leq q)}{\sum w_i}=\alpha$ \item \texttt{gbm()} currently does not implement the weighted median and issues a warning when the user uses weighted data with \texttt{distribution=list(name="quantile")}. \end{itemize} \subsection{Cox Proportional Hazard} \begin{tabular}{ll} Deviance & $-2\sum w_i(\delta_i(f(\mathbf{x}_i)-\log(R_i/w_i)))$\\ Gradient & $\displaystyle z_i=\delta_i - \sum_j \delta_j \frac{w_jI(t_i\geq t_j)e^{f(\mathbf{x}_i)}} {\sum_k w_kI(t_k\geq t_j)e^{f(\mathbf{x}_k)}}$ \\ Initial value & 0 \\ Terminal node estimates & Newton-Raphson algorithm \end{tabular} \begin{enumerate} \item Initialize the terminal node predictions to 0, $\mathgbf{\rho}=0$ \item Let $\displaystyle p_i^{(k)}=\frac{\sum_j I(k(j)=k)I(t_j\geq t_i)e^{f(\mathbf{x}_i)+\rho_k}} {\sum_j I(t_j\geq t_i)e^{f(\mathbf{x}_i)+\rho_k}}$ \item Let $g_k=\sum w_i\delta_i\left(I(k(i)=k)-p_i^{(k)}\right)$ \item Let $\mathbf{H}$ be a $k\times k$ matrix with diagonal elements \begin{enumerate} \item Set diagonal elements $H_{mm}=\sum w_i\delta_i p_i^{(m)}\left(1-p_i^{(m)}\right)$ \item Set off diagonal elements $H_{mn}=-\sum w_i\delta_i p_i^{(m)}p_i^{(n)}$ \end{enumerate} \item Newton-Raphson update $\mathgbf{\rho} \leftarrow \mathgbf{\rho} - \mathbf{H}^{-1}\mathbf{g}$ \item Return to step 2 until convergence \end{enumerate} Notes: \begin{itemize} \item $t_i$ is the survival time and $\delta_i$ is the death indicator. \item $R_i$ denotes the hazard for the risk set, $R_i=\sum_{j=1}^N w_jI(t_j\geq t_i)e^{f(\mathbf{x}_i)}$ \item $k(i)$ indexes the terminal node of observation $i$ \item For speed, \texttt{gbm()} does only one step of the Newton-Raphson algorithm rather than iterating to convergence. No appreciable loss of accuracy since the next boosting iteration will simply correct for the prior iterations inadequacy. \item \texttt{gbm()} initially sorts the data by survival time. Doing this reduces the computation of the risk set from $O(n^2)$ to $O(n)$ at the cost of a single up front sort on survival time. After the model is fit, the data are then put back in their original order. \end{itemize} \subsection{Poisson} \begin{tabular}{ll} Deviance & -2$\frac{1}{\sum w_i} \sum w_i(y_if(\mathbf{x}_i)-\exp(f(\mathbf{x}_i)))$ \\ Initial value & $\displaystyle f(\mathbf{x})= \log\left(\frac{\sum w_iy_i}{\sum w_ie^{o_i}}\right)$ \\ Gradient & $z_i=y_i - \exp(f(\mathbf{x}_i))$ \\ Terminal node estimates & $\displaystyle \log\frac{\sum w_iy_i}{\sum w_i\exp(f(\mathbf{x}_i))}$ \end{tabular} The Poisson class includes special safeguards so that the most extreme predicted values are $e^{-19}$ and $e^{+19}$. This behavior is consistent with \texttt{glm()}. \subsection{Pairwise} This distribution implements ranking measures following the \emph{LambdaMart} algorithm \cite{Burges:2010}. Instances belong to \emph{groups}; all pairs of items with different labels, belonging to the same group, are used for training. In \emph{Information Retrieval} applications, groups correspond to user queries, and items to (feature vectors of) documents in the associated match set to be ranked. For consistency with typical usage, our goal is to \emph{maximize} one of the \emph{utility} functions listed below. Consider a group with instances $x_1, \dots, x_n$, ordered such that $f(x_1) \geq f(x_2) \geq \dots f(x_n)$; i.e., the \emph{rank} of $x_i$ is $i$, where smaller ranks are preferable. Let $P$ be the set of all ordered pairs such that $y_i > y_j$. \begin{enumerate} \item[{\bf Concordance:}] Fraction of concordant (i.e, correctly ordered) pairs. For the special case of binary labels, this is equivalent to the Area under the ROC Curve. $$\left\{ \begin{array}{l l}\frac{\|\{(i,j)\in P | f(x_i)>f(x_j)\}\|}{\|P\|} & P \neq \emptyset\\ 0 & \mbox{otherwise.} \end{array}\right. $$ \item[{\bf MRR:}] Mean reciprocal rank of the highest-ranked positive instance (it is assumed $y_i\in\{0,1\}$): $$\left\{ \begin{array}{l l}\frac{1}{\min\{1 \leq i \leq n |y_i=1\}} & \exists i: \, 1 \leq i \leq n, y_i=1\\ 0 & \mbox{otherwise.}\end{array}\right.$$ \item[{\bf MAP:}] Mean average precision, a generalization of MRR to multiple positive instances: $$\left\{ \begin{array}{l l} \frac{\sum_{1\leq i\leq n | y_i=1} \|\{1\leq j\leq i |y_j=1\}\|\,/\,i}{\|\{1\leq i\leq n | y_i=1\}\|} & \exists i: \, 1 \leq i \leq n, y_i=1\\ 0 & \mbox{otherwise.}\end{array}\right.$$ \item[{\bf nDCG:}] Normalized discounted cumulative gain: $$\frac{\sum_{1\leq i\leq n} \log_2(i+1) \, y_i}{\sum_{1\leq i\leq n} \log_2(i+1) \, y'_i},$$ where $y'_1, \dots, y'_n$ is a reordering of $y_1, \dots,y_n$ with $y'_1 \geq y'_2 \geq \dots \geq y'_n$. \end{enumerate} The generalization to multiple (possibly weighted) groups is straightforward. Sometimes a cut-off rank $k$ is given for \emph{MRR} and \emph{nDCG}, in which case we replace the outer index $n$ by $\min(n,k)$. The initial value for $f(x_i)$ is always zero. We derive the gradient of a cost function whose gradient locally approximates the gradient of the IR measure for a fixed ranking: \begin{eqnarray*} \Phi & = & \sum_{(i,j) \in P} \Phi_{ij}\\ & = & \sum_{(i,j) \in P} |\Delta Z_{ij}| \log \left( 1 + e^{-(f(x_i) - f(x_j))}\right), \end{eqnarray*} where $|\Delta Z_{ij}|$ is the absolute utility difference when swapping the ranks of $i$ and $j$, while leaving all other instances the same. Define \begin{eqnarray*} \lambda_{ij} & = & \frac{\partial\Phi_{ij}}{\partial f(x_i)}\\ & = & - |\Delta Z_{ij}| \frac{1}{1 + e^{f(x_i) - f(x_j)}}\\ & = & - |\Delta Z_{ij}| \, \rho_{ij}, \end{eqnarray*} with $$ \rho_{ij} = - \frac{\lambda_{ij }}{|\Delta Z_{ij}|} = \frac{1}{1 + e^{f(x_i) - f(x_j)}}$$ For the gradient of $\Phi$ with respect to $f(x_i)$, define \begin{eqnarray*} \lambda_i & = & \frac{\partial \Phi}{\partial f(x_i)}\\ & = & \sum_{j|(i,j) \in P} \lambda_{ij} - \sum_{j|(j,i) \in P} \lambda_{ji}\\ & = & - \sum_{j|(i,j) \in P} |\Delta Z_{ij}| \, \rho_{ij}\\ & & \mbox{} + \sum_{j|(j,i) \in P} |\Delta Z_{ji}| \, \rho_{ji}. \end{eqnarray*} The second derivative is \begin{eqnarray*} \gamma_i & \stackrel{def}{=} & \frac{\partial^2\Phi}{\partial f(x_i)^2}\\ & = & \sum_{j|(i,j) \in P} |\Delta Z_{ij}| \, \rho_{ij} \, (1-\rho_{ij})\\ & & \mbox{} + \sum_{j|(j,i) \in P} |\Delta Z_{ji}| \, \rho_{ji} \, (1-\rho_{ji}). \end{eqnarray*} Now consider again all groups with associated weights. For a given terminal node, let $i$ range over all contained instances. Then its estimate is $$-\frac{\sum_i v_i\lambda_{i}}{\sum_i v_i \gamma_i},$$ where $v_i=w(\mbox{\em group}(i))/\|\{(j,k)\in\mbox{\em group}(i)\}\|.$ In each iteration, instances are reranked according to the preliminary scores $f(x_i)$ to determine the $|\Delta Z_{ij}|$. Note that in order to avoid ranking bias, we break ties by adding a small amount of random noise. \bibliography{gbm} \end{document} gbm/vignettes/shrinkage-v-iterations.pdf0000644000176200001440000003213614547111634020132 0ustar liggesusers%PDF-1.3 %Çì¢ 6 0 obj <> stream xœÅ|MÏ­;RÝüüŠ=„AÞØåïi¤…YÓWbŒ I@§A}!âï§Ö‡Ÿ½\: ZjµnŸ·Ê~üíry­òþÝ«|ÕWÁÿüï_ÿöÛï¾ý÷¿è¯ÿýOß~•B÷ÛüÿÞ¢¼¾ãüø7þø?ßþòõßÖ×xýË·úúóüïïó¿¿û¶fOÝjíkg£ÖùÕ,~ÿöë•,iÎêÔ:[Ï?o²ÅwzŒ2¾âI·øNom¯ù¤[|§÷èãë<éßé?4÷iý;ýw¯Êq»ÿüõo_ÿã§¹k¼~úÛoÏš%|ÕÚ_}}õùúé·ßþ¤üéOÿm­¯‘~ú›o¥P¹SÙ­ë¿ ›ºfݾºöµžŒµXû?zýê[ÎÓWíÞÕJmÖSΞþ˜œú+½SëhÈ~“-¾Ó#ZÍnºÅôŸÅ[|§»97ý£u¿oœ1ÌîÿV1”½~­œâ“½5Ò_õpÆ×Zkxp ¿ òäÝ­}¨1sý×ã«•ÒW)Ù½ƒÅøóo²ýÿ¹Mô^U‘Ýl3^9NSËêýóo~þ«þ»ü‡BÃ~q€j|™ŸÌ¯úè׿û¿õóoþæõ›ŸþÇŸŸ¡ÍŸÊi‰ñûÛŠéSøwÁnÙçz¸JÖ©¯ºFäŽüžÂùZ¯ÚvÍ öýÛ.-Ç®¶5YÕ.\pý ë®+—EY ³ÆÈs|õT´’½gŽÕ³ûÕ"ܬÙÎT´¹¸)Ï©ÙÎTô¬9`.²}­/kÊYàèÚƒµä¨/ìÉÊ~×3?KƒºœÞ°ÚÛ©-ª%þ¼zéÅ¥q¢{ÛòÄhôZ6¥…k»šÒκz Ù̃6çÂ)Nß;+ñBúþŸÞ_X¾?®K-·»Î?%”þK©;4¿þì“ÿð~Ÿíkæ6È‘ÏlÛžýúow›>û‹?Ë&c`Ÿ§Ý3-A}ï§™h¾÷S›ZÞO1¼{péըڸϋúàn1|R?·XË0>·X›“ƒx7YísIám–ŠñÃ6KÅüØdy-–»^+èn±:sšÞ[,»µè!ÜME/ÚÜdP¤ÍéÏ&KÅö1§MÅñò&Ë“d~l±<+Ü o±´…­ð‹ZÇ•sg¼Ì¹Å"«÷‹,Í‹ž,÷³v¸7X´±,sƒEÛ?l°è1-çK©¿7-Òyo°”›7óÁÔFù³²lDÚ'殕5¯zSƒ5¯ÚeHr¢Qóê.½ö¬yݲkgÝé²ÈPÔÁºwª9&u²îÓm 0%žY¿úÇ9³ÒOŠ´‘¹òª£CKÎô¿sjáºy‰ÉNžœ¤—ñÐÉ!_܇‡RžTvz&-¦èíÁmC h8^!›) o. Œ [[¯®;lšd kšA§ÒDב–¸½Mt.ÛLtn·ð÷üƒ6à1ÐÍ­×@§5³i‰NkØ>MtZËæã€&:mï²Ñ¥‰ÎªÍ(4]¥#ņÍ&tÊãšh4Àyi¢³öíd¢3ýšpšèL?®‹&:åi™&šuÅÛDW˜èvM´;ø˜è÷×Q'¿žööd²S¾&&;¿wOe²³§ím°sx®§Áæp}˜ì§aO“]q³ßo‹Ãs[³1ª×‹•Á~˲×){EÈ\ç€Ù –µNÙ#+c²gA¶:+×m¦:°¼ítJF:‡+ÔÙè¯>™è‹,t€î!6Ð)û(“}Îñ ‚yÎî{­Ê:§ì£MÆ9åõõa›s@ªz*Óœ¢k”-sˆ}x攽Ód—S¶»³œ­Ë“Öé;Ÿ{Ç&y¾<6Ǹ.N¯¨AO…òº¦¸Böê½°u¬”òaë¶WÉ›­Û^ƒ·¤ gJé‹+>=FË›+>JK3%~;KÐ T·eòª˜Zõí¬Ø6nË!Èô+·ì†›-Æ ÃMit^„§çxæ¦(vœÙ”™Ã¼¨pOgv±Ck(Å+‘®sœ÷Ñý™Í]eâ\Ì!æ\:ksü=Q+‡š¾våL¬¬p„—<ܾN×ÜÙ…޶\‚ÎÙœöaÍ¥ ‡|uzà¬r²…+Çš.›£•Ã=¡Øþ`î²qm½Üøý¦/—ÓbûsŽ«AÕÝ9®á)/—Öá1kTic@²¾gãÐmvn˜œ‘9½Ã÷®¼Mec•sc”ÿpÏ;ܧp¯âïO¼ ‘em«ênUOã:_ÙÉë|Ýó ¾Óª¢Ò×ù¶¥gr¯¡kU= Ë,§`hjÎÆÍ$e¯Ås:[7±¢þ­ŸF¼²mKu¥]"H¸¶Æ9eÕ½uÙÉCUuû‚¥ç×»¨åyÍá:ßµ¸ôÉ•¾kw:N^§¼q?HùXNƒ? kߤ|PÛn¬;O ôsw]Q¢êŒÁT³íiÑÏ=d‡# 1¤¡²+.½€Ìºs˜¨{kÉÌ7Úvª¿ž›m;õlª¸¦nÊXUÅ-•#Q VåäjWçççéš•À]»áš©U©>eWr9µ:+R>„d »¤oE .'Vvaѳ¾eÇâ¡“3-‰Æ+gÕöu î ̲¾žÙ–QëZÜÃw¾”¹1põQnº69áÍun‹aÏ6`˜Q[×ý#6Ýœ~[ÐÜcÚÆm¯œhaÖ±YPÈq¢ŒÜ[þ[^Éuéb yN¶îØÂí5غ},sˆ2]7¹ØÌp­˜~ ¥\ UÎÜÀ ξ$§sÎÜ\òÂsá"µ³e0º(iÊo À(kݼ“`×\žÓCT?—‚¤tÜcL‰P×Ò*XƒM\óVº;[¹›¶nƒ±E3 }«D“¹@Žœ*š8{Sѵ•n©¸ˆWïþ¦ëU*Æ-”g]~r®bààç'®e«­¶Êyšp·í°ÍÍ«·ÛnóæØÜo8õÕ µD5žyЊÛÍJ‹N,2헞Ɠ­XœÌ-ÊX†®[èü¾“„û:¿÷-ƒ®€7 VàÆ…vž§ §qÀTç³I)‚îxxÄ–fêðùÒpÍŠk±‚[à¤Ëâ1‰Ë ;N I•¶Žôu§¹ÎJ¯Ílm6Vš{XýhÜQãëÔ~åÂŽ›ƒû7·§¸@E*ªç¼ÿ.¢·y©Y=ýß\èõ¦ÓN…O›4ühE.ô§„ÎE•+ݦ¼ua£ .d³†‡hæñéš}ݘ̷âÅS±±fsƼ—|Xbî‰p9°Æ¨¿5•7°œ‘»åFð –»àöj—yj|Œ5\Ž64­ÞrxÃhð=i.ÆÀIYF¬™Ã—D½ð”娷±ae<ÓI™xˆ÷ e¸¨!M’—­šm³âÊý³-Ô\S4T°«(ë h“è"v%6©]~¨áf/î$•L²¹6É‚6 ,Ê›¡EË¥ ] ò”ÇVÈT]nFÀóµ8ïØ ì”|ز1-·Í¶Zj«/¶myLu/jaZµ-2°)«t êoiãË}ÛuºÃM®’‰¹çq¢tpn<‚ŽÓôD]†¯ ¬y†pz#ÞÊ“åÁÓP=C›]LYdUÛì·½ê#}ׯ°§0X¡tŽÄÞSåéJÓhÄs°<0lMé¬?ï.d=Ô:Ü\äc¹ Îí­ÚÀ®UÉúšîoZÈíÐûmw”ŸÎokwµò¢"™­r‹Ö-nÑž-0 a2)¹e權¥^8hiYtÕì…K¯õÐ^é ÍdA¯°-XœÎ?È0õ~ÑˬHÝNd˜znË"ÃÔ‡àƒàv˜¢|T;wy Ô½›uÀ¶¨{iÞ:¯)`ˆ¥ö´HuÙµ©nSö]4Wë[Ðj¯dç°4ŽDd·F Oo­á@ƒÝÜfuÀnš¹å}xŠô(\æ„Ó¬8"¬lŽ{0h§PS‘Øè¨sڭ鸠䀌ë°w@¶¨e훃 _ÃÁÕïJ£µî'‹ÜÒ,ã¶&w–sÅ­h…(·‡ÜÊÑ<¤b]oçäÑ–ôpÓºÝJÞâ<˜ÓœW|Û¨™·”În6ø‡]šAÿ$íš­ŠÙ©yªZôƒ.‚¶y2¤æÜÚ/^H5tq†Á£<ÖðâÕ@ͪß9£ˆu×>1©Á$½“}HÃ(¶µwÆ|¥,0¬ÖØo:úù8}Á¥á\’‘†Ó{¯íIÓé½™K´¢€ž ò‘è€u;XFÁ0·— Y O6Íh`äõ}‡çIYé$V?dnºvÙú £Ñ€èAZÜ pÿ$or½ð%r½{ê$ë³íÝË}ÇícJ’Iöî-)Pö‘ý쓤TÛw'Ö×YºË¢»›²Àù>éí¦Ù]þž8|®ö{òØiˆYRi›gêq´CÇ $[všû L7[v9õ´=(«+2 ÑE]¶Æ}1’#²g“}>eìmÈayÒ‡;¾uEP¤Ñvß0Ýè›vÐ[Ôv¼"Þ¢¶c ·éÊ¦ì‘ØôdÓ¨+Ú ‹Ëµ¯¯éÅ’VjgšWߦ›_þ@ßtaŸ ¤¾éÁ¦ìsd¥såúnOá²£¶y7eºì"›Õ·SëÛÞ:ðÛþAGw¸U9´A5'𥠿AX?Ÿ ´¦ãR¥}æ"¥}‘“®‘ç'ÎÁ ÖArÉÀ¤Ñ~ùÀ`V•°¤¨lN›!E¨a¨hÍE+®gžŽ/>èåI(µãW¥ zÚë TpÇ«EЬtør5 ã(z¶£‚J×¾9¸*ú› ô9»RL9¢Û6Ãvþ×÷i³“ODò§Èˆå_Lÿ£0ïX«e¿rŸ•-V~_¸˜¾B+=¡‡u7WœÇòqHN/Úþ=mîÀC¶8ž®¸Åe·Ä§|ËR8Ïî—{&WsšgWãFŠ+8‚ÜpD;+ü Íÿ`†IDèkqçõÏ`žºŸà qÃËNüå†W¹¡Eâ†g3çIngŠ$qÃãácÅ ÷éÐ"sÃ=²¸á<ÍçŠn¸3>*/3ÃŒ§¹áˆ[úe‡•*n¸š‹¹ÜpvŲ¸aL•Ø]qÃÀÀÅÖŠ® ¯ÖiIÜp&¿ÞÌp-Í<³˜á‡q73\¶¾/ /™ý4/\Öm‰xaDÐC¾Ä0¬Ç”BÌ0Œ²–©áGE¢¸a¤« UpûBl1̾sˆ.Fà^H!¾Še…ÊoòÉ£Ú"…(c(Î'g .>Ic6T)Ä#²A SÇèë0W\4·ó&q•Oö¸8¸Xìñ[{\|4_þø-ϦX IÜVf˜s7cH³½É©•m.tHÙÄp3и3ÌüV²j8wDÜ׉yñÕQ½¢WWl„’–¡U÷ÐÆX…qÓ £¥ì•´ˆ¢UY*­`µZÊn‹9ãÜ)úÚœqõµî²ÆØíõf«¯°—5®7”Ϭqí ²ÆUfæŒët<Œ9ãœ×½éö׺<Ðð;6ަ€ã1¨p°Â Â<3\°äÕ7ERÇ“ŠùæŽÇTôrVä˜-æY“—ÉãNÑÔ>Øc‰\s ÇcS!îZìqÜ3{QÌJÓü¯øãðµûòÇiÜ_oöðªø`±ÇaøKìqørÙã7Uì1 ¼Øa±Ç†ÎëÍR[ÄÇlf›ÅÇô÷æùàâõæq@ˆ¾˜‹?Ç÷_þ8pùãØíþ8ö-]üqøD½üqœpéâqà|òÇáè”Ë·âq7,ÔøòÇ­˜)7}ÜŠ¹³Ç-Ì­˜¬0Õ^M&¦G`N—8Y*ºkš¸â;óÆS!†Í¦‰Ó,•š¸âQ‡«$MÌCYEŠ&®ˆA¾¼ñ¡_2ú%«Z1Ë41ƒñ?ibžÝ.T‘Žý¡\‡Bû¾”ùP´cßý*î˜÷¡[‚âóêòAw)Ü E<އBžÄÎê¸ðÙá Åü ‡\²Ëo§ë&Ñ„.ñ³:î-‡¼pwì’CÚ¹Ì1!4*Dí‹f¨Ëà¶®€Ä4À¢…ëÔm^8]£=&†©PµË/ìžÈPÃåH£®,Uéw\â¬#¯¦©'RÄŒ¤Z7>Ñ; ¦ºí¨h*3m¤Š57Pd1k(þ¥gÄX¹vЬýÒZ¤Œq\Ìbx‘¤ñfìÕ­ûdjîˆã&ÔÜñƒ®Çõj†qãÒɤ阋N&}LßHt1éãLOú¾’‰ZÒÇyÛh‰@NÙd¶d}‰$&\Auˆ`nH"— W\`”—rúV&ÊE §‰·1œò­k„‡ù …\§Ã‚M"§§eG$rÊæD"Óó"’.™²ÒI!§_avE rÊBcL 3ÀŒH½øãt;ŠI뉯Ö\Ùc†Ÿ)•ä1ÞÔ¼æ8%…˜8NÙ¬’xcúyëõÐÆu9ÄÙ¬1ƒÙXºHãŠ'gì›8㔿'†V×ìÎO -eì"¿ò0UêÖånm·Ž§/ªt=2#ª8}J͸™b³qD§—yÓåÜ®Ëïˆ&Î:Y—è,¿Í×C§l~E qÝÆéMÓ'åÈ€® Œ”*×v·riWôu·v©[bjT˜©%¨Vw/7QµTÜàÑÃéÆÞXÐÃ)ž[¡µôD~¤‡ëžO ×Rq‰pÑÃéý>EòâP ŠS=\÷›pUtä¾D¤‡+¶¯EÅGîs‹=œ.n\¾X’§<9"iÚä0P°FRL®y+Ö’è"7Wý©O‘‡Ëp¬š=ˆ¯¥â’éƒøZ=ÏñHRÈ_±‚‹ÿ4Yd„¸/LB¸¶•‚Œp*ÎUàuà¹g†9኷‚¤p~è‡X{³Î‡¥gWƪ˜šáþƒ†]Gh€‚ ÀojæÍÓºJÞ7cv èU—§Ž» ¨WíW\­q­ž¼:Øö“/ÔÜ 0ƃís›C†qZ{à'5w/ñó ©ºÀ# Ñab{˜niô®Œ»ˆYßá€Fó˜âS‡Î%rÈq6˜CNÙûZr_:Ì!GY—#&‡Ì»8_rÈ£,N˜2ï:úžräšCλNýäùZQ39ä¸O%Ì!§l–Wrˆ²|8äÀ…^,.9ä¼|˜#‡ì›ÒÃ!G½qâ£úŒ6‡œò0GìÊfkAé3Ûrà‚.Yñ“uˆ1&ƒu„k§¡Žûƒ rÔËb‰ANÙ ‰ä¨“6ƒ8P”ŸˆbÊ7?dľþ8¥ucôëT³Óä)ë[òÇq/åæóNgVJü±ïtƒŒ;8 MLJL˜£b @!ó·ŠÅ£8ÒK¶ŠDÎí2b‘Sá`ÓÈ»¸ä‘ÇÚ' r]é žš¿åÕ4¯ŽýQp­ÇìˆGæsWq›â‘©É!9Êšã%œ7Î~Y`òÈ©°k9âM,“GÞ¼­p$«+%œŠKƈG\­MÔ’GNÅm–xdþ€‰[EJ¶úð¿ uaäë›FF¡÷Ù(YdO-¼§â2½€{½¸ü[kW±õ>øÆ’@ÆáMþ¸Aqg%×ÄsÙdrÇŸ ’Ç9î»Lö˜oÕNÑÇû¼s?Λ¶¯À&ƒ/u¤ ƒÌ»øU0†¿z"9p×b‰¼µS‹œ “W¦‘ùÚ Ùø¼Àß%<g•š20tƒÎeI³¡+rôêøAòÉ83zÜ%6ªžš}[³6¾‘WfÉ7T»û½]æPÜ2ñ€‡YŽÞÍÈŠYŽÞå)“[NirËÑõdn9e[+qËÑõdn9º‰ sË·Gß“]Žîç{f—£ûñ”Ùe†-3]ìrtÃÂd—ã¾'2»ýlç&»L™ölá{šÍ6Ç(âŽÉ5£ ¿+®9ðI¥-—康‡«^±Í);ÖGl3!1Ä œdˆÞëa›cDw~…MŸZ曉ˆ(A“€CÄ/+fr8vÍ|s ÿ…ùæ”i$¾9Í¢¾&Û¸¾‹Û&Û  D_ƒl^å_×Ì7û QÍüQŽ„˜æçGFL4Ç}4ežù É6ÍÌkÖ.–™!Ù*ŸœAÌRœNf*î³*RÌ1‹|3Ì1‹î~&˜S¯‡]æ·çõËŠî™[NI£jjÁÝN%³œ²üË1›v‹ye†¯×C+çYCºWœr`“˜†&|— _\Ì)Ǽ1Óæ”‰Ñ"ýA8es­"§¾øS$Ÿü‹é>á«'æK åÿÏSîª^Æàí‡Q6–×[+l(¯G¿2˜^/". ¯×ûz8=§x=0^/f4Š—â•Ç€ë׃á¥?|ßqÂï¼ü޾¾ÚB,†· ÊBïÚ4sdð®=o„ݵaPN?0Ñ Èéç%z7¶¾üóç–Í7 ø¸òO„E–î7KA¦ ×›b*o:Z]j_t4Pßï{ŽV¶ÇLï9Œ÷|¿ï9Ò¯1Ðä÷ç¾Uñ{ŽãM¥÷ÛFý¾çØؼï9¶C,ï{ŽÅŸ6¹¯9.ps_s,ÿØóšãôûºCï9æ¯9̦ø5Ç8ç ÆÑˆ²e~φW©zÏ1ÊØôž£ÛŒÞ÷ý¾ 䋎î°Úû¢£¿Áñ›Žæ—ð÷M}¬×ÇÑÕkÂ×ô¦›2¤Ð›Žìœhãû¦ƒÏ¸? ¹«Ö¥Ð›°>C ¼é¨hƒD½é )…Þt€)u½é€¡kÐ\ £Zá7ñ|ßtàf# Ëo:Ð!‘ûMG)OÂç H!€hƒs¡;OK ÑÙFåˆÒk)Ÿ0éªüIž}ÇÓ@N^cmBê°ÙU©¡:¾$•BXÐ6õÄ`ÝBðÞ'^‡ýæ2Ø!ÌQhÄ€€Ëd3¬$f·`% ´Ã暑v3W†ºnÐÀqµB¨ôtPA;‚ç7¨ÁóvǼõað|¸!@î¦!w  »F >n9+ ø?š-nx &Àšcºáö¹é_r«Ý4õÀˆJcž¸:ÞMTÃÓèv3Õ@Íü•©jl[õÂÐn¯û¥ôÅðz7øl¯_¨Û^6öõ†ïðC?Jz‡ø[‚ïz¹7„ݵchÄÐ~oàܵŖ¶kv/j‡+”êâSŽÚ†âŒü”£6Ÿf~ÌQ[3ì­Ç)ßômk»¿žsÔV˜¥UƒLzÎQqeOôœ£â!Ã~}¿Ï9jžsWyä=]J>‘Ð1pÉé„èø[;.’]ÅMÂí$HÇ8åJW˵V†éøû<ªE8çÇŸ€«Æ‰ì—"âªËöX§KE¹9†ç)R\5bë--œŽ?ÄãÄéô«@RœÐiø…Ó1ܬZÁP¦] ãtü­ +ˆÓéç|¤ NWk»PâtTh„Ó1ÊÍµè· Þ1~ꂚúhÄ C£X7ÏG€Ñ:$x÷' àºr•§r€×-jîŠif©ì§abGïºì*.¢Jç½E±¯®ƒüz°:þúoî‚êjÅ¥mýT“/?ê ¿”®âÁ#KH§ÐÅ׃Ñ1ØQ©¡(Ä.ÞÒ‰êè˜Îº„ÏQÖû©ÄûbBèÜÇ÷Kñ‡Ý(‚°¹”UÛa[ª_s6ãF‚åjñåǯ;˜Î²ôºƒ2KÓ뎊È/¥wíÿŽŒ_wPVþ©õß<6zßA™}Óûȯï÷uGJóóue¿æ çYükbÆà( E‹-‰s* Ž¿¦Tbpµ8ÔØÜG~y¾ïüÄà ¿Ž©ª›\½?Kfî- cmBÉFUü{±Däúzð¸wÙÂãв׃ƽGEhÜ“:ö­WXÝ\wLTÑ8¦ }Û ¾£ 4®Þ_©5÷Î/4Žå «·v’и;ƒÂâÞµ‹£$dN5f_e‰Ÿ¾ßŠœ~ÖŠ8¬µ×ƒÄqå)÷Ñ*¿oþ€ÄݼÂá)´Æo›„ÂÝZ…Á½Û$ Žý?¯ƒ{÷AÜGikºÿÆÜèU=uŸþn³8§ã—’Ã5/chñÔ¬¯›s÷׃À¹Æß>R§¸^Týpë¼w.¸~ÞH®•ñ€p´¹Þüáxø­A¸8osñ‡á Oy¿> /Contents 6 0 R >> endobj 3 0 obj << /Type /Pages /Kids [ 5 0 R ] /Count 1 >> endobj 1 0 obj <> endobj 4 0 obj <> endobj 11 0 obj <> endobj 12 0 obj <> endobj 9 0 obj <> endobj 8 0 obj <>stream xœUU{\מaav“ÝÙ`ª¼ä%(AE`,òX]>peW@—G`AE´&$&pA£ÕH­ ĶŠ]"‚ŠhXQ]QD­VÒž!—þÚYLÓ_ÿ¹¿ùÝ;ç»ß9ßwÎ% K ‚$IÛ°Oâ>‰ŒwÔhó5ºô•ys&ïDòS-øi"Š~)ÿ%Æj!?ðĺj*h­¶ð‡IPð!"ÉeûeeoÊIOMÓI]–&,suw÷øßŽO`` tͦÿžHÃ4¹é©™ÒÂG¾F›•¡ÉÔI kµé)ÒTí¦ì´\©J­Ö¨ÍaJ•V³^‘®MÏÎÎÊ—º,r•úz{ûÌßÅékòr¥ Uf®T.MФæiU9ÿ·I„}f–ZóiN®.]‘¥òóŸè)õöñ%ˆX"Ž'"ˆBA,!–Ñ„¡$܈P‚%&ö„#1…˜D„-aG0„ƒP,Â’#Љ!RF"û-¢-Š,ªE^¢u¢m–Ž–1–µVŽV唘r œ¨ljPg÷ J>¾;­Âõ„ôÃŽ>{æÔ-¾’eþ=ý0tHö§ÌïP½ãùÓu]7Nd.)“Ø@®›wë"o÷‰`·@Ó> À†ÉnI)å‰ŒÎŠŠ²oj$·ÄÛ¾ÞZ²Ñ©Ÿí­çˆm GHrCÄT¥B¬›0VÙGfXÎuî§lF½tù×;dŠ‚—&Ù¹ºCü>¶¦»öáyB*~.ØÁ÷œâyQË­˜ÇOs›Ö&LAêüm^憕ÛâÐ<´ôPZSöñßÿµô¬UidyrmJ³¬?DȈnWœ;¡o¬ëDhPqÝå(þXïÀôÌ®^_Ó6ån§þ%ÐJÒqu'šÙÚ¡>Ø)輑ϟÌWàf1sVª÷‹Rìðâ€1±_§âï=ú£7õfc¨fª « Ñ3([ÏúWÆ3Ê x+¼›÷î1§|ð¸^ðMñç_¡íN™›÷áàªxHÖŒÙùòqc ˆü B×ÚÎÆ¢¨êÔ;«$Ì›Ô)qÁSðä—> éËA`ï§\ l”`¾ïOŽˆJJ^¸0éÌ.ý™»n¶dÞ˜Úsc>þ±—úúÚÛL¿^Ÿ`€kÂX4§Ûsؘþ)d@·Nuœ½YaDÀ"˜°é'U‡úRL]°Ð Nn31‡=áýúêÖï¹¼Pá>Å Õ[ºhxŸ_Ç>¼&óò[ã?Gyex°½óg6ç‰FD a§O¡lÆŸ¡¡ óýõƒ"¨ÄŒB±™k“VÆè<¶§qQ/fÀÜ S| Hòž®øQ²¶9®&ÑÁ–ÃçÜq^¹r–«çŠ— å¹áanÜV£Þ‚§|OíãËØ±2Óhb¼X‹É …xj-7¿'ÈÈ7õ¼½ ò#;öµçòñzã¯.:üíâÝB¿^ÉmU5¨–ŒD>(r½:6G³-¹xm¢vœßU»¯ªêô÷GZÝÛž0_±n¹<•óZ†]殎,ÂÞŽ¼–²ÑUñ•æ‰#¯¢ ïöM0|3qâ£=­ â?i˜Ô endstream endobj 13 0 obj 2012 endobj 10 0 obj <> endobj 2 0 obj <>endobj xref 0 14 0000000000 65535 f 0000009358 00000 n 0000012997 00000 n 0000009299 00000 n 0000009406 00000 n 0000009139 00000 n 0000000015 00000 n 0000009119 00000 n 0000009844 00000 n 0000009537 00000 n 0000011962 00000 n 0000009475 00000 n 0000009505 00000 n 0000011941 00000 n trailer << /Size 14 /Root 1 0 R /Info 2 0 R >> startxref 13047 %%EOF gbm/vignettes/gbm.bib0000644000176200001440000000341114547111634014257 0ustar liggesusers@article{FreundSchapire:1997, author = {Y. Freund and R. E. Schapire}, title = {A decision-theoretic generalization of on-line learning and an application to boosting}, journal = {Journal of Computer and System Sciences}, volume = {55}, number = {1}, pages = {119--139}, year = {1997} } @article{Friedman:2001, author = {J. H. Friedman}, title = {Greedy Function Approximation: A Gradient Boosting Machine}, journal = {Annals of Statistics}, volume = {29}, number = {5}, pages = {1189--1232}, year = {2001} } @article{Friedman:2002, author = {J. H. Friedman}, title = {Stochastic Gradient Boosting}, journal = {Computational Statistics and Data Analysis}, volume = {38}, number = {4}, pages = {367--378}, year = {2002} } @article{FHT:2000, author = {J. H. Friedman and T. Hastie and and R. Tibshirani}, title = {Additive Logistic Regression: a Statistical View of Boosting}, journal = {Annals of Statistics}, volume = {28}, number = {2}, pages = {337--374}, year = {2000} } @article{Kriegler:2010, author = {B. Kriegler and R. Berk}, title = {Small Area Estimation of the Homeless in Los Angeles, An Application of Cost-Sensitive Stochastic Gradient Boosting}, journal = {Annals of Applied Statistics}, volume = {4}, number = {3}, pages = {1234--1255}, year = {2010} } @article{Ridgeway:1999, author = {G. Ridgeway}, title = {The state of boosting}, journal = {Computing Science and Statistics}, volume = {31}, pages = {172--181}, year = {1999} } @article{Burges:2010, author = {C. Burges}, title = {From RankNet to LambdaRank to LambdaMART: An Overview}, journal = {Microsoft Research Technical Report MSR-TR-2010-82}, year = {2010} } gbm/src/0000755000176200001440000000000014637005163011613 5ustar liggesusersgbm/src/gbm.h0000644000176200001440000000357014547624323012543 0ustar liggesusers//------------------------------------------------------------------------------ // GBM by Greg Ridgeway Copyright (C) 2003 // // File: gbm.h // // License: GNU GPL (version 2 or later) // // Contents: Entry point for gbm.dll // // Owner: gregr@rand.org // // History: 2/14/2003 gregr created // 6/11/2007 gregr added quantile regression // written by Brian Kriegler // //------------------------------------------------------------------------------ #include #include "dataset.h" #include "distribution.h" #include "bernoulli.h" #include "adaboost.h" #include "poisson.h" #include "gaussian.h" #include "coxph.h" #include "laplace.h" #include "quantile.h" #include "tdist.h" #include "multinomial.h" #include "pairwise.h" #include "gbm_engine.h" #include "locationm.h" #include "huberized.h" typedef std::vector VEC_CATEGORIES; typedef std::vector VEC_VEC_CATEGORIES; GBMRESULT gbm_setup ( double *adY, double *adOffset, double *adX, int *aiXOrder, double *adWeight, double *adMisc, int cRows, int cCols, int *acVarClasses, int *alMonotoneVar, const char *pszFamily, int cTrees, int cLeaves, int cMinObsInNode, int cNumClasses, double dShrinkage, double dBagFraction, int cTrain, CDataset *pData, PCDistribution &pDist, int& cGroups ); GBMRESULT gbm_transfer_to_R ( CGBM *pGBM, VEC_VEC_CATEGORIES &vecSplitCodes, int *aiSplitVar, double *adSplitPoint, int *aiLeftNode, int *aiRightNode, int *aiMissingNode, double *adErrorReduction, double *adWeight, double *adPred, int cCatSplitsOld ); GBMRESULT gbm_transfer_catsplits_to_R ( int iCatSplit, VEC_VEC_CATEGORIES &vecSplitCodes, int *aiSplitCodes ); int size_of_vector ( VEC_VEC_CATEGORIES &vec, int i ); gbm/src/gbmentry.cpp0000644000176200001440000005311514636612513014155 0ustar liggesusers// GBM by Greg Ridgeway Copyright (C) 2003 #include "gbm.h" extern "C" { #define R_NO_REMAP // https://rstudio.github.io/r-manuals/r-exts/The-R-API.html #include #include SEXP gbm_fit ( SEXP radY, // outcome or response SEXP radOffset, // offset for f(x), NA for no offset SEXP radX, SEXP raiXOrder, SEXP radWeight, SEXP radMisc, // other row specific data (eg failure time), NA=no Misc SEXP rcRows, SEXP rcCols, SEXP racVarClasses, SEXP ralMonotoneVar, SEXP rszFamily, SEXP rcTrees, SEXP rcDepth, // interaction depth SEXP rcMinObsInNode, SEXP rcNumClasses, SEXP rdShrinkage, SEXP rdBagFraction, SEXP rcTrain, SEXP radFOld, SEXP rcCatSplitsOld, SEXP rcTreesOld, SEXP rfVerbose ) { unsigned long hr = 0; SEXP rAns = NULL; SEXP rNewTree = NULL; SEXP riSplitVar = NULL; SEXP rdSplitPoint = NULL; SEXP riLeftNode = NULL; SEXP riRightNode = NULL; SEXP riMissingNode = NULL; SEXP rdErrorReduction = NULL; SEXP rdWeight = NULL; SEXP rdPred = NULL; SEXP rdInitF = NULL; SEXP radF = NULL; SEXP radTrainError = NULL; SEXP radValidError = NULL; SEXP radOOBagImprove = NULL; SEXP rSetOfTrees = NULL; SEXP rSetSplitCodes = NULL; SEXP rSplitCode = NULL; VEC_VEC_CATEGORIES vecSplitCodes; int i = 0; int iT = 0; int iK = 0; int cTrees = INTEGER(rcTrees)[0]; const int cResultComponents = 7; // rdInitF, radF, radTrainError, radValidError, radOOBagImprove // rSetOfTrees, rSetSplitCodes const int cTreeComponents = 8; // riSplitVar, rdSplitPoint, riLeftNode, // riRightNode, riMissingNode, rdErrorReduction, rdWeight, rdPred int cNodes = 0; int cTrain = INTEGER(rcTrain)[0]; int cNumClasses = INTEGER(rcNumClasses)[0]; double dTrainError = 0.0; double dValidError = 0.0; double dOOBagImprove = 0.0; CGBM *pGBM = NULL; CDataset *pData = NULL; CDistribution *pDist = NULL; int cGroups = -1; // set up the dataset pData = new CDataset(); if(pData==NULL) { hr = GBM_OUTOFMEMORY; goto Error; } // initialize R's random number generator GetRNGstate(); // initialize some things hr = gbm_setup(REAL(radY), REAL(radOffset), REAL(radX), INTEGER(raiXOrder), REAL(radWeight), REAL(radMisc), INTEGER(rcRows)[0], INTEGER(rcCols)[0], INTEGER(racVarClasses), INTEGER(ralMonotoneVar), CHAR(STRING_ELT(rszFamily,0)), INTEGER(rcTrees)[0], INTEGER(rcDepth)[0], INTEGER(rcMinObsInNode)[0], INTEGER(rcNumClasses)[0], REAL(rdShrinkage)[0], REAL(rdBagFraction)[0], INTEGER(rcTrain)[0], pData, pDist, cGroups); if(GBM_FAILED(hr)) { goto Error; } // allocate the GBM pGBM = new CGBM(); if(pGBM==NULL) { hr = GBM_OUTOFMEMORY; goto Error; } // initialize the GBM hr = pGBM->Initialize(pData, pDist, REAL(rdShrinkage)[0], cTrain, REAL(rdBagFraction)[0], INTEGER(rcDepth)[0], INTEGER(rcMinObsInNode)[0], INTEGER(rcNumClasses)[0], cGroups); if(GBM_FAILED(hr)) { goto Error; } // allocate the main return object Rf_protect(rAns = Rf_allocVector(VECSXP, cResultComponents)); // allocate the initial value Rf_protect(rdInitF = Rf_allocVector(REALSXP, 1)); SET_VECTOR_ELT(rAns,0,rdInitF); Rf_unprotect(1); // rdInitF // allocate the predictions Rf_protect(radF = Rf_allocVector(REALSXP, (pData->cRows) * cNumClasses)); SET_VECTOR_ELT(rAns,1,radF); Rf_unprotect(1); // radF hr = pDist->Initialize(pData->adY, pData->adMisc, pData->adOffset, pData->adWeight, pData->cRows); if(GBM_FAILED(hr)) { goto Error; } if(ISNA(REAL(radFOld)[0])) // check for old predictions { // set the initial value of F as a constant hr = pDist->InitF(pData->adY, pData->adMisc, pData->adOffset, pData->adWeight, REAL(rdInitF)[0], cTrain); if(GBM_FAILED(hr)) { goto Error; } for(i=0; i < (pData->cRows) * cNumClasses; i++) { REAL(radF)[i] = REAL(rdInitF)[0]; } } else { for(i=0; i < (pData->cRows) * cNumClasses; i++) { REAL(radF)[i] = REAL(radFOld)[i]; } } // allocate space for the performance measures Rf_protect(radTrainError = Rf_allocVector(REALSXP, cTrees)); Rf_protect(radValidError = Rf_allocVector(REALSXP, cTrees)); Rf_protect(radOOBagImprove = Rf_allocVector(REALSXP, cTrees)); SET_VECTOR_ELT(rAns,2,radTrainError); SET_VECTOR_ELT(rAns,3,radValidError); SET_VECTOR_ELT(rAns,4,radOOBagImprove); Rf_unprotect(3); // radTrainError , radValidError, radOOBagImprove // allocate the component for the tree structures Rf_protect(rSetOfTrees = Rf_allocVector(VECSXP, cTrees * cNumClasses)); SET_VECTOR_ELT(rAns,5,rSetOfTrees); Rf_unprotect(1); // rSetOfTrees if(INTEGER(rfVerbose)[0]) { Rprintf("Iter TrainDeviance ValidDeviance StepSize Improve\n"); } for(iT=0; iTUpdateParams(REAL(radF), pData->adOffset, pData->adWeight, cTrain); if(GBM_FAILED(hr)) { goto Error; } REAL(radTrainError)[iT] = 0.0; REAL(radValidError)[iT] = 0.0; REAL(radOOBagImprove)[iT] = 0.0; for (iK = 0; iK < cNumClasses; iK++) { hr = pGBM->iterate(REAL(radF), dTrainError,dValidError,dOOBagImprove, cNodes, cNumClasses, iK); if(GBM_FAILED(hr)) { goto Error; } // store the performance measures REAL(radTrainError)[iT] += dTrainError; REAL(radValidError)[iT] += dValidError; REAL(radOOBagImprove)[iT] += dOOBagImprove; // allocate the new tree component for the R list structure Rf_protect(rNewTree = Rf_allocVector(VECSXP, cTreeComponents)); // riNodeID,riSplitVar,rdSplitPoint,riLeftNode, // riRightNode,riMissingNode,rdErrorReduction,rdWeight Rf_protect(riSplitVar = Rf_allocVector(INTSXP, cNodes)); Rf_protect(rdSplitPoint = Rf_allocVector(REALSXP, cNodes)); Rf_protect(riLeftNode = Rf_allocVector(INTSXP, cNodes)); Rf_protect(riRightNode = Rf_allocVector(INTSXP, cNodes)); Rf_protect(riMissingNode = Rf_allocVector(INTSXP, cNodes)); Rf_protect(rdErrorReduction = Rf_allocVector(REALSXP, cNodes)); Rf_protect(rdWeight = Rf_allocVector(REALSXP, cNodes)); Rf_protect(rdPred = Rf_allocVector(REALSXP, cNodes)); SET_VECTOR_ELT(rNewTree,0,riSplitVar); SET_VECTOR_ELT(rNewTree,1,rdSplitPoint); SET_VECTOR_ELT(rNewTree,2,riLeftNode); SET_VECTOR_ELT(rNewTree,3,riRightNode); SET_VECTOR_ELT(rNewTree,4,riMissingNode); SET_VECTOR_ELT(rNewTree,5,rdErrorReduction); SET_VECTOR_ELT(rNewTree,6,rdWeight); SET_VECTOR_ELT(rNewTree,7,rdPred); Rf_unprotect(cTreeComponents); SET_VECTOR_ELT(rSetOfTrees,(iK + iT * cNumClasses),rNewTree); Rf_unprotect(1); // rNewTree hr = gbm_transfer_to_R(pGBM, vecSplitCodes, INTEGER(riSplitVar), REAL(rdSplitPoint), INTEGER(riLeftNode), INTEGER(riRightNode), INTEGER(riMissingNode), REAL(rdErrorReduction), REAL(rdWeight), REAL(rdPred), INTEGER(rcCatSplitsOld)[0]); } // Close for iK // print the information if((iT <= 9) || ((iT+1+INTEGER(rcTreesOld)[0])/20 == (iT+1+INTEGER(rcTreesOld)[0])/20.0) || (iT==cTrees-1)) { R_CheckUserInterrupt(); if(INTEGER(rfVerbose)[0]) { Rprintf("%6d %13.4f %15.4f %10.4f %9.4f\n", iT+1+INTEGER(rcTreesOld)[0], REAL(radTrainError)[iT], REAL(radValidError)[iT], REAL(rdShrinkage)[0], REAL(radOOBagImprove)[iT]); } } } if(INTEGER(rfVerbose)[0]) Rprintf("\n"); // transfer categorical splits to R Rf_protect(rSetSplitCodes = Rf_allocVector(VECSXP, vecSplitCodes.size())); SET_VECTOR_ELT(rAns,6,rSetSplitCodes); Rf_unprotect(1); // rSetSplitCodes for(i=0; i<(int)vecSplitCodes.size(); i++) { Rf_protect(rSplitCode = Rf_allocVector(INTSXP, size_of_vector(vecSplitCodes,i))); SET_VECTOR_ELT(rSetSplitCodes,i,rSplitCode); Rf_unprotect(1); // rSplitCode hr = gbm_transfer_catsplits_to_R(i, vecSplitCodes, INTEGER(rSplitCode)); } // dump random number generator seed #ifdef NOISY_DEBUG Rprintf("PutRNGstate\n"); #endif PutRNGstate(); Cleanup: Rf_unprotect(1); // rAns #ifdef NOISY_DEBUG Rprintf("destructing\n"); #endif if(pGBM != NULL) { delete pGBM; pGBM = NULL; } if(pDist != NULL) { delete pDist; pDist = NULL; } if(pData != NULL) { delete pData; pData = NULL; } return rAns; Error: goto Cleanup; } SEXP gbm_pred ( SEXP radX, // the data matrix SEXP rcRows, // number of rows SEXP rcCols, // number of columns SEXP rcNumClasses, // number of classes SEXP rcTrees, // number of trees, may be a vector SEXP rdInitF, // the initial value SEXP rTrees, // the list of trees SEXP rCSplits, // the list of categorical splits SEXP raiVarType, // indicator of continuous/nominal SEXP riSingleTree // boolean whether to return only results for one tree ) { int iTree = 0; int iObs = 0; int cRows = INTEGER(rcRows)[0]; int cPredIterations = LENGTH(rcTrees); int iPredIteration = 0; int cTrees = 0; int iClass = 0; int cNumClasses = INTEGER(rcNumClasses)[0]; SEXP rThisTree = NULL; int *aiSplitVar = NULL; double *adSplitCode = NULL; int *aiLeftNode = NULL; int *aiRightNode = NULL; int *aiMissingNode = NULL; int iCurrentNode = 0; double dX = 0.0; int iCatSplitIndicator = 0; bool fSingleTree = (INTEGER(riSingleTree)[0]==1); SEXP radPredF = NULL; // allocate the predictions to return Rf_protect(radPredF = Rf_allocVector(REALSXP, cRows*cNumClasses*cPredIterations)); if(radPredF == NULL) { goto Error; } // initialize the predicted values if(!fSingleTree) { // initialize with the intercept for only the smallest rcTrees for(iObs=0; iObs0)) { // copy over from the last rcTrees for(iObs=0; iObs 0) { cStackNodes--; iCurrentNode = aiNodeStack[cStackNodes]; if(aiSplitVar[iCurrentNode] == -1) // terminal node { REAL(radPredF)[iClass*cRows + iObs] += adWeightStack[cStackNodes]*adSplitCode[iCurrentNode]; } else // non-terminal node { // is this a split variable that interests me? iPredVar = -1; for(i=0; (iPredVar == -1) && (i < cCols); i++) { if(INTEGER(raiWhichVar)[i] == aiSplitVar[iCurrentNode]) { iPredVar = i; // split is on one that interests me } } if(iPredVar != -1) // this split is among raiWhichVar { dX = REAL(radX)[iPredVar*cRows + iObs]; // missing? if(ISNA(dX)) { aiNodeStack[cStackNodes] = aiMissingNode[iCurrentNode]; cStackNodes++; } // continuous? else if(INTEGER(raiVarType)[aiSplitVar[iCurrentNode]] == 0) { if(dX < adSplitCode[iCurrentNode]) { aiNodeStack[cStackNodes] = aiLeftNode[iCurrentNode]; cStackNodes++; } else { aiNodeStack[cStackNodes] = aiRightNode[iCurrentNode]; cStackNodes++; } } else // categorical { iCatSplitIndicator = INTEGER( VECTOR_ELT(rCSplits, (int)adSplitCode[iCurrentNode]))[(int)dX]; if(iCatSplitIndicator==-1) { aiNodeStack[cStackNodes] = aiLeftNode[iCurrentNode]; cStackNodes++; } else if(iCatSplitIndicator==1) { aiNodeStack[cStackNodes] = aiRightNode[iCurrentNode]; cStackNodes++; } else // handle unused level { iCurrentNode = aiMissingNode[iCurrentNode]; } } } // iPredVar != -1 else // not interested in this split, average left and right { aiNodeStack[cStackNodes] = aiRightNode[iCurrentNode]; dCurrentW = adWeightStack[cStackNodes]; adWeightStack[cStackNodes] = dCurrentW * adW[aiRightNode[iCurrentNode]]/ (adW[aiLeftNode[iCurrentNode]]+ adW[aiRightNode[iCurrentNode]]); cStackNodes++; aiNodeStack[cStackNodes] = aiLeftNode[iCurrentNode]; adWeightStack[cStackNodes] = dCurrentW-adWeightStack[cStackNodes-1]; cStackNodes++; } } // non-terminal node } // while(cStackNodes > 0) } // iObs } // iClass } // iTree Cleanup: Rf_unprotect(1); // radPredF return radPredF; Error: goto Cleanup; } // gbm_plot } // end extern "C" gbm/src/matrix.h0000644000176200001440000001777614547624323013317 0ustar liggesusers// header file for matrix template class // NOTE: all matrices handled here must be SQUARE // (i.e., # rows = # columns) // in addition, all DIAGONAL ELEMENTS MUST BE NONZERO // written by Mike Dinolfo 12/98 // version 1.0 #ifndef __mjdmatrix_h #define __mjdmatrix_h #include // generic object (class) definition of matrix: template class matrix{ // NOTE: maxsize determines available memory storage, but // actualsize determines the actual size of the stored matrix in use // at a particular time. int maxsize; // max number of rows (same as max number of columns) int actualsize; // actual size (rows, or columns) of the stored matrix D* data; // where the data contents of the matrix are stored void allocateD() { delete[] data; data = new D [maxsize*maxsize]; }; public: matrix() { maxsize = 5; actualsize = 5; data = 0; allocateD(); }; // private ctor's matrix(int newmaxsize) {matrix(newmaxsize,newmaxsize);}; matrix(int newmaxsize, int newactualsize) { // the only public ctor if (newmaxsize <= 0) newmaxsize = 5; maxsize = newmaxsize; if ((newactualsize <= newmaxsize)&&(newactualsize>0)) actualsize = newactualsize; else actualsize = newmaxsize; // since allocateD() will first call delete[] on data: data = 0; allocateD(); }; ~matrix() { delete[] data; }; void dumpMatrixValues() { bool xyz; double rv; for (int i=0; i < actualsize; i++) { std::cout << "i=" << i << ": "; for (int j=0; j maxunitydeviation ) { maxunitydeviation = currentunitydeviation; worstdiagonal = i; } } int worstoffdiagonalrow = 0; int worstoffdiagonalcolumn = 0; D maxzerodeviation = 0.0; D currentzerodeviation ; for ( i = 0; i < actualsize; i++ ) { for ( int j = 0; j < actualsize; j++ ) { if ( i == j ) continue; // we look only at non-diagonal terms currentzerodeviation = data[i*maxsize+j]; if ( currentzerodeviation < 0.0) currentzerodeviation *= -1.0; if ( currentzerodeviation > maxzerodeviation ) { maxzerodeviation = currentzerodeviation; worstoffdiagonalrow = i; worstoffdiagonalcolumn = j; } } } std::cout << "Worst diagonal value deviation from unity: " << maxunitydeviation << " at row/column " << worstdiagonal << std::endl; std::cout << "Worst off-diagonal value deviation from zero: " << maxzerodeviation << " at row = " << worstoffdiagonalrow << ", column = " << worstoffdiagonalcolumn << std::endl; } void settoproduct(matrix& left, matrix& right) { actualsize = left.getactualsize(); if ( maxsize < left.getactualsize() ) { maxsize = left.getactualsize(); allocateD(); } for ( int i = 0; i < actualsize; i++ ) { for ( int j = 0; j < actualsize; j++ ) { D sum = 0.0; D leftvalue, rightvalue; bool success; for (int c = 0; c < actualsize; c++) { left.getvalue(i,c,leftvalue,success); right.getvalue(c,j,rightvalue,success); sum += leftvalue * rightvalue; } setvalue(i,j,sum); } } } void copymatrix(matrix& source) { actualsize = source.getactualsize(); if ( maxsize < source.getactualsize() ) { maxsize = source.getactualsize(); allocateD(); } for ( int i = 0; i < actualsize; i++ ) { for ( int j = 0; j < actualsize; j++ ) { D value; bool success; source.getvalue(i,j,value,success); data[i*maxsize+j] = value; } } }; void setactualsize(int newactualsize) { if ( newactualsize > maxsize ) { maxsize = newactualsize ; // * 2; // wastes memory but saves // time otherwise required for // operation new[] allocateD(); } if (newactualsize >= 0) actualsize = newactualsize; }; int getactualsize() { return actualsize; }; void getvalue(int row, int column, D& returnvalue, bool& success) { if ( (row>=maxsize) || (column>=maxsize) || (row<0) || (column<0) ) { success = false; return; } returnvalue = data[ row * maxsize + column ]; success = true; }; bool setvalue(int row, int column, D newvalue) { if ( (row >= maxsize) || (column >= maxsize) || (row<0) || (column<0) ) return false; data[ row * maxsize + column ] = newvalue; return true; }; void invert() { int i = 0; int j = 0; int k = 0; if (actualsize <= 0) return; // sanity check if (actualsize == 1) { data[0] = 1.0/data[0]; return; } for (i=1; i < actualsize; i++) data[i] /= data[0]; // normalize row 0 for (i=1; i < actualsize; i++) { for ( j=i; j < actualsize; j++) { // do a column of L D sum = 0.0; for ( k = 0; k < i; k++) sum += data[j*maxsize+k] * data[k*maxsize+i]; data[j*maxsize+i] -= sum; } if (i == actualsize-1) continue; for ( j=i+1; j < actualsize; j++) { // do a row of U D sum = 0.0; for ( k = 0; k < i; k++) sum += data[i*maxsize+k]*data[k*maxsize+j]; data[i*maxsize+j] = (data[i*maxsize+j]-sum) / data[i*maxsize+i]; } } for ( i = 0; i < actualsize; i++ ) // invert L { for ( j = i; j < actualsize; j++ ) { D x = 1.0; if ( i != j ) { x = 0.0; for ( k = i; k < j; k++ ) x -= data[j*maxsize+k]*data[k*maxsize+i]; } data[j*maxsize+i] = x / data[j*maxsize+j]; } } for ( i = 0; i < actualsize; i++ ) // invert U { for ( j = i; j < actualsize; j++ ) { if ( i == j ) continue; D sum = 0.0; for ( k = i; k < j; k++ ) sum += data[k*maxsize+j]*( (i==k) ? 1.0 : data[i*maxsize+k] ); data[i*maxsize+j] = -sum; } } for ( i = 0; i < actualsize; i++ ) // final inversion { for ( j = 0; j < actualsize; j++ ) { D sum = 0.0; for ( k = ((i>j)?i:j); k < actualsize; k++ ) sum += ((j==k)?1.0:data[j*maxsize+k])*data[k*maxsize+i]; data[j*maxsize+i] = sum; } } }; }; #endif gbm/src/node.h0000644000176200001440000000666514547624323012733 0ustar liggesusers//------------------------------------------------------------------------------ // GBM by Greg Ridgeway Copyright (C) 2003 // // File: node.h // // License: GNU GPL (version 2 or later) // // Contents: a node in the tree // // Owner: gregr@rand.org // // History: 3/26/2001 gregr created // 2/14/2003 gregr: adapted for R implementation // //------------------------------------------------------------------------------ #ifndef NODGBM_H #define NODGBM_H #include #include "dataset.h" #include "buildinfo.h" class CNodeFactory; typedef std::vector VEC_CATEGORIES; typedef std::vector VEC_VEC_CATEGORIES; class CNode { public: CNode(); virtual ~CNode(); virtual GBMRESULT Adjust(unsigned long cMinObsInNode); virtual GBMRESULT Predict(CDataset *pData, unsigned long iRow, double &dFadj); virtual GBMRESULT Predict(double *adX, unsigned long cRow, unsigned long cCol, unsigned long iRow, double &dFadj) = 0; static double Improvement ( double dLeftW, double dRightW, double dMissingW, double dLeftSum, double dRightSum, double dMissingSum ) { double dTemp = 0.0; double dResult = 0.0; if(dMissingW == 0.0) { dTemp = dLeftSum/dLeftW - dRightSum/dRightW; dResult = dLeftW*dRightW*dTemp*dTemp/(dLeftW+dRightW); } else { dTemp = dLeftSum/dLeftW - dRightSum/dRightW; dResult += dLeftW*dRightW*dTemp*dTemp; dTemp = dLeftSum/dLeftW - dMissingSum/dMissingW; dResult += dLeftW*dMissingW*dTemp*dTemp; dTemp = dRightSum/dRightW - dMissingSum/dMissingW; dResult += dRightW*dMissingW*dTemp*dTemp; dResult /= (dLeftW + dRightW + dMissingW); } return dResult; } virtual GBMRESULT PrintSubtree(unsigned long cIndent); virtual GBMRESULT TransferTreeToRList(int &iNodeID, CDataset *pData, int *aiSplitVar, double *adSplitPoint, int *aiLeftNode, int *aiRightNode, int *aiMissingNode, double *adErrorReduction, double *adWeight, double *adPred, VEC_VEC_CATEGORIES &vecSplitCodes, int cCatSplitsOld, double dShrinkage); double TotalError(); virtual GBMRESULT GetVarRelativeInfluence(double *adRelInf); virtual GBMRESULT RecycleSelf(CNodeFactory *pNodeFactory) = 0; double dPrediction; double dTrainW; // total training weight in node unsigned long cN; // number of training observations in node bool isTerminal; protected: double GetXEntry(CDataset *pData, unsigned long iRow, unsigned long iCol) { return pData->adX[iCol*(pData->cRows) + iRow]; } }; typedef CNode *PCNode; #endif // NODGBM_H gbm/src/gbm.cpp0000644000176200001440000001330614562474751013101 0ustar liggesusers//------------------------------------------------------------------------------ // // GBM by Greg Ridgeway Copyright (C) 2003 // File: gbm.cpp // //------------------------------------------------------------------------------ #include "gbm.h" // Count the number of distinct groups in the input data int num_groups(const double* adMisc, int cTrain) { if (cTrain <= 0) { return 0; } double dLastGroup = adMisc[0]; int cGroups = 1; for(int i=1; iSetData(adX,aiXOrder,adY,adOffset,adWeight,adMisc, cRows,cCols,acVarClasses,alMonotoneVar); if(GBM_FAILED(hr)) { goto Error; } // set the distribution if(strncmp(pszFamily,"bernoulli",2) == 0) { pDist = new CBernoulli(); if(pDist==NULL) { hr = GBM_OUTOFMEMORY; goto Error; } } else if(strncmp(pszFamily,"gaussian",2) == 0) { pDist = new CGaussian(); if(pDist==NULL) { hr = GBM_OUTOFMEMORY; goto Error; } } else if(strncmp(pszFamily,"poisson",2) == 0) { pDist = new CPoisson(); if(pDist==NULL) { hr = GBM_OUTOFMEMORY; goto Error; } } else if(strncmp(pszFamily,"adaboost",2) == 0) { pDist = new CAdaBoost(); if(pDist==NULL) { hr = GBM_OUTOFMEMORY; goto Error; } } else if(strncmp(pszFamily,"coxph",2) == 0) { pDist = new CCoxPH(); if(pDist==NULL) { hr = GBM_OUTOFMEMORY; goto Error; } } else if(strncmp(pszFamily,"laplace",2) == 0) { pDist = new CLaplace(); if(pDist==NULL) { hr = GBM_OUTOFMEMORY; goto Error; } } else if(strncmp(pszFamily,"quantile",2) == 0) { pDist = new CQuantile(adMisc[0]); if(pDist==NULL) { hr = GBM_OUTOFMEMORY; goto Error; } } else if(strncmp(pszFamily,"tdist",2) == 0) { pDist = new CTDist(adMisc[0]); if(pDist==NULL) { hr = GBM_OUTOFMEMORY; goto Error; } } else if(strncmp(pszFamily,"multinomial",2) == 0) { pDist = new CMultinomial(cNumClasses, cRows); if(pDist==NULL) { hr = GBM_OUTOFMEMORY; goto Error; } } else if(strncmp(pszFamily,"huberized",2) == 0) { pDist = new CHuberized(); if(pDist==NULL) { hr = GBM_OUTOFMEMORY; goto Error; } } else if(strcmp(pszFamily,"pairwise_conc") == 0) { pDist = new CPairwise("conc"); if(pDist==NULL) { hr = GBM_OUTOFMEMORY; goto Error; } } else if(strcmp(pszFamily,"pairwise_ndcg") == 0) { pDist = new CPairwise("ndcg"); if(pDist==NULL) { hr = GBM_OUTOFMEMORY; goto Error; } } else if(strcmp(pszFamily,"pairwise_map") == 0) { pDist = new CPairwise("map"); if(pDist==NULL) { hr = GBM_OUTOFMEMORY; goto Error; } } else if(strcmp(pszFamily,"pairwise_mrr") == 0) { pDist = new CPairwise("mrr"); if(pDist==NULL) { hr = GBM_OUTOFMEMORY; goto Error; } } else { hr = GBM_INVALIDARG; goto Error; } if(pDist==NULL) { hr = GBM_INVALIDARG; goto Error; } if (!strncmp(pszFamily, "pairwise", strlen("pairwise"))) { cGroups = num_groups(adMisc, cTrain); } Cleanup: return hr; Error: goto Cleanup; } GBMRESULT gbm_transfer_to_R ( CGBM *pGBM, VEC_VEC_CATEGORIES &vecSplitCodes, int *aiSplitVar, double *adSplitPoint, int *aiLeftNode, int *aiRightNode, int *aiMissingNode, double *adErrorReduction, double *adWeight, double *adPred, int cCatSplitsOld ) { GBMRESULT hr = GBM_OK; hr = pGBM->TransferTreeToRList(aiSplitVar, adSplitPoint, aiLeftNode, aiRightNode, aiMissingNode, adErrorReduction, adWeight, adPred, vecSplitCodes, cCatSplitsOld); if(GBM_FAILED(hr)) goto Error; Cleanup: return hr; Error: goto Cleanup; } GBMRESULT gbm_transfer_catsplits_to_R ( int iCatSplit, VEC_VEC_CATEGORIES &vecSplitCodes, int *aiSplitCodes ) { unsigned long i=0; for(i=0; i vecdNum; std::vector vecdDen; }; #endif // HUBERIZED_H gbm/src/bernoulli.cpp0000644000176200001440000001072714547665000014323 0ustar liggesusers// GBM by Greg Ridgeway Copyright (C) 2003 #include "bernoulli.h" CBernoulli::CBernoulli() { } CBernoulli::~CBernoulli() { } GBMRESULT CBernoulli::ComputeWorkingResponse ( double *adY, double *adMisc, double *adOffset, double *adF, double *adZ, double *adWeight, bool *afInBag, unsigned long nTrain, int cIdxOff ) { unsigned long i = 0; double dProb = 0.0; double dF = 0.0; for(i=0; i 0.0001) { dNum=0.0; dDen=0.0; for(i=0; idPrediction = 0.0; } else { vecpTermNodes[iNode]->dPrediction = vecdNum[iNode]/vecdDen[iNode]; } } } return hr; } double CBernoulli::BagImprovement ( double *adY, double *adMisc, double *adOffset, double *adWeight, double *adF, double *adFadj, bool *afInBag, double dStepSize, unsigned long nTrain ) { double dReturnValue = 0.0; double dF = 0.0; double dW = 0.0; unsigned long i = 0; for(i=0; iRecycleNode(this); return GBM_OK; }; GBMRESULT CNodeTerminal::TransferTreeToRList ( int &iNodeID, CDataset *pData, int *aiSplitVar, double *adSplitPoint, int *aiLeftNode, int *aiRightNode, int *aiMissingNode, double *adErrorReduction, double *adWeight, double *adPred, VEC_VEC_CATEGORIES &vecSplitCodes, int cCatSplitsOld, double dShrinkage ) { GBMRESULT hr = GBM_OK; aiSplitVar[iNodeID] = -1; adSplitPoint[iNodeID] = dShrinkage*dPrediction; aiLeftNode[iNodeID] = -1; aiRightNode[iNodeID] = -1; aiMissingNode[iNodeID] = -1; adErrorReduction[iNodeID] = 0.0; adWeight[iNodeID] = dTrainW; adPred[iNodeID] = dShrinkage*dPrediction; iNodeID++; return hr; } gbm/src/adaboost.h0000644000176200001440000000557014547624323013574 0ustar liggesusers//------------------------------------------------------------------------------ // GBM by Greg Ridgeway Copyright (C) 2003 // // File: adaboost.h // // License: GNU GPL (version 2 or later) // // Contents: Object for fitting for the AdaBoost loss function // // Owner: gregr@rand.org // // History: 3/26/2001 gregr created // 2/14/2003 gregr: adapted for R implementation // //------------------------------------------------------------------------------ #ifndef ADABOOST_H #define ADABOOST_H #include "distribution.h" class CAdaBoost : public CDistribution { public: CAdaBoost(); virtual ~CAdaBoost(); GBMRESULT UpdateParams(double *adF, double *adOffset, double *adWeight, unsigned long cLength) { return GBM_OK; }; GBMRESULT ComputeWorkingResponse(double *adY, double *adMisc, double *adOffset, double *adWeight, double *adF, double *adZ, bool *afInBag, unsigned long nTrain, int cIdxOff); GBMRESULT InitF(double *adY, double *adMisc, double *adOffset, double *adWeight, double &dInitF, unsigned long cLength); GBMRESULT FitBestConstant(double *adY, double *adMisc, double *adOffset, double *adW, double *adF, double *adZ, unsigned long *aiNodeAssign, unsigned long nTrain, VEC_P_NODETERMINAL vecpTermNodes, unsigned long cTermNodes, unsigned long cMinObsInNode, bool *afInBag, double *adFadj, int cIdxOff); double Deviance(double *adY, double *adMisc, double *adOffset, double *adWeight, double *adF, unsigned long cLength, int cIdxOff); double BagImprovement(double *adY, double *adMisc, double *adOffset, double *adWeight, double *adF, double *adFadj, bool *afInBag, double dStepSize, unsigned long nTrain); private: std::vector vecdNum; std::vector vecdDen; }; #endif // ADABOOST_H gbm/src/quantile.cpp0000644000176200001440000001215114547624323014146 0ustar liggesusers// GBM by Greg Ridgeway Copyright (C) 2003 #include "quantile.h" CQuantile::CQuantile(double dAlpha) { this->dAlpha = dAlpha; } CQuantile::~CQuantile() { } GBMRESULT CQuantile::ComputeWorkingResponse ( double *adY, double *adMisc, double *adOffset, double *adF, double *adZ, double *adWeight, bool *afInBag, unsigned long nTrain, int cIdxOff ) { unsigned long i = 0; if(adOffset == NULL) { for(i=0; i adF[i]) ? dAlpha : -(1.0-dAlpha); } } else { for(i=0; i adF[i]+adOffset[i]) ? dAlpha : -(1.0-dAlpha); } } return GBM_OK; } // DEBUG: needs weighted quantile GBMRESULT CQuantile::InitF ( double *adY, double *adMisc, double *adOffset, double *adWeight, double &dInitF, unsigned long cLength ) { double dOffset=0.0; unsigned long i=0; vecd.resize(cLength); for(i=0; i adF[i]) { dL += adWeight[i]*dAlpha *(adY[i] - adF[i]); } else { dL += adWeight[i]*(1.0-dAlpha)*(adF[i] - adY[i]); } dW += adWeight[i]; } } else { for(i=cIdxOff; i adF[i] + adOffset[i]) { dL += adWeight[i]*dAlpha *(adY[i] - adF[i]-adOffset[i]); } else { dL += adWeight[i]*(1.0-dAlpha)*(adF[i]+adOffset[i] - adY[i]); } dW += adWeight[i]; } } return dL/dW; } // DEBUG: needs weighted quantile GBMRESULT CQuantile::FitBestConstant ( double *adY, double *adMisc, double *adOffset, double *adW, double *adF, double *adZ, unsigned long *aiNodeAssign, unsigned long nTrain, VEC_P_NODETERMINAL vecpTermNodes, unsigned long cTermNodes, unsigned long cMinObsInNode, bool *afInBag, double *adFadj, int cIdxOff ) { GBMRESULT hr = GBM_OK; unsigned long iNode = 0; unsigned long iObs = 0; unsigned long iVecd = 0; double dOffset; vecd.resize(nTrain); // should already be this size from InitF for(iNode=0; iNodecN >= cMinObsInNode) { iVecd = 0; for(iObs=0; iObsdPrediction = *std::max_element(vecd.begin(), vecd.begin()+iVecd); } else { nth_element(vecd.begin(), vecd.begin() + int(iVecd*dAlpha), vecd.begin() + int(iVecd)); vecpTermNodes[iNode]->dPrediction = *(vecd.begin() + int(iVecd*dAlpha)); } } } return hr; } double CQuantile::BagImprovement ( double *adY, double *adMisc, double *adOffset, double *adWeight, double *adF, double *adFadj, bool *afInBag, double dStepSize, unsigned long nTrain ) { double dReturnValue = 0.0; double dF = 0.0; double dW = 0.0; unsigned long i = 0; for(i=0; i dF) { dReturnValue += adWeight[i]*dAlpha*(adY[i]-dF); } else { dReturnValue += adWeight[i]*(1-dAlpha)*(dF-adY[i]); } if(adY[i] > dF+dStepSize*adFadj[i]) { dReturnValue -= adWeight[i]*dAlpha* (adY[i] - dF-dStepSize*adFadj[i]); } else { dReturnValue -= adWeight[i]*(1-dAlpha)* (dF+dStepSize*adFadj[i] - adY[i]); } dW += adWeight[i]; } } return dReturnValue/dW; } gbm/src/locationm.cpp0000644000176200001440000001145514547624323014317 0ustar liggesusers//------------------------------------------------------------------------------ // GBM alteration by Daniel Edwards // File: locationm.cpp // // Purpose: Class to provide methods to calculate the location M-estimates // of a variety of functions // // History: 31/03/2008 created // //------------------------------------------------------------------------------ #include "locationm.h" #include ///////////////////////////////////////////////// // Constructor // // Creates a new instance of this class ///////////////////////////////////////////////// CLocationM::CLocationM(const char *sType, int iN, double *adParams) { int ii; msType = sType; mdEps = 1e-8; madParams = new double[iN]; for (ii = 0; ii < iN; ii++) { madParams[ii] = adParams[ii]; } } ///////////////////////////////////////////////// // Destructor // // Frees any memory from variables in this class ///////////////////////////////////////////////// CLocationM::~CLocationM() { if (madParams != NULL) { delete[] madParams; } } ///////////////////////////////////////////////// // Median // // Function to return the weighted quantile of // a vector of a given length // // Parameters: iN - Length of vector // adV - Vector of doubles // adW - Array of weights // dAlpha - Quantile to calculate (0.5 for median) // // Returns : Weighted quantile ///////////////////////////////////////////////// double CLocationM::Median(int iN, double *adV, double *adW) { // Local variables int ii, iMedIdx; std::vector vecW; std::vector< std::pair > vecV; double dCumSum, dWSum, dMed; // Check the vector size if (iN == 0) { return 0.0; } else if(iN == 1) { return adV[0]; } // Create vectors containing the values and weights vecV.resize(iN); for (ii = 0; ii < iN; ii++) { vecV[ii] = std::make_pair(ii, adV[ii]); } // Sort the vector std::stable_sort(vecV.begin(), vecV.end(), comp()); // Sort the weights correspondingly and calculate their sum vecW.resize(iN); dWSum = 0.0; for (ii = 0; ii < iN; ii++) { vecW[ii] = adW[vecV[ii].first]; dWSum += adW[ii]; } // Get the first index where the cumulative weight is >=0.5 iMedIdx = -1; dCumSum = 0.0; while (dCumSum < 0.5 * dWSum) { iMedIdx ++; dCumSum += vecW[iMedIdx]; } // Get the index of the next non-zero weight int iNextNonZero = iN; for (ii = (iN - 1); ii > iMedIdx; ii--) { if (vecW[ii] > 0) { iNextNonZero = ii; } } // Use this index unless the cumulative sum is exactly alpha if (iNextNonZero == iN || dCumSum > 0.5 * dWSum) { dMed = vecV[iMedIdx].second; } else { dMed = 0.5 * (vecV[iMedIdx].second + vecV[iNextNonZero].second); } return dMed; } ///////////////////////////////////////////////// // PsiFun // // Function to calculate the psi of the supplied // value, given the type of function to use and // the supplied parameters // // Parameters: dX - Value // // Returns : Psi(X) ///////////////////////////////////////////////// double CLocationM::PsiFun(double dX) { // Local variables double dPsiVal = 0.0; // Switch on the type of function if(strncmp(msType,"tdist",2) == 0) { dPsiVal = dX / (madParams[0] + (dX * dX)); } else { // TODO: Handle the error Rprintf("Error: Function type %s not found\n", msType); } return dPsiVal; } ///////////////////////////////////////////////// // LocationM // // Function to calculate location M estimate for // the supplied weighted data, with the psi-function // type and parameters specified in this class // // Parameters: iN - Number of data points // adX - Data vector // adW - Weight vector // // Returns : Location M-Estimate of (X, W) ///////////////////////////////////////////////// double CLocationM::LocationM(int iN, double *adX, double *adW) { // Local variables int ii; // Get the initial estimate of location double dBeta0 = Median(iN, adX, adW); // Get the initial estimate of scale double *adDiff = new double[iN]; for (ii = 0; ii < iN; ii++) { adDiff[ii] = fabs(adX[ii] - dBeta0); } double dScale0 = 1.4826 * Median(iN, adDiff, adW); dScale0 = fmax(dScale0, mdEps); // Loop over until the error is low enough double dErr = 1.0; int iCount = 0; while (iCount < 50) { double dSumWX = 0.0; double dSumW = 0.0; for (ii = 0; ii < iN; ii++) { double dT = fabs(adX[ii] - dBeta0) / dScale0; dT = fmax(dT, mdEps); double dWt = adW[ii] * PsiFun(dT) / dT; dSumWX += dWt * adX[ii]; dSumW += dWt; } double dBeta = dBeta0; if (dSumW > 0){ dBeta = dSumWX / dSumW; } dErr = fabs(dBeta - dBeta0); if (dErr > mdEps) { dErr /= fabs(dBeta0); } dBeta0 = dBeta; if (dErr < mdEps) { iCount = 100; } else { iCount++; } } // Cleanup memory delete[] adDiff; return dBeta0; } gbm/src/coxph.h0000644000176200001440000000607414547624323013121 0ustar liggesusers//------------------------------------------------------------------------------ // GBM by Greg Ridgeway Copyright (C) 2003 // // File: coxph.h // // License: GNU GPL (version 2 or later) // // Contents: Cox proportional hazard object // // Owner: gregr@rand.org // // History: 3/26/2001 gregr created // 2/14/2003 gregr: adapted for R implementation // //------------------------------------------------------------------------------ #ifndef COXPH_H #define COXPH_H #include "distribution.h" #include "matrix.h" #include // for ULONG_MAX class CCoxPH : public CDistribution { public: CCoxPH(); virtual ~CCoxPH(); GBMRESULT UpdateParams(double *adF, double *adOffset, double *adWeight, unsigned long cLength) { return GBM_OK; }; GBMRESULT ComputeWorkingResponse(double *adT, double *adDelta, double *adOffset, double *adF, double *adZ, double *adWeight, bool *afInBag, unsigned long nTrain, int cIdxOff); GBMRESULT InitF(double *adT, double *adDelta, double *adOffset, double *adWeight, double &dInitF, unsigned long cLength); GBMRESULT FitBestConstant(double *adT, double *adDelta, double *adOffset, double *adW, double *adF, double *adZ, unsigned long *aiNodeAssign, unsigned long nTrain, VEC_P_NODETERMINAL vecpTermNodes, unsigned long cTermNodes, unsigned long cMinObsInNode, bool *afInBag, double *adFadj, int cIdxOff); double Deviance(double *adT, double *adDelta, double *adOffset, double *adWeight, double *adF, unsigned long cLength, int cIdxOff); double BagImprovement(double *adT, double *adDelta, double *adOffset, double *adWeight, double *adF, double *adFadj, bool *afInBag, double dStepSize, unsigned long nTrain); private: std::vector vecdP; std::vector vecdRiskTot; std::vector vecdG; std::vector veciK2Node; std::vector veciNode2K; matrix matH; matrix matHinv; }; #endif // COXPH_H gbm/src/bernoulli.h0000644000176200001440000000556514547624323013777 0ustar liggesusers//------------------------------------------------------------------------------ // GBM by Greg Ridgeway Copyright (C) 2003 // // File: bernoulli.h // // License: GNU GPL (version 2 or later) // // Contents: bernoulli object // // Owner: gregr@rand.org // // History: 3/26/2001 gregr created // 2/14/2003 gregr: adapted for R implementation // //------------------------------------------------------------------------------ #ifndef BERNOULLI_H #define BERNOULLI_H #include "distribution.h" #include "buildinfo.h" class CBernoulli : public CDistribution { public: CBernoulli(); virtual ~CBernoulli(); GBMRESULT UpdateParams(double *adF, double *adOffset, double *adWeight, unsigned long cLength) { return GBM_OK; }; GBMRESULT ComputeWorkingResponse(double *adY, double *adMisc, double *adOffset, double *adF, double *adZ, double *adWeight, bool *afInBag, unsigned long nTrain, int cIdxOff); double Deviance(double *adY, double *adMisc, double *adOffset, double *adWeight, double *adF, unsigned long cLength, int cIdxOff); GBMRESULT InitF(double *adY, double *adMisc, double *adOffset, double *adWeight, double &dInitF, unsigned long cLength); GBMRESULT FitBestConstant(double *adY, double *adMisc, double *adOffset, double *adW, double *adF, double *adZ, unsigned long *aiNodeAssign, unsigned long nTrain, VEC_P_NODETERMINAL vecpTermNodes, unsigned long cTermNodes, unsigned long cMinObsInNode, bool *afInBag, double *adFadj, int cIdxOff); double BagImprovement(double *adY, double *adMisc, double *adOffset, double *adWeight, double *adF, double *adFadj, bool *afInBag, double dStepSize, unsigned long nTrain); private: std::vector vecdNum; std::vector vecdDen; }; #endif // BERNOULLI_H gbm/src/quantile.h0000644000176200001440000000567114547624323013624 0ustar liggesusers//------------------------------------------------------------------------------ // GBM by Greg Ridgeway Copyright (C) 2003 // File: quantile.h // // License: GNU GPL (version 2 or later) // // Contents: laplace object // // Owner: gregr@rand.org // // History: 10/8/2006 Created by Brian Kriegler (bk@stat.ucla.edu) // 6/11/2007 gregr merged with official gbm // //------------------------------------------------------------------------------ #ifndef QUANTILE_H #define QUANTILE_H #include #include "distribution.h" class CQuantile: public CDistribution { public: CQuantile(double dAlpha); virtual ~CQuantile(); GBMRESULT UpdateParams(double *adF, double *adOffset, double *adWeight, unsigned long cLength) { return GBM_OK; }; GBMRESULT ComputeWorkingResponse(double *adY, double *adMisc, double *adOffset, double *adF, double *adZ, double *adWeight, bool *afInBag, unsigned long nTrain, int cIdxOff); GBMRESULT InitF(double *adY, double *adMisc, double *adOffset, double *adWeight, double &dInitF, unsigned long cLength); GBMRESULT FitBestConstant(double *adY, double *adMisc, double *adOffset, double *adW, double *adF, double *adZ, unsigned long *aiNodeAssign, unsigned long nTrain, VEC_P_NODETERMINAL vecpTermNodes, unsigned long cTermNodes, unsigned long cMinObsInNode, bool *afInBag, double *adFadj, int cIdxOff); double Deviance(double *adY, double *adMisc, double *adOffset, double *adWeight, double *adF, unsigned long cLength, int cIdxOff); double BagImprovement(double *adY, double *adMisc, double *adOffset, double *adWeight, double *adF, double *adFadj, bool *afInBag, double dStepSize, unsigned long nTrain); private: std::vector vecd; double dAlpha; }; #endif // QUANTILE_H gbm/src/node_nonterminal.h0000644000176200001440000000470714547111634015330 0ustar liggesusers//------------------------------------------------------------------------------ // GBM by Greg Ridgeway Copyright (C) 2003 // // File: node_nonterminal.h // // License: GNU GPL (version 2 or later) // // Contents: a node in the tree // // Owner: gregr@rand.org // // History: 3/26/2001 gregr created // 2/14/2003 gregr: adapted for R implementation // //------------------------------------------------------------------------------ #ifndef NODENONTERMINAL_H #define NODENONTERMINAL_H #include "node.h" #include "node_terminal.h" class CNodeNonterminal : public CNode { public: CNodeNonterminal(); virtual ~CNodeNonterminal(); virtual GBMRESULT Adjust(unsigned long cMinObsInNode); virtual signed char WhichNode(CDataset *pData, unsigned long iObs) = 0; virtual signed char WhichNode(double *adX, unsigned long cRow, unsigned long cCol, unsigned long iRow) = 0; virtual GBMRESULT TransferTreeToRList(int &iNodeID, CDataset *pData, int *aiSplitVar, double *adSplitPoint, int *aiLeftNode, int *aiRightNode, int *aiMissingNode, double *adErrorReduction, double *adWeight, double *adPred, VEC_VEC_CATEGORIES &vecSplitCodes, int cCatSplitsOld, double dShrinkage) = 0; GBMRESULT Predict(CDataset *pData, unsigned long iRow, double &dFadj); GBMRESULT Predict(double *adX, unsigned long cRow, unsigned long cCol, unsigned long iRow, double &dFadj); GBMRESULT GetVarRelativeInfluence(double *adRelInf); virtual GBMRESULT RecycleSelf(CNodeFactory *pNodeFactory) = 0; CNode *pLeftNode; CNode *pRightNode; CNode *pMissingNode; unsigned long iSplitVar; double dImprovement; }; typedef CNodeNonterminal *PCNodeNonterminal; #endif // NODENONTERMINAL_H gbm/src/node_factory.h0000644000176200001440000000316614547624323014453 0ustar liggesusers//------------------------------------------------------------------------------ // GBM by Greg Ridgeway Copyright (C) 2003 // // File: node_factory.h // // License: GNU GPL (version 2 or later) // // Contents: manager for allocation and destruction of all nodes // // Owner: gregr@rand.org // // History: 3/26/2001 gregr created // 2/14/2003 gregr: adapted for R implementation // //------------------------------------------------------------------------------ #ifndef NODEFACTORY_H #define NODEFACTORY_H #include #include #include "node_terminal.h" #include "node_continuous.h" #include "node_categorical.h" #define NODEFACTORY_NODGBM_RESERVE ((unsigned long)101) class CNodeFactory { public: CNodeFactory(); ~CNodeFactory(); GBMRESULT Initialize(unsigned long cDepth); CNodeTerminal* GetNewNodeTerminal(); CNodeContinuous* GetNewNodeContinuous(); CNodeCategorical* GetNewNodeCategorical(); GBMRESULT RecycleNode(CNodeTerminal *pNode); GBMRESULT RecycleNode(CNodeContinuous *pNode); GBMRESULT RecycleNode(CNodeCategorical *pNode); private: std::stack TerminalStack; std::stack ContinuousStack; std::stack CategoricalStack; CNodeTerminal* pNodeTerminalTemp; CNodeContinuous* pNodeContinuousTemp; CNodeCategorical* pNodeCategoricalTemp; CNodeTerminal aBlockTerminal[NODEFACTORY_NODGBM_RESERVE]; CNodeContinuous aBlockContinuous[NODEFACTORY_NODGBM_RESERVE]; CNodeCategorical aBlockCategorical[NODEFACTORY_NODGBM_RESERVE]; }; #endif // NODEFACTORY_H gbm/src/locationm.h0000644000176200001440000000161014547624323013754 0ustar liggesusers//------------------------------------------------------------------------------ // GBM alteration by Daniel Edwards // File: locationm.h // // History: 27/3/2008 created // //------------------------------------------------------------------------------ #ifndef LOCMCGBM_H #define LOCMCGBM_H #include #include #include #include using namespace std; class CLocationM { public: CLocationM(const char *sType, int iN, double *adParams); virtual ~CLocationM(); double Median(int iN, double *adV, double *adW); double PsiFun(double dX); double LocationM(int iN, double *adX, double *adW); private: double *madParams; const char *msType; double mdEps; struct comp{ bool operator()(std::pair prP, std::pair prQ) { return (prP.second < prQ.second); } }; }; #endif // LOCMCGBM_H gbm/src/laplace.cpp0000644000176200001440000001033414547655675013744 0ustar liggesusers// GBM by Greg Ridgeway Copyright (C) 2003 #include "laplace.h" CLaplace::CLaplace() { mpLocM = NULL; adArr = NULL; adW2 = NULL; } CLaplace::~CLaplace() { if(mpLocM != NULL) { delete mpLocM; } if(adArr!=NULL) { delete[] adArr; } if(adW2!=NULL) { delete[] adW2; } } GBMRESULT CLaplace::ComputeWorkingResponse ( double *adY, double *adMisc, double *adOffset, double *adF, double *adZ, double *adWeight, bool *afInBag, unsigned long nTrain, int cIdxOff ) { unsigned long i = 0; if(adOffset == NULL) { for(i=0; i 0.0 ? 1.0 : -1.0; } } else { for(i=0; i 0.0 ? 1.0 : -1.0; } } return GBM_OK; } GBMRESULT CLaplace::InitF ( double *adY, double *adMisc, double *adOffset, double *adWeight, double &dInitF, unsigned long cLength ) { GBMRESULT hr = GBM_OK; double dOffset = 0.0; unsigned long ii = 0; int nLength = int(cLength); double *pTemp = NULL; // Create a new LocationM object (for weighted medians) mpLocM = new CLocationM("Other", 0, pTemp); if(mpLocM == NULL) { hr = GBM_OUTOFMEMORY; goto Error; } adArr = new double[cLength]; if(adArr == NULL) { hr = GBM_OUTOFMEMORY; goto Error; } adW2 = new double[cLength]; if(adW2==NULL) { hr = GBM_OUTOFMEMORY; goto Error; } for (ii = 0; ii < cLength; ii++) { dOffset = (adOffset==NULL) ? 0.0 : adOffset[ii]; adArr[ii] = adY[ii] - dOffset; } dInitF = mpLocM->Median(nLength, adArr, adWeight); Cleanup: return hr; Error: goto Cleanup; } double CLaplace::Deviance ( double *adY, double *adMisc, double *adOffset, double *adWeight, double *adF, unsigned long cLength, int cIdxOff ) { unsigned long i=0; double dL = 0.0; double dW = 0.0; if(adOffset == NULL) { for(i=cIdxOff; icN >= cMinObsInNode) { iVecd = 0; for(iObs=0; iObsdPrediction = mpLocM->Median(iVecd, adArr, adW2); } } return GBM_OK; } double CLaplace::BagImprovement ( double *adY, double *adMisc, double *adOffset, double *adWeight, double *adF, double *adFadj, bool *afInBag, double dStepSize, unsigned long nTrain ) { double dReturnValue = 0.0; double dF = 0.0; double dW = 0.0; unsigned long i = 0; for(i=0; i #include "buildinfo.h" #include "distribution.h" #include "tree.h" #include "dataset.h" #include "node_factory.h" class CGBM { public: CGBM(); ~CGBM(); GBMRESULT Initialize(CDataset *pData, CDistribution *pDist, double dLambda, unsigned long nTrain, double dBagFraction, unsigned long cLeaves, unsigned long cMinObsInNode, unsigned long cNumClasses, int cGroups); GBMRESULT iterate(double *adF, double &dTrainError, double &dValidError, double &dOOBagImprove, int &cNodes, int cNumClasses, int cClassIdx); GBMRESULT TransferTreeToRList(int *aiSplitVar, double *adSplitPoint, int *aiLeftNode, int *aiRightNode, int *aiMissingNode, double *adErrorReduction, double *adWeight, double *adPred, VEC_VEC_CATEGORIES &vecSplitCodes, int cCatSplitsOld); GBMRESULT Predict(unsigned long iVar, unsigned long cTrees, double *adF, double *adX, unsigned long cLength); GBMRESULT Predict(double *adX, unsigned long cRow, unsigned long cCol, unsigned long cTrees, double *adF); GBMRESULT GetVarRelativeInfluence(double *adRelInf, unsigned long cTrees); GBMRESULT PrintTree(); bool IsPairwise() const { return (cGroups >= 0); } CDataset *pData; // the data CDistribution *pDist; // the distribution bool fInitialized; // indicates whether the GBM has been initialized CNodeFactory *pNodeFactory; // these objects are for the tree growing // allocate them once here for all trees to use bool *afInBag; unsigned long *aiNodeAssign; CNodeSearch *aNodeSearch; PCCARTTree ptreeTemp; VEC_P_NODETERMINAL vecpTermNodes; double *adZ; double *adFadj; private: double dLambda; unsigned long cTrain; unsigned long cValid; unsigned long cTotalInBag; double dBagFraction; unsigned long cDepth; unsigned long cMinObsInNode; int cGroups; }; #endif // GBM_ENGINGBM_H gbm/src/node_continuous.h0000644000176200001440000000351014547111634015177 0ustar liggesusers//------------------------------------------------------------------------------ // GBM by Greg Ridgeway Copyright (C) 2003 // // File: node_continuous.h // // License: GNU GPL (version 2 or later) // // Contents: a node with a continuous split // // Owner: gregr.rand.org // // History: 3/26/2001 gregr created // 2/14/2003 gregr: adapted for R implementation // //------------------------------------------------------------------------------ #ifndef NODECONTINUOUS_H #define NODECONTINUOUS_H #include #include "node_nonterminal.h" class CNodeContinuous : public CNodeNonterminal { public: CNodeContinuous(); ~CNodeContinuous(); GBMRESULT PrintSubtree(unsigned long cIndent); GBMRESULT TransferTreeToRList(int &iNodeID, CDataset *pData, int *aiSplitVar, double *adSplitPoint, int *aiLeftNode, int *aiRightNode, int *aiMissingNode, double *adErrorReduction, double *adWeight, double *adPred, VEC_VEC_CATEGORIES &vecSplitCodes, int cCatSplitsOld, double dShrinkage); signed char WhichNode(CDataset *pData, unsigned long iObs); signed char WhichNode(double *adX, unsigned long cRow, unsigned long cCol, unsigned long iRow); GBMRESULT RecycleSelf(CNodeFactory *pNodeFactory); double dSplitValue; }; typedef CNodeContinuous *PCNodeContinuous; #endif // NODECONTINUOUS_H gbm/src/tree.cpp0000644000176200001440000002666014547111634013271 0ustar liggesusers// GBM by Greg Ridgeway Copyright (C) 2003 #include "tree.h" CCARTTree::CCARTTree() { pRootNode = NULL; pNodeFactory = NULL; dShrink = 1.0; } CCARTTree::~CCARTTree() { if(pRootNode != NULL) { pRootNode->RecycleSelf(pNodeFactory); } } GBMRESULT CCARTTree::Initialize ( CNodeFactory *pNodeFactory ) { GBMRESULT hr = GBM_OK; this->pNodeFactory = pNodeFactory; return hr; } GBMRESULT CCARTTree::Reset() { GBMRESULT hr = GBM_OK; if(pRootNode != NULL) { // delete the old tree and start over hr = pRootNode->RecycleSelf(pNodeFactory); } if(GBM_FAILED(hr)) { goto Error; } iBestNode = 0; dBestNodeImprovement = 0.0; schWhichNode = 0; pNewSplitNode = NULL; pNewLeftNode = NULL; pNewRightNode = NULL; pNewMissingNode = NULL; pInitialRootNode = NULL; Cleanup: return hr; Error: goto Cleanup; } //------------------------------------------------------------------------------ // Grows a regression tree //------------------------------------------------------------------------------ GBMRESULT CCARTTree::grow ( double *adZ, CDataset *pData, double *adW, double *adF, unsigned long nTrain, unsigned long nBagged, double dLambda, unsigned long cMaxDepth, unsigned long cMinObsInNode, bool *afInBag, unsigned long *aiNodeAssign, CNodeSearch *aNodeSearch, VEC_P_NODETERMINAL &vecpTermNodes ) { GBMRESULT hr = GBM_OK; #ifdef NOISY_DEBUG Rprintf("Growing tree\n"); #endif if((adZ==NULL) || (pData==NULL) || (adW==NULL) || (adF==NULL) || (cMaxDepth < 1)) { hr = GBM_INVALIDARG; goto Error; } dSumZ = 0.0; dSumZ2 = 0.0; dTotalW = 0.0; #ifdef NOISY_DEBUG Rprintf("initial tree calcs\n"); #endif for(iObs=0; iObsGetNewNodeTerminal(); pInitialRootNode->dPrediction = dSumZ/dTotalW; pInitialRootNode->dTrainW = dTotalW; vecpTermNodes.resize(2*cMaxDepth + 1,NULL); // accounts for missing nodes vecpTermNodes[0] = pInitialRootNode; pRootNode = pInitialRootNode; aNodeSearch[0].Set(dSumZ,dTotalW,nBagged, pInitialRootNode, &pRootNode, pNodeFactory); // build the tree structure #ifdef NOISY_DEBUG Rprintf("Building tree 1 "); #endif cTotalNodeCount = 1; cTerminalNodes = 1; for(cDepth=0; cDepthWhichNode(pData,iObs); if(schWhichNode == 1) // goes right { aiNodeAssign[iObs] = cTerminalNodes-2; } else if(schWhichNode == 0) // is missing { aiNodeAssign[iObs] = cTerminalNodes-1; } // those to the left stay with the same node assignment } } // set up the node search for the new right node aNodeSearch[cTerminalNodes-2].Set(aNodeSearch[iBestNode].dBestRightSumZ, aNodeSearch[iBestNode].dBestRightTotalW, aNodeSearch[iBestNode].cBestRightN, pNewRightNode, &(pNewSplitNode->pRightNode), pNodeFactory); // set up the node search for the new missing node aNodeSearch[cTerminalNodes-1].Set(aNodeSearch[iBestNode].dBestMissingSumZ, aNodeSearch[iBestNode].dBestMissingTotalW, aNodeSearch[iBestNode].cBestMissingN, pNewMissingNode, &(pNewSplitNode->pMissingNode), pNodeFactory); // set up the node search for the new left node // must be done second since we need info for right node first aNodeSearch[iBestNode].Set(aNodeSearch[iBestNode].dBestLeftSumZ, aNodeSearch[iBestNode].dBestLeftTotalW, aNodeSearch[iBestNode].cBestLeftN, pNewLeftNode, &(pNewSplitNode->pLeftNode), pNodeFactory); } // end tree growing // DEBUG // Print(); Cleanup: return hr; Error: goto Cleanup; } GBMRESULT CCARTTree::GetBestSplit ( CDataset *pData, unsigned long nTrain, CNodeSearch *aNodeSearch, unsigned long cTerminalNodes, unsigned long *aiNodeAssign, bool *afInBag, double *adZ, double *adW, unsigned long &iBestNode, double &dBestNodeImprovement ) { GBMRESULT hr = GBM_OK; int iVar = 0; unsigned long iNode = 0; unsigned long iOrderObs = 0; unsigned long iWhichObs = 0; unsigned long cVarClasses = 0; double dX = 0.0; for(iVar=0; iVar < pData->cCols; iVar++) { cVarClasses = pData->acVarClasses[iVar]; for(iNode=0; iNode < cTerminalNodes; iNode++) { hr = aNodeSearch[iNode].ResetForNewVar(iVar,cVarClasses); } // distribute the observations in order to the correct node search for(iOrderObs=0; iOrderObs < nTrain; iOrderObs++) { iWhichObs = pData->aiXOrder[iVar*nTrain + iOrderObs]; if(afInBag[iWhichObs]) { iNode = aiNodeAssign[iWhichObs]; dX = pData->adX[iVar*(pData->cRows) + iWhichObs]; hr = aNodeSearch[iNode].IncorporateObs (dX, adZ[iWhichObs], adW[iWhichObs], pData->alMonotoneVar[iVar]); if(GBM_FAILED(hr)) { goto Error; } } } for(iNode=0; iNode dBestNodeImprovement) { iBestNode = iNode; dBestNodeImprovement = aNodeSearch[iNode].BestImprovement(); } } Cleanup: return hr; Error: goto Cleanup; } GBMRESULT CCARTTree::GetNodeCount ( int &cNodes ) { cNodes = cTotalNodeCount; return GBM_OK; } GBMRESULT CCARTTree::PredictValid ( CDataset *pData, unsigned long nValid, double *adFadj ) { GBMRESULT hr = GBM_OK; int i=0; for(i=pData->cRows - nValid; icRows; i++) { pRootNode->Predict(pData, i, adFadj[i]); adFadj[i] *= dShrink; } return hr; } GBMRESULT CCARTTree::Predict ( double *adX, unsigned long cRow, unsigned long cCol, unsigned long iRow, double &dFadj ) { if(pRootNode != NULL) { pRootNode->Predict(adX,cRow,cCol,iRow,dFadj); dFadj *= dShrink; } else { dFadj = 0.0; } return GBM_OK; } GBMRESULT CCARTTree::Adjust ( unsigned long *aiNodeAssign, double *adFadj, unsigned long cTrain, VEC_P_NODETERMINAL &vecpTermNodes, unsigned long cMinObsInNode ) { unsigned long hr = GBM_OK; unsigned long iObs = 0; hr = pRootNode->Adjust(cMinObsInNode); if(GBM_FAILED(hr)) { goto Error; } // predict for the training observations for(iObs=0; iObsdPrediction; } Cleanup: return hr; Error: goto Cleanup; } GBMRESULT CCARTTree::Print() { GBMRESULT hr = GBM_OK; if(pRootNode != NULL) { pRootNode->PrintSubtree(0); Rprintf("shrinkage: %f\n",dShrink); Rprintf("initial error: %f\n\n",dError); } return hr; } GBMRESULT CCARTTree::GetVarRelativeInfluence ( double *adRelInf ) { GBMRESULT hr = GBM_OK; if(pRootNode != NULL) { hr = pRootNode->GetVarRelativeInfluence(adRelInf); if(GBM_FAILED(hr)) { goto Error; } } Cleanup: return hr; Error: goto Cleanup; } GBMRESULT CCARTTree::TransferTreeToRList ( CDataset *pData, int *aiSplitVar, double *adSplitPoint, int *aiLeftNode, int *aiRightNode, int *aiMissingNode, double *adErrorReduction, double *adWeight, double *adPred, VEC_VEC_CATEGORIES &vecSplitCodes, int cCatSplitsOld, double dShrinkage ) { GBMRESULT hr = GBM_OK; int iNodeID = 0; if(pRootNode != NULL) { hr = pRootNode->TransferTreeToRList(iNodeID, pData, aiSplitVar, adSplitPoint, aiLeftNode, aiRightNode, aiMissingNode, adErrorReduction, adWeight, adPred, vecSplitCodes, cCatSplitsOld, dShrinkage); } else { hr = GBM_FAIL; } return hr; } gbm/src/laplace.h0000644000176200001440000000603714547652770013407 0ustar liggesusers//------------------------------------------------------------------------------ // GBM by Greg Ridgeway Copyright (C) 2003 // File: laplace.h // // License: GNU GPL (version 2 or later) // // Contents: laplace object // // Owner: gregr@rand.org // // History: 3/26/2001 gregr created // 2/14/2003 gregr: adapted for R implementation // //------------------------------------------------------------------------------ #ifndef LAPLACGBM_H #define LAPLACGBM_H #include #include "distribution.h" #include "locationm.h" class CLaplace : public CDistribution { public: CLaplace(); virtual ~CLaplace(); GBMRESULT UpdateParams(double *adF, double *adOffset, double *adWeight, unsigned long cLength) { return GBM_OK; }; GBMRESULT ComputeWorkingResponse(double *adY, double *adMisc, double *adOffset, double *adF, double *adZ, double *adWeight, bool *afInBag, unsigned long nTrain, int cIdxOff); GBMRESULT InitF(double *adY, double *adMisc, double *adOffset, double *adWeight, double &dInitF, unsigned long cLength); GBMRESULT FitBestConstant(double *adY, double *adMisc, double *adOffset, double *adW, double *adF, double *adZ, unsigned long *aiNodeAssign, unsigned long nTrain, VEC_P_NODETERMINAL vecpTermNodes, unsigned long cTermNodes, unsigned long cMinObsInNode, bool *afInBag, double *adFadj, int cIdxOff); double Deviance(double *adY, double *adMisc, double *adOffset, double *adWeight, double *adF, unsigned long cLength, int cIdxOff); double BagImprovement(double *adY, double *adMisc, double *adOffset, double *adWeight, double *adF, double *adFadj, bool *afInBag, double dStepSize, unsigned long nTrain); private: std::vector vecd; std::vector::iterator itMedian; CLocationM *mpLocM; double *adArr; // for temp calculations double *adW2; }; #endif // LAPLACGBM_H gbm/src/tdist.h0000644000176200001440000000543714547111634013125 0ustar liggesusers//------------------------------------------------------------------------------ // GBM alteration by Daniel Edwards // // File: tdist.h // // Contains: Distribution object to implement t-distribution // // History: 04/04/2008 Created // //------------------------------------------------------------------------------ #ifndef TDISTCGBM_H #define TDISTCGBM_H #include #include "distribution.h" #include "locationm.h" class CTDist : public CDistribution { public: CTDist(double adNu); virtual ~CTDist(); GBMRESULT UpdateParams(double *adF, double *adOffset, double *adWeight, unsigned long cLength) { return GBM_OK; }; GBMRESULT ComputeWorkingResponse(double *adY, double *adMisc, double *adOffset, double *adF, double *adZ, double *adWeight, bool *afInBag, unsigned long nTrain, int cIdxOff); GBMRESULT InitF(double *adY, double *adMisc, double *adOffset, double *adWeight, double &dInitF, unsigned long cLength); GBMRESULT FitBestConstant(double *adY, double *adMisc, double *adOffset, double *adW, double *adF, double *adZ, unsigned long *aiNodeAssign, unsigned long nTrain, VEC_P_NODETERMINAL vecpTermNodes, unsigned long cTermNodes, unsigned long cMinObsInNode, bool *afInBag, double *adFadj, int cIdxOff); double Deviance(double *adY, double *adMisc, double *adOffset, double *adWeight, double *adF, unsigned long cLength, int cIdxOff); double BagImprovement(double *adY, double *adMisc, double *adOffset, double *adWeight, double *adF, double *adFadj, bool *afInBag, double dStepSize, unsigned long nTrain); private: double mdNu; CLocationM *mpLocM; }; #endif // TDISTCGBM_H gbm/src/tree.h0000644000176200001440000000756714547111634012743 0ustar liggesusers//------------------------------------------------------------------------------ // GBM by Greg Ridgeway Copyright (C) 2003 // // File: tree.h // // License: GNU GPL (version 2 or later) // // Contents: regression tree // // Owner: gregr@rand.org // // History: 3/26/2001 gregr created // 2/14/2003 gregr: adapted for R implementation // //------------------------------------------------------------------------------ #ifndef TREGBM_H #define TREGBM_H #include #include #include #include "dataset.h" #include "node_factory.h" #include "node_search.h" class CCARTTree { public: CCARTTree(); ~CCARTTree(); GBMRESULT Initialize(CNodeFactory *pNodeFactory); GBMRESULT grow(double *adZ, CDataset *pData, double *adAlgW, double *adF, unsigned long nTrain, unsigned long nBagged, double dLambda, unsigned long cMaxDepth, unsigned long cMinObsInNode, bool *afInBag, unsigned long *aiNodeAssign, CNodeSearch *aNodeSearch, VEC_P_NODETERMINAL &vecpTermNodes); GBMRESULT Reset(); GBMRESULT TransferTreeToRList(CDataset *pData, int *aiSplitVar, double *adSplitPoint, int *aiLeftNode, int *aiRightNode, int *aiMissingNode, double *adErrorReduction, double *adWeight, double *adPred, VEC_VEC_CATEGORIES &vecSplitCodes, int cCatSplitsOld, double dShrinkage); GBMRESULT PredictValid(CDataset *pData, unsigned long nValid, double *adFadj); GBMRESULT Predict(double *adX, unsigned long cRow, unsigned long cCol, unsigned long iRow, double &dFadj); GBMRESULT Adjust(unsigned long *aiNodeAssign, double *adFadj, unsigned long cTrain, VEC_P_NODETERMINAL &vecpTermNodes, unsigned long cMinObsInNode); GBMRESULT GetNodeCount(int &cNodes); GBMRESULT SetShrinkage(double dShrink) { this->dShrink = dShrink; return GBM_OK; } double GetShrinkage() {return dShrink;} GBMRESULT Print(); GBMRESULT GetVarRelativeInfluence(double *adRelInf); double dError; // total squared error before carrying out the splits private: GBMRESULT GetBestSplit(CDataset *pData, unsigned long nTrain, CNodeSearch *aNodeSearch, unsigned long cTerminalNodes, unsigned long *aiNodeAssign, bool *afInBag, double *adZ, double *adW, unsigned long &iBestNode, double &dBestNodeImprovement); CNode *pRootNode; double dShrink; // objects used repeatedly unsigned long cDepth; unsigned long cTerminalNodes; unsigned long cTotalNodeCount; unsigned long iObs; unsigned long iWhichNode; unsigned long iBestNode; double dBestNodeImprovement; double dSumZ; double dSumZ2; double dTotalW; signed char schWhichNode; CNodeFactory *pNodeFactory; CNodeNonterminal *pNewSplitNode; CNodeTerminal *pNewLeftNode; CNodeTerminal *pNewRightNode; CNodeTerminal *pNewMissingNode; CNodeTerminal *pInitialRootNode; }; typedef CCARTTree *PCCARTTree; #endif // TREGBM_H gbm/src/node_continuous.cpp0000644000176200001440000001211414547114320015525 0ustar liggesusers// GBM by Greg Ridgeway Copyright (C) 2003 #include "node_continuous.h" #include "node_factory.h" CNodeContinuous::CNodeContinuous() { dSplitValue = 0.0; } CNodeContinuous::~CNodeContinuous() { #ifdef NOISY_DEBUG Rprintf("continuous destructor\n"); #endif } GBMRESULT CNodeContinuous::PrintSubtree ( unsigned long cIndent ) { GBMRESULT hr = GBM_OK; unsigned long i = 0; for(i=0; i< cIndent; i++) Rprintf(" "); Rprintf("N=%f, Improvement=%f, Prediction=%f, NA pred=%f\n", dTrainW, dImprovement, dPrediction, (pMissingNode == NULL ? 0.0 : pMissingNode->dPrediction)); for(i=0; i< cIndent; i++) Rprintf(" "); Rprintf("V%lu < %f\n", iSplitVar, dSplitValue); hr = pLeftNode->PrintSubtree(cIndent+1); for(i=0; i< cIndent; i++) Rprintf(" "); Rprintf("V%lu > %f\n", iSplitVar, dSplitValue); hr = pRightNode->PrintSubtree(cIndent+1); for(i=0; i< cIndent; i++) Rprintf(" "); Rprintf("missing\n"); hr = pMissingNode->PrintSubtree(cIndent+1); return hr; } signed char CNodeContinuous::WhichNode ( CDataset *pData, unsigned long iObs ) { signed char ReturnValue = 0; double dX = pData->adX[iSplitVar*(pData->cRows) + iObs]; if(!ISNA(dX)) { if(dX < dSplitValue) { ReturnValue = -1; } else { ReturnValue = 1; } } // if missing value returns 0 return ReturnValue; } signed char CNodeContinuous::WhichNode ( double *adX, unsigned long cRow, unsigned long cCol, unsigned long iRow ) { signed char ReturnValue = 0; double dX = adX[iSplitVar*cRow + iRow]; if(!ISNA(dX)) { if(dX < dSplitValue) { ReturnValue = -1; } else { ReturnValue = 1; } } // if missing value returns 0 return ReturnValue; } GBMRESULT CNodeContinuous::RecycleSelf ( CNodeFactory *pNodeFactory ) { GBMRESULT hr = GBM_OK; pNodeFactory->RecycleNode(this); return hr; }; GBMRESULT CNodeContinuous::TransferTreeToRList ( int &iNodeID, CDataset *pData, int *aiSplitVar, double *adSplitPoint, int *aiLeftNode, int *aiRightNode, int *aiMissingNode, double *adErrorReduction, double *adWeight, double *adPred, VEC_VEC_CATEGORIES &vecSplitCodes, int cCatSplitsOld, double dShrinkage ) { GBMRESULT hr = GBM_OK; int iThisNodeID = iNodeID; aiSplitVar[iThisNodeID] = iSplitVar; adSplitPoint[iThisNodeID] = dSplitValue; adErrorReduction[iThisNodeID] = dImprovement; adWeight[iThisNodeID] = dTrainW; adPred[iThisNodeID] = dShrinkage*dPrediction; iNodeID++; aiLeftNode[iThisNodeID] = iNodeID; hr = pLeftNode->TransferTreeToRList(iNodeID, pData, aiSplitVar, adSplitPoint, aiLeftNode, aiRightNode, aiMissingNode, adErrorReduction, adWeight, adPred, vecSplitCodes, cCatSplitsOld, dShrinkage); if(GBM_FAILED(hr)) goto Error; aiRightNode[iThisNodeID] = iNodeID; hr = pRightNode->TransferTreeToRList(iNodeID, pData, aiSplitVar, adSplitPoint, aiLeftNode, aiRightNode, aiMissingNode, adErrorReduction, adWeight, adPred, vecSplitCodes, cCatSplitsOld, dShrinkage); if(GBM_FAILED(hr)) goto Error; aiMissingNode[iThisNodeID] = iNodeID; hr = pMissingNode->TransferTreeToRList(iNodeID, pData, aiSplitVar, adSplitPoint, aiLeftNode, aiRightNode, aiMissingNode, adErrorReduction, adWeight, adPred, vecSplitCodes, cCatSplitsOld, dShrinkage); if(GBM_FAILED(hr)) goto Error; Cleanup: return hr; Error: goto Cleanup; } gbm/src/distribution.h0000644000176200001440000001312114547111634014502 0ustar liggesusers//------------------------------------------------------------------------------ // GBM by Greg Ridgeway Copyright (C) 2003 // // File: distribution.h // // License: GNU GPL (version 2 or later) // // Contents: distribution object // // Owner: gregr@rand.org // // History: 3/26/2001 gregr created // 2/14/2003 gregr: adapted for R implementation // //------------------------------------------------------------------------------ #ifndef DISTRIBUTION_H #define DISTRIBUTION_H #include "node_terminal.h" class CDistribution { public: CDistribution(); virtual ~CDistribution(); // In the subsequent functions, parameters have the following meaning: // * adY - The target // * adMisc - Optional auxiliary data (the precise meaning is specific to the // derived class) // * adOffset - An optional offset to the score (adF) // * adWeight - Instance training weight // * adF - Current score (sum of all trees generated so far) // * adZ - (Negative) gradient of loss function, to be predicted by tree // * adFadj - Output of current tree, to be added to adF // * cLength - Number of instances (size of vectors) // * afInBag - true if instance is part of training set for current tree // (depends on random subsampling) // * cIdxOff - Offset used for multi-class training (CMultinomial). // Initialize() is called once, before training starts. // It gives derived classes a chance for custom preparations, e.g., to allocate // memory or to pre-compute values that do not change between iterations. virtual GBMRESULT Initialize(double *adY, double *adMisc, double *adOffset, double *adWeight, unsigned long cLength) { return GBM_OK; } // UpdateParams() is called at the start of each iteration. // CMultinomial uses it to normalize predictions across multiple classes. virtual GBMRESULT UpdateParams(double *adF, double *adOffset, double *adWeight, unsigned long cLength) = 0; // ComputeWorkingResonse() calculates the negative gradients of the // loss function, and stores them in adZ. virtual GBMRESULT ComputeWorkingResponse(double *adY, double *adMisc, double *adOffset, double *adF, double *adZ, double *adWeight, bool *afInBag, unsigned long cLength, int cIdxOff) = 0; // InitF() computes the best constant prediction for all instances, and // stores it in dInitF. virtual GBMRESULT InitF(double *adY, double *adMisc, double *adOffset, double *adWeight, double &dInitF, unsigned long cLength) = 0; // Deviance() returns the value of the loss function, based on the // current predictions (adF). virtual double Deviance(double *adY, double *adMisc, double *adOffset, double *adWeight, double *adF, unsigned long cLength, int cIdxOff) = 0; // FitBestConstant() calculates and sets prediction values for all terminal nodes // of the tree being currently constructed. // Assumptions: // * cTermNodes is the number of terminal nodes of the tree. // * vecpTermNodes is a vector of (pointers to) the terminal nodes of the tree, of // size cTermNodes. // * aiNodeAssign is a vector of size cLength, that maps each instance to an index // into vecpTermNodes for the corresponding terminal node. virtual GBMRESULT FitBestConstant(double *adY, double *adMisc, double *adOffset, double *adWeight, double *adF, double *adZ, unsigned long *aiNodeAssign, unsigned long cLength, VEC_P_NODETERMINAL vecpTermNodes, unsigned long cTermNodes, unsigned long cMinObsInNode, bool *afInBag, double *adFadj, int cIdxOff) = 0; // BagImprovement() returns the incremental difference in the loss // function induced by scoring with (adF + dStepSize * adFAdj) instead of adF, for // all instances that were not part of the training set for the current tree (i.e., // afInBag set to false). virtual double BagImprovement(double *adY, double *adMisc, double *adOffset, double *adWeight, double *adF, double *adFadj, bool *afInBag, double dStepSize, unsigned long cLength) = 0; }; typedef CDistribution *PCDistribution; #endif // DISTRIBUTION_H gbm/src/dataset.cpp0000644000176200001440000000314114547111634013744 0ustar liggesusers// GBM by Greg Ridgeway Copyright (C) 2003 #include "dataset.h" CDataset::CDataset() { fHasOffset = false; adX = NULL; aiXOrder = NULL; adXTemp4Order = NULL; adY = NULL; adOffset = NULL; adWeight = NULL; apszVarNames = NULL; cRows = 0; cCols = 0; } CDataset::~CDataset() { } GBMRESULT CDataset::ResetWeights() { GBMRESULT hr = GBM_OK; int i = 0; if(adWeight == NULL) { hr = GBM_INVALIDARG; goto Error; } for(i=0; icRows = cRows; this->cCols = cCols; this->adX = adX; this->aiXOrder = aiXOrder; this->adY = adY; this->adOffset = adOffset; this->adWeight = adWeight; this->acVarClasses = acVarClasses; this->alMonotoneVar = alMonotoneVar; if((adOffset != NULL) && !ISNA(*adOffset)) { this->adOffset = adOffset; fHasOffset = true; } else { this->adOffset = NULL; fHasOffset = false; } if((adMisc != NULL) && !ISNA(*adMisc)) { this->adMisc = adMisc; } else { this->adMisc = NULL; } Cleanup: return hr; Error: goto Cleanup; } gbm/src/buildinfo.h0000644000176200001440000000122614562474325013747 0ustar liggesusers// GBM by Greg Ridgeway Copyright (C) 2003 // License: GNU GPL (version 2 or later) #ifndef BUILDINFO_H #define BUILDINFO_H #undef ERROR #define R_NO_REMAP // https://rstudio.github.io/r-manuals/r-exts/The-R-API.html #include #define GBM_FAILED(hr) ((unsigned long)hr != 0) typedef unsigned long GBMRESULT; #define GBM_OK 0 #define GBM_FAIL 1 #define GBM_INVALIDARG 2 #define GBM_OUTOFMEMORY 3 #define GBM_INVALID_DATA 4 #define GBM_NOTIMPL 5 #define LEVELS_PER_CHUNK ((unsigned long) 1) typedef unsigned long ULONG; typedef char *PCHAR; // #define NOISY_DEBUG #endif // BUILDINFO_H gbm/src/gbm-init.c0000644000176200001440000000160314547111634013466 0ustar liggesusers#include #include #include // for NULL #include /* FIXME: Check these declarations against the C/Fortran source code. */ /* .Call calls */ extern SEXP gbm_fit(SEXP, SEXP, SEXP, SEXP, SEXP, SEXP, SEXP, SEXP, SEXP, SEXP, SEXP, SEXP, SEXP, SEXP, SEXP, SEXP, SEXP, SEXP, SEXP, SEXP, SEXP, SEXP); extern SEXP gbm_plot(SEXP, SEXP, SEXP, SEXP, SEXP, SEXP, SEXP, SEXP, SEXP, SEXP); extern SEXP gbm_pred(SEXP, SEXP, SEXP, SEXP, SEXP, SEXP, SEXP, SEXP, SEXP, SEXP); static const R_CallMethodDef CallEntries[] = { {"gbm_fit", (DL_FUNC) &gbm_fit, 22}, {"gbm_plot", (DL_FUNC) &gbm_plot, 10}, {"gbm_pred", (DL_FUNC) &gbm_pred, 10}, {NULL, NULL, 0} }; void R_init_gbm(DllInfo *dll) { R_registerRoutines(dll, NULL, CallEntries, NULL, NULL); R_useDynamicSymbols(dll, FALSE); } gbm/src/node_search.cpp0000644000176200001440000003045314562474421014602 0ustar liggesusers//------------------------------------------------------------------------------ // GBM by Greg Ridgeway Copyright (C) 2003 // // File: node_search.cpp // //------------------------------------------------------------------------------ #include "node_search.h" CNodeSearch::CNodeSearch() :k_cMaxClasses(1024) { iBestSplitVar = 0; dBestSplitValue = 0.0; fIsSplit = false; dBestMissingTotalW = 0.0; dCurrentMissingTotalW = 0.0; dBestMissingSumZ = 0.0; dCurrentMissingSumZ = 0.0; adGroupSumZ = NULL; adGroupW = NULL; acGroupN = NULL; adGroupMean = NULL; aiCurrentCategory = NULL; aiBestCategory = NULL; iRank = UINT_MAX; } CNodeSearch::~CNodeSearch() { if(adGroupSumZ != NULL) { delete [] adGroupSumZ; adGroupSumZ = NULL; } if(adGroupW != NULL) { delete [] adGroupW; adGroupW = NULL; } if(acGroupN != NULL) { delete [] acGroupN; acGroupN = NULL; } if(adGroupMean != NULL) { delete [] adGroupMean; adGroupMean = NULL; } if(aiCurrentCategory != NULL) { delete [] aiCurrentCategory; aiCurrentCategory = NULL; } if(aiBestCategory != NULL) { delete [] aiBestCategory; aiBestCategory = NULL; } } GBMRESULT CNodeSearch::Initialize ( unsigned long cMinObsInNode ) { GBMRESULT hr = GBM_OK; adGroupSumZ = new double[k_cMaxClasses]; if(adGroupSumZ == NULL) { hr = GBM_OUTOFMEMORY; goto Error; } adGroupW = new double[k_cMaxClasses]; if(adGroupW == NULL) { hr = GBM_OUTOFMEMORY; goto Error; } acGroupN = new ULONG[k_cMaxClasses]; if(acGroupN == NULL) { hr = GBM_OUTOFMEMORY; goto Error; } adGroupMean = new double[k_cMaxClasses]; if(adGroupMean == NULL) { hr = GBM_OUTOFMEMORY; goto Error; } aiCurrentCategory = new int[k_cMaxClasses]; if(aiCurrentCategory == NULL) { hr = GBM_OUTOFMEMORY; goto Error; } aiBestCategory = new ULONG[k_cMaxClasses]; if(aiBestCategory == NULL) { hr = GBM_OUTOFMEMORY; goto Error; } this->cMinObsInNode = cMinObsInNode; Cleanup: return hr; Error: goto Cleanup; } GBMRESULT CNodeSearch::IncorporateObs ( double dX, double dZ, double dW, long lMonotone ) { GBMRESULT hr = GBM_OK; static double dWZ = 0.0; if(fIsSplit) goto Cleanup; dWZ = dW*dZ; if(ISNA(dX)) { dCurrentMissingSumZ += dWZ; dCurrentMissingTotalW += dW; cCurrentMissingN++; dCurrentRightSumZ -= dWZ; dCurrentRightTotalW -= dW; cCurrentRightN--; } else if(cCurrentVarClasses == 0) // variable is continuous { if(dLastXValue > dX) { Rf_error("Observations are not in order. gbm() was unable to build an index for the design matrix. Could be a bug in gbm or an unusual data type in data.\n"); hr = GBM_FAIL; goto Error; } // Evaluate the current split // the newest observation is still in the right child dCurrentSplitValue = 0.5*(dLastXValue + dX); if((dLastXValue != dX) && (cCurrentLeftN >= cMinObsInNode) && (cCurrentRightN >= cMinObsInNode) && ((lMonotone==0) || (lMonotone*(dCurrentRightSumZ*dCurrentLeftTotalW - dCurrentLeftSumZ*dCurrentRightTotalW) > 0))) { dCurrentImprovement = CNode::Improvement(dCurrentLeftTotalW,dCurrentRightTotalW, dCurrentMissingTotalW, dCurrentLeftSumZ,dCurrentRightSumZ, dCurrentMissingSumZ); if(dCurrentImprovement > dBestImprovement) { iBestSplitVar = iCurrentSplitVar; dBestSplitValue = dCurrentSplitValue; cBestVarClasses = 0; dBestLeftSumZ = dCurrentLeftSumZ; dBestLeftTotalW = dCurrentLeftTotalW; cBestLeftN = cCurrentLeftN; dBestRightSumZ = dCurrentRightSumZ; dBestRightTotalW = dCurrentRightTotalW; cBestRightN = cCurrentRightN; dBestImprovement = dCurrentImprovement; } } // now move the new observation to the left // if another observation arrives we will evaluate this dCurrentLeftSumZ += dWZ; dCurrentLeftTotalW += dW; cCurrentLeftN++; dCurrentRightSumZ -= dWZ; dCurrentRightTotalW -= dW; cCurrentRightN--; dLastXValue = dX; } else // variable is categorical, evaluates later { adGroupSumZ[(unsigned long)dX] += dWZ; adGroupW[(unsigned long)dX] += dW; acGroupN[(unsigned long)dX] ++; } Cleanup: return hr; Error: goto Cleanup; } GBMRESULT CNodeSearch::Set ( double dSumZ, double dTotalW, unsigned long cTotalN, CNodeTerminal *pThisNode, CNode **ppParentPointerToThisNode, CNodeFactory *pNodeFactory ) { GBMRESULT hr = GBM_OK; dInitSumZ = dSumZ; dInitTotalW = dTotalW; cInitN = cTotalN; dBestLeftSumZ = 0.0; dBestLeftTotalW = 0.0; cBestLeftN = 0; dCurrentLeftSumZ = 0.0; dCurrentLeftTotalW = 0.0; cCurrentLeftN = 0; dBestRightSumZ = dSumZ; dBestRightTotalW = dTotalW; cBestRightN = cTotalN; dCurrentRightSumZ = 0.0; dCurrentRightTotalW = dTotalW; cCurrentRightN = cTotalN; dBestMissingSumZ = 0.0; dBestMissingTotalW = 0.0; cBestMissingN = 0; dCurrentMissingSumZ = 0.0; dCurrentMissingTotalW = 0.0; cCurrentMissingN = 0; dBestImprovement = 0.0; iBestSplitVar = UINT_MAX; dCurrentImprovement = 0.0; iCurrentSplitVar = UINT_MAX; dCurrentSplitValue = -HUGE_VAL; fIsSplit = false; this->pThisNode = pThisNode; this->ppParentPointerToThisNode = ppParentPointerToThisNode; this->pNodeFactory = pNodeFactory; return hr; } GBMRESULT CNodeSearch::ResetForNewVar ( unsigned long iWhichVar, long cCurrentVarClasses ) { GBMRESULT hr = GBM_OK; long i=0; if(fIsSplit) goto Cleanup; for(i=0; icCurrentVarClasses = cCurrentVarClasses; dCurrentLeftSumZ = 0.0; dCurrentLeftTotalW = 0.0; cCurrentLeftN = 0; dCurrentRightSumZ = dInitSumZ; dCurrentRightTotalW = dInitTotalW; cCurrentRightN = cInitN; dCurrentMissingSumZ = 0.0; dCurrentMissingTotalW = 0.0; cCurrentMissingN = 0; dCurrentImprovement = 0.0; dLastXValue = -HUGE_VAL; Cleanup: return hr; } GBMRESULT CNodeSearch::WrapUpCurrentVariable() { GBMRESULT hr = GBM_OK; if(iCurrentSplitVar == iBestSplitVar) { if(cCurrentMissingN > 0) { dBestMissingSumZ = dCurrentMissingSumZ; dBestMissingTotalW = dCurrentMissingTotalW; cBestMissingN = cCurrentMissingN; } else // DEBUG: consider a weighted average with parent node? { dBestMissingSumZ = dInitSumZ; dBestMissingTotalW = dInitTotalW; cBestMissingN = 0; } } return hr; } GBMRESULT CNodeSearch::EvaluateCategoricalSplit() { GBMRESULT hr = GBM_OK; long i=0; long j=0; unsigned long cFiniteMeans = 0; if(fIsSplit) goto Cleanup; if(cCurrentVarClasses == 0) { hr = GBM_INVALIDARG; goto Error; } cFiniteMeans = 0; for(i=0; i1) && ((ULONG)i= cMinObsInNode) && (cCurrentRightN >= cMinObsInNode) && (dCurrentImprovement > dBestImprovement)) { dBestSplitValue = dCurrentSplitValue; if(iBestSplitVar != iCurrentSplitVar) { iBestSplitVar = iCurrentSplitVar; cBestVarClasses = cCurrentVarClasses; for(j=0; jGetNewNodeTerminal(); pNewRightNode = pNodeFactory->GetNewNodeTerminal(); pNewMissingNode = pNodeFactory->GetNewNodeTerminal(); // set up a continuous split if(cBestVarClasses==0) { pNewNodeContinuous = pNodeFactory->GetNewNodeContinuous(); pNewNodeContinuous->dSplitValue = dBestSplitValue; pNewNodeContinuous->iSplitVar = iBestSplitVar; pNewSplitNode = pNewNodeContinuous; } else { // get a new categorical node and its branches pNewNodeCategorical = pNodeFactory->GetNewNodeCategorical(); // set up the categorical split pNewNodeCategorical->iSplitVar = iBestSplitVar; pNewNodeCategorical->cLeftCategory = (ULONG)dBestSplitValue + 1; pNewNodeCategorical->aiLeftCategory = new ULONG[pNewNodeCategorical->cLeftCategory]; for(i=0; icLeftCategory; i++) { pNewNodeCategorical->aiLeftCategory[i] = aiBestCategory[i]; } pNewSplitNode = pNewNodeCategorical; } *ppParentPointerToThisNode = pNewSplitNode; pNewSplitNode->dPrediction = pThisNode->dPrediction; pNewSplitNode->dImprovement = dBestImprovement; pNewSplitNode->dTrainW = pThisNode->dTrainW; pNewSplitNode->pLeftNode = pNewLeftNode; pNewSplitNode->pRightNode = pNewRightNode; pNewSplitNode->pMissingNode = pNewMissingNode; pNewLeftNode->dPrediction = dBestLeftSumZ/dBestLeftTotalW; pNewLeftNode->dTrainW = dBestLeftTotalW; pNewLeftNode->cN = cBestLeftN; pNewRightNode->dPrediction = dBestRightSumZ/dBestRightTotalW; pNewRightNode->dTrainW = dBestRightTotalW; pNewRightNode->cN = cBestRightN; pNewMissingNode->dPrediction = dBestMissingSumZ/dBestMissingTotalW; pNewMissingNode->dTrainW = dBestMissingTotalW; pNewMissingNode->cN = cBestMissingN; pThisNode->RecycleSelf(pNodeFactory); return hr; } gbm/src/node_nonterminal.cpp0000644000176200001440000000467514547111634015667 0ustar liggesusers// GBM by Greg Ridgeway Copyright (C) 2003 #include "node_nonterminal.h" CNodeNonterminal::CNodeNonterminal() { pLeftNode = NULL; pRightNode = NULL; iSplitVar = 0; dImprovement = 0.0; pMissingNode = NULL; } CNodeNonterminal::~CNodeNonterminal() { } GBMRESULT CNodeNonterminal::Adjust ( unsigned long cMinObsInNode ) { GBMRESULT hr = GBM_OK; hr = pLeftNode->Adjust(cMinObsInNode); hr = pRightNode->Adjust(cMinObsInNode); if(pMissingNode->isTerminal && (pMissingNode->cN < cMinObsInNode)) { dPrediction = ((pLeftNode->dTrainW)*(pLeftNode->dPrediction) + (pRightNode->dTrainW)*(pRightNode->dPrediction))/ (pLeftNode->dTrainW + pRightNode->dTrainW); pMissingNode->dPrediction = dPrediction; } else { hr = pMissingNode->Adjust(cMinObsInNode); dPrediction = ((pLeftNode->dTrainW)* (pLeftNode->dPrediction) + (pRightNode->dTrainW)* (pRightNode->dPrediction) + (pMissingNode->dTrainW)*(pMissingNode->dPrediction))/ (pLeftNode->dTrainW + pRightNode->dTrainW + pMissingNode->dTrainW); } return hr; } GBMRESULT CNodeNonterminal::Predict ( CDataset *pData, unsigned long iRow, double &dFadj ) { GBMRESULT hr = GBM_OK; signed char schWhichNode = WhichNode(pData,iRow); if(schWhichNode == -1) { hr = pLeftNode->Predict(pData, iRow, dFadj); } else if(schWhichNode == 1) { hr = pRightNode->Predict(pData, iRow, dFadj); } else { hr = pMissingNode->Predict(pData, iRow, dFadj); } return hr; } GBMRESULT CNodeNonterminal::Predict ( double *adX, unsigned long cRow, unsigned long cCol, unsigned long iRow, double &dFadj ) { GBMRESULT hr = GBM_OK; signed char schWhichNode = WhichNode(adX,cRow,cCol,iRow); if(schWhichNode == -1) { hr = pLeftNode->Predict(adX,cRow,cCol,iRow,dFadj); } else if(schWhichNode == 1) { hr = pRightNode->Predict(adX,cRow,cCol,iRow,dFadj); } else { hr = pMissingNode->Predict(adX,cRow,cCol,iRow,dFadj); } return hr; } GBMRESULT CNodeNonterminal::GetVarRelativeInfluence ( double *adRelInf ) { GBMRESULT hr = GBM_OK; adRelInf[iSplitVar] += dImprovement; pLeftNode->GetVarRelativeInfluence(adRelInf); pRightNode->GetVarRelativeInfluence(adRelInf); return hr; } gbm/src/pairwise.h0000644000176200001440000003266614547624323013631 0ustar liggesusers//--------------------------------------------------------------------------------- // GBM alteration by Stefan Schroedl (schroedl@a9.com) // // File: pairwise // // Contains: Distribution object to implement pairwise distributions for ranking // // History: 12/15/2011 Created // //--------------------------------------------------------------------------------- // This file implements the LambdaMart algorithm for learning ranking functions. // The main idea is to model p_ij, the probability that item i should rank higher // than j, as // p_ij = 1 / (1 + exp(s_i - s_j)), // where s_i, s_j are the model scores for the two items. // // While scores are still generated one item at a time, gradients for learning // depend on _pairs_ of items. The algorithm is aware of _groups_; all pairs of items // with different labels, belonging to the same group, are used for training. A // typical application is ranking for web search: groups correspond to user queries, // and items to (feature vectors of) web pages in the associated match set. // // Different IR measures can be chosen, to weight instances based on their rank. // Generally, changes in top ranks should have more influence than changes at the // bottom of the result list. This function provides the following options: // // * CONC (concordance index, fraction of correctly raked pairs. This is a generalization // of Area under the ROC Curve (AUC) from binary to multivalued labels. // * Normalized Discounted Cumulative Gain (NDCG) // * Mean Reciprocal Rank (MRR) of the highest-ranked positive instance. // * Mean Average Precision (MAP), a generalization of MRR to multiple positive instances. // // While MRR and MAP expect binary target labels, CONC and NDCG can equally work with // continuous values. More precisely, NDCG is defined as // \Sum_{r=1..n} val_r / log2(r+1), // where val_r is the user-specified target for the item at rank r. Note that this is // contrast to some definitions of NDCG that assume integer targets s_i, and // implicitly transform val_r = 2^{s+i}-1. // // Groups are specified using an integer vector of the same length as the training instances. // // Optionally, item weights can be supplied; it is assumed that all instances belonging // to the same group have the same weight. // // For background information on LambdaMart, please see e.g. the following papers: // // * Burges, C., "From RankNet to LambdaRank to LambdaMART: An Overview", Microsoft // Research Technical Report MSR-TR-2010-82, 2010 // * Donmez, P., K. Svore, K., and Burges, C., "On the Local Optimality of // LambdaRank", SIGIR 2009 // * Burges, C., Ragno, R., and Le, Q., "Learning to Rank with Non-Smooth Cost // Functions", NIPS 2006 #ifndef PAIRWISE_H #define PAIRWISE_H #include "distribution.h" #include "buildinfo.h" #include // for UINT_MAX // A class to rerank groups based on (intermediate) scores // Note: Smaller ranks are better, the top rank is 1 class CRanker { public: // Auxiliary structure to store score and rank typedef std::pair CDoubleUintPair; // Buffer memory allocation void Init(unsigned int cMaxItemsPerGroup); // Initialize ranker with scores of items belonging to the same group // - adScores is a score array, (at least) cNumItems long bool SetGroupScores(const double* const adScores, unsigned int cNumItems); // Perform the ranking // - Return true if any item changed its rank bool Rank(); // Getter / setter unsigned int GetNumItems() const { return cNumItems; } unsigned int GetRank(int i) const { return vecdipScoreRank[i].second; } unsigned int GetItem(unsigned int iRank) const { return (vecpdipScoreRank[iRank-1] - &(vecdipScoreRank[0])); } void SetRank(int i, unsigned int r) { vecdipScoreRank[i].second = r; } void AddToScore(int i, double delta) { vecdipScoreRank[i].first += delta; } protected: // Number of items in current group unsigned int cNumItems; // Pairs of (score, rank) for current group std::vector vecdipScoreRank; // Array of pointers to elements of vecdipScoreRank, used for sorting // Note: We need a separate array for sorting in order to be able to // quickly look up the rank for any given item. std::vector vecpdipScoreRank; }; // Abstract base class for all IR Measures class CIRMeasure { public: // Constructor CIRMeasure() : cRankCutoff(UINT_MAX) {} // Destructor virtual ~CIRMeasure() { } // Getter / Setter unsigned int GetCutoffRank() const { return cRankCutoff; } void SetCutoffRank(unsigned int cRankCutoff) { this->cRankCutoff = cRankCutoff; } // Auxiliary function for sanity check bool AnyPairs(const double* const adY, unsigned int cNumItems) const { return (cNumItems >= 2 // at least two instances && adY[0] > 0.0 // at least one positive example (targets are non-increasing) && adY[cNumItems-1] != adY[0]); // at least two different targets } // Memory allocation virtual void Init(unsigned long cMaxGroup, unsigned long cNumItems, unsigned int cRankCutoff = UINT_MAX) { this->cRankCutoff = cRankCutoff; } // Calculate the IR measure for the group of items set in the ranker. // Precondition: CRanker::SetGroupScores() has been called // - adY are the target scores virtual double Measure(const double* const adY, const CRanker& ranker) = 0; // Calculate the maximum achievable IR measure for a given group. // Side effect: the ranker state might change // Default implementation for MRR and MAP: if any positive items exist, // ranking them at the top yields a perfect measure of 1. virtual double MaxMeasure(unsigned int iGroup, const double* const adY, unsigned int cNumItems) { return (AnyPairs(adY, cNumItems) ? 1.0 : 0.0); } // Calculate the difference in the IR measure caused by swapping the ranks of two items. // Assumptions: // * iItemBetter has a higher label than iItemWorse (i.e., adY[iItemBetter] > adY[iItemWorse]). // * ranker.setGroup() has been called. virtual double SwapCost(int iItemBetter, int iItemWorse, const double* const adY, const CRanker& ranker) const = 0; protected: // Cut-off rank below which items are ignored for measure unsigned int cRankCutoff; }; // Class to implement IR Measure 'CONC' (fraction of concordant pairs). For the case of binary labels, this is // equivalent to the area under the ROC curve (AUC). class CConc : public CIRMeasure { public: virtual ~CConc() { } void Init(unsigned long cMaxGroup, unsigned long cNumItems, unsigned int cRankCutoff = UINT_MAX); double Measure(const double* const adY, const CRanker& ranker); // The maximum number of correctly classified pairs is simply all pairs with different labels double MaxMeasure(unsigned int iGroup, const double* const adY, unsigned int cNumItems) { return PairCount(iGroup, adY, cNumItems); } // (Cached) calculation of the number of pairs with different labels unsigned int PairCount(unsigned int iGroup, const double* const adY, unsigned int cNumItems); double SwapCost(int iItemBetter, int iItemWorse, const double* const adY, const CRanker& ranker) const; protected: // Calculate the number of pairs with different labels int ComputePairCount(const double* const adY, unsigned int cNumItems); // Caches the number of pairs with different labels, for each group std::vector veccPairCount; }; // Class to implement IR Measure 'Normalized Discounted Cumulative Gain' // Note: Labels can have any non-negative value class CNDCG : public CIRMeasure { public: void Init(unsigned long cMaxGroup, unsigned long cNumItems, unsigned int cRankCutoff = UINT_MAX); // Compute DCG double Measure(const double* const adY, const CRanker& ranker); // Compute best possible DCG double MaxMeasure(unsigned int iGroup, const double* const adY, unsigned int cNumItems); double SwapCost(int iItemBetter, int iItemWorse, const double* const adY, const CRanker& ranker) const; protected: // Lookup table for rank weight (w(rank) = 1/log2(1+rank)) std::vector vecdRankWeight; // Caches the maximum achievable DCG, for each group std::vector vecdMaxDCG; }; // Class to implement IR Measure 'Mean Reciprocal Rank' // Assumption: Labels are 0 or 1 class CMRR : public CIRMeasure { public: double Measure(const double* const adY, const CRanker& ranker); double SwapCost(int iItemPos, int iItemNeg, const double* const adY, const CRanker& ranker) const; }; // Class to implement IR Measure 'Mean Average Precision' // Assumption: Labels are 0 or 1 class CMAP : public CIRMeasure { public: void Init(unsigned long cMaxGroup, unsigned long cNumItems, unsigned int cRankCutoff = UINT_MAX); double Measure(const double* const adY, const CRanker& ranker); double SwapCost(int iItemPos, int iItemNeg, const double* const adY, const CRanker& ranker) const; protected: // Buffer to hold positions of positive examples mutable std::vector veccRankPos; }; // Main class for 'pairwise' distribution // Notes and Assumptions: // * The items are sorted such that // * Instances belonging to the same group occur in // a contiguous range // * Within a group, labels are non-increasing. // * adGroup supplies the group ID (positive integer, but double // format for compliance with the base class interface). // * The targets adY are non-negative values, and binary {0,1} // for measures MRR and MAP. // * Higher IR measures are better. // * Only pairs with different labels are used for training. // * Instance weights (adWeight) are constant among groups. // * CPairwise::Initialize() is called before any of the other // functions, with same values for adY, adGroup, adWeight, and // nTrain. Certain values have to be precomputed for // efficiency. class CPairwise : public CDistribution { public: // Constructor: determine IR measure as either "conc", "map", "mrr", or "ndcg" CPairwise(const char* szIRMeasure); virtual ~CPairwise(); GBMRESULT Initialize(double *adY, double *adGroup, double *adOffset, double *adWeight, unsigned long cLength); GBMRESULT UpdateParams(double *adF, double *adOffset, double *adWeight, unsigned long cLength) { return GBM_OK; }; GBMRESULT ComputeWorkingResponse(double *adY, double *adGroup, double *adOffset, double *adF, double *adZ, double *adWeight, bool *afInBag, unsigned long nTrain, int cIdxOff); double Deviance(double *adY, double *adGroup, double *adOffset, double *adWeight, double *adF, unsigned long cLength, int cIdxOff); GBMRESULT InitF(double *adY, double *adGroup, double *adOffset, double *adWeight, double &dInitF, unsigned long cLength); GBMRESULT FitBestConstant(double *adY, double *adGroup, double *adOffset, double *adW, double *adF, double *adZ, unsigned long *aiNodeAssign, unsigned long nTrain, VEC_P_NODETERMINAL vecpTermNodes, unsigned long cTermNodes, unsigned long cMinObsInNode, bool *afInBag, double *adFadj, int cIdxOff); double BagImprovement(double *adY, double *adGroup, double *adOffset, double *adWeight, double *adF, double *adFadj, bool *afInBag, double dStepSize, unsigned long nTrain); protected: // Calculate and accumulate up the gradients and Hessians from all training pairs void ComputeLambdas(int iGroup, unsigned int cNumItems, const double* const adY, const double* const adF, const double* const adWeight, double* adZ, double* adDeriv); CIRMeasure* pirm; // The IR measure to use CRanker ranker; // The ranker std::vector vecdHessian; // Second derivative of loss function, for each training instance; used for Newton step std::vector vecdNum; // Buffer used for numerator in FitBestConstant(), for each node std::vector vecdDenom; // Buffer used for denominator in FitBestConstant(), for each node std::vector vecdFPlusOffset; // Temporary buffer for (adF + adOffset), if the latter is not null }; #endif // PAIRWISE_H gbm/src/multinomial.h0000644000176200001440000000566114547653005014332 0ustar liggesusers//------------------------------------------------------------------------------ // GBM alteration by Daniel Edwards // // File: multinomial.h // // // Contains: Distribution object to implement multinomial // // History: 04/04/2008 Created // //------------------------------------------------------------------------------ #ifndef KMULTICGBM_H #define KMULTICGBM_H #include #include "distribution.h" #include "locationm.h" class CMultinomial : public CDistribution { public: CMultinomial(int cNumClasses, int cRows); virtual ~CMultinomial(); GBMRESULT UpdateParams(double *adF, double *adOffset, double *adWeight, unsigned long cLength); GBMRESULT ComputeWorkingResponse(double *adY, double *adMisc, double *adOffset, double *adF, double *adZ, double *adWeight, bool *afInBag, unsigned long nTrain, int cIdxOff); GBMRESULT InitF(double *adY, double *adMisc, double *adOffset, double *adWeight, double &dInitF, unsigned long cLength); GBMRESULT FitBestConstant(double *adY, double *adMisc, double *adOffset, double *adW, double *adF, double *adZ, unsigned long *aiNodeAssign, unsigned long nTrain, VEC_P_NODETERMINAL vecpTermNodes, unsigned long cTermNodes, unsigned long cMinObsInNode, bool *afInBag, double *adFadj, int cIdxOff); double Deviance(double *adY, double *adMisc, double *adOffset, double *adWeight, double *adF, unsigned long cLength, int cIdxOff); double BagImprovement(double *adY, double *adMisc, double *adOffset, double *adWeight, double *adF, double *adFadj, bool *afInBag, double dStepSize, unsigned long nTrain); private: unsigned long mcNumClasses; unsigned long mcRows; double *madProb; double *adStepProb; // used in BagImprovement() }; #endif // KMULTICGBM_H gbm/src/tdist.cpp0000644000176200001440000001055714547111634013457 0ustar liggesusers// GBM by Greg Ridgeway Copyright (C) 2003 #include "tdist.h" CTDist::CTDist(double adNu) { mdNu = adNu; double *adParams = new double[1]; adParams[0] = adNu; mpLocM = new CLocationM("tdist", 1, adParams); delete[] adParams; } CTDist::~CTDist() { delete mpLocM; } GBMRESULT CTDist::ComputeWorkingResponse ( double *adY, double *adMisc, double *adOffset, double *adF, double *adZ, double *adWeight, bool *afInBag, unsigned long nTrain, int cIdxOff ) { unsigned long i = 0; double dU = 0.0; if(adOffset == NULL) { for(i=0; iLocationM(iN, adArr, adWeight); delete[] adArr; return GBM_OK; } double CTDist::Deviance ( double *adY, double *adMisc, double *adOffset, double *adWeight, double *adF, unsigned long cLength, int cIdxOff ) { unsigned long i=0; double dL = 0.0; double dW = 0.0; double dU = 0.0; if(adOffset == NULL) { for(i=cIdxOff; icN >= cMinObsInNode) { // Get the number of nodes here int iNumNodes = 0; for (iObs = 0; iObs < nTrain; iObs++) { if(afInBag[iObs] && (aiNodeAssign[iObs] == iNode)) { iNumNodes++; } } // Create the arrays to centre double *adArr = new double[iNumNodes]; double *adWeight = new double[iNumNodes]; int iIdx = 0; for(iObs=0; iObsdPrediction = mpLocM->LocationM(iNumNodes, adArr, adWeight); delete[] adArr; delete[] adWeight; } } return hr; } double CTDist::BagImprovement ( double *adY, double *adMisc, double *adOffset, double *adWeight, double *adF, double *adFadj, bool *afInBag, double dStepSize, unsigned long nTrain ) { double dReturnValue = 0.0; double dF = 0.0; double dW = 0.0; unsigned long i = 0; double dU = 0.0; double dV = 0.0; for(i=0; idPrediction = 0.0; } return pNodeTerminalTemp; } CNodeContinuous* CNodeFactory::GetNewNodeContinuous() { if(ContinuousStack.empty()) { #ifdef NOISY_DEBUG Rprintf("Continuous stack is empty\n"); #endif pNodeContinuousTemp = NULL; } else { pNodeContinuousTemp = ContinuousStack.top(); ContinuousStack.pop(); pNodeContinuousTemp->dPrediction = 0.0; pNodeContinuousTemp->dImprovement = 0.0; pNodeContinuousTemp->pMissingNode = NULL; pNodeContinuousTemp->pLeftNode = NULL; pNodeContinuousTemp->pRightNode = NULL; pNodeContinuousTemp->iSplitVar = 0; pNodeContinuousTemp->dSplitValue = 0.0; } return pNodeContinuousTemp; } CNodeCategorical* CNodeFactory::GetNewNodeCategorical() { if(CategoricalStack.empty()) { #ifdef NOISY_DEBUG Rprintf("Categorical stack is empty\n"); #endif pNodeCategoricalTemp = NULL; } else { pNodeCategoricalTemp = CategoricalStack.top(); CategoricalStack.pop(); pNodeCategoricalTemp->dPrediction = 0.0; pNodeCategoricalTemp->dImprovement = 0.0; pNodeCategoricalTemp->pMissingNode = NULL; pNodeCategoricalTemp->pLeftNode = NULL; pNodeCategoricalTemp->pRightNode = NULL; pNodeCategoricalTemp->iSplitVar = 0; pNodeCategoricalTemp->aiLeftCategory = NULL; pNodeCategoricalTemp->cLeftCategory = 0; } return pNodeCategoricalTemp; } GBMRESULT CNodeFactory::RecycleNode ( CNodeTerminal *pNode ) { if(pNode != NULL) { TerminalStack.push(pNode); } return GBM_OK; } GBMRESULT CNodeFactory::RecycleNode ( CNodeContinuous *pNode ) { if(pNode != NULL) { if(pNode->pLeftNode != NULL) pNode->pLeftNode->RecycleSelf(this); if(pNode->pRightNode != NULL) pNode->pRightNode->RecycleSelf(this); if(pNode->pMissingNode != NULL) pNode->pMissingNode->RecycleSelf(this); ContinuousStack.push(pNode); } return GBM_OK; } GBMRESULT CNodeFactory::RecycleNode ( CNodeCategorical *pNode ) { if(pNode != NULL) { if(pNode->pLeftNode != NULL) pNode->pLeftNode->RecycleSelf(this); if(pNode->pRightNode != NULL) pNode->pRightNode->RecycleSelf(this); if(pNode->pMissingNode != NULL) pNode->pMissingNode->RecycleSelf(this); if(pNode->aiLeftCategory != NULL) { delete [] pNode->aiLeftCategory; pNode->aiLeftCategory = NULL; } CategoricalStack.push(pNode); } return GBM_OK; } gbm/src/node.cpp0000644000176200001440000000224014547111634013243 0ustar liggesusers// GBM by Greg Ridgeway Copyright (C) 2003 #include "node.h" CNode::CNode() { dPrediction = 0.0; dTrainW = 0.0; isTerminal = false; } CNode::~CNode() { // the nodes get deleted by deleting the node factory } GBMRESULT CNode::Adjust ( unsigned long cMinObsInNode ) { GBMRESULT hr = GBM_NOTIMPL; return hr; } GBMRESULT CNode::Predict ( CDataset *pData, unsigned long iRow, double &dFadj ) { GBMRESULT hr = GBM_NOTIMPL; return hr; } double CNode::TotalError() { GBMRESULT hr = GBM_NOTIMPL; return hr; } GBMRESULT CNode::PrintSubtree ( unsigned long cIndent ) { GBMRESULT hr = GBM_NOTIMPL; return hr; } GBMRESULT CNode::GetVarRelativeInfluence ( double *adRelInf ) { GBMRESULT hr = GBM_NOTIMPL; return hr; } GBMRESULT CNode::TransferTreeToRList ( int &iNodeID, CDataset *pData, int *aiSplitVar, double *adSplitPoint, int *aiLeftNode, int *aiRightNode, int *aiMissingNode, double *adErrorReduction, double *adWeight, double *adPred, VEC_VEC_CATEGORIES &vecSplitCodes, int cCatSplitsOld, double dShrinkage ) { return GBM_NOTIMPL; } gbm/src/multinomial.cpp0000644000176200001440000001213614547655565014675 0ustar liggesusers #include "multinomial.h" CMultinomial::CMultinomial(int cNumClasses, int cRows) { mcNumClasses = cNumClasses; mcRows = cRows; madProb = NULL; adStepProb = NULL; } CMultinomial::~CMultinomial() { if(madProb != NULL) { delete [] madProb; } if(adStepProb!=NULL) { delete[] adStepProb; } } GBMRESULT CMultinomial::UpdateParams ( double *adF, double *adOffset, double *adWeight, unsigned long cLength ) { // Local variables unsigned long ii=0; unsigned long kk=0; // Set the probabilities for each observation in each class for (ii = 0; ii < mcRows; ii++) { double dClassSum = 0.0; for (kk = 0; kk < mcNumClasses; kk++) { int iIdx = ii + kk * mcRows; double dF = (adOffset == NULL) ? adF[iIdx] : adF[iIdx] + adOffset[iIdx]; madProb[iIdx] = adWeight[iIdx] * exp(dF); dClassSum += adWeight[iIdx] * exp(dF); } dClassSum = (dClassSum > 0) ? dClassSum : 1e-8; for (kk = 0; kk < mcNumClasses; kk++) { madProb[ii + kk * mcRows] /= dClassSum; } } return GBM_OK; } GBMRESULT CMultinomial::ComputeWorkingResponse ( double *adY, double *adMisc, double *adOffset, double *adF, double *adZ, double *adWeight, bool *afInBag, unsigned long nTrain, int cIdxOff ) { unsigned long i = 0; for(i=cIdxOff; icN >= cMinObsInNode) { // Get the number of nodes here double dNum = 0.0; double dDenom = 0.0; for (iObs = 0; iObs < nTrain; iObs++) { if(afInBag[iObs] && (aiNodeAssign[iObs] == iNode)) { int iIdx = iObs + cIdxOff; dNum += adW[iIdx] * adZ[iIdx]; dDenom += adW[iIdx] * fabs(adZ[iIdx]) * (1 - fabs(adZ[iIdx])); } } dDenom = (dDenom > 0) ? dDenom : 1e-8; vecpTermNodes[iNode]->dPrediction = dNum / dDenom; } } return hr; } double CMultinomial::BagImprovement ( double *adY, double *adMisc, double *adOffset, double *adWeight, double *adF, double *adFadj, bool *afInBag, double dStepSize, unsigned long nTrain ) { double dReturnValue = 0.0; double dW = 0.0; unsigned long ii; unsigned long kk; // Assume that this is last class - calculate new prob as in updateParams but // using (F_ik + ss*Fadj_ik) instead of F_ik. Then calculate OOB improve for (ii = 0; ii < mcRows; ii++) { double dClassSum = 0.0; for (kk = 0; kk < mcNumClasses; kk++) { int iIdx = ii + kk * mcRows; double dF = (adOffset == NULL) ? adF[iIdx] : adF[iIdx] + adOffset[iIdx]; dF += dStepSize * adFadj[iIdx]; adStepProb[iIdx] = adWeight[iIdx] * exp(dF); dClassSum += adWeight[iIdx] * exp(dF); } dClassSum = (dClassSum > 0) ? dClassSum : 1e-8; for (kk = 0; kk < mcNumClasses; kk++) { adStepProb[ii + kk * mcRows] /= dClassSum; } } // Calculate the improvement for(ii=0; iidPrediction = 0.0; } else { vecpTermNodes[iNode]->dPrediction = vecdNum[iNode]/vecdDen[iNode]; } } } return hr; } double CHuberized::BagImprovement ( double *adY, double *adMisc, double *adOffset, double *adWeight, double *adF, double *adFadj, bool *afInBag, double dStepSize, unsigned long nTrain ) { double dReturnValue = 0.0; double dF = 0.0; double dW = 0.0; unsigned long i = 0; for(i=0; ipData = pData; this->pDist = pDist; this->dLambda = dLambda; this->cTrain = cTrain; this->dBagFraction = dBagFraction; this->cDepth = cDepth; this->cMinObsInNode = cMinObsInNode; this->cGroups = cGroups; // allocate the tree structure ptreeTemp = new CCARTTree; if(ptreeTemp == NULL) { hr = GBM_OUTOFMEMORY; goto Error; } cValid = pData->cRows - cTrain; cTotalInBag = (unsigned long)(dBagFraction*cTrain); adZ = new double[(pData->cRows) * cNumClasses]; if(adZ == NULL) { hr = GBM_OUTOFMEMORY; goto Error; } adFadj = new double[(pData->cRows) * cNumClasses]; if(adFadj == NULL) { hr = GBM_OUTOFMEMORY; goto Error; } for (i=0; i<(pData->cRows)*cNumClasses; i++) { adFadj[i] = 0.0; } pNodeFactory = new CNodeFactory(); if(pNodeFactory == NULL) { hr = GBM_OUTOFMEMORY; goto Error; } hr = pNodeFactory->Initialize(cDepth); if(GBM_FAILED(hr)) { goto Error; } ptreeTemp->Initialize(pNodeFactory); // array for flagging those observations in the bag afInBag = new bool[cTrain]; if(afInBag==NULL) { hr = GBM_OUTOFMEMORY; goto Error; } // aiNodeAssign tracks to which node each training obs belongs aiNodeAssign = new ULONG[cTrain]; if(aiNodeAssign==NULL) { hr = GBM_OUTOFMEMORY; goto Error; } // NodeSearch objects help decide which nodes to split aNodeSearch = new CNodeSearch[2*cDepth+1]; if(aNodeSearch==NULL) { hr = GBM_OUTOFMEMORY; goto Error; } for(i=0; i<2*cDepth+1; i++) { aNodeSearch[i].Initialize(cMinObsInNode); } vecpTermNodes.resize(2*cDepth+1,NULL); fInitialized = true; Cleanup: return hr; Error: goto Cleanup; } GBMRESULT CGBM::Predict ( unsigned long iVar, unsigned long cTrees, double *adF, double *adX, unsigned long cLength ) { GBMRESULT hr = GBM_OK; return hr; } GBMRESULT CGBM::Predict ( double *adX, unsigned long cRow, unsigned long cCol, unsigned long cTrees, double *adF ) { GBMRESULT hr = GBM_OK; return hr; } GBMRESULT CGBM::GetVarRelativeInfluence ( double *adRelInf, unsigned long cTrees ) { GBMRESULT hr = GBM_OK; int iVar=0; for(iVar=0; iVarcCols; iVar++) { adRelInf[iVar] = 0.0; } return hr; } GBMRESULT CGBM::PrintTree() { GBMRESULT hr = GBM_OK; hr = ptreeTemp->Print(); if(GBM_FAILED(hr)) goto Error; Cleanup: return hr; Error: goto Cleanup; } GBMRESULT CGBM::iterate ( double *adF, double &dTrainError, double &dValidError, double &dOOBagImprove, int &cNodes, int cNumClasses, int cClassIdx ) { GBMRESULT hr = GBM_OK; unsigned long i = 0; unsigned long cBagged = 0; int cIdxOff = cClassIdx * (cTrain + cValid); // for(i=0; i < cTrain + cIdxOff; i++){ adF[i] = 0;} if(!fInitialized) { hr = GBM_FAIL; goto Error; } dTrainError = 0.0; dValidError = 0.0; dOOBagImprove = 0.0; vecpTermNodes.assign(2*cDepth+1,NULL); // randomly assign observations to the Bag if (cClassIdx == 0) { if (!IsPairwise()) { // regular instance based training for(i=0; i= cTotalInBag){ break; } */ } // the remainder is not in the bag for( ; iadMisc[i]; if (dGroup != dLastGroup) { if (cBaggedGroups >= cTotalGroupsInBag) { break; } // Group changed, make a new decision chosen = (unif_rand()*(cGroups - cSeenGroups) < cTotalGroupsInBag - cBaggedGroups); if (chosen) { cBaggedGroups++; } dLastGroup = dGroup; cSeenGroups++; } if (chosen) { afInBag[i] = true; cBagged++; } else { afInBag[i] = false; } } // the remainder is not in the bag for( ; iComputeWorkingResponse(pData->adY, pData->adMisc, pData->adOffset, adF, adZ, pData->adWeight, afInBag, cTrain, cIdxOff); if(GBM_FAILED(hr)) { goto Error; } #ifdef NOISY_DEBUG Rprintf("Reset tree\n"); #endif hr = ptreeTemp->Reset(); #ifdef NOISY_DEBUG Rprintf("grow tree\n"); #endif hr = ptreeTemp->grow(&(adZ[cIdxOff]), pData, &(pData->adWeight[cIdxOff]), &(adFadj[cIdxOff]), cTrain, cTotalInBag, dLambda, cDepth, cMinObsInNode, afInBag, aiNodeAssign, aNodeSearch, vecpTermNodes); if(GBM_FAILED(hr)) { goto Error; } #ifdef NOISY_DEBUG Rprintf("get node count\n"); #endif hr = ptreeTemp->GetNodeCount(cNodes); if(GBM_FAILED(hr)) { goto Error; } // Now I have adF, adZ, and vecpTermNodes (new node assignments) // Fit the best constant within each terminal node #ifdef NOISY_DEBUG Rprintf("fit best constant\n"); #endif hr = pDist->FitBestConstant(pData->adY, pData->adMisc, pData->adOffset, pData->adWeight, adF, adZ, aiNodeAssign, cTrain, vecpTermNodes, (2*cNodes+1)/3, // number of terminal nodes cMinObsInNode, afInBag, adFadj, cIdxOff); if(GBM_FAILED(hr)) { goto Error; } // update training predictions // fill in missing nodes where N < cMinObsInNode hr = ptreeTemp->Adjust(aiNodeAssign,&(adFadj[cIdxOff]),cTrain, vecpTermNodes,cMinObsInNode); if(GBM_FAILED(hr)) { goto Error; } ptreeTemp->SetShrinkage(dLambda); if (cClassIdx == (cNumClasses - 1)) { dOOBagImprove = pDist->BagImprovement(pData->adY, pData->adMisc, pData->adOffset, pData->adWeight, adF, adFadj, afInBag, dLambda, cTrain); } // update the training predictions for(i=0; i < cTrain; i++) { int iIdx = i + cIdxOff; adF[iIdx] += dLambda * adFadj[iIdx]; } dTrainError = pDist->Deviance(pData->adY, pData->adMisc, pData->adOffset, pData->adWeight, adF, cTrain, cIdxOff); // update the validation predictions hr = ptreeTemp->PredictValid(pData,cValid,&(adFadj[cIdxOff])); for(i=cTrain; i < cTrain+cValid; i++) { adF[i + cIdxOff] += adFadj[i + cIdxOff]; } if(pData->fHasOffset) { dValidError = pDist->Deviance(pData->adY, pData->adMisc, pData->adOffset, pData->adWeight, adF, cValid, cIdxOff + cTrain); } else { dValidError = pDist->Deviance(pData->adY, pData->adMisc, NULL, pData->adWeight, adF, cValid, cIdxOff + cTrain); } Cleanup: return hr; Error: goto Cleanup; } GBMRESULT CGBM::TransferTreeToRList ( int *aiSplitVar, double *adSplitPoint, int *aiLeftNode, int *aiRightNode, int *aiMissingNode, double *adErrorReduction, double *adWeight, double *adPred, VEC_VEC_CATEGORIES &vecSplitCodes, int cCatSplitsOld ) { GBMRESULT hr = GBM_OK; hr = ptreeTemp->TransferTreeToRList(pData, aiSplitVar, adSplitPoint, aiLeftNode, aiRightNode, aiMissingNode, adErrorReduction, adWeight, adPred, vecSplitCodes, cCatSplitsOld, dLambda); return hr; } gbm/src/node_search.h0000644000176200001440000000626714547624323014256 0ustar liggesusers//------------------------------------------------------------------------------ // GBM by Greg Ridgeway Copyright (C) 2003 // // File: node_search.h // // License: GNU GPL (version 2 or later) // // Contents: does the searching for where to split a node // // Owner: gregr@rand.org // // History: 3/26/2001 gregr created // 2/14/2003 gregr: adapted for R implementation // //------------------------------------------------------------------------------ #ifndef NODESEARCH_H #define NODESEARCH_H #include "node_factory.h" #include "dataset.h" #include // for UINT_MAX class CNodeSearch { public: CNodeSearch(); ~CNodeSearch(); GBMRESULT Initialize(unsigned long cMinObsInNode); GBMRESULT IncorporateObs(double dX, double dZ, double dW, long lMonotone); GBMRESULT Set(double dSumZ, double dTotalW, unsigned long cTotalN, CNodeTerminal *pThisNode, CNode **ppParentPointerToThisNode, CNodeFactory *pNodeFactory); GBMRESULT ResetForNewVar(unsigned long iWhichVar, long cVarClasses); double BestImprovement() { return dBestImprovement; } GBMRESULT SetToSplit() { fIsSplit = true; return GBM_OK; }; GBMRESULT SetupNewNodes(PCNodeNonterminal &pNewSplitNode, PCNodeTerminal &pNewLeftNode, PCNodeTerminal &pNewRightNode, PCNodeTerminal &pNewMissingNode); GBMRESULT EvaluateCategoricalSplit(); GBMRESULT WrapUpCurrentVariable(); double ThisNodePrediction() {return pThisNode->dPrediction;} bool operator<(const CNodeSearch &ns) {return dBestImprovement #include "dataset.h" #include "node.h" class CNodeTerminal : public CNode { public: CNodeTerminal(); ~CNodeTerminal(); GBMRESULT Adjust(unsigned long cMinObsInNode); GBMRESULT PrintSubtree(unsigned long cIndent); GBMRESULT TransferTreeToRList(int &iNodeID, CDataset *pData, int *aiSplitVar, double *adSplitPoint, int *aiLeftNode, int *aiRightNode, int *aiMissingNode, double *adErrorReduction, double *adWeight, double *adPred, VEC_VEC_CATEGORIES &vecSplitCodes, int cCatSplitsOld, double dShrinkage); GBMRESULT ApplyShrinkage(double dLambda); GBMRESULT Predict(CDataset *pData, unsigned long i, double &dFadj); GBMRESULT Predict(double *adX, unsigned long cRow, unsigned long cCol, unsigned long iRow, double &dFadj); GBMRESULT GetVarRelativeInfluence(double *adRelInf); GBMRESULT RecycleSelf(CNodeFactory *pNodeFactory); }; typedef CNodeTerminal *PCNodeTerminal; typedef std::vector VEC_P_NODETERMINAL; #endif // NODETERMINAL_H gbm/src/node_categorical.cpp0000644000176200001440000001404514547114306015605 0ustar liggesusers// GBM by Greg Ridgeway Copyright (C) 2003 #include "node_categorical.h" #include "node_factory.h" CNodeCategorical::CNodeCategorical() { aiLeftCategory = NULL; cLeftCategory = 0; } CNodeCategorical::~CNodeCategorical() { #ifdef NOISY_DEBUG Rprintf("categorical destructor\n"); #endif if(aiLeftCategory != NULL) { delete [] aiLeftCategory; aiLeftCategory = NULL; } } GBMRESULT CNodeCategorical::PrintSubtree ( unsigned long cIndent ) { GBMRESULT hr = GBM_OK; unsigned long i = 0; for(i=0; i< cIndent; i++) Rprintf(" "); Rprintf("N=%f, Improvement=%f, Prediction=%f, NA pred=%f\n", dTrainW, dImprovement, dPrediction, (pMissingNode == NULL ? 0.0 : pMissingNode->dPrediction)); for(i=0; i< cIndent; i++) Rprintf(" "); Rprintf("V%lu in ",iSplitVar); for(i=0; iPrintSubtree(cIndent+1); for(i=0; i< cIndent; i++) Rprintf(" "); Rprintf("V%lu not in ",iSplitVar); for(i=0; iPrintSubtree(cIndent+1); for(i=0; i< cIndent; i++) Rprintf(" "); Rprintf("missing\n"); hr = pMissingNode->PrintSubtree(cIndent+1); return hr; } signed char CNodeCategorical::WhichNode ( CDataset *pData, unsigned long iObs ) { signed char ReturnValue = 0; double dX = pData->adX[iSplitVar*(pData->cRows) + iObs]; if(!ISNA(dX)) { if(std::find(aiLeftCategory, aiLeftCategory+cLeftCategory, (ULONG)dX) != aiLeftCategory+cLeftCategory) { ReturnValue = -1; } else { ReturnValue = 1; } } // if missing value returns 0 return ReturnValue; } signed char CNodeCategorical::WhichNode ( double *adX, unsigned long cRow, unsigned long cCol, unsigned long iRow ) { signed char ReturnValue = 0; double dX = adX[iSplitVar*cRow + iRow]; if(!ISNA(dX)) { if(std::find(aiLeftCategory, aiLeftCategory+cLeftCategory, (ULONG)dX) != aiLeftCategory+cLeftCategory) { ReturnValue = -1; } else { ReturnValue = 1; } } // if missing value returns 0 return ReturnValue; } GBMRESULT CNodeCategorical::RecycleSelf ( CNodeFactory *pNodeFactory ) { GBMRESULT hr = GBM_OK; hr = pNodeFactory->RecycleNode(this); return hr; }; GBMRESULT CNodeCategorical::TransferTreeToRList ( int &iNodeID, CDataset *pData, int *aiSplitVar, double *adSplitPoint, int *aiLeftNode, int *aiRightNode, int *aiMissingNode, double *adErrorReduction, double *adWeight, double *adPred, VEC_VEC_CATEGORIES &vecSplitCodes, int cCatSplitsOld, double dShrinkage ) { GBMRESULT hr = GBM_OK; int iThisNodeID = iNodeID; unsigned long cCatSplits = vecSplitCodes.size(); unsigned long i = 0; int cLevels = pData->acVarClasses[iSplitVar]; aiSplitVar[iThisNodeID] = iSplitVar; adSplitPoint[iThisNodeID] = cCatSplits+cCatSplitsOld; // 0 based adErrorReduction[iThisNodeID] = dImprovement; adWeight[iThisNodeID] = dTrainW; adPred[iThisNodeID] = dShrinkage*dPrediction; vecSplitCodes.push_back(VEC_CATEGORIES()); vecSplitCodes[cCatSplits].resize(cLevels,1); for(i=0; iTransferTreeToRList(iNodeID, pData, aiSplitVar, adSplitPoint, aiLeftNode, aiRightNode, aiMissingNode, adErrorReduction, adWeight, adPred, vecSplitCodes, cCatSplitsOld, dShrinkage); if(GBM_FAILED(hr)) goto Error; aiRightNode[iThisNodeID] = iNodeID; hr = pRightNode->TransferTreeToRList(iNodeID, pData, aiSplitVar, adSplitPoint, aiLeftNode, aiRightNode, aiMissingNode, adErrorReduction, adWeight, adPred, vecSplitCodes, cCatSplitsOld, dShrinkage); if(GBM_FAILED(hr)) goto Error; aiMissingNode[iThisNodeID] = iNodeID; hr = pMissingNode->TransferTreeToRList(iNodeID, pData, aiSplitVar, adSplitPoint, aiLeftNode, aiRightNode, aiMissingNode, adErrorReduction, adWeight, adPred, vecSplitCodes, cCatSplitsOld, dShrinkage); if(GBM_FAILED(hr)) goto Error; Cleanup: return hr; Error: goto Cleanup; } gbm/src/gaussian.cpp0000644000176200001440000000643114547111634014136 0ustar liggesusers// GBM by Greg Ridgeway Copyright (C) 2003 #include "gaussian.h" CGaussian::CGaussian() { } CGaussian::~CGaussian() { } GBMRESULT CGaussian::ComputeWorkingResponse ( double *adY, double *adMisc, double *adOffset, double *adF, double *adZ, double *adWeight, bool *afInBag, unsigned long nTrain, int cIdxOff ) { GBMRESULT hr = GBM_OK; unsigned long i = 0; if((adY == NULL) || (adF == NULL) || (adZ == NULL) || (adWeight == NULL)) { hr = GBM_INVALIDARG; goto Error; } if(adOffset == NULL) { for(i=0; icN >= cMinObsInNode) { veciK2Node[K] = i; veciNode2K[i] = K; K++; } } vecdP.resize(K); matH.setactualsize(K-1); vecdG.resize(K-1); vecdG.assign(K-1,0.0); // zero the Hessian for(k=0; kcN >= cMinObsInNode)) { dF = adF[i] + ((adOffset==NULL) ? 0.0 : adOffset[i]); vecdP[veciNode2K[aiNodeAssign[i]]] += adW[i]*exp(dF); dRiskTot += adW[i]*exp(dF); if(adDelta[i]==1.0) { // compute g and H for(k=0; kdPrediction = 0.0; } for(m=0; mdPrediction = 0.0; break; } else { vecpTermNodes[veciK2Node[k]]->dPrediction -= dTemp*vecdG[m]; } } } // vecpTermNodes[veciK2Node[K-1]]->dPrediction = 0.0; // already set to 0.0 return hr; } double CCoxPH::BagImprovement ( double *adT, double *adDelta, double *adOffset, double *adWeight, double *adF, double *adFadj, bool *afInBag, double dStepSize, unsigned long nTrain ) { double dReturnValue = 0.0; double dNum = 0.0; double dDen = 0.0; double dF = 0.0; double dW = 0.0; unsigned long i = 0; dNum = 0.0; dDen = 0.0; for(i=0; idPrediction = -19.0; } else if(vecdDen[iNode] == 0.0) { vecpTermNodes[iNode]->dPrediction = 0.0; } else { vecpTermNodes[iNode]->dPrediction = log(vecdNum[iNode]/vecdDen[iNode]); } vecpTermNodes[iNode]->dPrediction = fmin2(vecpTermNodes[iNode]->dPrediction, 19-vecdMax[iNode]); vecpTermNodes[iNode]->dPrediction = fmax2(vecpTermNodes[iNode]->dPrediction, -19-vecdMin[iNode]); } } return hr; } double CPoisson::BagImprovement ( double *adY, double *adMisc, double *adOffset, double *adWeight, double *adF, double *adFadj, bool *afInBag, double dStepSize, unsigned long nTrain ) { double dReturnValue = 0.0; double dF = 0.0; double dW = 0.0; unsigned long i = 0; for(i=0; i= cRows) || (iCol >= cCols)) { hr = GBM_INVALIDARG; goto Error; } dValue = adX[iCol*cRows + iRow]; Cleanup: return hr; Error: goto Cleanup; } bool fHasOffset; double *adX; int *aiXOrder; double *adXTemp4Order; double *adY; double *adOffset; double *adWeight; double *adMisc; char **apszVarNames; int *acVarClasses; int *alMonotoneVar; int cRows; int cCols; private: }; #endif // DATASET_H gbm/src/poisson.h0000644000176200001440000000572014547624323013467 0ustar liggesusers//------------------------------------------------------------------------------ // GBM by Greg Ridgeway Copyright (C) 2003 // File: poisson.h // // License: GNU GPL (version 2 or later) // // Contents: poisson object // // Owner: gregr@rand.org // // History: 3/26/2001 gregr created // 2/14/2003 gregr: adapted for R implementation // //------------------------------------------------------------------------------ #ifndef POISSON_H #define POISSON_H #include #include "distribution.h" class CPoisson : public CDistribution { public: CPoisson(); virtual ~CPoisson(); GBMRESULT UpdateParams(double *adF, double *adOffset, double *adWeight, unsigned long cLength) { return GBM_OK; }; GBMRESULT ComputeWorkingResponse(double *adY, double *adMisc, double *adOffset, double *adWeight, double *adF, double *adZ, bool *afInBag, unsigned long nTrain, int cIdxOff); double Deviance(double *adY, double *adMisc, double *adOffset, double *adWeight, double *adF, unsigned long cLength, int cIdxOff); GBMRESULT InitF(double *adY, double *adMisc, double *adOffset, double *adWeight, double &dInitF, unsigned long cLength); GBMRESULT FitBestConstant(double *adY, double *adMisc, double *adOffset, double *adW, double *adF, double *adZ, unsigned long *aiNodeAssign, unsigned long nTrain, VEC_P_NODETERMINAL vecpTermNodes, unsigned long cTermNodes, unsigned long cMinObsInNode, bool *afInBag, double *adFadj, int cIdxOff); double BagImprovement(double *adY, double *adMisc, double *adOffset, double *adWeight, double *adF, double *adFadj, bool *afInBag, double dStepSize, unsigned long nTrain); private: std::vector vecdNum; std::vector vecdDen; std::vector vecdMax; std::vector vecdMin; }; #endif // POISSON_H gbm/src/gaussian.h0000644000176200001440000000551314547652532013612 0ustar liggesusers//------------------------------------------------------------------------------ // GBM by Greg Ridgeway Copyright (C) 2003 // // File: gaussian.h // // License: GNU GPL (version 2 or later) // // Contents: gaussian object // // Owner: gregr@rand.org // // History: 3/26/2001 gregr created // 2/14/2003 gregr: adapted for R implementation // //------------------------------------------------------------------------------ #ifndef GAUSSIAN_H #define GAUSSIAN_H #include "distribution.h" class CGaussian : public CDistribution { public: CGaussian(); virtual ~CGaussian(); GBMRESULT UpdateParams(double *adF, double *adOffset, double *adWeight, unsigned long cLength) { return GBM_OK; }; GBMRESULT ComputeWorkingResponse(double *adY, double *adMisc, double *adOffset, double *adWeight, double *adF, double *adZ, bool *afInBag, unsigned long nTrain, int cIdxOff); GBMRESULT InitF(double *adY, double *adMisc, double *adOffset, double *adWeight, double &dInitF, unsigned long cLength); GBMRESULT FitBestConstant(double *adY, double *adMisc, double *adOffset, double *adW, double *adF, double *adZ, unsigned long *aiNodeAssign, unsigned long nTrain, VEC_P_NODETERMINAL vecpTermNodes, unsigned long cTermNodes, unsigned long cMinObsInNode, bool *afInBag, double *adFadj, int cIdxOff); double Deviance(double *adY, double *adMisc, double *adOffset, double *adWeight, double *adF, unsigned long cLength, int cIdxOff); double BagImprovement(double *adY, double *adMisc, double *adOffset, double *adWeight, double *adF, double *adFadj, bool *afInBag, double dStepSize, unsigned long nTrain); }; #endif // GAUSSIAN_H gbm/src/adaboost.cpp0000644000176200001440000001027714547614462014132 0ustar liggesusers// GBM by Greg Ridgeway Copyright (C) 2003 #include "adaboost.h" CAdaBoost::CAdaBoost() { } CAdaBoost::~CAdaBoost() { } GBMRESULT CAdaBoost::ComputeWorkingResponse ( double *adY, double *adMisc, double *adOffset, double *adF, double *adZ, double *adWeight, bool *afInBag, unsigned long nTrain, int cIdxOff ) { unsigned long i = 0; if(adOffset == NULL) { for(i=0; idPrediction = 0.0; } else { vecpTermNodes[iNode]->dPrediction = vecdNum[iNode]/vecdDen[iNode]; } } } return hr; } double CAdaBoost::BagImprovement ( double *adY, double *adMisc, double *adOffset, double *adWeight, double *adF, double *adFadj, bool *afInBag, double dStepSize, unsigned long nTrain ) { double dReturnValue = 0.0; double dF = 0.0; double dW = 0.0; unsigned long i = 0; for(i=0; i #include #include #include //#define NOISY_DEBUG #ifdef NOISY_DEBUG #endif void CRanker::Init(unsigned int cMaxItemsPerGroup) { // Allocate sorting buffers vecdipScoreRank.resize(cMaxItemsPerGroup); vecpdipScoreRank.resize(cMaxItemsPerGroup); } bool CRanker::SetGroupScores(const double* const adScores, const unsigned int cNumItems) { const double dEPS = 1e-10; if (cNumItems > vecdipScoreRank.size()) { // Allocate additional space // (We should never get here if CPairwise::Initialize has been called before, as expected) Init(cNumItems); } this->cNumItems = cNumItems; // Copy scores to buffer, and // initialize pointer array to score entries for(unsigned int i = 0; i < cNumItems; i++) { // Add small random number to break possible ties vecdipScoreRank[i].first = adScores[i] + dEPS * (unif_rand() - 0.5); vecpdipScoreRank[i] = &(vecdipScoreRank[i]); } return true; } // Auxiliary struct to compare pair pointers // decreasing order based on the first component (score) struct CDoubleUintPairPtrComparison { bool operator() (const CRanker::CDoubleUintPair* lhs, const CRanker::CDoubleUintPair* rhs) { return (lhs->first > rhs->first); } }; bool CRanker::Rank() { // Sort the pointer array, based on decreasing score CDoubleUintPairPtrComparison comp; std::sort(vecpdipScoreRank.begin(), vecpdipScoreRank.begin() + cNumItems, comp); bool bChanged = false; // Create inverted rank lookup for(unsigned int i = 0; i < cNumItems; i++) { // Note: ranks are 1-based const unsigned int cNewRank = i + 1; if (!bChanged) { bChanged = (cNewRank != vecpdipScoreRank[i]->second); } // Store the rank with the corresponding score in the vecdipScoreRank array vecpdipScoreRank[i]->second = cNewRank; } return bChanged; } void CConc::Init ( unsigned long cMaxGroup, unsigned long cMaxItemsPerGroup, unsigned int cRankCutoff ) { CIRMeasure::Init(cMaxGroup, cMaxItemsPerGroup, cRankCutoff); veccPairCount.resize(cMaxGroup + 1, -1); } unsigned int CConc::PairCount(unsigned int iGroup, const double* const adY, unsigned int cNumItems) { if (iGroup >= veccPairCount.size()) { // Allocate additional space // (We should never get here if CPairwise::Initialize has been called before, as expected) veccPairCount.resize(iGroup + 1, -1); } if (veccPairCount[iGroup] < 0.0) { // Not yet initialized veccPairCount[iGroup] = ComputePairCount(adY, cNumItems); } return veccPairCount[iGroup]; } // Calculate the number of pairs with different labels, and store in veccPairCount // Assumption: instances are sorted such that labels are non-increasing int CConc::ComputePairCount(const double* const adY, unsigned int cNumItems) { if (!AnyPairs(adY, cNumItems)) { return 0; } double dLabelCurrent = adY[0]; int iLabelEnd = 0; // End of range with higher labels int cPairs = 0; for (unsigned int j = 1; j < cNumItems; j++) { if (adY[j] != dLabelCurrent) { // i.e., dYj < dLabelCurrent iLabelEnd = j; dLabelCurrent = adY[j]; } // All items in 0 .. iLabelEnd - 1 are better than item j; // i.e, we have pairs (j,0), (j,1), ... (j, iLabelEnd - 1) cPairs += iLabelEnd; } return cPairs; } // Count the number of correctly ranked pairs with different labels double CConc::Measure(const double* const adY, const CRanker& ranker) { double dLabelCurrent = adY[0]; int iLabelEnd = 0; // End of the range with higher labels int cGoodPairs = 0; for (unsigned int j = 1; j < ranker.GetNumItems(); j++) { const double dYj = adY[j]; if (dYj != dLabelCurrent) { // i.e., dYj < dLabelCurrent iLabelEnd = j; dLabelCurrent = dYj; } // All items in 0 .. iLabelEnd - 1 are better than this item for (int i = 0; i < iLabelEnd; i++) { if (ranker.GetRank(i) < ranker.GetRank(j)) { cGoodPairs++; } } } return cGoodPairs; } double CConc::SwapCost(int iItemBetter, int iItemWorse, const double* const adY, const CRanker& ranker) const { // Note: this implementation can handle arbitrary non-negative target values. // For binary (0/1) targets, the swap cost would reduce to the much simpler expression: // (int)ranker.GetRank(iItemBetter) - (int)ranker.GetRank(iItemWorse) const unsigned int cRankBetter = ranker.GetRank(iItemBetter); const unsigned int cRankWorse = ranker.GetRank(iItemWorse); // Which one of the two has the higher rank? unsigned int cRankUpper, cRankLower; double dYUpper, dYLower; int cDiff; if (cRankBetter > cRankWorse) { // Concordance increasing cRankUpper = cRankWorse; cRankLower = cRankBetter; dYUpper = adY[iItemWorse]; dYLower = adY[iItemBetter]; cDiff = 1; // The direct impact of the pair (iItemBetter, iItemWorse) } else { // Concordance decreasing cRankUpper = cRankBetter; cRankLower = cRankWorse; dYUpper = adY[iItemBetter]; dYLower = adY[iItemWorse]; cDiff = -1; // // The direct impact of the pair (iItemBetter, iItemWorse) } // Compute indirect impact for pairs involving items in between the two for (unsigned int cRank = cRankUpper + 1; cRank < cRankLower; cRank++) { const double dYi = adY[ranker.GetItem(cRank)]; double dScoreDiff = dYi - dYLower; if (dScoreDiff != 0) { cDiff += (dScoreDiff < 0) ? 1 : -1; } dScoreDiff = dYi - dYUpper; if (dScoreDiff != 0) { cDiff += (dScoreDiff < 0) ? -1 : 1; } } return cDiff; } void CNDCG::Init ( unsigned long cMaxGroup, unsigned long cMaxItemsPerGroup, unsigned int cRankCutoff ) { CIRMeasure::Init(cMaxGroup, cMaxItemsPerGroup, cRankCutoff); // Initialize rank weights (note: ranks are 1-based) vecdRankWeight.resize(cMaxItemsPerGroup + 1, 0.0); const unsigned int cMaxRank = std::min((unsigned int)cMaxItemsPerGroup, GetCutoffRank()); // Precompute rank weights for (unsigned int i = 1; i <= cMaxRank; i++) { vecdRankWeight[i] = log((double)2) / log((double)(i+1)); } // Allocate buffer vecdMaxDCG.resize(cMaxGroup + 1, -1.0); } // Sum of target values, weighted by rank weight double CNDCG::Measure(const double* const adY, const CRanker& ranker) { double dScore = 0; for (unsigned int i = 0; i < ranker.GetNumItems(); i++) { dScore += adY[i] * vecdRankWeight[ranker.GetRank(i)]; } return dScore; } double CNDCG::MaxMeasure(unsigned int iGroup, const double* const adY, unsigned int cNumItems) { if (iGroup >= vecdMaxDCG.size()) { // Allocate additional space // (We should never get here if CPairwise::Initialize has been called before, as expected) vecdMaxDCG.resize(iGroup + 1, -1.0); } if (vecdMaxDCG[iGroup] < 0.0) { // Not initialized if (!AnyPairs(adY, cNumItems)) { // No training pairs exist vecdMaxDCG[iGroup] = 0.0; } else { // Compute maximum possible DCG. // Note: By assumption, items are pre-sorted by descending score. double dScore = 0; unsigned int i = 0; while (i < cNumItems && adY[i] > 0) { // Note: Due to sorting, we can terminate early for a zero score. dScore += adY[i] * vecdRankWeight[i + 1]; i++; } vecdMaxDCG[iGroup] = dScore; #ifdef NOISY_DEBUG if (vecdMaxDCG[iGroup] == 0) { Rprintf("max score is 0: iGroup = %d, maxScore = %f, sz = %d\n", iGroup, vecdMaxDCG[iGroup], ranker.GetNumItems()); assert(false); } #endif } } return vecdMaxDCG[iGroup]; } double CNDCG::SwapCost(int iItemBetter, int iItemWorse, const double* const adY, const CRanker& ranker) const { const unsigned int cRanki = ranker.GetRank(iItemBetter); const unsigned int cRankj = ranker.GetRank(iItemWorse); return (vecdRankWeight[cRanki] - vecdRankWeight[cRankj]) * (adY[iItemBetter] - adY[iItemWorse]); } // Auxiliary function to find the top rank of a positive item (cRankTop), and the number of positive items (cPos) inline void TopRankPos(const double* const adY, const CRanker& ranker, unsigned int& cRankTop, unsigned int& cPos) { const unsigned int cNumItems = ranker.GetNumItems(); cRankTop = cNumItems + 1; // Ranks are 1-based for (cPos = 0; cPos < cNumItems; cPos++) { if (adY[cPos] <= 0.0) { // All subsequent items are zero, because of presorting return; } cRankTop = std::min(cRankTop, ranker.GetRank(cPos)); } } double CMRR::Measure(const double* const adY, const CRanker& ranker) { unsigned int cRankTop, cPos; TopRankPos(adY, ranker, cRankTop, cPos); const unsigned int cNumItems = std::min(ranker.GetNumItems(), GetCutoffRank()); if (cRankTop >= cNumItems + 1) { // No positive item found return 0.0; } // Ranks start at 1 return 1.0 / cRankTop; } double CMRR::SwapCost(int iItemPos, int iItemNeg, const double* const adY, const CRanker& ranker) const { unsigned int cRankTop, cPos; TopRankPos(adY, ranker, cRankTop, cPos); const unsigned int cNumItems = ranker.GetNumItems(); if (cRankTop >= cNumItems + 1 // No positive item (ranks are 1-based) || cPos >= cNumItems) // No negative item { return 0.0; } const unsigned int cRankPos = ranker.GetRank(iItemPos); const unsigned int cRankNeg = ranker.GetRank(iItemNeg); const unsigned int cCutoffRank = GetCutoffRank(); const double dMeasureCurrent = (cRankTop > cCutoffRank) ? 0.0 : 1.0 / cRankTop; const double dMeasureNeg = (cRankNeg > cCutoffRank) ? 0.0 : 1.0 / cRankNeg; // Only pairs where the negative item is above the top positive result, // or else where the positive item *is* the top item, can change the MRR return ((cRankNeg < cRankTop || cRankPos == cRankTop) ? (dMeasureNeg - dMeasureCurrent) : 0.0); } void CMAP::Init ( unsigned long cMaxGroup, unsigned long cMaxItemsPerGroup, unsigned int cRankCutoff ) { CIRMeasure::Init(cMaxGroup, cMaxItemsPerGroup, cRankCutoff); // Allocate rank buffer (note: ranks are 1-based) veccRankPos.resize(cMaxItemsPerGroup + 1); } // Auxiliary function to find the sorted ranks of positive items (veccRankPos), and their number (cPos) inline void SortRankPos(const double* const adY, const CRanker& ranker, std::vector& veccRankPos, unsigned int& cPos) { // Store all ranks of positive items in veccRankPos for (cPos = 0; cPos < ranker.GetNumItems(); cPos++) { if (adY[cPos] <= 0.0) { // All subsequent items are zero, because of presorting break; } veccRankPos[cPos] = ranker.GetRank(cPos); } std::sort(veccRankPos.begin(), veccRankPos.begin() + cPos); } double CMAP::SwapCost(int iItemPos, int iItemNeg, const double* const adY, const CRanker& ranker) const { unsigned int cPos; SortRankPos(adY, ranker, veccRankPos, cPos); if (cPos == 0) { return 0.0; } // Now veccRankPos[i] is the i-th highest rank of a positive item, and // cPos is the total number of positive items. const int iRankItemPos = ranker.GetRank(iItemPos); const int iRankItemNeg = ranker.GetRank(iItemNeg); // Search for the position of the two items to swap const std::vector::iterator itItemPos = upper_bound(veccRankPos.begin(), veccRankPos.begin() + cPos, iRankItemPos); const std::vector::iterator itItemNeg = upper_bound(veccRankPos.begin(), veccRankPos.begin() + cPos, iRankItemNeg); // The number of positive items up to and including iItemPos const unsigned int cNumPosNotBelowItemPos = (unsigned int)(itItemPos - veccRankPos.begin()); // The number of positive items up to iItemNeg (Note: Cannot include iItemNeg itself) const unsigned int cNumPosAboveItemNeg = (unsigned int)(itItemNeg - veccRankPos.begin()); // Range of indices of positive items between iRankItemPos and iRankItemNeg (exclusively) int cIntermediateHigh, cIntermediateLow; // Current contribution of iItemPos double dContribBefore = (double) cNumPosNotBelowItemPos / iRankItemPos; double dSign, dContribAfter; if (iRankItemNeg > iRankItemPos) { // MAP is decreasing dSign = -1.0; // The first positive item after iRankItemPos cIntermediateLow = cNumPosNotBelowItemPos; // The last positive item before iRankItemNeg cIntermediateHigh = cNumPosAboveItemNeg - 1; // Note: iItemPos already counted in cNumPosAboveItemNeg dContribAfter = (double)cNumPosAboveItemNeg / iRankItemNeg; } else { // MAP is increasing dSign = 1.0; // The first positive result after iRankItemNeg cIntermediateLow = cNumPosAboveItemNeg; // The first positive result after iRankItemPos, minus iItemPos itself cIntermediateHigh = cNumPosNotBelowItemPos - 2; // Note: iItemPos not yet counted in cNumPosAboveItemNeg dContribAfter = (double) (cNumPosAboveItemNeg + 1) / iRankItemNeg; } // The direct effect of switching iItemPos double dDiff = dContribAfter - dContribBefore; // The indirect effect for all items in between the two items for (int j = cIntermediateLow; j <= cIntermediateHigh; j++) { dDiff += dSign / veccRankPos[j]; } return dDiff / cPos; } double CMAP::Measure(const double* const adY, const CRanker& ranker) { unsigned int cPos; SortRankPos(adY, ranker, veccRankPos, cPos); if (cPos == 0) { return 0.0; } // Now veccRankPos[i] is the i-th highest rank of a positive item double dPrec = 0.0; for (unsigned int j = 0; j < cPos; j++) { dPrec += double(j + 1) / veccRankPos[j]; } return dPrec / cPos; } CPairwise::CPairwise(const char* szIRMeasure) { // Construct the IR Measure if (!strcmp(szIRMeasure, "conc")) { pirm = new CConc(); } else if (!strcmp(szIRMeasure, "map")) { pirm = new CMAP(); } else if (!strcmp(szIRMeasure, "mrr")) { pirm = new CMRR(); } else { if (strcmp(szIRMeasure, "ndcg")) { Rprintf("Unknown IR measure '%s' in initialization, using 'ndcg' instead\n", szIRMeasure); } pirm = new CNDCG(); } } CPairwise::~CPairwise() { delete pirm; } // Auxiliary function for addition of optional offset parameter inline const double* OffsetVector(const double* const adX, const double* const adOffset, unsigned int iStart, unsigned int iEnd, std::vector& vecBuffer) { if (adOffset == NULL) { // Optional second argument is not set, just return first one return adX + iStart; } else { for (unsigned int i = iStart, iOut = 0; i < iEnd; i++, iOut++) { vecBuffer[iOut] = adX[i] + adOffset[i]; } return &vecBuffer[0]; } } GBMRESULT CPairwise::ComputeWorkingResponse ( double *adY, double *adGroup, double *adOffset, double *adF, double *adZ, double *adWeight, bool *afInBag, unsigned long nTrain, int cIdxOff ) { #ifdef NOISY_DEBUG Rprintf("compute working response, nTrain = %u, cIdxOff = %d\n", nTrain, cIdxOff); #endif if (nTrain <= 0) { return GBM_OK; } try { // Iterate through all groups, compute gradients unsigned int iItemStart = 0; unsigned int iItemEnd = 0; while (iItemStart < nTrain) { adZ[iItemEnd] = 0; vecdHessian[iItemEnd] = 0; const double dGroup = adGroup[iItemStart]; // Find end of current group, initialize working response for (iItemEnd = iItemStart + 1; iItemEnd < nTrain && adGroup[iItemEnd] == dGroup; iItemEnd++) { // Clear gradients from last iteration adZ[iItemEnd] = 0; vecdHessian[iItemEnd] = 0; } #ifdef NOISY_DEBUG // Check sorting for (unsigned int i = iItemStart; i < iItemEnd-1; i++) { assert(adY[i] >= adY[i+1]); } #endif if (afInBag[iItemStart]) { // Group is part of the training set const int cNumItems = iItemEnd - iItemStart; // If offset given, add up current scores const double* adFPlusOffset = OffsetVector(adF, adOffset, iItemStart, iItemEnd, vecdFPlusOffset); // Accumulate gradients ComputeLambdas((int)dGroup, cNumItems, adY + iItemStart, adFPlusOffset, adWeight + iItemStart, adZ + iItemStart, &vecdHessian[iItemStart]); } // Next group iItemStart = iItemEnd; } } catch (std::bad_alloc&) { return GBM_OUTOFMEMORY; } return GBM_OK; } // Referring to MSR-TR-2010-82-2, section 7 (see also the vignette): // // Let P be the set of pairs (i,j) where Y(i)>Y(j) (i is better than j). // The approximation to the IR measure is the utility function C (to be maximized) // C // = \Sum_{(i,j) in P} |Delta Z_ij| C(s_i - s_j) // = \Sum_{(i,j) in P} |Delta Z_ij| / (1 + exp(-(s_i - s_j))), // where |Delta Z_ij| is the cost of swapping (only) i and j in the current ranking, // and s_i, s_j are the prediction scores (sum of the tree predictions) for items // i and j. // // For (i,j) in P, define // lambda_ij // = dC(s_i-s_j) / ds_i // = - |Delta Z_ij| / (1 + exp(s_i - s_j)) // = - |Delta Z_ij| * rho_ij, // with // rho_ij = - lambda_ij / |Delta Z_ij| = 1 / (1 + exp(s_i - s_j)) // // So the gradient of C with respect to s_i is // dC / ds_i // =(def) lambda_i // = \Sum_{j|(i,j) in P} lambda_ij - \Sum_{j|(j,i) in P} lambda_ji // = - \Sum_{j|(i,j) in P} |Delta Z_ij| * rho_ij // + \Sum_{j|(j,i) in P} |Delta Z_ji| * rho_ji; // it is stored in adZ[i]. // // The second derivative is // d^2C / ds_i^2 // =(def) gamma_i // = \Sum_{j|(i,j) in P} |Delta Z_ij| * rho_ij * (1-rho_ij) // - \Sum_{j|(j,i) in P} |Delta Z_ji| * rho_ji * (1-rho_ji); // it is stored in vecdHessian[i]. // // The Newton step for a particular leaf node is (a fraction of) // g'/g'', where g' (resp. g'') is the sum of dC/ds_i = lambda_i // (resp. d^2C/d^2s_i = gamma_i) over all instances falling into this leaf. This // summation is calculated later in CPairwise::FitBestConstant(). void CPairwise::ComputeLambdas(int iGroup, unsigned int cNumItems, const double* const adY, const double* const adF, const double* const adWeight, double* adZ, double* adDeriv) { // Assumption: Weights are constant within group if (adWeight[0] <= 0) { return; } // Normalize for maximum achievable group score const double dMaxScore = pirm->MaxMeasure(iGroup, adY, cNumItems); if (dMaxScore <= 0.0) { // No pairs return; } // Rank items by current score ranker.SetGroupScores(adF, cNumItems); ranker.Rank(); double dLabelCurrent = adY[0]; // First index of instance that has dLabelCurrent // (i.e., each smaller index corresponds to better item) unsigned int iLabelCurrentStart = 0; // Number of pairs with unequal labels unsigned int cPairs = 0; #ifdef NOISY_DEBUG double dMeasureBefore = pirm->Measure(adY, ranker); #endif for (unsigned int j = 1; j < cNumItems; j++) { const double dYj = adY[j]; if (dYj != dLabelCurrent) { iLabelCurrentStart = j; dLabelCurrent = dYj; } for (unsigned int i = 0; i < iLabelCurrentStart; i++) { // Instance i is better than j const double dSwapCost = fabs(pirm->SwapCost(i, j, adY, ranker)); #ifdef NOISY_DEBUG double dDelta = fabs(pirm->SwapCost(i, j, adY, ranker)); const int cRanki = ranker.GetRank(i); const int cRankj = ranker.GetRank(j); ranker.SetRank(i, cRankj); ranker.SetRank(j, cRanki); double dMeasureAfter = pirm->Measure(adY, ranker); if (fabs(dMeasureBefore-dMeasureAfter) - dDelta > 1e-5) { Rprintf("%f %f %f %f %f %d %d\n", pirm->SwapCost(i, j, adY, ranker), dMeasureBefore, dMeasureAfter, dMeasureBefore - dMeasureAfter, dDelta , i, j); for (unsigned int k = 0; k < cNumItems; k++) { Rprintf("%d\t%d\t%f\t%f\n", k, ranker.GetRank(k), adY[k], adF[k]); } assert(false); } assert(fabs(dMeasureBefore - dMeasureAfter) - fabs(dDelta) < 1e-5); ranker.SetRank(j, cRankj); ranker.SetRank(i, cRanki); #endif assert(std::isfinite(dSwapCost)); if (dSwapCost > 0.0) { cPairs++; const double dRhoij = 1.0 / (1.0 + exp(adF[i]- adF[j])) ; assert(std::isfinite(dRhoij)); const double dLambdaij = dSwapCost * dRhoij; adZ[i] += dLambdaij; adZ[j] -= dLambdaij; const double dDerivij = dLambdaij * (1.0 - dRhoij); assert(dDerivij >= 0); adDeriv[i] += dDerivij; adDeriv[j] += dDerivij; } } } if (cPairs > 0) { // Normalize for number of training pairs const double dQNorm = 1.0 / (dMaxScore * cPairs); for (unsigned int j = 0; j < cNumItems; j++) { adZ[j] *= dQNorm; adDeriv[j] *= dQNorm; } } } GBMRESULT CPairwise::Initialize ( double *adY, double *adGroup, double *adOffset, double *adWeight, unsigned long cLength ) { if (cLength <= 0) { return GBM_OK; } try { // Allocate memory for derivative buffer vecdHessian.resize(cLength); // Count the groups and number of items per group unsigned int cMaxItemsPerGroup = 0; double dMaxGroup = 0; unsigned int iItemStart = 0; unsigned int iItemEnd = 0; while (iItemStart < cLength) { const double dGroup = adGroup[iItemStart]; // Find end of current group for (iItemEnd = iItemStart + 1; iItemEnd < cLength && adGroup[iItemEnd] == dGroup; iItemEnd++); const unsigned int cNumItems = iItemEnd - iItemStart; if (cNumItems > cMaxItemsPerGroup) { cMaxItemsPerGroup = cNumItems; } if (dGroup > dMaxGroup) { dMaxGroup = dGroup; } // Next group iItemStart = iItemEnd; } // Allocate buffer for offset addition vecdFPlusOffset.resize(cMaxItemsPerGroup); // Allocate ranker memory ranker.Init(cMaxItemsPerGroup); // Allocate IR measure memory // The last element of adGroup specifies the cutoff // (zero means no cutoff) unsigned int cRankCutoff = cMaxItemsPerGroup; if (adGroup[cLength] > 0) { cRankCutoff = (unsigned int)adGroup[cLength]; } pirm->Init((unsigned long)dMaxGroup, cMaxItemsPerGroup, cRankCutoff); #ifdef NOISY_DEBUG Rprintf("Initialization: instances=%ld, groups=%u, max items per group=%u, rank cutoff=%u, offset specified: %d\n", cLength, (unsigned long)dMaxGroup, cMaxItemsPerGroup, cRankCutoff, (adOffset != NULL)); #endif } catch (std::bad_alloc&) { return GBM_OUTOFMEMORY; } return GBM_OK; } GBMRESULT CPairwise::InitF ( double *adY, double *adGroup, double *adOffset, double *adWeight, double &dInitF, unsigned long cLength ) { dInitF = 0.0; return GBM_OK; } double CPairwise::Deviance ( double *adY, double *adGroup, double *adOffset, double *adWeight, double *adF, unsigned long cLength, int cIdxOff ) { #ifdef NOISY_DEBUG Rprintf("Deviance, cLength = %u, cIdxOff = %d\n", cLength, cIdxOff); #endif if (cLength <= 0) { return 0; } double dL = 0.0; double dW = 0.0; unsigned int iItemStart = cIdxOff; unsigned int iItemEnd = iItemStart; const unsigned int cEnd = cLength + cIdxOff; while (iItemStart < cEnd) { const double dGroup = adGroup[iItemStart]; const double dWi = adWeight[iItemStart]; // Find end of current group for (iItemEnd = iItemStart + 1; iItemEnd < cEnd && adGroup[iItemEnd] == dGroup; iItemEnd++) ; const int cNumItems = iItemEnd - iItemStart; const double dMaxScore = pirm->MaxMeasure((int)dGroup, adY + iItemStart, cNumItems); if (dMaxScore > 0.0) { // Rank items by current score // If offset given, add up current scores const double* adFPlusOffset = OffsetVector(adF, adOffset, iItemStart, iItemEnd, vecdFPlusOffset); ranker.SetGroupScores(adFPlusOffset, cNumItems); ranker.Rank(); dL += dWi * pirm->Measure(adY + iItemStart, ranker) / dMaxScore; dW += dWi; } // Next group iItemStart = iItemEnd; } // Loss = 1 - utility return 1.0 - dL / dW; } GBMRESULT CPairwise::FitBestConstant ( double *adY, double *adGroup, double *adOffset, double *adW, double *adF, double *adZ, unsigned long *aiNodeAssign, unsigned long nTrain, VEC_P_NODETERMINAL vecpTermNodes, unsigned long cTermNodes, unsigned long cMinObsInNode, bool *afInBag, double *adFadj, int cIdxOff ) { #ifdef NOISY_DEBUG Rprintf("FitBestConstant, nTrain = %u, cIdxOff = %d, cTermNodes = %d, \n", nTrain, cIdxOff, cTermNodes); #endif // Assumption: ComputeWorkingResponse() has been executed before with // the same arguments try { // Allocate space for numerators and denominators, and set to zero vecdNum.reserve(cTermNodes); vecdDenom.reserve(cTermNodes); for (unsigned int i = 0; i < cTermNodes; i++) { vecdNum[i] = 0.0; vecdDenom[i] = 0.0; } } catch (std::bad_alloc&) { return GBM_OUTOFMEMORY; } for (unsigned int iObs = 0; iObs < nTrain; iObs++) { if (afInBag[iObs]) { assert(std::isfinite(adW[iObs])); assert(std::isfinite(adZ[iObs])); assert(std::isfinite(vecdHessian[iObs])); vecdNum[aiNodeAssign[iObs]] += adW[iObs] * adZ[iObs]; vecdDenom[aiNodeAssign[iObs]] += adW[iObs] * vecdHessian[iObs]; } } for (unsigned int iNode = 0; iNode < cTermNodes; iNode++) { if (vecpTermNodes[iNode] != NULL) { vecpTermNodes[iNode]->dPrediction = vecdNum[iNode]; if (vecdDenom[iNode] <= 0.0) { vecpTermNodes[iNode]->dPrediction = 0.0; } else { vecpTermNodes[iNode]->dPrediction = vecdNum[iNode]/vecdDenom[iNode]; } } } return GBM_OK; } double CPairwise::BagImprovement ( double *adY, double *adGroup, double *adOffset, double *adWeight, double *adF, double *adFadj, bool *afInBag, double dStepSize, unsigned long nTrain ) { #ifdef NOISY_DEBUG Rprintf("BagImprovement, nTrain = %u\n", nTrain); #endif if (nTrain <= 0) { return 0; } double dL = 0.0; double dW = 0.0; unsigned int iItemStart = 0; unsigned int iItemEnd = 0; while (iItemStart < nTrain) { const double dGroup = adGroup[iItemStart]; // Find end of current group for (iItemEnd = iItemStart + 1; iItemEnd < nTrain && adGroup[iItemEnd] == dGroup; iItemEnd++) ; if (!afInBag[iItemStart]) { // Group was held out of training set const unsigned int cNumItems = iItemEnd - iItemStart; const double dMaxScore = pirm->MaxMeasure((int)dGroup, adY + iItemStart, cNumItems); if (dMaxScore > 0.0) { // If offset given, add up current scores const double* adFPlusOffset = OffsetVector(adF, adOffset, iItemStart, iItemEnd, vecdFPlusOffset); // Compute score according to old score, adF ranker.SetGroupScores(adFPlusOffset, cNumItems); ranker.Rank(); const double dOldScore = pirm->Measure(adY + iItemStart, ranker); // Compute score according to new score: adF' = adF + dStepSize * adFadj for (unsigned int i = 0; i < cNumItems; i++) { ranker.AddToScore(i, adFadj[i+iItemStart] * dStepSize); } const double dWi = adWeight[iItemStart]; if (ranker.Rank()) { // Ranking changed const double dNewScore = pirm->Measure(adY + iItemStart, ranker); dL += dWi * (dNewScore - dOldScore) / dMaxScore; } dW += dWi; } } // Next group iItemStart = iItemEnd; } return dL / dW; } gbm/src/distribution.cpp0000644000176200001440000000022614547111634015037 0ustar liggesusers// GBM by Greg Ridgeway Copyright (C) 2003 #include "distribution.h" CDistribution::CDistribution() { } CDistribution::~CDistribution() { } gbm/src/node_categorical.h0000644000176200001440000000362314547111634015253 0ustar liggesusers//------------------------------------------------------------------------------ // GBM by Greg Ridgeway Copyright (C) 2003 // // File: node_categorical.h // // License: GNU GPL (version 2 or later) // // Contents: a node with a categorical split // // Owner: gregr@rand.org // // History: 3/26/2001 gregr created // 2/14/2003 gregr: adapted for R implementation // //------------------------------------------------------------------------------ #ifndef NODECATEGORICAL_H #define NODECATEGORICAL_H #include #include #include "node_nonterminal.h" class CNodeCategorical : public CNodeNonterminal { public: CNodeCategorical(); ~CNodeCategorical(); GBMRESULT PrintSubtree(unsigned long cIndent); GBMRESULT TransferTreeToRList(int &iNodeID, CDataset *pData, int *aiSplitVar, double *adSplitPoint, int *aiLeftNode, int *aiRightNode, int *aiMissingNode, double *adErrorReduction, double *adWeight, double *adPred, VEC_VEC_CATEGORIES &vecSplitCodes, int cCatSplitsOld, double dShrinkage); signed char WhichNode(CDataset *pData, unsigned long iObs); signed char WhichNode(double *adX, unsigned long cRow, unsigned long cCol, unsigned long iRow); GBMRESULT RecycleSelf(CNodeFactory *pNodeFactory); unsigned long *aiLeftCategory; unsigned long cLeftCategory; }; typedef CNodeCategorical *PCNodeCategorical; #endif // NODECATEGORICAL_H gbm/NAMESPACE0000644000176200001440000000424414547111627012252 0ustar liggesusers# Generated by roxygen2: do not edit by hand S3method(plot,gbm) S3method(predict,gbm) S3method(print,gbm) S3method(summary,gbm) export(basehaz.gbm) export(calibrate.plot) export(checkID) export(checkMissing) export(checkOffset) export(checkWeights) export(gbm) export(gbm.conc) export(gbm.fit) export(gbm.loss) export(gbm.more) export(gbm.perf) export(gbm.roc.area) export(gbmCluster) export(gbmCrossVal) export(gbmCrossValErr) export(gbmCrossValModelBuild) export(gbmCrossValPredictions) export(gbmDoFold) export(getCVgroup) export(getStratify) export(getVarNames) export(guessDist) export(interact.gbm) export(ir.measure.auc) export(ir.measure.conc) export(ir.measure.map) export(ir.measure.mrr) export(ir.measure.ndcg) export(perf.pairwise) export(permutation.test.gbm) export(plot.gbm) export(predict.gbm) export(pretty.gbm.tree) export(quantile.rug) export(reconstructGBMdata) export(relative.influence) export(show.gbm) export(summary.gbm) export(test.gbm) export(test.relative.influence) export(validate.gbm) import(lattice) importFrom(grDevices,rainbow) importFrom(graphics,abline) importFrom(graphics,axis) importFrom(graphics,barplot) importFrom(graphics,lines) importFrom(graphics,mtext) importFrom(graphics,par) importFrom(graphics,plot) importFrom(graphics,polygon) importFrom(graphics,rug) importFrom(graphics,segments) importFrom(graphics,title) importFrom(stats,approx) importFrom(stats,binomial) importFrom(stats,delete.response) importFrom(stats,gaussian) importFrom(stats,glm) importFrom(stats,loess) importFrom(stats,model.extract) importFrom(stats,model.frame) importFrom(stats,model.offset) importFrom(stats,model.response) importFrom(stats,model.weights) importFrom(stats,na.pass) importFrom(stats,poisson) importFrom(stats,predict) importFrom(stats,quantile) importFrom(stats,rbinom) importFrom(stats,reformulate) importFrom(stats,reorder) importFrom(stats,rexp) importFrom(stats,rnorm) importFrom(stats,runif) importFrom(stats,sd) importFrom(stats,supsmu) importFrom(stats,terms) importFrom(stats,var) importFrom(stats,weighted.mean) importFrom(survival,Surv) useDynLib(gbm, .registration = TRUE) gbm/LICENSE0000644000176200001440000000124214547111627012033 0ustar liggesusersGeneralized Boosted Regression package for the R environment Copyright (C) 2003 Greg Ridgeway This program is free software; you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation; either version 2 of the License, or (at your option) any later version. This program is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for more details. Copies of the relevant licenses can be found at: https://www.r-project.org/Licenses/ gbm/NEWS.md0000644000176200001440000000751514547260745012143 0ustar liggesusers# gbm 2.1.9 * Maintenance update to address new R standards # gbm 2.1.8 * Removed experimental functions `shrink.gbm()` and `shrink.gbm.pred()`; the latter seemed broken anyway. Happy to accept a PR if anyone wants to fix them. # gbm 2.1.7 * Fix `Non-file package-anchored link(s) in documentation...` warning. # gbm 2.1.6 * Corrected the number of arguments for `gbm_shrink_gradient()` in `gbm-init.c` [(#50)](https://github.com/gbm-developers/gbm/issues/50). (Thanks to CRAN for highlighting the issue.) * Removed unnecessary dependency on [gridExtra](https://cran.r-project.org/package=gridExtra). * Switched to using `lapply()` instead of `parallel::parLapply()` whenever `n.cores = 1`. * Calling `gbm()` with `distribution = "bernoulli"` will now throw an error whenever the response is non-numeric (e.g., 0/1 factors will throw an error instead of possibly crashing the session.) [(#6)](https://github.com/gbm-developers/gbm/issues/6). (Thanks to @mzoll.) * Calling `gbm()` with `distribution = "multinomial"` now comes with a warning message; multinomial support has always been problematic and since this package is only being maintained for backwards compatibility, it likely will not be fixed unless someone makes a PR. * Switched from [RUnit](https://cran.r-project.org/package=RUnit) to [tinytest](https://cran.r-project.org/package=tinytest) framework. The `test.gbm()`, `test.relative.influence()`, and `validate.gbm()` functions will remain for backwards compatability. This is just the start, as more tests will be added in the future [(#51)](https://github.com/gbm-developers/gbm/issues/51). #### Bug fixes * Fixed a long standing bug that could occur when using k-fold cross-validation with a response that's been transformed in the model formula [(#30)](https://github.com/gbm-developers/gbm/issues/30). * Fixed a but that would crash the session when giving "bad" input for `n.trees` in the call to `predict.gbm()` [(#45)](https://github.com/gbm-developers/gbm/issues/45). (Thanks to @ngreifer.) * Fixed a bug where calling `predict()` could throw an error in some cases when `n.trees` was not specified. # gbm 2.1.5 * Fixed bug that occurred whenever `distribution` was a list (e.g., "pairwise" regression) [(#27)](https://github.com/gbm-developers/gbm/issues/27). * Fixed a bug that occurred when making predictions on new data with different factor levels [(#28)](https://github.com/gbm-developers/gbm/issues/28). * Fixed a bug that caused `relative.influence()` to give different values whenever `n.trees` was/wasn't given for multinomial distributions [(#31)](https://github.com/gbm-developers/gbm/issues/31). * The `plot.it` argument of `gbm.perf()` is no longer ignored [(#34)](https://github.com/gbm-developers/gbm/issues/34). * Fixed an error that occurred in `gbm.perf()` whenever `oobag.curve = FALSE` and `overlay = FALSE`. # gbm 2.1.4 * Switched from `CHANGES` to `NEWS` file. * Updated links and maintainer field in `DESCRIPTION` file. * Fixed bug caused by factors with unused levels [(#5)](https://github.com/gbm-developers/gbm/issues/5). * Fixed bug with axis labels in the `plot()` method for `"gbm"` objects [(#17)](https://github.com/gbm-developers/gbm/issues/17). * The `plot()` method for `"gbm"` objects is now more consistent and always returns a `"trellis"` object [(#19)](https://github.com/gbm-developers/gbm/issues/19). Consequently, setting graphical parameters via `par` will no longer have an effect on the output from `plot.gbm()`. * The `plot()` method for `"gbm"` objects gained five new arguments: `level.plot`, `contour`, `number`, `overlap`, and `col.regions`; see `?plot.gbm` for details. * The default color palette for false color level plots in `plot.gbm()` has changed to the Matplotlib 'viridis' color map. * Fixed a number of references and URLs. gbm/inst/0000755000176200001440000000000014637005163012001 5ustar liggesusersgbm/inst/doc/0000755000176200001440000000000014637005163012546 5ustar liggesusersgbm/inst/doc/gbm.Rnw0000644000176200001440000007342614547111634014020 0ustar liggesusers\documentclass{article} \bibliographystyle{plain} \newcommand{\EV}{\mathrm{E}} \newcommand{\Var}{\mathrm{Var}} \newcommand{\aRule}{\begin{center} \rule{5in}{1mm} \end{center}} \title{Generalized Boosted Models:\\A guide to the gbm package} \author{Greg Ridgeway} %\VignetteEngine{knitr::knitr} %\VignetteIndexEntry{Generalized Boosted Models: A guide to the gbm package} \newcommand{\mathgbf}[1]{{\mbox{\boldmath$#1$\unboldmath}}} \begin{document} \maketitle Boosting takes on various forms with different programs using different loss functions, different base models, and different optimization schemes. The gbm package takes the approach described in \cite{Friedman:2001} and \cite{Friedman:2002}. Some of the terminology differs, mostly due to an effort to cast boosting terms into more standard statistical terminology (e.g. deviance). In addition, the gbm package implements boosting for models commonly used in statistics but not commonly associated with boosting. The Cox proportional hazard model, for example, is an incredibly useful model and the boosting framework applies quite readily with only slight modification \cite{Ridgeway:1999}. Also some algorithms implemented in the gbm package differ from the standard implementation. The AdaBoost algorithm \cite{FreundSchapire:1997} has a particular loss function and a particular optimization algorithm associated with it. The gbm implementation of AdaBoost adopts AdaBoost's exponential loss function (its bound on misclassification rate) but uses Friedman's gradient descent algorithm rather than the original one proposed. So the main purposes of this document is to spell out in detail what the gbm package implements. \section{Gradient boosting} This section essentially presents the derivation of boosting described in \cite{Friedman:2001}. The gbm package also adopts the stochastic gradient boosting strategy, a small but important tweak on the basic algorithm, described in \cite{Friedman:2002}. \subsection{Friedman's gradient boosting machine} \label{sec:GradientBoostingMachine} \begin{figure} \aRule Initialize $\hat f(\mathbf{x})$ to be a constant, $\hat f(\mathbf{x}) = \arg \min_{\rho} \sum_{i=1}^N \Psi(y_i,\rho)$. \\ For $t$ in $1,\ldots,T$ do \begin{enumerate} \item Compute the negative gradient as the working response \begin{equation} z_i = -\frac{\partial}{\partial f(\mathbf{x}_i)} \Psi(y_i,f(\mathbf{x}_i)) \mbox{\Huge $|$}_{f(\mathbf{x}_i)=\hat f(\mathbf{x}_i)} \end{equation} \item Fit a regression model, $g(\mathbf{x})$, predicting $z_i$ from the covariates $\mathbf{x}_i$. \item Choose a gradient descent step size as \begin{equation} \rho = \arg \min_{\rho} \sum_{i=1}^N \Psi(y_i,\hat f(\mathbf{x}_i)+\rho g(\mathbf{x}_i)) \end{equation} \item Update the estimate of $f(\mathbf{x})$ as \begin{equation} \hat f(\mathbf{x}) \leftarrow \hat f(\mathbf{x}) + \rho g(\mathbf{x}) \end{equation} \end{enumerate} \aRule \caption{Friedman's Gradient Boost algorithm} \label{fig:GradientBoost} \end{figure} Friedman (2001) and the companion paper Friedman (2002) extended the work of Friedman, Hastie, and Tibshirani (2000) and laid the ground work for a new generation of boosting algorithms. Using the connection between boosting and optimization, this new work proposes the Gradient Boosting Machine. In any function estimation problem we wish to find a regression function, $\hat f(\mathbf{x})$, that minimizes the expectation of some loss function, $\Psi(y,f)$, as shown in (\ref{NonparametricRegression1}). \begin{eqnarray} \hspace{0.5in} \hat f(\mathbf{x}) &=& \arg \min_{f(\mathbf{x})} \EV_{y,\mathbf{x}} \Psi(y,f(\mathbf{x})) \nonumber \\ \label{NonparametricRegression1} &=& \arg \min_{f(\mathbf{x})} \EV_x \left[ \EV_{y|\mathbf{x}} \Psi(y,f(\mathbf{x})) \Big| \mathbf{x} \right] \end{eqnarray} We will focus on finding estimates of $f(\mathbf{x})$ such that \begin{equation} \label{NonparametricRegression2} \hspace{0.5in} \hat f(\mathbf{x}) = \arg \min_{f(\mathbf{x})} \EV_{y|\mathbf{x}} \left[ \Psi(y,f(\mathbf{x}))|\mathbf{x} \right] \end{equation} Parametric regression models assume that $f(\mathbf{x})$ is a function with a finite number of parameters, $\beta$, and estimates them by selecting those values that minimize a loss function (e.g. squared error loss) over a training sample of $N$ observations on $(y,\mathbf{x})$ pairs as in (\ref{eq:Friedman1}). \begin{equation} \label{eq:Friedman1} \hspace{0.5in} \hat\beta = \arg \min_{\beta} \sum_{i=1}^N \Psi(y_i,f(\mathbf{x}_i;\beta)) \end{equation} When we wish to estimate $f(\mathbf{x})$ non-parametrically the task becomes more difficult. Again we can proceed similarly to \cite{FHT:2000} and modify our current estimate of $f(\mathbf{x})$ by adding a new function $f(\mathbf{x})$ in a greedy fashion. Letting $f_i = f(\mathbf{x}_i)$, we see that we want to decrease the $N$ dimensional function \begin{eqnarray} \label{EQ:Friedman2} \hspace{0.5in} J(\mathbf{f}) &=& \sum_{i=1}^N \Psi(y_i,f(\mathbf{x}_i)) \nonumber \\ &=& \sum_{i=1}^N \Psi(y_i,F_i). \end{eqnarray} The negative gradient of $J(\mathbf{f})$ indicates the direction of the locally greatest decrease in $J(\mathbf{f})$. Gradient descent would then have us modify $\mathbf{f}$ as \begin{equation} \label{eq:Friedman3} \hspace{0.5in} \hat \mathbf{f} \leftarrow \hat \mathbf{f} - \rho \nabla J(\mathbf{f}) \end{equation} where $\rho$ is the size of the step along the direction of greatest descent. Clearly, this step alone is far from our desired goal. First, it only fits $f$ at values of $\mathbf{x}$ for which we have observations. Second, it does not take into account that observations with similar $\mathbf{x}$ are likely to have similar values of $f(\mathbf{x})$. Both these problems would have disastrous effects on generalization error. However, Friedman suggests selecting a class of functions that use the covariate information to approximate the gradient, usually a regression tree. This line of reasoning produces his Gradient Boosting algorithm shown in Figure~\ref{fig:GradientBoost}. At each iteration the algorithm determines the direction, the gradient, in which it needs to improve the fit to the data and selects a particular model from the allowable class of functions that is in most agreement with the direction. In the case of squared-error loss, $\Psi(y_i,f(\mathbf{x}_i)) = \sum_{i=1}^N (y_i-f(\mathbf{x}_i))^2$, this algorithm corresponds exactly to residual fitting. There are various ways to extend and improve upon the basic framework suggested in Figure~\ref{fig:GradientBoost}. For example, Friedman (2001) substituted several choices in for $\Psi$ to develop new boosting algorithms for robust regression with least absolute deviation and Huber loss functions. Friedman (2002) showed that a simple subsampling trick can greatly improve predictive performance while simultaneously reduce computation time. Section~\ref{GBMModifications} discusses some of these modifications. \section{Improving boosting methods using control of the learning rate, sub-sampling, and a decomposition for interpretation} \label{GBMModifications} This section explores the variations of the previous algorithms that have the potential to improve their predictive performance and interpretability. In particular, by controlling the optimization speed or learning rate, introducing low-variance regression methods, and applying ideas from robust regression we can produce non-parametric regression procedures with many desirable properties. As a by-product some of these modifications lead directly into implementations for learning from massive datasets. All these methods take advantage of the general form of boosting \begin{equation} \hat f(\mathbf{x}) \leftarrow \hat f(\mathbf{x}) + \EV(z(y,\hat f(\mathbf{x}))|\mathbf{x}). \end{equation} So far we have taken advantage of this form only by substituting in our favorite regression procedure for $\EV_w(z|\mathbf{x})$. I will discuss some modifications to estimating $\EV_w(z|\mathbf{x})$ that have the potential to improve our algorithm. \subsection{Decreasing the learning rate} As several authors have phrased slightly differently, ``...boosting, whatever flavor, seldom seems to overfit, no matter how many terms are included in the additive expansion''. This is not true as the discussion to \cite{FHT:2000} points out. In the update step of any boosting algorithm we can introduce a learning rate to dampen the proposed move. \begin{equation} \label{eq:shrinkage} \hat f(\mathbf{x}) \leftarrow \hat f(\mathbf{x}) + \lambda \EV(z(y,\hat f(\mathbf{x}))|\mathbf{x}). \end{equation} By multiplying the gradient step by $\lambda$ as in equation~\ref{eq:shrinkage} we have control on the rate at which the boosting algorithm descends the error surface (or ascends the likelihood surface). When $\lambda=1$ we return to performing full gradient steps. Friedman (2001) relates the learning rate to regularization through shrinkage. The optimal number of iterations, $T$, and the learning rate, $\lambda$, depend on each other. In practice I set $\lambda$ to be as small as possible and then select $T$ by cross-validation. Performance is best when $\lambda$ is as small as possible performance with decreasing marginal utility for smaller and smaller $\lambda$. Slower learning rates do not necessarily scale the number of optimal iterations. That is, if when $\lambda=1.0$ and the optimal $T$ is 100 iterations, does {\it not} necessarily imply that when $\lambda=0.1$ the optimal $T$ is 1000 iterations. \subsection{Variance reduction using subsampling} Friedman (2002) proposed the stochastic gradient boosting algorithm that simply samples uniformly without replacement from the dataset before estimating the next gradient step. He found that this additional step greatly improved performance. We estimate the regression $\EV(z(y,\hat f(\mathbf{x}))|\mathbf{x})$ using a random subsample of the dataset. \subsection{ANOVA decomposition} Certain function approximation methods are decomposable in terms of a ``functional ANOVA decomposition''. That is a function is decomposable as \begin{equation} \label{ANOVAdecomp} f(\mathbf{x}) = \sum_j f_j(x_j) + \sum_{jk} f_{jk}(x_j,x_k) + \sum_{jk\ell} f_{jk\ell}(x_j,x_k,x_\ell) + \cdots. \end{equation} This applies to boosted trees. Regression stumps (one split decision trees) depend on only one variable and fall into the first term of \ref{ANOVAdecomp}. Trees with two splits fall into the second term of \ref{ANOVAdecomp} and so on. By restricting the depth of the trees produced on each boosting iteration we can control the order of approximation. Often additive components are sufficient to approximate a multivariate function well, generalized additive models, the na\"{\i}ve Bayes classifier, and boosted stumps for example. When the approximation is restricted to a first order we can also produce plots of $x_j$ versus $f_j(x_j)$ to demonstrate how changes in $x_j$ might affect changes in the response variable. \subsection{Relative influence} Friedman (2001) also develops an extension of a variable's ``relative influence'' for boosted estimates. For tree based methods the approximate relative influence of a variable $x_j$ is \begin{equation} \label{RelInfluence} \hspace{0.5in} \hat J_j^2 = \hspace{-0.1in}\sum_{\mathrm{splits~on~}x_j}\hspace{-0.2in}I_t^2 \end{equation} where $I_t^2$ is the empirical improvement by splitting on $x_j$ at that point. Friedman's extension to boosted models is to average the relative influence of variable $x_j$ across all the trees generated by the boosting algorithm. \begin{figure} \aRule Select \begin{itemize} \item a loss function (\texttt{distribution}) \item the number of iterations, $T$ (\texttt{n.trees}) \item the depth of each tree, $K$ (\texttt{interaction.depth}) \item the shrinkage (or learning rate) parameter, $\lambda$ (\texttt{shrinkage}) \item the subsampling rate, $p$ (\texttt{bag.fraction}) \end{itemize} Initialize $\hat f(\mathbf{x})$ to be a constant, $\hat f(\mathbf{x}) = \arg \min_{\rho} \sum_{i=1}^N \Psi(y_i,\rho)$ \\ For $t$ in $1,\ldots,T$ do \begin{enumerate} \item Compute the negative gradient as the working response \begin{equation} z_i = -\frac{\partial}{\partial f(\mathbf{x}_i)} \Psi(y_i,f(\mathbf{x}_i)) \mbox{\Huge $|$}_{f(\mathbf{x}_i)=\hat f(\mathbf{x}_i)} \end{equation} \item Randomly select $p\times N$ cases from the dataset \item Fit a regression tree with $K$ terminal nodes, $g(\mathbf{x})=\EV(z|\mathbf{x})$. This tree is fit using only those randomly selected observations \item Compute the optimal terminal node predictions, $\rho_1,\ldots,\rho_K$, as \begin{equation} \rho_k = \arg \min_{\rho} \sum_{\mathbf{x}_i\in S_k} \Psi(y_i,\hat f(\mathbf{x}_i)+\rho) \end{equation} where $S_k$ is the set of $\mathbf{x}$s that define terminal node $k$. Again this step uses only the randomly selected observations. \item Update $\hat f(\mathbf{x})$ as \begin{equation} \hat f(\mathbf{x}) \leftarrow \hat f(\mathbf{x}) + \lambda\rho_{k(\mathbf{x})} \end{equation} where $k(\mathbf{x})$ indicates the index of the terminal node into which an observation with features $\mathbf{x}$ would fall. \end{enumerate} \aRule \caption{Boosting as implemented in \texttt{gbm()}} \label{fig:gbm} \end{figure} \section{Common user options} This section discusses the options to gbm that most users will need to change or tune. \subsection{Loss function} The first and foremost choice is \texttt{distribution}. This should be easily dictated by the application. For most classification problems either \texttt{bernoulli} or \texttt{adaboost} will be appropriate, the former being recommended. For continuous outcomes the choices are \texttt{gaussian} (for minimizing squared error), \texttt{laplace} (for minimizing absolute error), and quantile regression (for estimating percentiles of the conditional distribution of the outcome). Censored survival outcomes should require \texttt{coxph}. Count outcomes may use \texttt{poisson} although one might also consider \texttt{gaussian} or \texttt{laplace} depending on the analytical goals. \subsection{The relationship between shrinkage and number of iterations} The issues that most new users of gbm struggle with are the choice of \texttt{n.trees} and \texttt{shrinkage}. It is important to know that smaller values of \texttt{shrinkage} (almost) always give improved predictive performance. That is, setting \texttt{shrinkage=0.001} will almost certainly result in a model with better out-of-sample predictive performance than setting \texttt{shrinkage=0.01}. However, there are computational costs, both storage and CPU time, associated with setting \texttt{shrinkage} to be low. The model with \texttt{shrinkage=0.001} will likely require ten times as many iterations as the model with \texttt{shrinkage=0.01}, increasing storage and computation time by a factor of 10. Figure~\ref{fig:shrinkViters} shows the relationship between predictive performance, the number of iterations, and the shrinkage parameter. Note that the increase in the optimal number of iterations between two choices for shrinkage is roughly equal to the ratio of the shrinkage parameters. It is generally the case that for small shrinkage parameters, 0.001 for example, there is a fairly long plateau in which predictive performance is at its best. My rule of thumb is to set \texttt{shrinkage} as small as possible while still being able to fit the model in a reasonable amount of time and storage. I usually aim for 3,000 to 10,000 iterations with shrinkage rates between 0.01 and 0.001. \begin{figure}[ht] \begin{center} \includegraphics[width=5in]{shrinkage-v-iterations} \end{center} \caption{Out-of-sample predictive performance by number of iterations and shrinkage. Smaller values of the shrinkage parameter offer improved predictive performance, but with decreasing marginal improvement.} \label{fig:shrinkViters} \end{figure} \subsection{Estimating the optimal number of iterations} gbm offers three methods for estimating the optimal number of iterations after the gbm model has been fit, an independent test set (\texttt{test}), out-of-bag estimation (\texttt{OOB}), and $v$-fold cross validation (\texttt{cv}). The function \texttt{gbm.perf} computes the iteration estimate. Like Friedman's MART software, the independent test set method uses a single holdout test set to select the optimal number of iterations. If \texttt{train.fraction} is set to be less than 1, then only the \textit{first} \texttt{train.fraction}$\times$\texttt{nrow(data)} will be used to fit the model. Note that if the data are sorted in a systematic way (such as cases for which $y=1$ come first), then the data should be shuffled before running gbm. Those observations not used in the model fit can be used to get an unbiased estimate of the optimal number of iterations. The downside of this method is that a considerable number of observations are used to estimate the single regularization parameter (number of iterations) leaving a reduced dataset for estimating the entire multivariate model structure. Use \texttt{gbm.perf(...,method="test")} to obtain an estimate of the optimal number of iterations using the held out test set. If \texttt{bag.fraction} is set to be greater than 0 (0.5 is recommended), gbm computes an out-of-bag estimate of the improvement in predictive performance. It evaluates the reduction in deviance on those observations not used in selecting the next regression tree. The out-of-bag estimator underestimates the reduction in deviance. As a result, it almost always is too conservative in its selection for the optimal number of iterations. The motivation behind this method was to avoid having to set aside a large independent dataset, which reduces the information available for learning the model structure. Use \texttt{gbm.perf(...,method="OOB")} to obtain the OOB estimate. Lastly, gbm offers $v$-fold cross validation for estimating the optimal number of iterations. If when fitting the gbm model, \texttt{cv.folds=5} then gbm will do 5-fold cross validation. gbm will fit five gbm models in order to compute the cross validation error estimate and then will fit a sixth and final gbm model with \texttt{n.trees}iterations using all of the data. The returned model object will have a component labeled \texttt{cv.error}. Note that \texttt{gbm.more} will do additional gbm iterations but will not add to the \texttt{cv.error} component. Use \texttt{gbm.perf(...,method="cv")} to obtain the cross validation estimate. \begin{figure}[ht] \begin{center} \includegraphics[width=5in]{oobperf2} \end{center} \caption{Out-of-sample predictive performance of four methods of selecting the optimal number of iterations. The vertical axis plots performance relative the best. The boxplots indicate relative performance across thirteen real datasets from the UCI repository. See \texttt{demo(OOB-reps)}.} \label{fig:oobperf} \end{figure} Figure~\ref{fig:oobperf} compares the three methods for estimating the optimal number of iterations across 13 datasets. The boxplots show the methods performance relative to the best method on that dataset. For most datasets the method perform similarly, however, 5-fold cross validation is consistently the best of them. OOB, using a 33\% test set, and using a 20\% test set all have datasets for which the perform considerably worse than the best method. My recommendation is to use 5- or 10-fold cross validation if you can afford the computing time. Otherwise you may choose among the other options, knowing that OOB is conservative. \section{Available distributions} This section gives some of the mathematical detail for each of the distribution options that gbm offers. The gbm engine written in C++ has access to a C++ class for each of these distributions. Each class contains methods for computing the associated deviance, initial value, the gradient, and the constants to predict in each terminal node. In the equations shown below, for non-zero offset terms, replace $f(\mathbf{x}_i)$ with $o_i + f(\mathbf{x}_i)$. \subsection{Gaussian} \begin{tabular}{ll} Deviance & $\displaystyle \frac{1}{\sum w_i} \sum w_i(y_i-f(\mathbf{x}_i))^2$ \\ Initial value & $\displaystyle f(\mathbf{x})=\frac{\sum w_i(y_i-o_i)}{\sum w_i}$ \\ Gradient & $z_i=y_i - f(\mathbf{x}_i)$ \\ Terminal node estimates & $\displaystyle \frac{\sum w_i(y_i-f(\mathbf{x}_i))}{\sum w_i}$ \end{tabular} \subsection{AdaBoost} \begin{tabular}{ll} Deviance & $\displaystyle \frac{1}{\sum w_i} \sum w_i\exp(-(2y_i-1)f(\mathbf{x}_i))$ \\ Initial value & $\displaystyle \frac{1}{2}\log\frac{\sum y_iw_ie^{-o_i}}{\sum (1-y_i)w_ie^{o_i}}$ \\ Gradient & $\displaystyle z_i= -(2y_i-1)\exp(-(2y_i-1)f(\mathbf{x}_i))$ \\ Terminal node estimates & $\displaystyle \frac{\sum (2y_i-1)w_i\exp(-(2y_i-1)f(\mathbf{x}_i))} {\sum w_i\exp(-(2y_i-1)f(\mathbf{x}_i))}$ \end{tabular} \subsection{Bernoulli} \begin{tabular}{ll} Deviance & $\displaystyle -2\frac{1}{\sum w_i} \sum w_i(y_if(\mathbf{x}_i)-\log(1+\exp(f(\mathbf{x}_i))))$ \\ Initial value & $\displaystyle \log\frac{\sum w_iy_i}{\sum w_i(1-y_i)}$ \\ Gradient & $\displaystyle z_i=y_i-\frac{1}{1+\exp(-f(\mathbf{x}_i))}$ \\ Terminal node estimates & $\displaystyle \frac{\sum w_i(y_i-p_i)}{\sum w_ip_i(1-p_i)}$ \\ & where $\displaystyle p_i = \frac{1}{1+\exp(-f(\mathbf{x}_i))}$ \\ \end{tabular} Notes: \begin{itemize} \item For non-zero offset terms, the computation of the initial value requires Newton-Raphson. Initialize $f_0=0$ and iterate $\displaystyle f_0 \leftarrow f_0 + \frac{\sum w_i(y_i-p_i)}{\sum w_ip_i(1-p_i)}$ where $\displaystyle p_i = \frac{1}{1+\exp(-(o_i+f_0))}$. \end{itemize} \subsection{Laplace} \begin{tabular}{ll} Deviance & $\frac{1}{\sum w_i} \sum w_i|y_i-f(\mathbf{x}_i)|$ \\ Initial value & $\mbox{median}_w(y)$ \\ Gradient & $z_i=\mbox{sign}(y_i-f(\mathbf{x}_i))$ \\ Terminal node estimates & $\mbox{median}_w(z)$ \end{tabular} Notes: \begin{itemize} \item $\mbox{median}_w(y)$ denotes the weighted median, defined as the solution to the equation $\frac{\sum w_iI(y_i\leq m)}{\sum w_i}=\frac{1}{2}$ \item \texttt{gbm()} currently does not implement the weighted median and issues a warning when the user uses weighted data with \texttt{distribution="laplace"}. \end{itemize} \subsection{Quantile regression} Contributed by Brian Kriegler (see \cite{Kriegler:2010}). \begin{tabular}{ll} Deviance & $\frac{1}{\sum w_i} \left(\alpha\sum_{y_i>f(\mathbf{x}_i)} w_i(y_i-f(\mathbf{x}_i))\right. +$ \\ & \hspace{0.5in}$\left.(1-\alpha)\sum_{y_i\leq f(\mathbf{x}_i)} w_i(f(\mathbf{x}_i)-y_i)\right)$ \\ Initial value & $\mathrm{quantile}^{(\alpha)}_w(y)$ \\ Gradient & $z_i=\alpha I(y_i>f(\mathbf{x}_i))-(1-\alpha)I(y_i\leq f(\mathbf{x}_i))$ \\ Terminal node estimates & $\mathrm{quantile}^{(\alpha)}_w(z)$ \end{tabular} Notes: \begin{itemize} \item $\mathrm{quantile}^{(\alpha)}_w(y)$ denotes the weighted quantile, defined as the solution to the equation $\frac{\sum w_iI(y_i\leq q)}{\sum w_i}=\alpha$ \item \texttt{gbm()} currently does not implement the weighted median and issues a warning when the user uses weighted data with \texttt{distribution=list(name="quantile")}. \end{itemize} \subsection{Cox Proportional Hazard} \begin{tabular}{ll} Deviance & $-2\sum w_i(\delta_i(f(\mathbf{x}_i)-\log(R_i/w_i)))$\\ Gradient & $\displaystyle z_i=\delta_i - \sum_j \delta_j \frac{w_jI(t_i\geq t_j)e^{f(\mathbf{x}_i)}} {\sum_k w_kI(t_k\geq t_j)e^{f(\mathbf{x}_k)}}$ \\ Initial value & 0 \\ Terminal node estimates & Newton-Raphson algorithm \end{tabular} \begin{enumerate} \item Initialize the terminal node predictions to 0, $\mathgbf{\rho}=0$ \item Let $\displaystyle p_i^{(k)}=\frac{\sum_j I(k(j)=k)I(t_j\geq t_i)e^{f(\mathbf{x}_i)+\rho_k}} {\sum_j I(t_j\geq t_i)e^{f(\mathbf{x}_i)+\rho_k}}$ \item Let $g_k=\sum w_i\delta_i\left(I(k(i)=k)-p_i^{(k)}\right)$ \item Let $\mathbf{H}$ be a $k\times k$ matrix with diagonal elements \begin{enumerate} \item Set diagonal elements $H_{mm}=\sum w_i\delta_i p_i^{(m)}\left(1-p_i^{(m)}\right)$ \item Set off diagonal elements $H_{mn}=-\sum w_i\delta_i p_i^{(m)}p_i^{(n)}$ \end{enumerate} \item Newton-Raphson update $\mathgbf{\rho} \leftarrow \mathgbf{\rho} - \mathbf{H}^{-1}\mathbf{g}$ \item Return to step 2 until convergence \end{enumerate} Notes: \begin{itemize} \item $t_i$ is the survival time and $\delta_i$ is the death indicator. \item $R_i$ denotes the hazard for the risk set, $R_i=\sum_{j=1}^N w_jI(t_j\geq t_i)e^{f(\mathbf{x}_i)}$ \item $k(i)$ indexes the terminal node of observation $i$ \item For speed, \texttt{gbm()} does only one step of the Newton-Raphson algorithm rather than iterating to convergence. No appreciable loss of accuracy since the next boosting iteration will simply correct for the prior iterations inadequacy. \item \texttt{gbm()} initially sorts the data by survival time. Doing this reduces the computation of the risk set from $O(n^2)$ to $O(n)$ at the cost of a single up front sort on survival time. After the model is fit, the data are then put back in their original order. \end{itemize} \subsection{Poisson} \begin{tabular}{ll} Deviance & -2$\frac{1}{\sum w_i} \sum w_i(y_if(\mathbf{x}_i)-\exp(f(\mathbf{x}_i)))$ \\ Initial value & $\displaystyle f(\mathbf{x})= \log\left(\frac{\sum w_iy_i}{\sum w_ie^{o_i}}\right)$ \\ Gradient & $z_i=y_i - \exp(f(\mathbf{x}_i))$ \\ Terminal node estimates & $\displaystyle \log\frac{\sum w_iy_i}{\sum w_i\exp(f(\mathbf{x}_i))}$ \end{tabular} The Poisson class includes special safeguards so that the most extreme predicted values are $e^{-19}$ and $e^{+19}$. This behavior is consistent with \texttt{glm()}. \subsection{Pairwise} This distribution implements ranking measures following the \emph{LambdaMart} algorithm \cite{Burges:2010}. Instances belong to \emph{groups}; all pairs of items with different labels, belonging to the same group, are used for training. In \emph{Information Retrieval} applications, groups correspond to user queries, and items to (feature vectors of) documents in the associated match set to be ranked. For consistency with typical usage, our goal is to \emph{maximize} one of the \emph{utility} functions listed below. Consider a group with instances $x_1, \dots, x_n$, ordered such that $f(x_1) \geq f(x_2) \geq \dots f(x_n)$; i.e., the \emph{rank} of $x_i$ is $i$, where smaller ranks are preferable. Let $P$ be the set of all ordered pairs such that $y_i > y_j$. \begin{enumerate} \item[{\bf Concordance:}] Fraction of concordant (i.e, correctly ordered) pairs. For the special case of binary labels, this is equivalent to the Area under the ROC Curve. $$\left\{ \begin{array}{l l}\frac{\|\{(i,j)\in P | f(x_i)>f(x_j)\}\|}{\|P\|} & P \neq \emptyset\\ 0 & \mbox{otherwise.} \end{array}\right. $$ \item[{\bf MRR:}] Mean reciprocal rank of the highest-ranked positive instance (it is assumed $y_i\in\{0,1\}$): $$\left\{ \begin{array}{l l}\frac{1}{\min\{1 \leq i \leq n |y_i=1\}} & \exists i: \, 1 \leq i \leq n, y_i=1\\ 0 & \mbox{otherwise.}\end{array}\right.$$ \item[{\bf MAP:}] Mean average precision, a generalization of MRR to multiple positive instances: $$\left\{ \begin{array}{l l} \frac{\sum_{1\leq i\leq n | y_i=1} \|\{1\leq j\leq i |y_j=1\}\|\,/\,i}{\|\{1\leq i\leq n | y_i=1\}\|} & \exists i: \, 1 \leq i \leq n, y_i=1\\ 0 & \mbox{otherwise.}\end{array}\right.$$ \item[{\bf nDCG:}] Normalized discounted cumulative gain: $$\frac{\sum_{1\leq i\leq n} \log_2(i+1) \, y_i}{\sum_{1\leq i\leq n} \log_2(i+1) \, y'_i},$$ where $y'_1, \dots, y'_n$ is a reordering of $y_1, \dots,y_n$ with $y'_1 \geq y'_2 \geq \dots \geq y'_n$. \end{enumerate} The generalization to multiple (possibly weighted) groups is straightforward. Sometimes a cut-off rank $k$ is given for \emph{MRR} and \emph{nDCG}, in which case we replace the outer index $n$ by $\min(n,k)$. The initial value for $f(x_i)$ is always zero. We derive the gradient of a cost function whose gradient locally approximates the gradient of the IR measure for a fixed ranking: \begin{eqnarray*} \Phi & = & \sum_{(i,j) \in P} \Phi_{ij}\\ & = & \sum_{(i,j) \in P} |\Delta Z_{ij}| \log \left( 1 + e^{-(f(x_i) - f(x_j))}\right), \end{eqnarray*} where $|\Delta Z_{ij}|$ is the absolute utility difference when swapping the ranks of $i$ and $j$, while leaving all other instances the same. Define \begin{eqnarray*} \lambda_{ij} & = & \frac{\partial\Phi_{ij}}{\partial f(x_i)}\\ & = & - |\Delta Z_{ij}| \frac{1}{1 + e^{f(x_i) - f(x_j)}}\\ & = & - |\Delta Z_{ij}| \, \rho_{ij}, \end{eqnarray*} with $$ \rho_{ij} = - \frac{\lambda_{ij }}{|\Delta Z_{ij}|} = \frac{1}{1 + e^{f(x_i) - f(x_j)}}$$ For the gradient of $\Phi$ with respect to $f(x_i)$, define \begin{eqnarray*} \lambda_i & = & \frac{\partial \Phi}{\partial f(x_i)}\\ & = & \sum_{j|(i,j) \in P} \lambda_{ij} - \sum_{j|(j,i) \in P} \lambda_{ji}\\ & = & - \sum_{j|(i,j) \in P} |\Delta Z_{ij}| \, \rho_{ij}\\ & & \mbox{} + \sum_{j|(j,i) \in P} |\Delta Z_{ji}| \, \rho_{ji}. \end{eqnarray*} The second derivative is \begin{eqnarray*} \gamma_i & \stackrel{def}{=} & \frac{\partial^2\Phi}{\partial f(x_i)^2}\\ & = & \sum_{j|(i,j) \in P} |\Delta Z_{ij}| \, \rho_{ij} \, (1-\rho_{ij})\\ & & \mbox{} + \sum_{j|(j,i) \in P} |\Delta Z_{ji}| \, \rho_{ji} \, (1-\rho_{ji}). \end{eqnarray*} Now consider again all groups with associated weights. For a given terminal node, let $i$ range over all contained instances. Then its estimate is $$-\frac{\sum_i v_i\lambda_{i}}{\sum_i v_i \gamma_i},$$ where $v_i=w(\mbox{\em group}(i))/\|\{(j,k)\in\mbox{\em group}(i)\}\|.$ In each iteration, instances are reranked according to the preliminary scores $f(x_i)$ to determine the $|\Delta Z_{ij}|$. Note that in order to avoid ranking bias, we break ties by adding a small amount of random noise. \bibliography{gbm} \end{document} gbm/inst/doc/gbm.pdf0000644000176200001440000104455014637005163014017 0ustar liggesusers%PDF-1.5 %ÐÔÅØ 3 0 obj << /Length 1920 /Filter /FlateDecode >> stream xÚ…XK“Û¸¾ûWð¶TÕˆK|nNv*v²U{I&'¯S…! e>‚Úû×§_$%™•½ˆ  Ñhtý€Òè¥Ñ§wéŸ|?<¿ûù£)#U&ZåYô|Œtš'¹‰ •'¦ÖÑs}Ž?¹Þ¶õß]³Û›TdžÝ^ñ¦yæ7ži\~Ù}yþ52IZE{­’ZU,å=sž®¾qLNƒ|Ï2qzéPñÅv îë.×±=9”ŠV‘yy®QѼJRSÀURÕJÝ ÀÎúæä^wºŒ-þ|JÔÉ$FÞðëµw¼AOH”±NuÆçé*ª“ºÐ·Wp—ReÑÞIUˆeÈÂ÷ròd¿î`Ê=ÿØåElG?\eá8Œ¯~:#•Çÿ=UÆ®G­'ž¼ŒÃi´3ó5,GÝrWÌ]ÄíÝv¯²¤Î@a6+|¼ö‡É}ÀÛæå†°ø‹ Ž©Žï‡n• ¶oxic§‰‡Ëä;ÿÝâ̸|v ÉnŸƒoŸÏ"½Š¦÷*’ï+ô}N¾ß›ª¼³jU b€°—Báž2>3sãÂaô/¤:æ|Ï>›/<ÁÁ™ì hV¤eü¯¡“ã†ãÝ9p¾;ßípú¶¥ól ²R™‚ÝÂÔ~º0qsuLÜák{þ:Ü3ŒÓýêÁ™Á €*÷CNÇÈÒ³ñek7ŒrT˜àzvl–Ñ~KëɃԃm‘Kß]&Tü{š§.9¡ÛL FýÃÛþJç §tÿ£ç¶iÀ Ž÷/j±[1Gt!]¢W ñ*¬úîÒ2øáNÅ‚Ud'>Ø ¯1z˜¸A(O†®z2>Œ®¾` ¾7™0_® ûÄt?L[Bl|ÊÁ[Ìx¦”ÆàùAcÂ}ºâþ¯Þ÷€Ý óŽh?rLŸíwvß]J ÊåuìÞ,Z¦‹Øžµz¾?Œ®Ùô¼¡»Õd ãµez=ˆÇ"¨Îî„™õråâd8‚ !Eíö*Ƭ‹·ù*2.—ÖSØÂà¿W?‰¤ÑÙÆ£¸]¬Ódô …CëOç9Ñdù¢*Æ]v˜ó ,|.1š³*ß·è5œ ×@Ùö4ŒpÅO&½IkL#­Cj†#á¶xÌ6‚! Äc¢ú?y42:>ŽC‡”ba8µ*Žîâ.w)MÍØAŽ÷]kO-wbÑŸõž?Û ÛYðv¸¶vDiR6ˆšKDðçqß}Æ'ÆU>M’Ã7ÂÝO,qȤlíRm؆ç)UÃ÷=æ û]¸à“WlJ†™ëÆ~?>ýIö"ÞSûRLÓí2ʆ‰ÒOá.þ¯sœ™:­E#Ã]µ îp ™Í@M”bEØUJ2„©Èÿ¸«€Ë»¦³ýO2íAãï 1ƒà¼þá°2§w3iŽ*œ²HOgÊ&À$MZ¡íOžލ‚uIdhw×`š ê,EövÖ‹¸Ëu\ØO'·ŠÔ™òš; Ž®Ýª½Éây™ª |;Óµ-‡ëÌØ‹7Y/‹¯g+«¬#L1â€ø±dÁä Ch  ¥¡N1ƒÎ4K ¡Pg 0Z/Á4¸R)hã>‰“ —›2G¨B+þr6ý±ñL©‹®´´¸Ïd]AVs7aêBX±‹i9.£ Kq%6ˆ:ÂÒ=w¥³˜œ‚ ¿[eç;+œó²:+t»ælµáI1¬¾Iœ‹a1QºFjŽW3? (QrTsŸg±€ób»BûŠ µÝ:$Œ1hrª,~b§ZÞ:Ë áÀÛª›ÝR¬íz`izn8ûU`'h“dØU³Îf ¸§‘ «1‚ÖŒÛÓeëûqQ&UU Ê’gwuv— i§?ÁÚN,Tb”Rn‚•³ïÝ6&U•˜ùìÛ”„¡\RFÔiª$“¥s» „ä‚{¬¹òÚžsiZ68-pªPv¶)[ϲÁdo“ë÷xÀÒˆl8’ª†ªïT§ ŽÿŽàr2`µñÙ¿„³AS‹©h¡¤[2:[¼‘—k˧QjÐ7-’š{Yàµ<î¡…Ú z…KÕA€W[Ï„[áÃ+ƒ~åßaY[’U ¡ãŒ‚¹‰h0ïŒn ¾\†Œ°Q×o»„§YNÜÝ—“ñjL¿KgÌâ6Gº/x0óøîß8Í †)‚¢­1¾a2…äÿ h°–w9ÙÙy¬ ±ãðe€Çlæ}õKgaÎÂ/–{ê¥ñ Þ4ºdç°2ù$ow”uíëÄh±áä_uû¯„NT–ã§ÐŒÇxÍá¾àZGà2KžÞ°˜¤*ž· E’BE[…(ôcš*h³ïÑÉüFWóÛym¯ ^Íð­¤oƒUiÎ+i¶ˆZAG€ùÔªöt¡“´(g½èåÿ\P›–PyR¨zãVŽgz½öì]›úч* A€:eüæå_µHà¶Ü’§I]åspËó·çwÿ¹i1Ï endstream endobj 14 0 obj << /Length 1261 /Filter /FlateDecode >> stream xÚÍXmoÛ6þî_Áo“±ˆåñøº"ÀÐa)¶ÊyÀ€¬¼Xq´ÅV&©íÖ_ߣ(ْͼ,’~",OwÇçyîhÁÖL°×3qÏú× h  rkÓh8xÇ.6³ð`’;Ô¬.ØåìÍìÕböâL:æ¹7Ò°Åå~'(.°ÅŠgßm˶\^—‹ùÛÅ÷L!GëÉ“‘>üž¿8CyÊ%¥Â²3»Œf“jî5…Ö½ÿUhM´™ wV 6'|.”9yŽˆY[…Ue¿Ïsi³"þXÆwÕ¶i—Û¹tY{Òeå Gù°¬ôse%­ËN㲬×óœÈ6å¶SÐoÑ…ÍÊsEnràŠ"é] ì?“ÏH/™£ß6Úý˜p‚k0!me0šýÐUM°ÜrOëž•½;ÚJ€?¤q ‰DAr© zç‡Pöe›Uÿ¤RçΩIæeÂa,½¤r;®Ò8FϵõãÃà]ú¹4–;*INqymâû³¹Ã¬ªSÑ“µuƒŸ6yêvt´Z!åÊp#v ë£ÿ*,&.vX^Ƈ‹Ä×plvNVU0Ùé Ë P-=%G;d_§˜ùÞè<×BdßT››wmÑïª'Û¶X/Ûò}`Yÿj]/Weyײ™lÂìÜØZÕ–Ûu|PÍ ÞeÄÙ"Q /HÓË‘NÓõGð1 ƒI`ÒŽà32:í]©qñ#vx)¨JÁ•ÕäxÞÙ}ÝUîÛÅH©¥¡ø 0¥‘+¡‚RŸ¿lE/‰\‰Y:Ó “ÄÑîÀ¯ÙOƒvO¿7øRŠr>™k“”(¬õø‘šÔ4ŠÔ}5="['g@š‹ Õ³3?Y$:‚=_Ÿ¡HC¡:üIÅ­³Œ-j×Éo:âBN"1¸»Lˆ3Q˜Dz0²ApÔ¨.¡‰6*‹ë:ÜèHB€p´m¯¿Ž;R… mVõÓ˜L ðYÙFµ\FY­‹5éfSV} ÙT„®Šë“$·hbqr(Ê:9Ò€ð`‚¾o¦ã Nbt7u±*/Ú ùÇÁiËîõ6eÇH0u—²ŒTW›ƒŽuQ…Öô~N¢¹¬Ëe[4‰4iT´d¼º›ñfÐÁÉæÒk®§ Ó ø*pÕ¬ë9d}ϵY‘UÑ\LšpÓ7ñU¦üè IõZɵ¶ àh°ggÇ“ÌæVGÇ ŒÖM3Bhà 1B!ŒFЃq6ÇNÒòÑ4ùKâK‚øD—À%øÿ8¼*ß5S­ÿ§¦ÓÂRp×V>í}FßÇ óÈQ!—Rf_¦§`z¿ å:ÝÆ¥yÆø$žHk,ê.yGt×öš&p$ù.h*)?Çyµ<˜Â1+š¶ÜôUV]&Ih1;d>ñE=*M ®¥sš›Ñ¸Ïô~| ¶¢äÞF‰U‡®ÏûÏ…'¡Ø¿/÷è @©Î=ÞÊšaܳ†ÿµ¤¥S‹¹í±aóñ?cRÈ.:©(ÓáºrV®ßÕ=Í€®ÛJépû§ñ«,V›åö‹&’îõÁµ³WC»ƶëuU—íÕæÎœ¨Ì@w» C: Í=Jäd%þ ¹”Z endstream endobj 23 0 obj << /Length 2613 /Filter /FlateDecode >> stream xÚåYYܸ~÷¯èG5àÖò%ðÃ:Xo² d€ð:ÝÍîVVÇDR{ÖûëSERçpzÆŽíÄÉK‹d—Èb_"›Ó†l¾FÂóåͳo^1½ÉÓ\2¹¹9nW)UÙFÂ3ÏÔææ°yü}ûææ‡o^q:#ܱ”f’åžìèÉû‰4p”ûÿg"ˆ'dFÂS š_#{È”'Ó&t»£„äExšö#M“ª¨§^Ê×øz¦R©Õf'SÓ%§jN–J)#¬Ò O)½Çêr%é‚Õ˜HX–ofÜ|aZ¦Jov4Í@|ŽèývÇeòûS2•B]“(òÍl¯Ÿ ¥Ó j¥ôɇ݆CwTª¨z©òO®ßAp ¢R42‘Å=ÁµÎáXñµ+}8.¤%•ÜP– $rŽœ<¸>¨“ýø‰*Ï{"¾¸ËÇûý#"!–òIŠ×,p<æk±À•ìUª@‹Àa&Ôx‘Ì™*Ùì딎/¯¨ïñÇáz××ôN2¹q B,jÄòµÈUÊø`ù$žl¸ÚøâfÇ)lÎ"™N HÓÿu«ybAüœ'wEYúѱÙU²¿t~ÞÔøÌÜÍêCQŸü²íú¢2½¨Ž1EKžªQu_0x GÝe¿e \ÇÍú³é=à(‘jÆ@lZÿ·†?¦4â > ’‡aåÒãàuTO€Aþ ²'À ÞLÌ\Ç$ý91 HÙfvÊëk˜”¥|:n†IòAL¢ùô.Ó»ŒƒÏîÒJó“(F=ÙÂÞ,ÁáõNj’»Kˆt0 ŽñÈCGˆm?m¦©lß{°S–%í–"šÐä4Žº®@ø`Œ'•‡•ƒ-;¿`ºîRY?¼s­MS1éøKº0Utþ^ÆO—zß×¹+ús¸‡8x,zëß©Q>— ߺ{Û6¤;¹J3b@í\„$ÀZ$·^¢¶ížG¤!üÌæ†@³º 2•ç¸mž˜úà÷Ÿc6Lû³­üè-2úÞ;[Z¸¨ÃyGÔtÖßm…LLy™Þ­¹ SQ¿:€”Ç—-›ÎÅ 5“&ÎPs6=¥Û]&€ƒ^Lkþ/Û¶Më‡øö€ï*iÝßm½lÝŠñ¾5ÀŽS*éLu[ÚðJ,H1î5¥?Ƽ¥€Z hÞv¶õ²Àtþ¶,' zî+ií£ÏC ƒ'³5èÜ·àÀ;ÿˆxkжóÑÛ„8]Ì8’H™:ös°éAi‚¸!1b‹˜ø€jÀÞQbO d n‘a ƒf•daúGaÛ!/û1’Aí¸Ëö¿|ˆ?‹œDOñ4etži­ÂKõ„ñ/è†á²E%>¼ðŠÔ±©ÖÙ¢<("û@.Iռػ–;#vëÈU¼FóŠðµ8+Â×}æ3°ý!6‹ó˜¥†£¯cVžE4æ½)ÕCº{¶hüŒ%w5¸è0EÄ Ôø•%£*Tižÿ'2V7õîv º¦,ßûuÀñ00Ý/~‚ξ©XÃJÕ´êPw!a -WÁ•²¿”=€±È³äÛ“q˜¥Â(6ïMX¼mC½`bÃJ¡ 4­ãN±ê,y-ÞøŽô”@¶Pß{ªæÒ†.mk]íýÂ\-Xt­hÔßôÑUöI’¸{ØË¦ `kCÝÄ|]Ú»Pn 0rFçÞòoXÛ‡e¡žÃ!R†O-(ø½Mw–1\gYògÛ»t!rp?¯U°†<õ8´‹ZÌ¡åEì4’ ú9|óS`!Êõ¹÷:0Ê”¬œnt-Õµ³šÕWç˜ëØq ¡tô xÁyüs°ûÖš.lƒ¨+ÉIJòG³)+9•­17壻ã¹S¦ØbvüÕg¦Ù‡Øí1Ú:Þ4ŽäP–BއWzjÿ£ùƒxRþ ¿†üaÙ(†4:ÏÀmèæ‡V8@Êÿª?>U|ÙT­¹x’–äZKÑ£gîû»k}L¨ˆE¾°1ë`rš¢Ë”î3ŸLK§'(ÆÞÍ´€xe& aWHdÊøxúC`Å?ÄžVÈwQ “s9²Ø÷9­ õq`Ý-ÿ—˜:©$$¸ñ÷é‡ÌÎKidæñÁU‚÷Åõ !üËK#¤ß/”æcZ··ó`çÂ_s)CŒ´!O9›©!àW†ögb’éVÙÕ<’~2šˆ ?e¸Ì£äÅ$¥¹Šõ‘W=>.²I$‰cˆà|3ÛéãÏ“©&³"‹Ä`G3]sÙG¤TŒû´‘MtÊ {Ì¢>C2°ª)‘¹o4éXÁJu@ŸÓ Ì»³mc96I¤¾/—UDñQ.ØJä"•š$P#¹f,¹Þßâ¯ÞÞú%S6®j€Åþì[ª8žaÃ|®ëÚû•™ï`1%îJë 3(´†dT¹Ov œèÏÈ/T!/ÖýŸ/˜Ö/Û¦ò#W½e®ÍÛ®…‡“ScJ`@ÁðUÑvýslóAÍׇ—j‡a0ÂÒ³ïb×7¯û(„/6e­&ì=oYº³Žã‚to¿’B¬T‚Ѭ¢t}Éœ€ã§&œ”SJz%XYwA0’Êä/P«×” ÉœdöàË•ð°qÝô~Л_†‚f…Wqãß1û}s‰Ç/_žûeƒrÝËÜq8ß7·9kùˆÐ(~#%W¿¿ ›ŒI‘qíز,<ó®?@B€ðITþb³Ãݘcó™<ôÁ1Ëçn÷Åú3¾ 6½l¼Ü\ñÕÙÐÖ@®VõÞmÛ¼-måî"1 ¦kIß"éзM׷ͦîU™Xðpvr Ï“­mkÊâ7€Ö\'Ýq*’?6x„·ÔÐCw (¤†š'm1|ÁÁ“w±–K…  Œ°¤»œNwta6}=À©ñ}iº@àRxedXöµ.Ž.˜ ø¥0؇V?Zi[8Ã墬ÌP‹®ô(z‹-*¸â¯¾}äÃÞdJÁ_Ðû¨.¡·†o{²Ö¼†¯YîýÖZôÜŒAêÐË¢Û;L†'¦[Mš#k‡ÎÙá²÷ð”%xg‰sc\y鉛.?L•§¦W­ü´;{¥Ö~Z„ç«âti­£¡*ª’og[³ÇÙ9¼ÕƒÝ, "’äC¨bspý€Ÿ®*B·¢#Ú˜"Ã|-}ü¯‡xPu|áa¹¶ö0lßĤZTAÝc1@ødJ.ÂójVfv0}0Wסœ¬¹ó›„?oMÛû‹&œW>?ØÒ“…h8ßÌiò7Žc<øW™7%æ>‚Ë¡£"¬8aaÕtáƒ}ºjʘáÏ€æ* r‡Á¨Ì ¤NþT¯ö¾LPCÕ4¥JTæîBÈØ¡ÇÃ4ßÝ<ûk¼M endstream endobj 27 0 obj << /Length 2707 /Filter /FlateDecode >> stream xÚåM³Û¶ñî_¡é‰šF ‚_íøLíÖ=t:“7ÓCœÎà‰xjŠTøáçç_ß]ì‚"õ ØNÓ¦^Dr±X,ö{WÉæ°I6|‘ðóÛ»_¿–妊«\æ›»‡HÓ¸ÈËMž±L³Í]½ù>~œtoêéû®ßîd^DM7 _Ák)¢·‰o“,Ùþp÷ç¯_§bA­Jc•K8Ê‘ybŒd“ÇU‘–ˆ¡â²Ì6;«J– ÐIãT›Òï·;\<îúIœ…?óÂX–¬–Eæq>øÊãDå!¾Vg=ã ŽCYLQF/yËò*RÄI*7E¬ I{þ8.‘åù¥„öD–wE\Uåš§b±UÆ¥(ýÕ^а„²2ά®¥t%ö²P?O{jAFÅRäk)%¡ãÊ8W³âBªÍâ*û%[|–bó bŸ ¼ˆ UÁŽv8\ ¨@‘Õf·Àbïv —ÒÍ¡ëíx<Ñç¾ë{3œ+êÚzpêßI™Æ S€,Ø=ͽ›§í.MÓhìð©"ØkëI7UãhÛCìÈWJ­©ÜMo¶;%óHÓK½ßføe»i •Ç-p£·°ôÄ<ŸæÃhÚš¶év+"~·§sßá®÷øÃLþZLãÈçÝëÁî öÐë“qÇuý;º{ˆëa:Ì0š…–D¶¥çk{˜ðø.b¤°m™F¿”ÙéÜTB¡h©·¦>i&€æ%“Ä;3@†é~í8Íg †®Ô£°G¹;»7ƒ÷Õ%«È›JáfȃJ38 ä„I‘iº3Zø'!»Q¢ñ(å-϶3\Ñw÷Ó0°7°ŠÁvÌÈ#l ¬Æh@ZòÌFªï‡®+Z%=«GÚ_% æšàšˆ=ÓÓ&Zy˜Ú=n@•ȯ¸,hI‚Æ=ÃÑÙË£“DMXãQ!™jTm •‰ï•ÓêŒÇl žïh}ïN†o†vnƒP4Sg·uVï-µŒÎ÷,\á«¢3_„ ר3æãÑ 9Ü<5£n x’;Å`êÉm†÷}w:O£—-F{2 ³²RÑwfKzÔvØOÃ`PÌ¥Š†îÄ„ºÞ4C›S縯-†„½f帀¥J )flˆX°/…¡æ dd½qþœŠè½pZ:ÛLQt€¶9Îð“[ƒè"š†yq7‚•±ït.âo$f `²\ʙυ‡4ÉÑLû–(ÀW¯GôßT:ßÜy¥3È'¢izÔd¼›%‚ä9ÿž‰ýÁ’”Ó"#÷ÁKLšì€Õó<¶ï˜gHí%<9šÚmsV͇sÓõNm`h÷06fÆYR-:M¢ù/¦øÞGâröx0Ú†ƒ™¤ á¨×Q—(|œc1„oĭ˹¤L’‹U/–í ïÚE̪¥‹ ”t‚$[$åÅzo;âæ§-Ø2X}–&Ñ›–PϺ߅ìx´û©Ñ=kûž¶ƒ" gYÎí»†‚¤r¶!XíÎàZö#».qŽ5kFOð…µÔ[›¨<÷=Ecpâ©¡ÈdØçRÇU<ÆoöôN¬­D×ó¹y¢sàËÖ vƒj×èYÁúîÄT9â‡Nxô*‚wŒƒ6Ï=³0íYÕm×î@ Ž]$u°aÔïÚCLsæ@Ê/ø†Ñž”cl¯ï3ïõu´†‚Ñ5Wß É$#gÎXÝ»¯#-P Ä7t|R t¯¡ÐG+ êšÞjÛ.¦çþhIa’9±›ðö¯”{¥\~‘†ðí¤ArÇ”Q­G=˜“d!DôMÓœyw{¶Á‹úÝ‚„®É¾˜¹ƒ)ˆÈ[®0-•/X#¢¯RíèdÛE‡+"“<–®IªbYrGò÷@]]ŒP>rYýËÕ½ºªîW4¨º¿v$‡6C^pTj‘”;iÁñ¿ÿbÙ—] Zý–¯n4p2‹d¦û1È[RˆçÌ=k³KN ýw@fþÄ›Ìò¥Ìn5pŒ÷ÀI2.òâç˜ÝM)þæ1̵JÅ"]M0*OsÞÅYš@OeG%Øý¾s¡*‰4ö9Uu‰ø\Wñ1¤¥Ï@A0…ÒÄ·ª@•C®µ+6§àdÑ+q±„ >¥›zÏäÌ–2Áru™hdZ^%‚¹€+Ó"zš¤µ,Ôñ3ÑÇ  Á*8Ÿ3ù0¦•CÝ2¦ò‹ÌûW‘Foè’ÓÞ››ú Ì‡oá4‡+˜Èð ³=i§j¬Ó ˜Z þUpvR”ÿË‚ä ‰+‘1Ï]9„[3´Dl7º"kYÏ#–ÔÆÅŠY7¼Ì-ù¢»ròuÝÕÎyoÞ[Ä¥ÃAhä.ëfÝ)wKEÆ|Ù²(…/,JG*ãÔ“uÅS®Ö‹4'(*z^½FŽ2ѱ‡Ê¤æý=(\}”§dc©“Qï+ÒÖc@-Ul!¢·"Uq_ʉ’ûD^xÞˆ0Íô2Ú… Ø©T¹¢Ä45•VPÿãF ðêS‹áSïÆ`¼µåup„Ñ .`Ÿ<¸×VÍ…«#izO]ûY™m÷ÍT»9…ÌhÆ#³[õ–®k»h”ÀJœÈ> ¤u‹Qî7èì*ãþÑYÙ@Ùv#›_?ñfÍK® dÃå À%ÿeø}ö¸ÝãÒvðV;Þž b+¦Á(¼ðø®Ö#†ÑœéÍœ É­prƒuIIÁ!]&ž*‘s®Â%šÒ$—૮Äí¥Çì7'„ÎGV¢¨õ‰ûÓÖË1æV¤#£ÐiÑú²ˆ’4N—½ÕÿkÙ«~¥‹qÙûœ+¸Ù¢”•Y§RÌ8·ªf‘Äyö‹TÍê3«æì?[5™ÈõªY}VÕ\B7š­”"’PÙŒÿž,KæoŸh„䇴v´È„#ƒ„–·×µåÖŸgŽ.æ-†OCʪXˆì6YÅI2óíBº¤?2ði~œü¨ ¾DBÜÌU½\Tõ…‡,`ðÛ}VÚ‰¼Æò` Ðàæešœ Ì3´pÄE$<TجA¾V,sS?âwTëÙC}6fä²Ì Ùþ”"!—r¾Æ%Ö.bºÈ!M¤îÿ*•ñöj…óêîÅ?ÐÌ  endstream endobj 30 0 obj << /Length 2667 /Filter /FlateDecode >> stream xڽ˒ã¶ñ¾_¡ò%TeE/ŒË‡lʛć8åLÅ©Ê&ŽÄÑ0¦HIíÎø’_O?ŠÔ`^µÙ\D°4ýÕ~•­~ÿ&óÏwWo¾~/ݪH‹\æ««›•P*µ¹[實R™ÕÕnõ÷dÛwðù¸6&)›zWŽu×®E’®7ZÊäÏki“ª¿éúCÙn«õFæyRü¼†‡Kªaä×O·U»þÇÕ÷_¿Wb~ªËS§$ÐDç}ȤáU Ú\ªsÖ„JÿeÓÌA69ÂÃ&@z}ÝTsÐŒX8eµ…Iea` ÒÂøKªÇ[\m’]µí«r¨Û=¾çÉ¡ì÷u[6<{ë¦×pË{nIUÏ›Êv·„>ä‚Ô"Ú<Ó:1qŠD`’¿4ð à¦"ñ4UÙ·žn“ôåX |c©s¸©]Þx×6+’¶ý ÚVÃPöusÏ€qoKä&¾Ž·~ÐâÁ§þz÷<ÑÝøçq¬È1|©Çª'€|g\ru[úëáízc2пñ…É–”?Â*‘¥Y®¯¾E^ÙDÄô¸á&¦þ&‚J‚Uذ CLŠEŠÀ|H¸æÃœN 9‰ö*F¬HµpsõFŒ"Ëø ` V$Ì;`”Öy²@猱Âç,iž-Š6r´K¥˜ø´”:ÐPŽa8¢¤& ÊTqa3qiIXzÆÿ¨u[Ìd%­FGdeR]Øçd51R¬7J)‘Rú åp‘âI e©ÑLBˆ$”1ê¹^ÓníV̉Û7BšTY ¼³©s£‘©¹ÂTò×u¡`;;Q…[íN[r¶ôzb„Ãát=” <$­Ã¥*ñ~íTÒ×Õ\lÏ Éd–Ix †AúîHŠÔ ÕŽ¡Ì5 #ëØMü¶ÆzËû¾ÜÕÙÿÈvÇ42É/›}׃O=Ô%mpìTj8ÿ€ƒ ©×…}“þ)t५_Nm~<Ì¡ÃîN€gä÷¾:6å¶: u6oèž~ñJQ$ÒÊ¡òK¼Ô~¢¨Ìèy}I¿V`7wèÅ´¼` Î cugg­LþPñ¢›îDþC{«"àxK†°r·ƒ¨Œé(Ò$<Ú£ PŽ ï †ÚS øH`· 7÷¬ôþ9D@ Êä2ù u„ÔN†ûÒ[°÷վǤÅZ©’ïP™bvZè47“Ëû%Ê2+În!ŠF¥ÎNHPÒyòMÜ{:p12Í¥×úFm$ØÍlÙM”®Â<$Ëd‘uÁ.Oç3$‚lIš;h›Z=yGNZx³Är™¼…LJ†ô°Hõd¹Im¦—JÍvÆc8º7o-*XKÔÉeÞ»¹àÝ”÷n¿ýÓk%ÀÉÑ {1H¬º($À»¡&G÷¨3óâú]Õe ª'ÁYÞœZï!{KÊ£×þ;²ÖÎ/:TàHéwCJ4lÜÎ'OUr’óµGîüà· è ~ý ”ÆÃ£Îjô&‹¬Â[v¸µtkàãâTºõW˜Æiíó ÜÇÁE“äàõ|Ñùä.8ÏN· ­CÌš ÏÍßH•*Hÿþ¿À1ý[¿tá+l* îÕ¹gâß<ñ )¬rL”„) [)³t<è³A¥´F/®¸À¦~’ƒlB/-.¡SÉÆlÑKÜÔ]”ø¼Ð/:Î]‡|ƒð×1¾ÍslSàó4±MkŽÞ@ÚýsŒy2-2KíÔë˜7Cy‘Øå™ø¢,\ y ±oÀD ô‹aUR.Pþ•J‘ÉWJE<'@gó¸XÀ''ÿŠ GCŽl^$Î ñeô,ÔKÕ\ÿodä^##“ÑÊ ½Nì1®Àe•zRì‹HîÅ>9¼L0a”ó("V•J§ÍÔ¡¹,m Íq Ï,¦ÄbÚ³Ú(X ?9ø¬ïŠòJ• Œ’M]ùÌêñy‘³SúÓ}Uanü#g„`nO‡£Gƒttmå'ŽØ¡!Dºú¼cì)XUƒÏQhA<Ýç•;&œiò³¡‹Lè0qO j¦kˆ…¿1Ý o¸%„}®:~£L ,í©7Û0Üûƒnø 4aÙ ™*¨ Ù†ðЂ=SÛ¥cÝ~ˆ]ЉQ˜¦seÐñ§U˜ÞC^€”hÁe_ðÈ2Èʰdðû»‰„ôîÝ=@Lc_o}å5?aW‘ð9ÚiŽdFï³§ÓõBŽR@aTr=ÈÐXá‡ð©,æM¡I…ãmÙò¸9ñ¤ïžaáYýŽZhã$Ì=LôÒÝ?ÜŒUëk©ú#õ)rÎÀZ_§‘h•Ï )AåÈ·õT=’¸;¿ ( ¾ LÝùZIr¢ÓÔ;5x¤ÉI=yÁ,‘‹Tf̘¦y˸öU |kê_ˆëˆ'܂ك—LÕ ~³ ¦Ú’RÎÿ¬•Ãr_fó}ï¨rä–%7޼mJ0p´‰ª;7%“Ç<‚½ˆ©;vA•6IuGµ*©ÉOØ!"¸×¼ü!3ÙcÀòYÚLM~ v°éP6ޝdÅkßyÊ« ƒÂáRR8ÚÚ€ý<£íÈ‚"96+)àCïm%T±Åç§~¸¾˜G–X?œb‰¼„ìÂM]°ÏËoÝ ü/Êo¹l[hó¡kAÞl3.Kn}o›f}ªÝ“ÒÂlýž %$__B(‡zëû:HN º§¸ }ÑB¦Rª¥V-¨¦Va»hK’Óið slCëy¢Öž:‰Ú×Ú?VM‰žB‰Äwë5§ÊòxM¿.ú…"$@Ší‡F»Šµé0 9¿«ºÌñ_É}ã4?&ß×û•߉uuï‰÷î ûÏgâ¿b:—ˆñÆÒ›Oý,Je¬æû‘{‚IŒ{8²Éu9„)ÄÊÐ[ÇÖ®ñͰü"y KÒrÁ÷K‚¹µ¢¸Ì× –D´ÚØ4Sö9¥V¬Ôæq¥¶©¬Ôfje#q…N1oÄÂÙ=ÝDÓBàÄ2ü{ŠÑ’ƒƒRhðÂ›ŽŒ¼É€Åj“Cac§ØÛ0+Ißÿü÷hÙUˆX5¼ kÙ¶)Šeáå³6H£ò$4¡–54öK.›N(vlF8ºN‘?^÷lŠ4ƒeXd[Ϻ?Fh4©ÈÅ XgX°n|ªtpñÔgÆEªä£õC–fÒNvú˜JJGòËž¾‹„ð‘ óÌ]$Þ%³nv—j`¡l›]€r ÈHÇ2j¿Ü7Ä­¥‡ó‡ ù*LÜó˜ dÌ¶K¥yiPyÒþð‹ÜlQ9†kð¨ðPç+ÊŠ³_[lu¡|6yS£þ^)ÀÂóì%í9•Óód’õ°ÜWž¿,ôå¾ò“·~0ù>αq?xÏ™û#Å£±í‰Ï»ùçÆï³«ó‹Jú/ƒ÷ºTÍ;Û\ÑpO!  -VLGv+^3;—_½° ‹Ÿgö]øt‹ß¿Ò¥± “§úƱ’©4‹%ß]½ù/„ÿZn endstream endobj 34 0 obj << /Length 2094 /Filter /FlateDecode >> stream xÚÕYÛŽãÆ}Ÿ¯à#…Xí¾_l,ØðI#ñŽkàŒ(‰^‰”IÊcï×§úÂKkZíì&›¼ˆ¤Ø]]U]uNUg› gßÞàg®¿Þ¸âŒd„1¤¤Î$ÃH•=ìoì &qF‘f"kËl}óÏ›¯în>¿¥:3ÈH*³»õ4k¤•ÉîVÙ›üu¹+úÅÏw×úüVŠù4Ž —Ù’À_TøY?Q‚OæDKÙ;«°\,–Œ±|×t½ãùúX?ôUSûÿÂÂÉ‚eåL„ˆÀ]±ªº¾­înš­')XƇÁ ‘\²hI8¸J°©ß–^ûzAU~\|ooîKø…W$oýûfí¯U_¶…Õ¾ûÌëÏÈÜÇ„ CFî&ŒÄä3Nc°±bS£¾-Ë.!‹I$ ½Þ] Ü¥up—üw­ÊC¿õQ0x¦,Tç[ÿdU¶îKz ¤!€ƒÐ¿§¼D7â /¾®j»3.‘×î©T­awäÕþ¢D!»ù«۶Uýv!D^lÊ)_š]»²hëªÞxB€•N9÷îP´Å¾ã’®41> *v˜ñ>ÎôêZUŸ ã aÅ®w¢ 5û(N<ÞwÅþ°‹•òŠQHˆqþ! ˆ]å‘1ï‹ Z‡øº¸¬Ñ qfC QFýÛ¿ÖU_»êónÆø0_ÒåÿJX´¤ˆpa/ã°uB RI#ð¹}þ=e Â\E–ø„n¼ÏGh´û¢¶éß»ýÈ´D ý¯Z%ž³*Î|0Cç¯ü¥h!¢ˆÆù¾ »Ë0L1 vƆ¼AœZêPâƒFÃ2$Z† Ñed€×$ÄY ³¹d~ØwÎk8[*dŒˆä«ÙTpb3^‘4rR°”"£Í *!“ÛNø å,j‚ ¸ÛV]¼A<¯ºP^ €1w4+¯€"xïãæØUS¡ßo›.ÄU{’6 l^”«pDrß•ío `H@r€`ÔIÈߣüi}µwÑaÿb…Íb%4Ìm¹ªΞÑ@B7ÌбDãĈH&U6r¨IÜw©pœ–‹ðt¢&Z/•0 IMæ ~æ-.ºTsETà d(Ô©¹ÑúÖ\•soÓëK—s×´OÌâxú8¥/¶OP‡  ÄyèŒ~Lpΰ_Q×U3õæÖÿó¤ýæ„|\KÝ }Þ¶àÈ34~\¢‰—4g)Íxô“öëçª<~¹Ê“YÔ–»ÎžÒüOé6OçBgšSŸ 玷£ „/6øÓ×l¶®ÈÐ,ðìã¶lË4U0MOâêÄ–ÇæÀNL¯ç%1°5àJ„ÒßÛšÀÝ4ëÄŽ:ñg9g”_8¡Ð×RI]ú§  í¨  SQngãÂo“;2† Ðg8ÿ˦°í¼× ƒ:]_üÝÑÿ ÆìG”¼:f¿9ϕÙb½‚¹¶ªz‹ ÀS¢èЦ  ’L÷ƒoÅ¡`óG üœz¾É$CÔ*ÇâTá§Õ´ ósŽS¥)w+]³üSíĵ(GÅÒ‡}³ÇÒž]˜8ýâ.cþ’.0¢$¹ðÑ >Ô¶…õrp+pË0P®c^hÅ{ngßžšxÒ̘ɫŠRHt[žs<àû¿üÝßú/i&¼ã8*sM\æÚ‰îà¯ñOÛêÁžõmýcQ‘ª–9GÔ@Îp*×=ú¹fÎåøº,úc;|vŒ\`¿9arɸdŒÖG«gsÜô\»JvèÑaÞð•›(åºÄsßLJÉO¿“S¬× C’ðñå¶Ú€U^ ¥9ç"ÿÊ;¶éúñ˜t8H­ö‡]¹g¬®ã V …nôňH¦F}s¿·1tö@âLð/sGKh×5©Œ$€£þ º † endstream endobj 40 0 obj << /Length 2785 /Filter /FlateDecode >> stream xÚYK“㶾ﯘ[¨ª  Arˆ·¼É¦ì8•šœâ0$±–"e>v<þõéF7(J‹YOå"W?¿n€òáø þúNòÿ÷ï¾ûhªe„6eþðxxPZ [VeaE®‹‡ÇýÃ2½Ù*•Ù‡þ|î»ÍVÛ"›G?P«¿LMß›ÿ>þý»yõP‹ºÌKÜL>ls%ª\Ñ6§f„%eÁŸÊü×aߌ»y=Ž[™M'Oô¸wèL=ŸÎL8¹‰Hç~œˆ†|ñ.ÏMÛ±ó~Ï;÷DÙmò*;¹îè‘oäT)Q,p¢iMsç †j 3ò(™UU±‚„býØ#©å0w$`R/ªºŽja )³_¤2Êaê:sÝž¨‡~ð$’o›úfç©ß°ÞËrµ¿*¤°FÃ9áPî44Oó+Ü”9XÜÄÉb³­f[áù㩟Û=ö´Ù‚Ú˜]ïÆ¦}¡}³ÛE‚Áµ(¤½Õçä¦`]Á À µÉÌÐp—KÛì2ç›Rf7•f3TѺÄoÝ86¨(šMã—¡jýy¤žGçj6 ·ÚÉ+p÷jÑΓ:°mª=SÄ™}j78Ò€‹W·»½{ê‘ë¯÷4…Ðv9}´0WÝb2ô—¡½½G‡!Càçz×5Mw¤þàw(<„©ïö~/ÐPv¥K˜²¥Áš© s?d7fë6 æiן1(óRØ “ªŒ'¸Á§œQ‚««(ïÑA˜7.凯sõÃ_d!È4î|nºæÜüäÄþøë ‡í©ã‡¡`ºzŸ:]¢–‹º[wiÝÎ'N×5à¾?=¨FC¨×êV5kŽtUdîiìÛyòÔ[±´Õµ¥`Æà›ÕßòÔÁ* ¾ ýEî°Ï85gpôxÊ$¶°ûÎwhÜeLâ×¢ÈX2™1%Z}ß`À¸–Fn0!L¹[c£õQ p$ ÇðÝØ½ã‚q¾4_6ßòÑ_Ö,•ÄR„TÇà›¤³”¥0²ŽVØõ¿]N [å¥P€ÃWÄ*5&§™TKg\}W×evF7u8üBH)oÉÊeçKߌc3E¹ƒk'îxâ“;χ6@B312BJ5Õ=P´cO¹L46û$b•¨dõ–*„*õ7ñJå"—æMñPëeÞÞ_jÀ‚3ľ:¢8€x×Ë„ÈP½Ç=™Ì¢[U€%ÁÞwÉ4çd2£ÎKp˜6Àýxj.DäÓPBLø}Þè<󾣑ñ44ÝçM™gî–~0ÖáôùŒ¿¼Á@èü8³™(i)À§€A-jÃ|ùÁ½Vèl—ù€uÕ:¯kUAŽçàˆªŠ%‹Z’´:ÿLy£¸4 %Œ²ÄŒPÉ ˆÚùxlyççf:Q ±—àU8ÑØ‡­¾ö-­õp˜ï_‘fÞz/(6ábµ0z ²‡;¦œÌÀž×\R¿É>q®Å¢‘Ys&ßë‡É-n¸”ƒyŸ»©ÏL%Ck<»¶ †IŒSdLj#Ù*™¿…s¬ »N®ÝÆtQAAeoá­H ’äà™í3ÒH´#`)Ø¡‡çqOÔ €oUåz&æ€uëvµ(+ð>ÔCØj„,”“~ ©$!xž©¯ÿ³Rª”;Ä7î@ÅL^ƒ;¶äÞ9àᎈ´÷¼y9Žp½3MhV”ßö‡íèÎŒ $}¥#¤qÞܦ’eTKÊ ÿtØ*¾¡.%¡TÑIm¥”eÑòb]`WUö7vZd’XÀLUzƒ'Üã6d³Ë<9NâÁÓ”µ¢,Š[qv x´¸±Q_=ªÏ~ h _0úáŸÿ¦ÔžB•MFØ5Tºã8Ù×~ÓÊJ$)AJK~ÁXÆ#bÌ$!ÏßJ×’Ú0çbÌ—!u—!YÇ_3YC€æowö*‡X/n+vܽm>c„úp ‚~,kGæ#¤¢VG¢¹‘Hg±¶b¥„²wr6Ó’nzeXš×5û”)‰qü‰Cacÿ_—Å*V–¦»/}¡À›at§<ÞW‘Ê~Ê`¨j+ŠººÃxs+ƒr¨õ‹3h;ü³ÙÁí&ºr‚†%ß —¸äçö`®¦a¨2ƒŒ¼ý‰w¿+ Ây%×( lu¢NQ¹M€¯PÕ| ÖNAÈ=£ÂT¼³É" Õ:Ôùâá ±d±27/eu×=¸®Á4vdÒÅ îìa)êJàô“«Ü”b”îpµŽ¶ =ì¯Ãøîrv-u’܇YÞl嬨¿Ó­%݆u÷tòµJáµtçƒÕß}N‚6Á˜*C•€ÿà!¡î$Ð È‚ÿ'²j˜‹ÌR3Üz{ 6¢šËëQÍèC,Ñê<+k6޾]´íË7´1Âšå ¶Xp®[¸‰ÌÀ‰W³#u1{HìÞ$1¤2įÛÍüo!5¾_ÐcàMš‘þ/pÍ@Ù9qwkûò%Ä„‘w3vÊà5H|>5dFžs—€ð5×"•ž!ÛL¼áÄv q ¨\~z!ú0‡Ê·d<(ñ¶ž1Ø9:gX͇SWª®D ­·~¶Z ?Ç»³Í°I$rWµ§À4@Q± û¸à‰tC°Š,ó#ÏøÞ:s† ­Š¤´U-¬º{¿!…º ÿ0à¡x íK€ìóêÊ\V¬Ó*‚2Î@çÛó›-¡}°H}¢©ó8g§cš35èýÖè÷RJÞ´§%-e©uʃ{#_n´¹I„™¡”×&ê/Àô=šë,$8¾žî’To{Ö/ß?¬blDe~ü@‰ôõÒ2ÌÛ5Ìÿá«0Ýø4ÜTÀúš.„ñ.hÔ8õAö=’Þ¡qón¤c"Êë»@íbèØ¨C^NÉVÞY°â0ÑûDØ \c®suSÃãÀ)„48´é¶®Èµ1çå½-LiºÕsƒ_<FÀæ­Ã0$¼z¥êP­Ea—‹mX˜xH’öù꧘¾W<¹#B OÜ……¼»ûpMc_çÈ@€. ýüó÷ ~T!ÊZÝócåõ¢­ÕÍí]ÈrA­/)´ê VÛCßògÝÀß)d¼c®Ý»?”JçÒ,OQ»Ô™P¢™òFÄŒÂd¢œ†µUÙß;ÖInN²9ÄÂÂ;8–¸@þH?zùêÍËO~ƒ„Z(V)oÕ«º…‚£†¾øàG±Ú˜ÛDø#Vüt«4%½«‡"e©éQóO#Múé/ÿÂi؃"®?Lo£Üòòò´nÊÅõíëã¹>¶‚ëccD˜Ã4l9>®éb¾–†õž‘ð…à~O>(×|’zú}ëw<ë)l_«E˜Ô­òR-Â@¨ñ=âZI¸ð‹Å§ä“Œ5B]Ÿd¦Á58 îµOkÛ¸ QHFøÐÊkîÀNëÇåû£ã0  •SúøŠ?TÓf%¾€U× ïz©'Q©â[Âis+[}!‘y*n„5×½¡ÆH†Èváù)…¨«êæv “–oÄøH[‰ ¶ÜjW{MóìͤßýÉèÍ endstream endobj 44 0 obj << /Length 1431 /Filter /FlateDecode >> stream xÚ…VKoã6¾çW{¢Xõ ¤{hѦH-Ц§n´E[Dõp)*iúëw”-'B Öpf8ä¼¾a¼9mâÍwñÿ|%üǹ‘iªÜ¤eU•Úº»¿ï"•e$¿Äž÷ÆÃS'7ßw¿Âoíf{»…Áïžï“rSE•JÔæùx=5W‘”ùæ¹Þü!íirf»ËÒ\¤ßlw*+Ä/“ß Çݨ»sDggj{ðöe›”bæmwI!Œ;®Óý!p÷¨ñÆtôÔáÿ”a§cÁpÄo&¬7N{;ô#óu_oÿ|þ ¼ÚIUy¸äØ8ÛÿµÍ•Ð'j‹ß:ݶh/ÍrñB²v2#¯Ñ>~}c˜¸±À¬³vº3~61|‰e:/lwv^ükäfË sí¢çàå’{ؘ¦b?y&^­oÊDmÎèÑö'–tÚl¯[^½;¿3Iá™—jÂÃS•I8<„,䕘FºµŠ…ø þe[)|àRd€èJcmZlždâçÁ¾ÑAÛgNØUk¯™¥]`ƒót&0mϼ Y‘âmô¦ƒd˜÷Šitïm-’_â<§*5X‰ÐT#Rôh afâµ³`ªAS©\–½L¢Ș ¿±ÊMgT‘JóYá3˜Ì¤á´¡3|>ÆÏAÉ÷“Ý,§"ëY™ÂƒºdÍ0µ5Ó—±€š¬†‹~'Ȧ’!ê&«àªá*tSßC|¥8ñú´ï°% %ž›a JÃ~4ºZ*´rûÁ3ª#K9SÀ ’q=”X¼F·Ã®ƒÊ{îü…­• bÝe \Óx&p7~'*æ½Õt ä˜Ñ[( Ã+‡dŽ&2Î(nyÑcž?`ÊbÛS ,E\@X‚šZêµß­bŒ­Ñ¡*að¨ð|;20Ñ„°°Ê,”fÁŽCNï[þ/¤«Câ9)ËçÓV2†f]¸TÈž9¬]Âl0„áY,œ9M­vö_:€å Ä%vÝ:hg!À`ç`l–µF\ž!C¯]ðs:PÒ•¢©6`Á …†}™‹ÆKQ–ËY[++¤àÒ(x~!ãÒ:¸8Áðäª-¨ {Ö™ÙŠ£œy³AgËa’Ö¦àÇ„žÅ*‹ò[Ï@ý…‰ ªàâ¿Æ)ù{3U¡·'Eª¡Pç²`QªÆ®¿ëyÂæ²¼°ÈxÅëè ÅÈàW_žÄƒ zÍbî\U+†Ú÷“¨Ýâ\=^ÁÕÌùê6WÌܸ+Ì©ûÛ··TU”ªœ^àª,ØŸòFç‡ç»¯˜¾µa endstream endobj 37 0 obj << /Type /XObject /Subtype /Form /FormType 1 /PTEX.FileName (./shrinkage-v-iterations.pdf) /PTEX.PageNumber 1 /PTEX.InfoDict 45 0 R /BBox [0 0 559 432] /Resources << /ProcSet [ /PDF /Text ] /ExtGState << /R4 46 0 R >>/Font << /R10 47 0 R>> >> /Length 9034 /Filter /FlateDecode >> stream xœÅ|MÏ­;RÝüüŠ=„AÞØåïi¤…YÓWbŒ I@§A}!âï§Ö‡Ÿ½\: ZjµnŸ·Ê~üíry­òþÝ«|ÕWÁÿüï_ÿöÛï¾ý÷¿è¯ÿýOß~•B÷ÛüÿÞ¢¼¾ãüø7þø?ßþòõßÖ×xýË·úúóüïïó¿¿û¶fOÝjíkg£ÖùÕ,~ÿöë•,iÎêÔ:[Ï?o²ÅwzŒ2¾âI·øNom¯ù¤[|§÷èãë<éßé?4÷iý;ýw¯Êq»ÿüõo_ÿã§¹k¼~úÛoÏš%|ÕÚ_}}õùúé·ßþ¤üéOÿm­¯‘~ú›o¥P¹SÙ­ë¿ ›ºfݾºöµžŒµXû?zýê[ÎÓWíÞÕJmÖSΞþ˜œú+½SëhÈ~“-¾Ó#ZÍnºÅôŸÅ[|§»97ý£u¿oœ1ÌîÿV1”½~­œâ“½5Ò_õpÆ×Zkxp ¿ òäÝ­}¨1sý×ã«•ÒW)Ù½ƒÅøóo²ýÿ¹Mô^U‘Ýl3^9NSËêýóo~þ«þ»ü‡BÃ~q€j|™ŸÌ¯úè׿û¿õóoþæõ›ŸþÇŸŸ¡ÍŸÊi‰ñûÛŠéSøwÁnÙçz¸JÖ©¯ºFäŽüžÂùZ¯ÚvÍ öýÛ.-Ç®¶5YÕ.\pý ë®+—EY ³ÆÈs|õT´’½gŽÕ³ûÕ"ܬÙÎT´¹¸)Ï©ÙÎTô¬9`.²}­/kÊYàèÚƒµä¨/ìÉÊ~×3?KƒºœÞ°ÚÛ©-ª%þ¼zéÅ¥q¢{ÛòÄhôZ6¥…k»šÒκz Ù̃6çÂ)Nß;+ñBúþŸÞ_X¾?®K-·»Î?%”þK©;4¿þì“ÿð~Ÿíkæ6È‘ÏlÛžýúow›>û‹?Ë&c`Ÿ§Ý3-A}ï§™h¾÷S›ZÞO1¼{péըڸϋúàn1|R?·XË0>·X›“ƒx7YísIám–ŠñÃ6KÅüØdy-–»^+èn±:sšÞ[,»µè!ÜME/ÚÜdP¤ÍéÏ&KÅö1§MÅñò&Ë“d~l±<+Ü o±´…­ð‹ZÇ•sg¼Ì¹Å"«÷‹,Í‹ž,÷³v¸7X´±,sƒEÛ?l°è1-çK©¿7-Òyo°”›7óÁÔFù³²lDÚ'殕5¯zSƒ5¯ÚeHr¢Qóê.½ö¬yݲkgÝé²ÈPÔÁºwª9&u²îÓm 0%žY¿úÇ9³ÒOŠ´‘¹òª£CKÎô¿sjáºy‰ÉNžœ¤—ñÐÉ!_܇‡RžTvz&-¦èíÁmC h8^!›) o. Œ [[¯®;lšd kšA§ÒDב–¸½Mt.ÛLtn·ð÷üƒ6à1ÐÍ­×@§5³i‰NkØ>MtZËæã€&:mï²Ñ¥‰ÎªÍ(4]¥#ņÍ&tÊãšh4Àyi¢³öíd¢3ýšpšèL?®‹&:åi™&šuÅÛDW˜èvM´;ø˜è÷×Q'¿žööd²S¾&&;¿wOe²³§ím°sx®§Áæp}˜ì§aO“]q³ßo‹Ãs[³1ª×‹•Á~˲×){EÈ\ç€Ù –µNÙ#+c²gA¶:+×m¦:°¼ítJF:‡+ÔÙè¯>™è‹,t€î!6Ð)û(“}Îñ ‚yÎî{­Ê:§ì£MÆ9åõõa›s@ªz*Óœ¢k”-sˆ}x攽Ód—S¶»³œ­Ë“Öé;Ÿ{Ç&y¾<6Ǹ.N¯¨AO…òº¦¸Böê½°u¬”òaë¶WÉ›­Û^ƒ·¤ gJé‹+>=FË›+>JK3%~;KÐ T·eòª˜Zõí¬Ø6nË!Èô+·ì†›-Æ ÃMit^„§çxæ¦(vœÙ”™Ã¼¨pOgv±Ck(Å+‘®sœ÷Ñý™Í]eâ\Ì!æ\:ksü=Q+‡š¾våL¬¬p„—<ܾN×ÜÙ…޶\‚ÎÙœöaÍ¥ ‡|uzà¬r²…+Çš.›£•Ã=¡Øþ`î²qm½Üøý¦/—ÓbûsŽ«AÕÝ9®á)/—Öá1kTic@²¾gãÐmvn˜œ‘9½Ã÷®¼Mec•sc”ÿpÏ;ܧp¯âïO¼ ‘em«ênUOã:_ÙÉë|Ýó ¾Óª¢Ò×ù¶¥gr¯¡kU= Ë,§`hjÎÆÍ$e¯Ås:[7±¢þ­ŸF¼²mKu¥]"H¸¶Æ9eÕ½uÙÉCUuû‚¥ç×»¨åyÍá:ßµ¸ôÉ•¾kw:N^§¼q?HùXNƒ? kߤ|PÛn¬;O ôsw]Q¢êŒÁT³íiÑÏ=d‡# 1¤¡²+.½€Ìºs˜¨{kÉÌ7Úvª¿ž›m;õlª¸¦nÊXUÅ-•#Q VåäjWçççéš•À]»áš©U©>eWr9µ:+R>„d »¤oE .'Vvaѳ¾eÇâ¡“3-‰Æ+gÕöu î ̲¾žÙ–QëZÜÃw¾”¹1põQnº69áÍun‹aÏ6`˜Q[×ý#6Ýœ~[ÐÜcÚÆm¯œhaÖ±YPÈq¢ŒÜ[þ[^Éuéb yN¶îØÂí5غ},sˆ2]7¹ØÌp­˜~ ¥\ UÎÜÀ ξ$§sÎÜ\òÂsá"µ³e0º(iÊo À(kݼ“`×\žÓCT?—‚¤tÜcL‰P×Ò*XƒM\óVº;[¹›¶nƒ±E3 }«D“¹@Žœ*š8{Sѵ•n©¸ˆWïþ¦ëU*Æ-”g]~r®bààç'®e«­¶Êyšp·í°ÍÍ«·ÛnóæØÜo8õÕ µD5žyЊÛÍJ‹N,2헞Ɠ­XœÌ-ÊX†®[èü¾“„û:¿÷-ƒ®€7 VàÆ…vž§ §qÀTç³I)‚îxxÄ–fêðùÒpÍŠk±‚[à¤Ëâ1‰Ë ;N I•¶Žôu§¹ÎJ¯Ílm6Vš{XýhÜQãëÔ~åÂŽ›ƒû7·§¸@E*ªç¼ÿ.¢·y©Y=ýß\èõ¦ÓN…O›4ühE.ô§„ÎE•+ݦ¼ua£ .d³†‡hæñéš}ݘ̷âÅS±±fsƼ—|Xbî‰p9°Æ¨¿5•7°œ‘»åFð –»àöj—yj|Œ5\Ž64­ÞrxÃhð=i.ÆÀIYF¬™Ã—D½ð”娷±ae<ÓI™xˆ÷ e¸¨!M’—­šm³âÊý³-Ô\S4T°«(ë h“è"v%6©]~¨áf/î$•L²¹6É‚6 ,Ê›¡EË¥ ] ò”ÇVÈT]nFÀóµ8ïØ ì”|ز1-·Í¶Zj«/¶myLu/jaZµ-2°)«t êoiãË}ÛuºÃM®’‰¹çq¢tpn<‚ŽÓôD]†¯ ¬y†pz#ÞÊ“åÁÓP=C›]LYdUÛì·½ê#}ׯ°§0X¡tŽÄÞSåéJÓhÄs°<0lMé¬?ï.d=Ô:Ü\äc¹ Îí­ÚÀ®UÉúšîoZÈíÐûmw”ŸÎokwµò¢"™­r‹Ö-nÑž-0 a2)¹e權¥^8hiYtÕì…K¯õÐ^é ÍdA¯°-XœÎ?È0õ~ÑˬHÝNd˜znË"ÃÔ‡àƒàv˜¢|T;wy Ô½›uÀ¶¨{iÞ:¯)`ˆ¥ö´HuÙµ©nSö]4Wë[Ðj¯dç°4ŽDd·F Oo­á@ƒÝÜfuÀnš¹å}xŠô(\æ„Ó¬8"¬lŽ{0h§PS‘Øè¨sڭ鸠䀌ë°w@¶¨e훃 _ÃÁÕïJ£µî'‹ÜÒ,ã¶&w–sÅ­h…(·‡ÜÊÑ<¤b]oçäÑ–ôpÓºÝJÞâ<˜ÓœW|Û¨™·”În6ø‡]šAÿ$íš­ŠÙ©yªZôƒ.‚¶y2¤æÜÚ/^H5tq†Á£<ÖðâÕ@ͪß9£ˆu×>1©Á$½“}HÃ(¶µwÆ|¥,0¬ÖØo:úù8}Á¥á\’‘†Ó{¯íIÓé½™K´¢€ž ò‘è€u;XFÁ0·— Y O6Íh`äõ}‡çIYé$V?dnºvÙú £Ñ€èAZÜ pÿ$or½ð%r½{ê$ë³íÝË}ÇícJ’Iöî-)Pö‘ý쓤TÛw'Ö×YºË¢»›²Àù>éí¦Ù]þž8|®ö{òØiˆYRi›gêq´CÇ $[všû L7[v9õ´=(«+2 ÑE]¶Æ}1’#²g“}>eìmÈayÒ‡;¾uEP¤Ñvß0Ýè›vÐ[Ôv¼"Þ¢¶c ·éÊ¦ì‘ØôdÓ¨+Ú ‹Ëµ¯¯éÅ’VjgšWߦ›_þ@ßtaŸ ¤¾éÁ¦ìsd¥såúnOá²£¶y7eºì"›Õ·SëÛÞ:ðÛþAGw¸U9´A5'𥠿AX?Ÿ ´¦ãR¥}æ"¥}‘“®‘ç'ÎÁ ÖArÉÀ¤Ñ~ùÀ`V•°¤¨lN›!E¨a¨hÍE+®gžŽ/>èåI(µãW¥ zÚë TpÇ«EЬtør5 ã(z¶£‚J×¾9¸*ú› ô9»RL9¢Û6Ãvþ×÷i³“ODò§Èˆå_Lÿ£0ïX«e¿rŸ•-V~_¸˜¾B+=¡‡u7WœÇòqHN/Úþ=mîÀC¶8ž®¸Åe·Ä§|ËR8Ïî—{&WsšgWãFŠ+8‚ÜpD;+ü Íÿ`†IDèkqçõÏ`žºŸà qÃËNüå†W¹¡Eâ†g3çIngŠ$qÃãácÅ ÷éÐ"sÃ=²¸á<ÍçŠn¸3>*/3ÃŒ§¹áˆ[úe‡•*n¸š‹¹ÜpvŲ¸aL•Ø]qÃÀÀÅÖŠ® ¯ÖiIÜp&¿ÞÌp-Í<³˜á‡q73\¶¾/ /™ý4/\Öm‰xaDÐC¾Ä0¬Ç”BÌ0Œ²–©áGE¢¸a¤« UpûBl1̾sˆ.Fà^H!¾Še…ÊoòÉ£Ú"…(c(Î'g .>Ic6T)Ä#²A SÇèë0W\4·ó&q•Oö¸8¸Xìñ[{\|4_þø-ϦX IÜVf˜s7cH³½É©•m.tHÙÄp3и3ÌüV²j8wDÜ׉yñÕQ½¢WWl„’–¡U÷ÐÆX…qÓ £¥ì•´ˆ¢UY*­`µZÊn‹9ãÜ)úÚœqõµî²ÆØíõf«¯°—5®7”Ϭqí ²ÆUfæŒët<Œ9ãœ×½éö׺<Ðð;6ަ€ã1¨p°Â Â<3\°äÕ7ERÇ“ŠùæŽÇTôrVä˜-æY“—ÉãNÑÔ>Øc‰\s ÇcS!îZìqÜ3{QÌJÓü¯øãðµûòÇiÜ_oöðªø`±ÇaøKìqørÙã7Uì1 ¼Øa±Ç†ÎëÍR[ÄÇlf›ÅÇô÷æùàâõæq@ˆ¾˜‹?Ç÷_þ8pùãØíþ8ö-]üqøD½üqœpéâqà|òÇáè”Ë·âq7,ÔøòÇ­˜)7}ÜŠ¹³Ç-Ì­˜¬0Õ^M&¦G`N—8Y*ºkš¸â;óÆS!†Í¦‰Ó,•š¸âQ‡«$MÌCYEŠ&®ˆA¾¼ñ¡_2ú%«Z1Ë41ƒñ?ibžÝ.T‘Žý¡\‡Bû¾”ùP´cßý*î˜÷¡[‚âóêòAw)Ü E<އBžÄÎê¸ðÙá Åü ‡\²Ëo§ë&Ñ„.ñ³:î-‡¼pwì’CÚ¹Ì1!4*Dí‹f¨Ëà¶®€Ä4À¢…ëÔm^8]£=&†©PµË/ìžÈPÃåH£®,Uéw\â¬#¯¦©'RÄŒ¤Z7>Ñ; ¦ºí¨h*3m¤Š57Pd1k(þ¥gÄX¹vЬýÒZ¤Œq\Ìbx‘¤ñfìÕ­ûdjîˆã&ÔÜñƒ®Çõj†qãÒɤ阋N&}LßHt1éãLOú¾’‰ZÒÇyÛh‰@NÙd¶d}‰$&\Auˆ`nH"— W\`”—rúV&ÊE §‰·1œò­k„‡ù …\§Ã‚M"§§eG$rÊæD"Óó"’.™²ÒI!§_avE rÊBcL 3ÀŒH½øãt;ŠI뉯Ö\Ùc†Ÿ)•ä1ÞÔ¼æ8%…˜8NÙ¬’xcúyëõÐÆu9ÄÙ¬1ƒÙXºHãŠ'gì›8㔿'†V×ìÎO -eì"¿ò0UêÖånm·Ž§/ªt=2#ª8}J͸™b³qD§—yÓåÜ®Ëïˆ&Î:Y—è,¿Í×C§l~E qÝÆéMÓ'åÈ€® Œ”*×v·riWôu·v©[bjT˜©%¨Vw/7QµTÜàÑÃéÆÞXÐÃ)ž[¡µôD~¤‡ëžO ×Rq‰pÑÃéý>EòâP ŠS=\÷›pUtä¾D¤‡+¶¯EÅGîs‹=œ.n\¾X’§<9"iÚä0P°FRL®y+Ö’è"7Wý©O‘‡Ëp¬š=ˆ¯¥â’éƒøZ=ÏñHRÈ_±‚‹ÿ4Yd„¸/LB¸¶•‚Œp*ÎUàuà¹g†9኷‚¤p~è‡X{³Î‡¥gWƪ˜šáþƒ†]Gh€‚ ÀojæÍÓºJÞ7cv èU—§Ž» ¨WíW\­q­ž¼:Øö“/ÔÜ 0ƃís›C†qZ{à'5w/ñó ©ºÀ# Ñab{˜niô®Œ»ˆYßá€Fó˜âS‡Î%rÈq6˜CNÙûZr_:Ì!GY—#&‡Ì»8_rÈ£,N˜2ï:úžräšCλNýäùZQ39ä¸O%Ì!§l–Wrˆ²|8äÀ…^,.9ä¼|˜#‡ì›ÒÃ!G½qâ£úŒ6‡œò0GìÊfkAé3Ûrà‚.Yñ“uˆ1&ƒu„k§¡Žûƒ rÔËb‰ANÙ ‰ä¨“6ƒ8P”ŸˆbÊ7?dľþ8¥ucôëT³Óä)ë[òÇq/åæóNgVJü±ïtƒŒ;8 MLJL˜£b @!ó·ŠÅ£8ÒK¶ŠDÎí2b‘Sá`ÓÈ»¸ä‘ÇÚ' r]é žš¿åÕ4¯ŽýQp­ÇìˆGæsWq›â‘©É!9Êšã%œ7Î~Y`òÈ©°k9âM,“GÞ¼­p$«+%œŠKƈG\­MÔ’GNÅm–xdþ€‰[EJ¶úð¿ uaäë›FF¡÷Ù(YdO-¼§â2½€{½¸ü[kW±õ>øÆ’@ÆáMþ¸Aqg%×ÄsÙdrÇŸ ’Ç9î»Lö˜oÕNÑÇû¼s?Λ¶¯À&ƒ/u¤ ƒÌ»øU0†¿z"9p×b‰¼µS‹œ “W¦‘ùÚ Ùø¼Àß%<g•š20tƒÎeI³¡+rôêøAòÉ83zÜ%6ªžš}[³6¾‘WfÉ7T»û½]æPÜ2ñ€‡YŽÞÍÈŠYŽÞå)“[NirËÑõdn9e[+qËÑõdn9º‰ sË·Gß“]Žîç{f—£ûñ”Ùe†-3]ìrtÃÂd—ã¾'2»ýlç&»L™ölá{šÍ6Ç(âŽÉ5£ ¿+®9ðI¥-—康‡«^±Í);ÖGl3!1Ä œdˆÞëa›cDw~…MŸZ曉ˆ(A“€CÄ/+fr8vÍ|s ÿ…ùæ”i$¾9Í¢¾&Û¸¾‹Û&Û  D_ƒl^å_×Ì7û QÍüQŽ„˜æçGFL4Ç}4ežù É6ÍÌkÖ.–™!Ù*ŸœAÌRœNf*î³*RÌ1‹|3Ì1‹î~&˜S¯‡]æ·çõËŠî™[NI£jjÁÝN%³œ²üË1›v‹ye†¯×C+çYCºWœr`“˜†&|— _\Ì)Ǽ1Óæ”‰Ñ"ýA8es­"§¾øS$Ÿü‹é>á«'æK åÿÏSîª^Æàí‡Q6–×[+l(¯G¿2˜^/". ¯×ûz8=§x=0^/f4Š—â•Ç€ë׃á¥?|ßqÂï¼ü޾¾ÚB,†· ÊBïÚ4sdð®=o„ݵaPN?0Ñ Èéç%z7¶¾üóç–Í7 ø¸òO„E–î7KA¦ ×›b*o:Z]j_t4Pßï{ŽV¶ÇLï9Œ÷|¿ï9Ò¯1Ðä÷ç¾Uñ{ŽãM¥÷ÛFý¾çØؼï9¶C,ï{ŽÅŸ6¹¯9.ps_s,ÿØóšãôûºCï9æ¯9̦ø5Ç8ç ÆÑˆ²e~φW©zÏ1ÊØôž£ÛŒÞ÷ý¾ 䋎î°Úû¢£¿Áñ›Žæ—ð÷M}¬×ÇÑÕkÂ×ô¦›2¤Ð›Žìœhãû¦ƒÏ¸? ¹«Ö¥Ð›°>C ¼é¨hƒD½é )…Þt€)u½é€¡kÐ\ £Zá7ñ|ßtàf# Ëo:Ð!‘ûMG)OÂç H!€hƒs¡;OK ÑÙFåˆÒk)Ÿ0éªüIž}ÇÓ@N^cmBê°ÙU©¡:¾$•BXÐ6õÄ`ÝBðÞ'^‡ýæ2Ø!ÌQhÄ€€Ëd3¬$f·`% ´Ã暑v3W†ºnÐÀqµB¨ôtPA;‚ç7¨ÁóvǼõað|¸!@î¦!w  »F >n9+ ø?š-nx &Àšcºáö¹é_r«Ý4õÀˆJcž¸:ÞMTÃÓèv3Õ@Íü•©jl[õÂÐn¯û¥ôÅðz7øl¯_¨Û^6öõ†ïðC?Jz‡ø[‚ïz¹7„ݵchÄÐ~oàܵŖ¶kv/j‡+”êâSŽÚ†âŒü”£6Ÿf~ÌQ[3ì­Ç)ßômk»¿žsÔV˜¥UƒLzÎQqeOôœ£â!Ã~}¿Ï9jžsWyä=]J>‘Ð1pÉé„èø[;.’]ÅMÂí$HÇ8åJW˵V†éøû<ªE8çÇŸ€«Æ‰ì—"âªËöX§KE¹9†ç)R\5bë--œŽ?ÄãÄéô«@RœÐiø…Ó1ܬZÁP¦] ãtü­ +ˆÓéç|¤ NWk»PâtTh„Ó1ÊÍµè· Þ1~ꂚúhÄ C£X7ÏG€Ñ:$x÷' àºr•§r€×-jîŠif©ì§abGïºì*.¢Jç½E±¯®ƒüz°:þúoî‚êjÅ¥mýT“/?ê ¿”®âÁ#KH§ÐÅ׃Ñ1ØQ©¡(Ä.ÞÒ‰êè˜Îº„ÏQÖû©ÄûbBèÜÇ÷Kñ‡Ý(‚°¹”UÛa[ª_s6ãF‚åjñåǯ;˜Î²ôºƒ2KÓ뎊È/¥wíÿŽŒ_wPVþ©õß<6zßA™}Óûȯï÷uGJóóue¿æ çYükbÆà( E‹-‰s* Ž¿¦Tbpµ8ÔØÜG~y¾ïüÄà ¿Ž©ª›\½?Kfî- cmBÉFUü{±Däúzð¸wÙÂãв׃ƽGEhÜ“:ö­WXÝ\wLTÑ8¦ }Û ¾£ 4®Þ_©5÷Î/4Žå «·v’и;ƒÂâÞµ‹£$dN5f_e‰Ÿ¾ßŠœ~ÖŠ8¬µ×ƒÄqå)÷Ñ*¿oþ€ÄݼÂá)´Æo›„ÂÝZ…Á½Û$ Žý?¯ƒ{÷AÜGikºÿÆÜèU=uŸþn³8§ã—’Ã5/chñÔ¬¯›s÷׃À¹Æß>R§¸^Týpë¼w.¸~ÞH®•ñ€p´¹Þüáxø­A¸8osñ‡á Oy¿> stream xœUU{\מaav“ÝÙ`ª¼ä%(AE`,òX]>peW@—G`AE´&$&pA£ÕH­ ĶŠ]"‚ŠhXQ]QD­VÒž!—þÚYLÓ_ÿ¹¿ùÝ;ç»ß9ßwÎ% K ‚$IÛ°Oâ>‰ŒwÔhó5ºô•ys&ïDòS-øi"Š~)ÿ%Æj!?ðĺj*h­¶ð‡IPð!"ÉeûeeoÊIOMÓI]–&,suw÷øßŽO`` tͦÿžHÃ4¹é©™ÒÂG¾F›•¡ÉÔI kµé)ÒTí¦ì´\©J­Ö¨ÍaJ•V³^‘®MÏÎÎÊ—º,r•úz{ûÌßÅékòr¥ Uf®T.MФæiU9ÿ·I„}f–ZóiN®.]‘¥òóŸè)õöñ%ˆX"Ž'"ˆBA,!–Ñ„¡$܈P‚%&ö„#1…˜D„-aG0„ƒP,Â’#Љ!RF"û-¢-Š,ªE^¢u¢m–Ž–1–µVŽV唘r œ¨ljPg÷ J>¾;­Âõ„ôÃŽ>{æÔ-¾’eþ=ý0tHö§ÌïP½ãùÓu]7Nd.)“Ø@®›wë"o÷‰`·@Ó> À†ÉnI)å‰ŒÎŠŠ²oj$·ÄÛ¾ÞZ²Ñ©Ÿí­çˆm GHrCÄT¥B¬›0VÙGfXÎuî§lF½tù×;dŠ‚—&Ù¹ºCü>¶¦»öáyB*~.ØÁ÷œâyQË­˜ÇOs›Ö&LAêüm^憕ÛâÐ<´ôPZSöñßÿµô¬UidyrmJ³¬?DȈnWœ;¡o¬ëDhPqÝå(þXïÀôÌ®^_Ó6ån§þ%ÐJÒqu'šÙÚ¡>Ø)輑ϟÌWàf1sVª÷‹Rìðâ€1±_§âï=ú£7õfc¨fª « Ñ3([ÏúWÆ3Ê x+¼›÷î1§|ð¸^ðMñç_¡íN™›÷áàªxHÖŒÙùòqc ˆü B×ÚÎÆ¢¨êÔ;«$Ì›Ô)qÁSðä—> éËA`ï§\ l”`¾ïOŽˆJJ^¸0éÌ.ý™»n¶dÞ˜Úsc>þ±—úúÚÛL¿^Ÿ`€kÂX4§Ûsؘþ)d@·Nuœ½YaDÀ"˜°é'U‡úRL]°Ð Nn31‡=áýúêÖï¹¼Pá>Å Õ[ºhxŸ_Ç>¼&óò[ã?Gyex°½óg6ç‰FD a§O¡lÆŸ¡¡ óýõƒ"¨ÄŒB±™k“VÆè<¶§qQ/fÀÜ S| Hòž®øQ²¶9®&ÑÁ–ÃçÜq^¹r–«çŠ— å¹áanÜV£Þ‚§|OíãËØ±2Óhb¼X‹É …xj-7¿'ÈÈ7õ¼½ ò#;öµçòñzã¯.:üíâÝB¿^ÉmU5¨–ŒD>(r½:6G³-¹xm¢vœßU»¯ªêô÷GZÝÛž0_±n¹<•óZ†]殎,ÂÞŽ¼–²ÑUñ•æ‰#¯¢ ïöM0|3qâ£=­ â?i˜Ô endstream endobj 54 0 obj << /Length 1692 /Filter /FlateDecode >> stream xÚ]oÛ6ð=¿Â(P€*V?$ èK·eè°­–>­{Pl:Ö&‰™DçãßïŽGÉ’£ E€ˆ<ÞïûŽN7w›tóÓU¿o®Þ_gŦä¥ÉÌææ°RòÜ£sžI½¹ÙoþdÇz·Í vÜ&2׬·ûÓθQÌ-Aëîàú¶òµëè¤B’‡­Ö¬ª›ê¶±,"hlÕwuwwÁ§uÛ÷¶¡ƒÁ÷§?õ–om öe°Û¿n~~mÌLît“ÁK%¾»mù½í_SrÎßµÖÝþÛϟ?¾˜ š—\ `xBJÉÜ­¯jTHŽ"¸Ð¾•A4àdŠKk!Ê/Õà›çm!Ù»m¢”DÙ`‘ 澦BÚ~ aàî³0…äF›Q˜‡q5E>"$×ì‰é®wÃ€Ë moXÕÔûè<ÖÇúO0_okq8¬ïˆ4¨d¼G ÚtèÑS›DmSa¡ímpž —æô­½íƒ81Ï$û¶£ØBy·Q3ZËÌââÝŠÿ30}Y¨Ñ »Ž&>è5Ó®U9yzaºê±nÒ1+×R.u܃$J”LfÅdvSȃٷÁª®CSÙB`¨‚ÒÄåÚ×Ò~Âzç¨wäáˆ_×ïƒí\{ò!?¦Œ0“š?`ú().'Ic€ Ìö=†HXÆè¦]ÕíiAÃ)‚«¨HÀ£ÏP?ùã%buU$AçrFãÎÓŸŒl^ú[+ž¦SüwÜ÷Ö+¾–†!G¼sóÓ@&¬šxaˆYøR¬SEÞ¬€©³û™‹f2»Û˜"eÊM¶Ôïo»C—b ƒ2eÇX*ÁÍ– ¡ ?k×Ù|žŽ¡œRž5 ÂKã#!ÂÍ,È«/Í£/T6"‚–¥0ì7ç-Ýï•_áo@­r¢ÂZÛºÞ®±7<ŸŒ?ê+0‚y¤Èx!–ö©öû=jŽÈÇ<È/|W°Û“§ƒ1rÖ¹ª°¢aþíçù‘Ÿ®×çßb®³M_xŒgJóJ{‚ NÃßÔ¥v¯7)ˆûâÿ›¨miSQ.SýE÷šf‚Ëï7WØÓ˜†ô[ªòÍ®½ú÷JðT*Ð( ÌÖt)#àý§6Ûüà®~‡¿ñ(¹&3¶¯&Y‘rabQ»®ï`0Àú˜2õúšóÉ'î U{ßÄ£{˜Xê¯Çúšå‚݇ª ÆÇ¡¥ÛELL~<=¸SOàªÀNÊȾu,ÜÆØº¬gS•€(þ?7JïÅï¢Qj­¨âà ÉÞûz72«žêV÷óã2²=@f˜ ÈÔPµš*X!ÔùºÐQ¸ƒZ—y”¢Ðá g9<· ZØ7t ÜM×c#Ù$dÕ8oà´1 TÕ­qÝ{‹m©{T8–âÁúaMæCïZŠë8§)öåûOémLÖ¡ö®‡aLa­SJ³?ìZº ™òìÜ^ö¶u˜£0÷%Àjx%9sɳLŸ‹é<«@֢઀0Ï$/…X°Ò‚)ú`e©zÏpG]HÃDæ:B“£Ý2*BƒcVSc;s˜rZ¬F"Mu0G)uÇèæY &—¡ÿÐ¥.L“,äÒ[ßDh‚<Ž!±—ªŽÑ´HÝ•hXÆ R9õ•G{Àb rÚŽ÷¡i ŒÌ Š(Q!@ŒkœéƒÕ‘Ú K¤áâ¶9ûyg‚ýŠõP·ðlêÏO¸òŒö´Š!œ˜ó, ”ã, Ë˲°:žíÀõà±}œpO€ÆpÑ óÀb`ElÀn1“d"ˆƒ RVªq¸Âe…ͤ|K{Ø",è2ÍhJ\Ði¢S,KßÒÞSoƒHótäñœeš¥¨ÒäìDêßíªHÙg èë §2³;  ·ÁiÏtB~p±`[¨Z~)P1BÑ!G‚am¿`(©‚ýÓQ%©Ï$˜ÂHƒO÷9ë8&¡smO¶ËÇE!¦„G¾†qEs(TD_.p`dú2/‹r endstream endobj 51 0 obj << /Type /XObject /Subtype /Form /FormType 1 /PTEX.FileName (./oobperf2.pdf) /PTEX.PageNumber 1 /PTEX.InfoDict 55 0 R /BBox [0 0 348 215] /Resources << /ProcSet [ /PDF /Text ] /ExtGState << /R4 56 0 R >>/Font << /R10 57 0 R/R13 58 0 R>> >> /Length 2497 /Filter /FlateDecode >> stream xœ­ZKof· Ý¿ân ´æVêµMÑM¢Mc´ët^I`O03ió÷K‰Qòg{&6‚‰MŠ’Ž)ŠÒõÇÃþpý?þùúîòñò§Âñþóå[ü5Ÿ±µvõ,ḻ¤r¢¢ªævjœ‡3vtRÅ—.?¦’¯ïŽonp2½?[ˆù¸yw!øÿ3ñḹ»üþøÃÍO—¿ÜßlîÌ5 ]Mݺ¹p|zݼ¤ÏpøÂì 8'ŸÞ^ÞýñRÎáømÿŠÿ~zÜþ;\ޝ¨ÿõ);i.ñL*ï[:³*nѤâ‚|míôÝ`H-ôµkkpÈg’Öàb“¾ Ï§½½[ÓdÁ&¢P¥êr©™Ukëˆ\û¥FÂX+œ£¿§ñ*6 {'öƒ“R2ÅF­®K‘’”‹Hd›Ali¬ì¥æì‚%mXSb{^Kb¬ =‹ ƺ¹¼ŒÖ\]fk’Á ižÑ1Z•‹´ÚâÖvY»ÊYÚi|aNæf_]ÐWEKkS?¯Š—ô©¦—Eæ$4÷Êl‚¾['ó z`NϤŠ^=—bÒv‡RÔV”BYmC\Ç nËmXü†Õ+6Z‹ï´N&¬ ÆšXœ£Ës6òÂDC^šhE–Õ—Shk»®]ä´Ž¯ÌñüÊ,ドŒ_xmÓoÆ«ÝË>B§ ÎnLŽÔ³ï–êž05YîiS¶ÀüÞyìy¦¥pVU ll2c8ú±ÚÈ^< Z`¶7Ó3 ÙØ|6–$:æ¡a%«¢¢x[úGižó**zŠö¼Qû´;Z:êàÕbìÜîʲ*¼–Ä,ƒÖ–ì¤=”½U›¢ì {è³9gzÜ‘è¼*`é€åeÀ4Ò‹™23F…”£‚&…YV)®¹´[(¬˜\e&V†oBjü‘`ñ­Ôús=4IÁLÜ]㊸,×u:x0Õ5ÄCi2!{³lþ®%Ë ¬Èeë’Ó6hÖUŒIó•5)Ê:%]4-Ê(ú¢“‚æx˜íi5çx˜ãsÈp1nt7Ñ.Àå½Ú÷I°“‚§ë‰ÂϰªZBW*]” TÙ’i•O¦| tõÈê3ëFH\j¡b\@.CÂ`W,g0ýÖ†èÔ vaV' 躧©…Í«ÚnÜÈ ¯4‡l"Æ&踤Y]EPúx’ÂPQ–œ&ô™.žSõdœ“\Z4a±'pv4ì¡0¹.Ê– ª|Ê´i"%`&6ºúL§^]ý>B¡—³€¦' ·çÅlN\î.Fâžùc³¸½oñÝ%:W–,£Û³ÐÙx¾€d~{A…6™ ‹è!“{ |¹à-Š–Ý}l4m’ãPFEP RÈþEE6—œÙ·$Uë²+Cœ  £d]EÔUð:ƒÂf’§"ûá†ÙeœÁvМ—Is¡ *ï°‹(æÂDѶ.“+t²9&l3¨é †=ýÅ ›]|nÂÀ7à”éAÅ‹‚.Œ0)žo|ê"TõÝ ã¨ðj‘Í™­ídQ¸m¯ >idÂrÒUøÉ…[n©Ì–Udº¥Î.ÛÝlxÄN: \a¦eùöELˆ,Ì´˜f¦ü|f0?•—gÆcÕ…«^˜R]Þ˜©XRGÖå+Ì`mêèe”˜ùVå ¯l/òó™ñX}à­ã¥™Á`õõUÌÔkÌya&ô[±Èú&Àö"¿33ùÕ?rx3XšeæÛ?±›¼=¯7f‚‡Æ™r0!ò­Ê^ÞBØ^äç3º¦½@]釳ð8òÁç1ÜßúpeÖ{]¹W:?líû¡ý†J/·UFOùŸº¯÷å:L=·C•zá¡¥¼&9åuibþãÖýÃÖýýõÚ#?TqïL]©,·~Hüï6ȹ+“ÿ<=ϧ­òÚË¦Ý ì¥|ªvlù‡ãd'ŸÛS Éß?gâ aG¹;èóºkñNø$ÅAhe<$©X=ñû4ÞŠ¿©¤nx¿X¶×Kä1> stream xœcd`ab`ddöuóññÐÉÌM-Ö ÊÏMÌ «ýfü!ÃôC–ù·Íﵿfÿ2`•eð™ËÌ»àû¡ïG¿äÿ¾O€™‘qãé»Îù•E™é% ¡AášÚÚ:CKKK…¤J˜Œ‚Kjqfzž‚Q–š“_›šWb­à T““™¬žSYQ¬˜’’šÒ–˜“š­à–™“YP_¦ á¬©`d``¨ $Œ¬üJsS‹òu2óÒ2ó2K*óRüsSÓrSRA¸äf–U*˜dæÁuûeæ&•+€=«à—o©à£”š^š“X„)ÃÀÀÀ¨Ä LŒŒ,ìßûø€¨|þ°ùßCæ³­äzÀ½r2σ¹<¼ }nx endstream endobj 62 0 obj << /Subtype /Type1C /Filter /FlateDecode /Length 64 0 R >> stream xœ]V TÇíq˜îV“wYÅ=€²ÈòA6²ˆ0:ÈŽ¢q‹[D â↚î¨ ›{T "aP02¢¨ä›¨?¯I çÿ4?çÿsæTwW×}õú½{oˆ2@‰D"÷¹n ÷¹V^JU†2-!&Z?9A° ##ÅØŸý³àO_ÉHJqàå’†{M¡ÇÃÊ/(±H”•Wè–”œ’Ÿ&Ÿ°ÐÒÊjòß3ö3gΔ/ÉþëÜ]™š—(On2”ª¤dµ21m¶Ü¬V©bäqªìäøTytl¬2V ‰V)—Ëç%¨’““2äÝ,åvvöÖdp˜Ÿ ^’ž*ŒNL•+äʸtUtÊÿLR5.1Æ×5)ÖMé·Ô?%.5>-!=(C¢^â8uÚ k¹Ý8{‡)åG¹S6ÔÊŸò ÆRó(;Ê“²§)/*ˆò¦‚)*„šJ-¤|)Wj:5Ÿr£8ÊŒ2§†QF”1%¢L(š2¥¤Ô—ÔxRVÊ€„ûŽº) íý9`ñ€jñpqxŸ¸Í@eð³Ä\â/I–œ–Ü•ôÐèýôuf“ÌeþÍ&³ïŠšä.øjÐàA‰ƒÞÁ)#¡ šã‹LQ«ó3ØÞa.½Ð$sÒïuÅÏà"#mþ£¾éùƒ‹1 þw‡ g:}êÇ;…ª]#dÒ ZÆHØ’Ö*Ln¡N±°~ã6__û}fi\¥¢Ô±x”-6 mvê’Ãh0ik/wóÓéÕÓG»#Ö6¨¾óZÍ˦ËK\wóF½Qq¯{‘i…Z±oÑHt íUcŽÁñ}ÆXÕk,Á\ŸZ+ 1:õÚ¢vÈo‡œvSÔ mÝó»Í¥B›´Óèú¡sYéDz³EÕ7†#mLM+œ:Sûx8ªÉ¼ªüAy&t¿I´’îÆ9È/¦•n<šu,sŸÅ ØÕª™)Y‰’E^ø9ùÉT “k§’ȇÓö.Žâ¿Yžž–žª^‰XRÑ¿*5b8*”qx‚ÃhìŽÝµ£Á&t}„9àíø[ñ¹ÎÜëzgl†8ÙÚ´‚Lohºy}MP± Ð€3éL'T“¾äzÕÜ |„ޝ <¡Ïz„áÙ¼´{tbŒzTuü~…Lš3ï1ƒW Ñ\×ÙØDÿzÈ‚ÙSlç?#0ºóô¥Ì¨wiZ»ÐÓ.BÝâÞIâÁÑVƒ‘%úJìçí7áA–mt¯_ð(ù ‚yèÝÛÃÌZÑB×%ä$ªý|œHãlðÓ£nÕe'䧪öªx5€c«(÷¹XØ k8ôäÛk9çžÏ®¶$ØñÖ„s𜗣`v4]Lè‘6/<Î … È#I—3On<¹­šÍkàvöܺ÷ ±í÷<§nE[·m%t9ô×G…ÝbaXo6‡•x4¶ÁY8 È–jÛŽWÝâÛ^ ‚Á,|ƒy°ÆËd3 ÀáwlˆØ}4¹Ø`Û1`^àõ\lx#8ETáÙÍ¢¦1ì!‚pl§Ï<»s;ÚvHÖÈdå¯Í]‰X爨9¼ƒ‡g“.¼Cïdôt,† í°KŸ¼èƒîrMß@ι|þÐUô…‘³ZñX®íSwÓ‚…ì‚æEÍ¢Ù³C9ퟓí]µŸEi.-oÔ+²T¯È£ j9Ty¾”•6;ttÇÍ\öfuÞÆÜµ(-Y¾È‰•–¿!ª¬Á‰M`¢ça|?.(O;Æ…‘öÜþºÔ˳Ç,vžZâq)”?U¿¢=@WN\}À&3È}ýâŒÄtÕâ쯑%ìN;˜U¸aߦSì4z×ÄV0FôðèÏ_+|€À˜%Á}H`œåËuÕ¸`slì2Å.¨ŸÐõšü}¢ Ö«ðŒ¸mï=oâ!„F_)&ͼ _¨yMÖÍÕ%H9,,l™k¤²àp†lMá·…›ÊÙ)t>Ò°F"†>»ÿ›&êÒ˜£üìCžû£òa×.jxX–”'ûliÀ…8Iq4q‘MŸºƒ¸™ÞËŽ_ŸS$;²rw ZÊ~ò4­¢nÜÜÈ´„Éà$CšŸÖ,LjýD:¿“Tl*ŒÇº{þ¤“Ó!eêûEEyß51k·¬Ù¶ ±qëv—ó€Ÿ` Ù8³|‹D¹;‰`gèŠ;è^µÁôÑÏh½šCõ+LI^ùDÑYB†™P<W1Ò y ‡£wôù{2™¡›È8Þü½µòxc¥LšåJ Äévlìýlh×5΄}Ý$ð–þEpÖ "„b‰5'ꜰ\p"· ÐKúsë÷Ögb˜Lê1½Oý)£&fÁ®UÏ‘‚þŒNWrÒŠ­ë7£M‰« ñp‡éö¬Âœ‹"=6^–š¼N½u!ûŒÞùãÙR b[.&…òé ŠÏÈñÞ€ådo^¾f~Š*y²Ò¬Éüþõ ¦¤ö–lGȱÔZtäßIdžJÚ°2%-AµdUb}”§jêÊO¼(àµ{æŸ(`ûe¤OWa;zÕ’‰4ŽÔõáeBŸd …}jIÿiñùäzÉúŸ¹´RtéeÒÎHŸþ\w§á ÷é'ôwÊb½õÂHæupµ¥Gt†˜LU}Äy ˆ‘ VZù˜ù?÷í€êNâ¾µý,jyÈH+矯Il¶YHÀœ§~Ä2·¯“ýcy(e`<.æºoÏÂ&x°ï,») ž€1ß|¢åû‰ –m‚‚tªZ+†'àÌag-8l-,öap‹.@bÃ@u‡ShH{’þT S¶"ah—XHî åtè™ä)}¡ò‡uˆ}sÛgô8_+çðsO2ø)ôÞ»“ŠWœ~¤zBÄ2æÝ˜r›·xDhÌõþ„Iàý)ôMGñs(÷\"Ñ7‘ò­Ö¥SÈ y(£Qó2æ|Ðå¢âlÀ]é:Xô¼†Úwc©GPòB%Ÿ sëßþ†î¢K±û}váA/d¼–Çú¥(×Fmucµôök;J÷–”\ºr¬±On¸.[¤ˆãmâ‰Ó{mÀvÃÕgº•´@N«i¡ ºæ‘Þ °4ܺ> stream xÚí[KsÛȾëWà–Ãñ¼©ÚCœ¬]É!•‡©Zï&! U)“”µñ¯ß¯1€C¢(YŽsñôÌôtýuψg—Ï>œñæ÷ÝùÙÛ÷ÚgB3¥­ÌÎ/2¡sÖgÖ8&•ÉΧÙ/¹…&ÿÓ—‘•yQÍŠO³r4VÎäÓjµ^VŸn×Õb¾ýzþ··ï¥Ï VZ’dz±ÌK%_U+|ht¾*'ôM¼¹¬F"ÿ2’./ë×*ÇÈ×e¼[\Ävë«2^\¸ÄŸjRÌb“i¹ÆÀâÛ‹Å2^”Åd$}~ï¢õ ¥;x;VL3ïÅMœÖXIޝŠ5]‰üòÓu|´øÈ…*—+6kÍ1¹r£A9¿¬æÍûeµ^—óø¢j~ÿüæM|{U4Ý“I¹ZŇëEó,ÞRãÄ('³‚>ÞÆ‰ÓÅÃÄéŽ&N¿˜øªŒ—½u£ñËÿÜû¨#võ`m `,Ó¼~èòër}…ñáùt³÷Éâú’ç—©áÖÚ—ÁæOßû|RërNË/U1Ÿ”À”¹‚žªuEKLフ ¾šÝÆ—úAÒ岘VP-d­›‹y#ð¾&±Z±Ñªy·ˆ¿7ËrZMÖ©ÑÖ ¥TT¨«- wëry]ÍkÓÃÝþZ’ƒ,±d#ÚD[\•ë¦ pUçòey3+&etN%:Î)½d̹äE æÐ¾ÿÈ M ï4QÌ;ݶù­é†g–§<5°Œk}0Z­&zRL —u¡;gsW­¯ÃãÆ•kTm§MŒ@BÞ=à äw½ItJWê5¦ј~Ћ Õkû“QS!€ÅŽÞ´ÖL4¸þ¡¸]­àvéÙ8ï83!~ö—ÆC£­KÏ‚R°k¤ z~öóùÙç3)òLdÒæËŒðŒûlr}ö˯<›âܩ೻ºåu&%£~y6Ëþ}öÏŸdÏbïEY&¥‹]þ#±LB2ál6vL;›Ý%í^RÑ]a²Tcs¡‘õŸT—X5¦­îRå <ÑiÏ”þ›6}oRÝ)½aúýÞ8ÆÿX¸¨!#¶u=•;  !Tl+ÇR*D¬ÄÞãvçLR‹ÛâPj6tÓ“Aº±!‚…ÏJ-ŽCÑH ùŒŽ²‘޶ZênËjE¿»ýf{||Hâmî‚Äèm†fÌh5„wÆ3ÛÝx§â\JÈ2±gÝA˸|Æ"ï 6S­+|è¬ýð5©e{€ ÁÄÑs§ÑO©Þ0>a¿‘½Â¨_ã±p9AhO¡Ê7)×ȃ¾&ùjCmWë ©T¹J¿q8=Šü?ö=SìKs/_b'`ä0ºÚw’/o˜† m°Ž!œŒúö¾A^…ã, Ì¡-{•mUbZ¼£$©Ðj½‡ÂЧrXÍmMNAb5× y÷wNb  ‰°ëAUùÛMÇ—»N(`ž]÷i]°'/ò¸Ž‹ÉÄ RÍóF†Þ˜z°Ð! [czn¸…ö¸ðñB$|›0Š|-·ßˆƒœGy ç1CΓñbÃw|Âw”“xÖaó0¯ˆô{Õ d¶¸L¸p¨iqÂöØÒF„Cxý(j6,ªMFÔÑ hjÅvHë[ åÐ}^,C\{ 4&‚_ÛÓÆêzÅ´ð‡¬®E! †˜VgZõ×§g5ì¯ùHA (´ Ûв;º? 5#gCÔåQ˜¾Ãtßø–5 'qþÑ+½•##üQ—cø‡Óâ2sHf`ú™Þ™ ’@òz¹/ŽŒ…u!S8¸xéi ™û§* ™ö; ÛT.õ–Y̺Âóf]Û h·@ð»dEǾË4‚Éôùê\æX–¹ƒjpÈ0X.FÖe©þà<ãN@m½T½ìÁÍAø¿ž^=fÛÝÛ Ãr´=fè D[`PMá]¹œ/ng³êñÅ…-*Ú£B‡¼ßSFhβy‡/ÐŒKõ´Ž,òñŒ…¿% uX‰Â<µDqÒ óxÂþíaÚîHWôó{¬¶¢(÷ Ýädc)eþ&þô¡±«ø`¯=~õ©Ÿâ@ )êj‰Qv°Z2¶ I%„v ê˜ªÄ K `þ›È5x‹_dí@¡/Ü҃ÈÓȲI¡þ–—¯¶˜€/½ÅcÚ²4ÌñcîÐõ9˜ÆB»á©8ä S“,‘ñiËAùBä½,;u®ís7Ôu'¥,“Fí±¢ xÅtžNN€žB¨ŸË“dÀ`(Öžrßñ´gn̳–Kv£ÉM¢;øGž¯Û™ŽÑšj¬ÀP5.Ä]’V³={óX>̱öèìTaÁä¾uTÇs¶ll0WöƒÂÝU¹LŸýdž‡ÍêÑÁ€àØîÉWBòõLIiÓçIhÜÈtN¦= ú£±3´´Îþ÷€öõ"´§ÿß¾GúÔ9©_‡D0 8YSÄü(ßød‹òß ßÞÓe­esDy¹ ;Õ9›LwíÙdíx<Mã‘ñ"þK@ýÍEÓºmQ‰œ¢ Ýw˜5ÝÒQèúLôç[\UËVêãÜ€ ݪ¡¼[chÿ*n®V‹9{·ª%ìÕ×äÁ>P{Å·VÞõáÞõW-Õ+4ï9 ¢¡G[Dì+Íó úATL™ –è¡.´k$>5’ž½oä#×<Ê^ÕÝB8©Ǫû_ÁfÕËŸ3²?z¼×PiL=3Ö?)Þ?Èrõ9´g; ô£ÅûY "¯Q¦æÝV»— øz;àÁ<ýߨ`…²ŽHËOBëÞ ¥:ÒÊ¢œÃÊÓDüá} }eÁøÏùt8&Zv²Ïbeç ëš0B>“ªa˜+þ;†äš endstream endobj 70 0 obj << /Length 2564 /Filter /FlateDecode >> stream xÚí[ÝsÛ6÷_ÁÉ“4 ,¾q3éC{—^zNÛóÃÍ$yP"Æf"KŽ$×Iþú[¤DP EɲãÎô%¦HX,öã·?nhv‘Ñì§3ZýýáüìÙ a2b¥dÙùû 8'Z™LIM—Ùù4{5DŒs&G¿L®g“wÅøÍùÏÏ^0“YbSîEI¬6Y†€fáµ–“ùf´ÎŽáÆ/à ‘ŒgœXü×7ìì_çgŸÎ¥£dLZb$¾'€H)²wWg¯ÞÐlŠÎ(áÖd·~èUÆ(ÑVáå,ûïÙïag òm碸YÖü-ˆÆiS4 Fë,WD3†ÝV;°™ ÖûÕŠg9NfuUV£š«*B¡R lކ*Ã5µænriœ¡‚‰°"Z‹Æd¸è¬1èCbE†²kÔ•ð%±– ÆÈ½kI"ys©×”ÒÄjîŠ×«½O›Ž¬Ÿ¿¦²šCÒh[F‹zÌç„Ĩh¡SG+íh—ƒ¤Ë}HL”3<5‰—ƒ"‰àF¾œ—ër2çœóÑŸc)G“ÙM1Ε1º*¦è )ƒ“”%"áokâB¶„Oé;þKRßL°†Â½ÜNP‰hãyáñOËÉ´,æcfFë”í¢¿[ØœÞפ-)ÎÃæd÷ÉàÒhqAÏÇ9Óf´*/æ]û¢(hƲߌ¡¹t³رÜgÇj¯¶¶v,·§¸=IF)1”Å'y>6|T,¯Êym¡óêØŒ¦EøY¬ÖåÕd]¬0Öã  V1X9Ü`¿&KuÊ`¥"Ær¿Mn!<þu;ø‡Ï%uš{öBÉÆl”P¯¼ÃjÁÐÖ+‘î j:õ‚)…¢y¨V†¹qÎ Åãœ/ü!ºëKw¸Ú¢ LŠòâÒ{r1 ÃVžâ øâk b^?›DSàÅj1»Y—‹y˜p½Ø>ö§!0AðQÖ±øt3ñ¯ìfFáÒ¬r©%–e6!«ØÓ£"h2YcÐË4hl|×ÛA;òDòà$iqxŸ8¯)ɘÀØ&]_%DÖ(rÛ=ZH #‚D=K‰¡ÿö )‰Vll’ th爉–bÏÙ©ãÏ.²v†ù$†™:C$p% àa%놕Jƒ¦:@8‘†–.šëm¦Bëõ&Y_ôÁ`e ¸û±áG©®ðsñöÊYå¹Ä™%l=öÝÍrY%ïÙ—q.$zyú.@É0¬ÃýòêzV\Õ™Þ?óžïžÝº›!xè<ÜÝ*úëÉ|Þ)W«›07Þ ¡)JF~ÆÉr^Î/PÆF·—¸4^ñ°ª»u³*–›«U5l'Œ¹»ÓÉzR=/×—A+‘Åx ›`9-WëeùÖG³çOf¡¦y’Ð& ©Ó-T&aH\1åC7hç mÁغd’UÉôû *‰Ãh]Î|,–£eq±,V«m\l.I} Åëy~\„ñâúÐŒ™ú­Ó—pýÃÒ‚»üϲ,.fNiœ ºVE•Û_©7Î`ü\­äÜ#ôA!NU«iD[2!]¸º[©¶™JUûñ“O˜D©†ÂèM8N%fóœ5SYû/ž˜Ñ¸°Ü¨ê’[paÀ>© ?¡Àá îû÷©¥6í"UèJqŸ“i„Ã)­êþ·©¸b¥2TÃÙí%ðÞ"o(À:² †e0{Äep@ÌèÑw¼FÌà'³èþz+Wª`vÞ ìVúýD@ÃgZàÚP¦åVNáò”~u?@í}[*¥OáxððŽÇ™•èxò¯E9zÂbÞ]ñDÜE$¶¾â4Ä ÉUYÌ£¸Pƒ\Õ'‡Yìp˜%‘@8k +Y5¼Pj¸|s6I@Û]ŸæÉ a–¹LoÁv—Ö–¡?3ôy~?™‡&ò921„#Q¯_S幨º-eI ‰J÷DI²®Ðôƒ¾ô^Êû£ªÔ·Î„»1 •¾ê°C5ͦ3YêÁ„•+»CíˤƒºG}6ÐÈ¢ÝüX‘R Z-FTpÓpŸF#Zˆ¶òÁ£­9u´í"x!m % ØC¼{΃£¹+;ð8Ä©Žƒ;y?É/¢Á,»â)fÙ=þt3v¬'ŒPOñ®RMŠÙ šÔsáX?h²ÌþÉ"X‚”X²ÆÜrà¥;u“ÌÒŠ1èo’ùÀÚåS² °”ía™AM-zˆöŸ~zH aˆ`ÐG¹¯)x‡ EŒd÷wx1ËŒÏÑWvYæ}«—ñÅ­Ç1¾s¾ßóá§ÐùäªxþÄÅ{¤žtè´&†C? ,¨@Ok“Àª"\8 øsà[.®ñ F‹¥“Æ£¼ýïÉ×ÉrÚßZS°1]ÛB—Q¹ÑTÑQ·&Ã’t¨Â`YvZWÔ Ö~ÔGí½é O%©`æ~V|@zè—ÿá„ÅèÙ$˜Ù½D— 0ój±üGòe«' Lª²%ÿó”Á8»q|D¡ K4X7iß°©e@29ÈžÓ Y;,l—¨²¾‡Ãû_bEI¸pá °À±Ý­d(½¦ÊwÝ¢á[HÎ'‰Âˆ‚ˆ—‹CøÈÔ\mT¼uÀìx„jÒš£ `¹×"†­—jRÛÙÿ’ è°vDuX‹©~n[ à¶Å@n[á¶uå £ CÉÑ:{à¨uáÍôÀÑí\‚p&ûðð1ɈIuïþ1iç™°s8½LعFK°];g8Ì<Œ;ò^GŠl-å̶1¨ "èÄB·Âˆ\ aø½Ô¾Ùà.-‚¿·ëÅ<ÿcr}¹òa\à"‹¥ƒäz•6n ¬õ)e“yBGÂvÈ«\â´•äå×jÙªÔˆþ{„»^bqòÎ!êUõÚ"¼EŸVåŽÊ‹ŽÜ a9KhÖ³ÍUKj\¶¹ý 0wìB[Kîë—"7°V’¨väˆ\g‚cŠà›Òö$(ßG6!Žßnëª;m iF´Öq;VŒ¤|1—Ô5å‡ÅÌÉo•5Õˆ™w t²­©xyH¬LG%¥ce{ëIB@Œó[Ø/ì“ç!€N ÄÅ@ǪV‹ëÝ€òáYž èȇ:‡~Äÿ.!¸%ŒÙD,‹—.vsN»QmÑõ'+F„éíÄaD¹‡ÃsQu=Ï'ñ{*ÿG;?=Ð9´[¡s{è<;ïøô“ƒÆù]cS¾óÙ¿ÂMàIukÉ¢(ö1IÉÕ°YödÔ`J÷ÌËõS6sn­bƒûßS’m³4'ôöŽìJš$휂³ûÍ÷wêHzÌȲ…ýÙL.ãÆ¦nFå»ÖïÕ@âW0Ëý¯rˆ} endstream endobj 74 0 obj << /Length 3250 /Filter /FlateDecode >> stream xÚÕ[[“Û¶~ß_¡GíÔBp™Œ;Ó6qêLê¦Î¾ÙyàJ–1E*$å]ç×÷¤H ’V·I¼¤@ðàà\¾sMg›}{C'׿ßÝ|ñŠ'³”¤šëÙÝzƤ"©T3­ áBÍîV³wsInºû®kön¡(o[þâ•¢‚•À4÷â?ý„ñ œðÄt3îo<™ÛÛ…bžùé‚ ¦sFÍ»é 9˜aˆaª›ðžR¡âî&DF\%$ᬛ°ÍÚ:ò<=æíÞÉù*Ï6U™~ÜvkË[`¾mÆÒYIxÊf `]v"|OÍà‹IòG¤#:\B-1ÙcŠHe&²t¦IjDâwÅ`øHeØÚ6¦C$U³Á´—aÖh=Ð,Ó ni¸ŸõCŒ+N˜Ñ „ôÇg†0!Fœåb‚f†|½§LG¨I"¹¹žÚ.ìÒ H)B™€¿Ì 4YRó:ƒ‰2„šƒUzÅOh,@¤,(¬TñãcÉ£¢A?‚$Bw9h7ºŒP•ú™,ê2,Õ—¡—\æO#)A hw"©±ÓMD¶àRÂ9[FÒTörÑ_+°Gñt˜:  $ãâ’ëroÒA¼Û2溊ÀÍ`Öˈºƒçé›õ-iú,ï6Ïòî1óçýQ]ôîäzj§l–ÿþ6+"6ˉÉ~^o+#„ÀعºÄzB(Üq~Ö‡˜F@ׄƒa»WT4mxcÛª\¼ÍvMUzwÙï\°ÊZë¹Ñfdt@ÜÈÏ‚GŒ<æ 賓QJĈ‹tøÁ‘¯Œ’¼;ò^1òÞTkÐ…fêÈé†Ê@iõvÄ"+íkô¥ÍYE€ÁŒ„üuTom»¯ƒèÛÊãWÓÚáA'¾òŒ,+÷û#þ±õÆ–K§*4[í$šßT­m¾œ°©Õ€MJ MCfÁ‹ºÈμrŽú­¾J£(1&ZŒP"o‚lľþ˜¼UzÞy›oó¬\ŘjÀÎ\‡bGü%gù[ÙÌg¤bž—«|™µUMÎ {Á„!­¨³Š«dý6Šï –¸ZÖ+[¢iL6ôýšÕ+?¸®êÉÓ:o>ø¡Æ¶/bÒW 3¹Ä®|»4‘‘PúÌ$x¸`˜µàའß8o¡SÁNÝØÏD@÷ [yÉâ˜K n—&é5øç-€Õtdl¯£)%;à èŒcobÔ%ç4Q~äH©Vã„€«¨ÃxÍuëÅ4I@;±½>üبok‘á~ ¼Z›cA*v*z?")ÆìÆÚXÐC¦šGˆ)È÷Fé,;‹‚Â;M–‚Ö58ñ!j;êˉëH >JW:è³OJd€¶Þæ}^V>««µŸ]Ý7¶ößæU³¯Ä@J¦Gl”Gktðr]G;IíÕmœèIÎæOƒ¬] ƒb Lˆš¦Åæ~‹"=XìXj”ˆ^¬+/”®R•Å'¼ãpgÃÂ.ø»‡kÿȉ¦é>ÌŠMUc_„ùÜòE Ñ»Â3UÁzjˆSw–"¹ ßMAq #{¹ *]•ÍR r»HaÁ7aV¶ÛÕv™g÷…õEÕà{ÿå˜ÇiËå¾Î–Ÿ:ÈrÕ䘹,¸f~\óyiŸZ?äÛQ^^UãÙÄñÀ5 >æEá_mòíʼn£Ëªî%ÉpFX…ÍwuîÇäš‹G—ÙÊþ²Ç €qœíÞ˜ÎÙÞȆF¶wÑ„dÒ‡Õ¼ÌÛ<+œå5oªºmü­·¡1{ÏüÐ=j°›Ò)åÒ)÷¤S W£èüëÊ‚#“7žNmW{°¬%*Öq†¸©tYmwû6¨Gpã=®€—DÀPãJw¸Y×Õ6æúPÈ(ÑÃô¿#‚1YúX+£åD^ Aç<²Ð ÷×#üíU1æSr(í>+ë“S2d`(k'2_‚ÿŒÕ‘ bsÐp¹)ÂÜýÎk—C2l¢^Д/E`.øšÞѹS7ŽŒ3vxâ3v¾³P\Íÿ¶nˆDçó8eÛ‡Â?py6ŒCö.!Ï„{XÎúçÞºñ.«£@.0±ƒÁ}Cœûl‰ìðÑ'/û7¹.`éæµªze}BÿÅ+ Ù™£ÄC•MR¨3ÒÏ´kÍØ(ƒMþp+¾ó¦©N(Í \Idè2|m?æ™CÏæ 1SH H±’T©7ßÜÝürƒHBglÆu $@¢4Õ³åöæÝOt¶‚‡ "À^ÝÔíŒÓY‹Ù7ÿ £~ÿ€¥É™›¡‚041\Ž2áQÂÅö%02§û†Úõ…$„”=¯û%ž•|Ëç$Ÿò¢?öÉç§hò™$êúæ÷:n%‘„m|¾3dç)ˆRšß´{- h’D:—É©nŒ}Ú$ #=lï¸ýî_ÀBM ‡3rÕeä¯}öH¢êÞF‡’²Ï±[5-G&5³Ôã`ÀK^úKQm"¾&Ñ l^:IuužCWÙI€¨9åçìB_ãWøØlÉdCDK;‡‚è)v¤4á"ù½ò§k]–ت"¨ ñ׋”©ãþ 5M{;+ƒ‹±¤îˆ& Ë~[g«<œ±\v€_OìD<«-¤b',ìä Ë) Šö Fp3Ðu$þTwm³åw,Mˆ;«ÆÒ;˜ÃÖ?çJz %Ú6s-EÆ)=8LI"P‡’pÆÿh°"/Á 7Ⱦžñ”©Î⊟,9—]õ´|¦ö?;ÆŸ2©ÉâO¤ƒÅ& áR¸s>Ù}ÜùòGƒP±òÁ2Çw(|mªçË"k›—Ëb¿²áW×üYúøŽ#øb¶¶›½+9VM¨Ÿ*ÿ¸}ð5—îj.EM†ìS[ÛmàeWÛEàT!å¸>YåËÖ®&)EhÒgu,¹0š°Ã'<6rþuÔ}ÖùW;àf¾ã9 uâ&%âгñ옩¿ÄW”D)=ZêH)%(7ÇðÙÒCæVywjÑÖ,«²É›Öö=­vð1ÑqgÉR¢þÐqSœëË@FÀT¿Ïx¡( ’áØUŠÉ°RÌòú1oldêæD‡Á¾/CÁþ`Gu~½ßjÁAl >upcuV~ð}€ä­Í >·áѺ*Š ç>† ¬kí€Hä¨8¤½«€åí=^VÙ¿²º!CJ í‹…¬6/'ZN𓾹Eµ¨ù;öjXQȯ›kb§GíÔlæNì˜Í7~¼ë¾ŒXfRjݵ™­ö»&¦Ä„èCƒù+O5+ ÏÏ”8ðMk…Ãmão»/Òh¿?±5ŠßxSó" ÆY4/‚Ói‡]‰0©*7^,4cYh’à@“mÃݦ†¼Àfï 8¾o:ðÎ5<•:ËK  "Õ Š4¦^ è˜×%¼¼ <)èü-ŠÎ‚µÙ†±„п?ÚQ¶ÛxŠmUä”iϵ‡`´E×  8[•yiêÀá ;©ýÝ/{ +w4²nb§‚Á;è¦k›µ`à~ß»^¶Uíì¢6¡.:c„Öür?p!¼19Ù€ ¦æYÖ ²%jþ¡? Ÿé÷æëRÁ+w+âK­H(èO#xªz+—Ÿü€7<¼kqÕO;váöM¶±Ø5£ðþ>ØTÝc„žžðŽ`Åøáã˧|›ÿC&‰ÑÜéRG©:ê¾]w¼‚T`,ýû6/òöS̬ y"\M b½/—¾W59øÊdåï{'rFü‰É?Pn+[û)™¿xKt·AŒp—÷hùîUBù/Óiñ=î!ëtœ=²è!/v‘“m˜6ó/ñ¢ýÅt—¯ü`,g‚P’Lr½x£Qr1\îaêÛš^nøE°[ßéH ÔêI××e7ÑÏ…1×燧[ÜWI6ÒŸN ý|¤­?þäuûâÿ×}}3½Øfñ$CÈß°Ý‘Éo÷Y‹ÛÅÌó‰%kÌœÈS$¾Ý3áB>âl$â‹aÚV­£- ¨¦’Kº\ni`i¤¿¼‰~ß›°g¼<>XŒtSƒ-¤+.bÂŒ-¹X_1€W ,ûØØ«¡YÛÏoCJÝÿ'‚o?’é­ý‡xYÀ(‹þ/‚>fºú©ÿTwi1> stream xÚÝZësÛ6ÿî¿‚¥i…âýH¦7í¥—No&w­ë™Þ\ÛŒLËtdÉ%¥8é_»H$Ëwv&sŸøZ,‹Ýß>@Z, Z|FG׿^œ}õZѧ¹..® «ð¶ÐÊ.TqqYü:yµ^Í×Íe¹šW/¦¿_ü½þÕknc '† øâG½žZ1iÊù¦^¯¦3¡èd}…W6™wü¦ÜN6áÛoTÑšT_â“Ц©æ›åÇ0ˆ«¦º¨ïʺiÉt&ó¬›@¹¹®E{7÷j^—Ë8kÙV(=È7cŒ8Wç¥bò¶^•ÍÇp¿,߆áË%×møÖ]«?¶õû)[.«~%ð~³îFLÙ¤ ÷ß6U¾nW°”Hx¿žãè¾ o_m›÷ø ߨ„ À e6Ô²!ŒX'’²°Ô^ ¢ÐÄa‘”IÂ)/fšPm廫ÈÒ è,aZvû6àE4‚°ÝÞÖ/o²l¬U6,#R†gæQDh×ü˜á  “ý,78èŽ ·\I4¨ãËý™8ê,¼÷L³ÅŒç¢jëÌTÀF,;¢ZN÷—œÐŒëÄ Rˇ¤6‰Ô7Y© 礇¦[¼óØð·‹³?μ¤+¸»µ²Ð,Qºb~{öëï´¸„à‡D8[Ü{ÒÛÂ: À}]?Ÿý)™Œ[C¤‚7ÌÁE4çŒ$¡–?dA½Ø‡¡‹)2ê`y²ƒÀŽPÁŽ(ÙÛ‘Žz"í>~}`8ëíôef8¸8SfÂOå)étæ´³“5àGs_·ar‚çÀC!\1ƒ+¿ð<ÞœŸEr¡ 罓½©ªg\œžÞ5ksV.9`üê]øŒPŠWløéº^\Wíf†XÖ"%7“ê2|½nëMÝa¯WíƒMxò‘a¿´áZ¶íö¸ìoÝÜ?fíajƈt*ñÜd{3ÁpD¦6¶”¡ºgEY4#•wÀc[-‰„çHÀ2â ‡Þ¨Yƒ)Ð__dbg„9ííÉÕÑ žIðXÐâLnô@®‘ƒ d¬PŠ Q鈃kE˜_ÂÀÁ“{^Rmcd»­W¹ÀRQvã Ç.gúâ2c)轇Ö&œXæN`Ò£Dn1°süô€ö1‹êTP×§„¢HôuN-€JÔš‘¡¥¸¥¨Ù]B‰r¤ºÄ^A-ÝçÈI™5{ð,'eFÃì ’æ™9w —ÝF½ÌÒ& ¨Ç‚Èc+ìDvQÀ#ˆ_'Œ†(cù>¸Žùì‰àc„`E"LäñæÛƒ;îHÁ]HÈc‚7å¢ /ïò[Há1–vR†×‹jDËúÏ2¦÷ðÎ'Òp…àn|& ×[ä¹]nê»eÇuøñu½š%z4Ô Ú dP¿?Z¤¹Q—þ2¢††aÆîœ'ZÏ ®R³0ø€N 0~gB‰BîêÌØ‡< œ+{=Æ žx»°v„ NÀ„¼á5æÒ#‹ÕÉ"D‡ St!BŸZT< ßd™˜Çàïƒ þ4ø{<©~þæò[GÔ.^¦0§ÂºF¡XA¢XH£‰¢öXhfž¥Þ’Á ÁR ø „ÕO°éæ)®~þ°Ëž'î¾Ë¥€6ÍxÕñÀ;ˆ…Ÿ"ôòGñxþÀË©•íÅ]Æ-S§^ ‰‰´»1¸¬¾{õýÑÀ+©Dq¡ÿX7·A±ÂÆÏeÝÎ×ÛÖG›îÝ|cg9”BNe½Ê…CqМØ¢”§AoÔM8ô>wM›‹~S98_T÷r½È- &Ó ¾óž¯·Dƒ†J’_kŽz”*²—û °3Î'lÐ.Iª/P¶q#DÙ+Må¾}€Q˜ŠB"Àvæ«ÂaÍ•ˆÔñ¢ôkÿ? Æ=Á˜íE>¯½ˆ¤YÊ%÷§Í‡¡¦ @ È:g 1êR¥§= aÍÍØ‘–§ŠHªàÆmâ¬÷€™Un–XzÒ4ÓÉ’-Ú-€v{ɲ [dD'¿$/ð¢ÃÅt——áeN$n¼ÏìËt@©Ô¸ÃFh‰‘h„²+NºÓƒ2ySù³zµ¯×W™…HŽÕ8êvŒqã ‚4PMD¹Hf;¼ôÑ}½¹Î EN<§‘ȸH¬àE=˜”üoR™D*~@*ýh©^Œö/›Ã 7„ã§]êã`< !+”ÄVÔ<šÌ…o$+·×oàÊúö~KÛ øÁ¹k1´õ[> stream xÚíZ[sÛÆ~ׯÀ#8 6{¿ØÓ™&™8­g’I-=´uü‹T²âöÏ÷ì\’’«±Ý¿Àâp÷ì¹|ç; ál•áìç3®?\œ}÷‚êÌ #©Ì..3ÂRRgR(D™È.ªìu~׌W‹7/¿{ÁÈŽ¬1ˆj‘„#)¸ý3Dq&‘QL[Qˆ ‰¸ s6ï¼X´ºAJólGêOa.¾«"F‚HÐ?,ˆqB7"° ¤¶ŠQ‘TL“”bg?]œýëŒÀ*8#• qÓq°"Ùrsöú Î*xù2ÈÝ9ÑMF•3ÎÖÙùÙß¼…#õ·s1Ê)¿hÊÔ9â~›$±M¦“Ì?û“Hã°?>RJ!LT¶#•RŒÉ섌ï8'ã1ÐéÞÒdß~ |¦”zˆý¤D”’Ø~‘Û¹À~›iÍ‚Rš“Š±·µc°„/3C˜o\†Í«XHª{oœ˜ˆ!BØ$óG˜ÄÀO†q7‰ :+02Fo$–‚iÄ÷½,79ŸZn'bÿSºUûоäöÅOíKEûz—Ü—¢4½¯ÈÃàJÐJYd1HËV/šå]¿(cùxUû›U_VMÝ.¨ÎG;ÂóîÒ¿ùìï|¹w}=Ü@¤è¼^Ža¢. °5¦çfÛÕQ À‹}«íÎÂVs«Í°GBºE¹™X‰!§¦µÚ·^ûª†]ò¶N¬^À6¤±ÚœAƒ`BÂU&…Î$‚7(¾NZOóÈ%8¹Š"ø' GŠÂ1ÈÕH:ƒÜH§i.Í‘fÒ«2é[B2ì3z×ájÁ®¼Âœ2‹Ó™Åµ´\¿ü=¡Na‹ø-¢aÊ-R) Xÿ8´hž§¦Ñ’|gš“è–æLšIà·„‹Ô>‘ÐY=4üÕH`;,öÔ„¸‘;lÎè“îÐàÔOçv? ÏŸ7I—p!¿(—Ĵ夫N¦PA C ª `IðF…!ûõrî=0ÃSå’4æ Ê&~Êuzî:~Ìu_*gãˆÓÚvª9áGš“BIX‘ÆöÍW8FêéÃ(‹%ýèXŠ%³Eåî¢Ï’d æ0'óÉå$/&þ;ÔËEAò®]¼ò$·ªûæý¸M9ÂÕ’]?Þ )ˆã1HlI"v¦÷ùÌD ¡í%!'ªú2éSõ³Aú}4k™ *®@^‰ˆ"1Z¼‘XìH šì ͘öÑdÐ#Î{Óù sq»¬þ¿ š{Mš]ŽÂ_}ض‡˜ÔÄ7h„á¦öˆˆ;(ä_ëéJ„äÓ!¡xPAOVPöihò£õ¬ËHй„Õ‰?w÷4zíÙá¾'#ª$ ]-„|à'8€ ‘Ï¿r€‰|ö¸È?ÎäCÂ_?:Ì7/Ê{«Iö„Úí™d'$æ5j{–f1 0^³ÀbÓøµ³ôänÙdòe× pÿT®Ê¦µ·:/×k?¶ê»Û›Áú3='9 ;Ó[6åXW~ðÎñžfuå´(8¥í¢›6L8RëǺß4m¹éÎØÚJÁ­_­ªÃܺSqBÜç|¸ Å[_õe» Œl¢r¯S8ìt°7`"ûrÓÔ•jÚa,Ûeí¶È¹%†m {ãàEêal6`š Œ?©@ts(8œ?ÞÄâ»±éÛÅo‰ ßh/Id„Y: ^Ó åûäù¼¦&u<9‹A6GíÇvÎH¡jÓ3°ˆ†p=F §žw÷ƒEÌ(ÂB t…å¸ÑÔç6Ú)ò/Ž)¦™± Lw|ž<®Û*}=U½»«ºOcƒŒÆ[Ø{ÔFgÙ†Ó_ÉÁ/‹wI@ä˜ìWZÉöU¿ОÝÞ¤*suð4ý?Š# qFwKöŒVÐRm ™:sŸs&ëËäçÙ¨¼œnW ²P‚‚H™_§ Ž0× ¾Á#×INf|CƳü>D>ƒÁ:Ç1p}ªT'K&QÐ<ªOàšµèŽÔË¥-Wþ©²Ý~×Úš„Í}Uð¯ËÞ‚?¶ß¿ Ð\»ÚR…WËe×WM»òcçý÷5¸éëuc+bÿÁ _§N’Åòøt&ž¦ç•{=oÁm íüµª}ůÃðUð¦†Z­éœ‚«ÿV)Ü@QtkÜn³ý%Yö A`鯮iJ'ïm”uMe‡¤¥)×.Vìû·M9@¼qH7O·gbö@ˆÄqð¶¯Ëëðŵqa Tä­ýÕ‡Àqª„öÞ¿6[âSnºÛÔ7^Чê6þ¾íš¡FÞ€)Ž¡ØSÚK°‚e‡¢ôª¾„Zã²ÃÖýéafľ+£¡¿&o"éìu!0δ´K«ü‡Û~å8˜ÁÄÓÌÞêe_Ž ’_·õèŸyáº.7Ö¶o"! K—˜ûÚ Ý,7e?>G šo¦eDß7õJ€#(:öÆ/ÍÒ£äÐ]Z‹ûµ€yë¡¶Ã¥{¹¼òo.Jåµi›¥XoDòúÆMÕ»‰TþËù«ââUA1Á…¦I†Ö™mñ:°h+b T÷?ヲIGüYx¢ÖöænÛÊ?—ÓÍ« ð“»’üÜãgyÓôµ2`Vÿ¦ª—ÍXZÖ°g÷ËUÝ|®›;¤õc.íµ-ÖÿŸ´øº.ûÖ¸ÔA)wc¯ÐÃÜܬ›å4/¼p8,'çûޣ߮’>Öš«É¢/»Û¾u.2Ø+×»ÍÍíè: xò ÀÍù‡a¬7!qý,‘ÞçË2ŹyH¬^Èf®Âõ°kŸbþC˜ùÖ§-1FÍ,’‚Ì|Ì’>~‰¼µþ‚¼q|ž5uµ)[\ç?÷u]}ðb—v·ír2ª³rï:Î?l?ÃN‚çw˜nö¯$ðƒ}ë{ÉMYôôÒþþ b!å(jЖ^~¿@m[{zÎ"ùàsÊ9 ®ç#¨k.Sv·ÿÇ€ù<…Œµ»v×`xÊh°<…’4³> stream xÚ”MSÜ0 †ïüŠ™ÆøÛ7h zƒÜ(³1!Ól–IR: ¾’ÀîòÕKÛ²e½d–5ËÎØž=©O…ËJZa²ê6ãRRk\f´¥Bꬪ³+r¥®óëêüy[vUhÆÈ9Í ÉJò-gä4wŠ m¨×¾‡)Å4¹œ6y!,Yåð¹óãÔ®Òžfðuú\82¥Ý7ÑÝÙ€cßP zxjÔÖý·Ô™®oöu³¾ÿ3ù©Ýô¾ƒs¤0Ä@#Ž5ñ}çd…`‚J˳‚sZê9µo~òà§9Î…!pÊãØŽ)ðŽ0†SÆ—°_ðhI¤ûÅ4SðáGÒØ'i]\QD0¶«Ya ÕÌAx8T,Êê÷•勲••‹²ApR-.(j˜'£1å¸ïbvªÚ›ñ®|ß"˜8®ëvjPþ\ºMÓÎx`ãš!Œ#ÈzÅ㚪ÒíJç“ã¢w ÈCþ¦¥Ím²7 ê'h¹TI»hœxô…“.çdL˜â™`·0¿†åuLîÑ‘–H´$ÒR/´Ø>-·±rÁef\æM\'(³`ä' jº0¤‘ïáÖuú¿˜=NÂð«ËµïºÙs>ýÈhË9Q‰ÙÂütÒÏÝf: “FmŸl·“¿ï›P¼K 6bÆFâÝ@EKüý}俀0±È»NÅúq»L`}œÆÎ=74,4Èi€Œç¾¶±¯aá¾ý¨¯¡‚»-øáw˜—:¤žÝvÒ:Æ$B®“ÚŸÕ†ÕTºýÚÀf2–R=q¡õsqðWÅQ2êßkeûfmœ!n ÐÖ ôHáQœG|+aÉÁ­+„‹>ØJó/r@ûŸ­Sr(x·û*‚/œa.W '=ZåZ KœÏDtøH/®JÊ¥y,‚êK™]Ô³â‰;>ËÊ˲ÜÓŽk¯°Í ¥…’M;¹Þñú^ü$?¦ô endstream endobj 91 0 obj << /Length 149 /Filter /FlateDecode >> stream xÚ31Ô35R0P0Bc3cs…C®B.c46K$çr9yré‡+pé{E¹ô=}JŠJS¹ôœ ¹ô]¢  b¹<]ä00üÿÃÀøÿûÿÿ üÿÿÿÿÿýÿÿ@¸þÿÿ0üÿÿÿ?Ä`d=0s@f‚ÌÙ² d'Èn.WO®@.Æsud endstream endobj 109 0 obj << /Length1 1633 /Length2 11323 /Length3 0 /Length 12389 /Filter /FlateDecode >> stream xÚ·PZ. ‚; 0¸»»»w`p`p'Hp'¸—àNp‡ ÁÝ%X‚ûãÊîÝÝÿ¯z¯¦jæ|Ý_ëé>UCE¦¢Î$f6Iƒ\˜Ø˜YùJâ:l¬VVfVVv$** k;пäHTZ gˆ5Øÿ?Î  Ë«LèòJT;ä]íl6n~6~VV;++ß¿ˆ`g~€$ÐÍÚ Ä ;€ HT`GOgkK+—×8ÿ:hÍèl||<ŒšÄìAÎÖf@€ÐÅ dÿÑ hP›Yƒ\<ÿË­ •‹‹#? ‹»»;3Рv¶¦c¸[»XÔ@³ÈðGÉe =èïÒ˜‘¨VÖ¿ê` w 3ð*°³69@^M\ÌA΀×èu9EÀGÃ_dÅ¿Œ€¿›`cfû·»¿­ÿpdíð§1ÐÌ lïtð´v°XXÛ¤™]<\@ó?ˆ@;øÕè´¶š¾þLS_+ü»>ˆ™³µ£ „bm÷G,¸ym³”ƒ¹ØÞäàAú#?IkgÙkß=Yþ¾\[°»ƒ÷¿…µƒ¹Åe˜»:²h:X;¹‚ä$ÿ漊þ‘Y‚\\¬¬¬<ܼäafÅòG OGПJ¶?į5øz;‚¯e€|­-@¯?HÞ àâì òõþOÅ#$66€¹µ™ Àdií€ô÷W1Èâ/üzÿÎÖ}Ö×ñc°þñù÷ÉðuÂÌÁvžÿÐÿ¼be YEq†¿Kþ·R\ìðfââ0±s±ØØ¸9<¯ßÿö£´þ;ÿ°•s°øþJ÷µOÿJÙíï ý{AèÿíKü:¹ í?ƒnÀÊÅjöúÅöÿ<îšüÿMù^þ¯ƒþ¿I»ÚÙý©§ý‹ðÿÑí­í<ÿf¼N®«Ëë(_wÁá©Ú ¿VWlgþ¿:9àë.ˆ9XÚý»Öik¹Šµ‹™Õ_ãò—\óE³³v©€!Ö<-&6VÖÿѽn—™íëóyÉ?U ×åùïRf`ó?¶Œýõ†ÎÎ@O¤×K~E\o¶×u4yü9Åf°Ë« àµ8_€Øéåæ°ˆý!ú ñX$þA¼É#6‹Ì?ˆÀ"ûzµSú7âe°¨üƒ^™jÿF|¯Là?ˆÀböoÄõÏ l÷ÚŠIþè‹ùÀ×@ÿ_][üä°Xþ_—€å?\½® ø? '€ÅùÈþJöøþW{Í\_™?àµ÷ÿ¾h È iql&lSÜ~[-Fäδ;.4Cµ«JÇä½èüÍõ >™®*óãºóµXòpÆÊ¶í•èé“÷Ï–zøO­‰ªm>ÆñjS»mH “ï& ~ŠÕõ#¾gÒÝóyròÑ ´}ÓÝ)O•ëäÊ‹¦’}ëÞ'ãQ×_¶<:·«ºWÅ­€üX6Í­eXüƒ*Ï4kŸü­ 1=Ö¹ú«ë¬œ‰Rùx$ßãhŽBo½ ö˜»Y¯Õ vH%>ñ›+¬±)joñƒÏòxóÞ%EQëBóÙ<á±CKÁ} ³ Zä~*øÎÁ¥LBŠØ”]9îfÎvZfÊjÒ:懲įÃ%ÆYï­ýïž(ˆÕt Í¾u“tŸjvÒû$Äâó³EïÛ“\ aoudùq?Þ½e)NÉ(Â%[@­Q,)x7ý]=ÏWJm]_à­ßÿ’Ç\tÃdóÆæ“O‘‹½Á²/žÊøƒ©JýAâÙÉSagœÕ°~C5ÎnÐ$†q_Øö¤ û3Gb#ϯ`F9vK,„n'«MÕ€ôñÜw]6n_§èBÎ !5NjäO“Uƒ!q Ó.â¦ä»'8=Ùóxgé Á)ùüÚ‡t¢)fe¿Ÿ¥Z3ÕNŠò ÆS63õ[H.Ϻn ‹ù¤ß@®iŠçªN1‚Z rbs!„Gw|Àvxã5ê#iÉ&> ­¶Â;Q7Ûˆa—ff©÷>^åzùÖÙÊ5(HYxï ÈϦ0JeÅ–x:æD„É˳¿dá¹v ŸÚ ³ãBîD––F×Þ¹g Q'!{î‡)`§o8&zûã» Q/¿N\kào;ÍrBøæ`¹üK}÷r­13Dj½}“b‚©ÒœŽµjˆ¹,Hº(…æ]3=dïuz2PÜï‡8p£ ãcÈ’~Ì0žò³e-[Û£:¼ÁA2<ë!"Ï Pm“6ýQÏ“b”ÌÞ}½¢ÍaÔy“Ò'òeJëD>)ÑpÒ»\§Øäåž³ŸY&Ù»±bgÙGå–]üÈr¥ÞýÂÔ¶Kº:9P¹¢ØÂF Æ\„"ÎŽœû¸œ¸Çf/Öûî “þníô$…` ›Q]u½¨-ÿ¡‚š¦Ä¡’ÀÀ¤ÖP528ˆÿmkµ¥‡ÛŽÀÕ”/ÔÞ°ýüãO–¥’B]JnB}†@M©Šæ(¥¼Ç´¨µÌ’›ŠU[eøWô›¢·^4ÅEÔ“ÉÜEÁçì)+o#:©àh“/ÓmLjñ%ןYj‡`’mi±†bbmS1«Ð\L®Å{ï?áoÍÖÈúéb_—Ž&Ô?ã"<`›Qu¡ûÑâ-!É»ÍCÐÐ}SùœæÜ÷&7¼·s¹C±¾ŽCÆ1•‚Óo©U™z?îÓ„SG¤)Q×òÔc¶`~vÑ[€5p&ö÷áJV*næ"¯ûHßQ…ÞI|¼t$X5ž,UÇØ¦xaaÃ<Òñ­súúgÎp¹kÏÅ›òuW}Ñ%ß̓Eò¹ ¤as¹x)zDF€Mçžá^¿s·‹G n)âY‚­T·®]~U‘¯ù'f&<ò~Ýà•D\2¼‘i©Âö»IT‡*9rOOD(+’>·Ôíya½W¶ÂÖYí=‹ÇP8ØÕQ?M[\{DˆŠB§j†ŒigZ-,¯®äÇ]13Tÿ6T’”üA3»Ò³‰F/u Ù&ó75ŠÑ%ܰYŸÇ\ú»H `V 'ur‡ì,¼ùf›`¨K7ë¼xÇKsÖ*¼¾Á#›“ÛfÝßîOÓ×H‡éŒÔºÕ8S—#3Ü•.Û;ð±¥Ý3}ºÙÍ}À-ÕÇH€®8éGíEÙw±áAFàèWZõÜò* ÞcÜÑ;oÝ»WÃQ·¾Ø™üYxv¼ì¦%=k fe[Nmü ·‘É Kú& öìŒþmçq¸²¡ª[1'ŒWÛK8çü} :À—ŽX2ëhÝÔpp1þEðì»&­åîÕÂÝmÉö”È–§üe/S8ÖmäÑ+&| ­jïö*Ä[cÙ³¸H£Oú l;ñÁ­-Ô”âGÎ(å_!€®û°w-ÂÂB œbÕÄI¦tVÏ©õ“RüœŠ+ym+ÂújÁC–Ë9×׎è$!Ò˜Þñv‚…VX¼ôž9}µôÍa®@¹hìà²,ç¦+ «¯VždÛ¶Z£êC_y>É!ßjX 5¯pVÁ?‡o‚ch¿¢h›C4ån…Î ïgÒ0”|fàÒãÍåo tËÔÐÊe<Л8Íi1Ö•c·ß7{“¿³»3ˆÚ}åÕBJ»…>;;š¸u¾híù_ÓïÚ{\Ñ‚n*Þÿ2©&»{ÎíoÌ6å ˜l™ÒAKS*­þVU‰µP¼…e䳑I7è|VŠ’ë·³Êv‘ŽÜ3›/(àoÎXe4Ã]¸k‘ŨÒå8^âsþÕ¶]·•§á’DØ9ìmXc `Ô\—™¥<ÿ^´­Ú ë ˤJ_\úSÇ£"Õ{Í÷™íØõâ €vzÊjR°Ç&‡ãªçB‹²C?_¿;ê)$ÕBS€€ó92º6›@õ­MËEÂ}p]|Ä[˶O¤–)ëàµT2*bMBœ®v«ðÂÒ¶ÚYö«ƒ–ùœºÎš übFÀfýFÚ·øú“a ãÆ"1Ú.£‚C!,-Ñ( ä{jï+íW{¾ædÐ}½.ÅÉŠ1pÒfCéž™J'NbCynÏ…/Ò‰Eœtˆ.YFMȋؤœœ“‚/y9æ¦F¶Ð;§$sßùxP¶ š»&½+‡‹3°ÊõÈêM˜œ—ìcéÏÚœ ?fàgï¼_l]ˆÈsAU”€¶œFâw0•G »WlísŠ%LtÞ·á2êã{S¦¬¦U'¾_ÉÂÝøçPKu¼9Q­;ì:ä2çÿ™d6 s 9r[Љ\ÛEÜ@ O¬äÎðñË8’Ú+" üawOOYœ<ï£PM¿w‚vcÀg¥_Äp©(P˜§åù–… tã©ö¾‡ñYaéñȾ”·ÕOÃ6è9Ϙ„0 ÷_³¸q©²ÍÞP ú5Ä!§DŒ~؃C}‚™ÿD\TgKÿ=úØë‘FvE…Y“ã…xÖ»Ç1&5ù!Jœ&øm© $˜´3=Ù1,U(ØF£ªrá…gŽÈ•ÙŒ>J…þ2Õ ;E¯BÏ%­np?\u{¨Ûô¡Äcæaé ž6ã7!f7óçcb®h5–åûo%Þð|&_y7pÑc#·ú–¤±$©?ŽùÏ>3Wä&¸6ÔwSx4©ÆlÞl+¢ˆ‹›œºØø"Jõì¢7„ɪ¡|'—N;:ÑÁ¨þä³â9qæê·«ŠiUnÛ鯩?S {ˆ¥ Ý:žÅû«$É¿‡;úsV÷Ñh!žéDy‘Ö7•Øy6‘ù"^ÛºX0L$íT,ò¾œÐo,Sï gàô´áLã»°‡ÀÚ0R±_X¦K®ìÁD9IÌ•¸ýa#Ï$Ê»lõtö³§Pú_ýRIƒ`'ΰٶ –=̘Vî#ýEŒ¹S ps7Q“Êï„Ò<éð¤Za¡vŒ_ ¼ç”qÙ9滑Ô¬¤SàÛ$WýúÕïcRšƒdïñûG H×â ØîR/·U‹ÇÜ­åÚóÑȡت¼ˆ¨'u‚(TvÔDÅHs—•§~âV[BÒõùëäì-†Œ;þ35¢Î Ú4;é}úþ]žÇÅ襓äGuªfkŸÑÈ;(}„›w¦‚wD„Dª¹”žÝ8UCÁà¾1‹ÉH¿9‹ !§ÌoM8ò}å’dz˳ ’DÃÓfÝ7G´»‚ù:øƒqøÌY¤“²‡-¨²ï¡BJåc¯zÞ¨ðÜ&$SŒ’Ī14›in–khlBÓfÓ‘6>J)™/SÕw‘ÿ~VdVè/Ë¡öáqúä>+óD°qŸM¹¡>¥œ×±ðlg?ÍíÄn ê¿‘Õºõ½ó¸€¦ ˜ B‚=¬&BMõϳ/¬LG;EIk±ÚW¢ôûé÷Ñ,Ë΃R jÆ!ƒ…¹³‹ A–¾ã€åz|ÜJîðÄi歷ñóZ@îãœýy"ñuMÛ±²íÌbÖç3µ‡’³íÙ†LŸ ovT.,èhŽÝ9#&Ê»¥äµüÇõù:í `ÂŽãМaÈ¢AÇßê=Œ¦OƒV”ƒRÇÒ+âmx´‹Ž(2ê¦ÞÎÀ'ÍÔ ¶ŠkÉÐúoXïàéïÈk‹ Š6lgUs_‹TaZBaFQ¢jŒÈ”þÄÊ dU†(r]-BÆ+’^¸F0Å@sÏY»Ýà )î ¥ªšòýäèêÈ8t~»ÃB†RT½ÙÅò¾éNu%1]› 0!‹åˆ¥vÓ@q(¡cû¬Ön¯‚ J/k ¸ØF¼ l©žô–,Ig}%K<GêAk}›¡$³¾'m7…BD‚¡åJ 9pO$rL×õ æì&r‰¼Ùÿ0ƒ:IgPÛ[Z@Ê#°x–üxbHRµ¼e˜i{qÒ£^¯xæy¼füls;šª6ß(x¶ ø=Ö2‹àÌÒTÞ—a ß½âI¼½¦ð…7®ŒÃ¥.‹J¡ƒŽ§O2_¿wiÖ’K÷[@0ü…¶·v}k@l‘æá=¯¨§§ªúO×̳¤ ¯çlûÃEùËpk²èJˆkäd‚ydÝ¢3¾DŽÈ…]{Pò˜–¿¶¹çê2—Ä’(Uº,Í1ªŸ[#6䲆D|=b’Ñ™½õª;`ñI@ȳ0Np6”’‰_^:-gÆXÆÛ#Î6(9|yIAp£­–¬e©¿\R4-Çô?Œœ7ß:[T«X–ÉÊ÷ji o>{,W¾4Y~±÷ÛH7©Õc:Øœ 󉄛ôÞ“7KQU»-@qr0§]C_ÎôLûÝt –ün¢Q&/ »r!ûÉK ¸r98C¬òXûôe> cþ(|Äc SB´í9*QÉq\P?6šuB¤ä 4(£§qtËRª¦Ÿo{kÙ?Qý‚ƒ4…—„ é$/sB3ª1;¯îX_B¤ß~ô'x>v¾9^üåó•Ÿ¿ C N9NüC·¦ÚðЬé9P¦JR­ê„’›’ÓE±½£JÞº©]îTY_b.Ó©Ȭ†É8P‰" ‘à‰d—ŒÀC“mo‘ˆL]aëß,åÙ³~ž)ÅÜu%®Eµï^)O@ç¤Áˆ² ÄÅŽqO}õhó~à©`stA [„ï ˜”]Ú»?x¾Y¨¨Ýè4LôUdEpoÊÏŸ Ò.Ï68wVµ$ÄÅK÷<¾³AŒK+h ØG¶õv:ô˜gVdDnV·ˆ@„YQõ®¼‡¨èŽŸy1%:¨oWÑõ9”P.7!oû\æX°'ʺ›5PöOÑ%p“ %%k\YWa࿠̨>ê y<ªäÑ+Hh:%ØæB%mSLàˆìÛ¶T#£Ò)!þ:yy¯FR=¨-‡JJ®¦Ës^¬ýܶ2 5s47Pý3ü,m›•At¼nVcUYcPä/ª`ù4Ø&_îáé×ÂK0äêØïƒÚ*ÂuVMðB©OyR6Ì¢ˆ©šÜ8Ô,WúAË;‰J’f*Õ©úSk÷-† s÷>áñ\^¯&ÎØGð‹Q«¿¾Ä¶áÓÅNI)û4¹´Dɶqض´<ïÔSïÓØ0»÷eg‘¾&[Pî ¢´Îpx T2Vþ¦wgåÜR–tþâH «LVfWØæÁ‡/5ÊÈüͦ³­ÁA Ò+ÙEÙÆ´¼Ùüü&žZz”D9ñoaW#,¥4î¯d|üe*«ª_‰3YJò‹¬°šLEËiÿ W`ùbWXvYêcˆöìáU? I¶Ee¶ àçë'bzëûmøÑÊtœ^o~ʬJz² 6T¦ïö~¼@P`&Á †ãƒ™A q¸å6Vþlåué7‹I²ÖL¢€wTá¥ã–+ß•!w´¡‰Î¶53Y½§ªÐ49lñÎÛœïËS.%+ŒÊï¿í—I'AmyŠzI‡‚óRr5Ì¿> ðéŠ;‹4«kÂð®´°´AåÕ‡úm¡çaHÈ)o8¦n¤O›D힒ѽðê=–5z”äz†^&Òõ›ù®ßðkc=þY¬8*Š19øÎ¦Èh;."²!oïÄ@ͨŒ8N¤öÌc*–ñ%`¥xî[®yÐí*ìm\¨Ã¾PÇÆÛ$‹¼ºL¶M[O:Q›BÌ”JCšZ׃'á¢áM­Ÿ‡Z1-œ*šäàÓS‹§¹$ï ãh\Î*æ·žlðòö…“f[Zãþ±0²jÇ®Y,”xà'áô'·h øN­äžMˆ„ Å-h «YòIüXá+k0Ë„š²uMe®"r³óÇ·¡}XƒQâ~1pDi‡†½Å?s€gHŸuƒý#W~PÄj; 6į¢[_•RmrìÌ;ßç_fW­Þ¤-¦ýÖÝ×—¼Â”Õg-‘A„ûÀ7Kõ΃8Wµ—ƒÞOȶýà® çåp/åÙ(ºx1ÉÉ㌠$Õ¢ð‘“R8£çfùØèÛŸz܇(¤É´¶(†ÍŸ+øçÈ[Vi×0ŸÐSÃkªŠžšª¯Â«º™4 ¡'ÖÓŠ{(âEt¦*Îò­g¿k`á¯3¼·é¼ÛP–ÞËm’8wz· éÛ’v™ÅŠT¯-mf=…Õ¥ –Ïè…­¼Fóüž1#&ç&/BY/ûäø–È´‘©ËDaïLÆm#„ê`¸ŠW»Ÿúç™Ò0Œ[E*4xCö"m~ES©r,kf»`¹îó’b…ùH™á|¼‘ÃÍâ¡j½`³œ`&âpè—µèÂŒþÛ¤ßëÑ mŸ»hùxSA¨qìÛÆ¯ ÃÝ7«¾`÷{¡6_ŒœV03¯î*¿°fh$à3Kt`~)GL¬\8e\Þ«aom: Çré}›öƒ”VôP[ªˆ0L÷j@×·WöÀ©q])? Ú³'•ýÐ([¾ÑYX˜L‰jn²R:°ª™?;d\ „KOq†¾Ã2Ÿ¹ØÅ•.§<Èk¹Öt¸K ZnWñÒˆ{ò¡ n†ÚPÇÇ% åm¾‰÷+³r¨ÈÖ ñïþò+öÐÕZ¢eí·«Ýö¡BKAè„eÖ»9-ìø+*ÈGkP ž$]©Ü…åÃð5Ÿ˜ÃLÊÁ a†úZ1‘jÛJs‚ñÝ´ IŒï×|äE¶)ÃáÄ ·bÐÀ ¼"`z®„é«Uö¬îÏÝŽ/lk&‹â#Lâé"ñ\ôïSÕ˜ß1St²|Ièl€RÐ(1ËÑÒ.–QåÝÀ‡ 7¹YAåyÐ •¬®~˜i7Š®如-qž<þ±-Vosúd‡l)—¡BÕVáÑ–[ ¦ßá3ýŽ <ý¿„do öVجì*”û Jæz©´|ï~-Êx¤ ÊQl¯àÄaQæh.‘¨Xgç¤v͵ßôÖ!™òº2æ)‚¶Y ?6Ÿ¹‚õ–]?OnYÁ< LÖÊNd BaÉ'þ„ÕÝ|å2 ×¦ò%XŽ=ð{Hòw¸È*¹Ò™gç=D$ä†NM"¢¸ã C¶7ìÑ—$5ªkæÓz ~Óߦ—+>ÛUƒk2ó„ÙL4I§R¬~ÑgrެFWçíKwê¨>#?ÉÔÖòõÕùN¾:éwoŸ–HKY)JZP©Ò(d|ë‹ZD…íYäJh¿ä^êx!³ïþÇ`ǰÀÆÍ¾°ÚÆÇ¬ÙÞÆ>g™\­7›méJU0:½F‡q׿b>Á·\Èܨ¡EÆÊ(º£¸ª˜/ä¡ùé#"sÔc2Ó2cÀ·V/ï”~ß·Þ>¡÷’1ZûÆß›ÍéàvÕlÁ´ì¸‡q<./’¾aa7ûtz‰îàúÀÉ>Gævà“¹¥ž—ØYIåÚT+pÙÊ2MȨ9ßøEMˆì¬¿2†HhÍ—Ÿí^Cµf2Å“ÜÁ]Å]æÕJ "Ÿ£b_ÆfW¡”8·÷Q˜ÀÆ=$ªZ:¡¦'£ceæ B7ƒê– ‹²·Ár!%Ù ˆÂh Õ„$\; ¨B›Â¥m†AË'9óÏ™:*>ÍŠ|ºž$öi”=áí¢\.2fVq¨^ ó1æù'Üâ ’ج§Yç…åér ²oÒNöíÐi²V½H;ÐyØ'ûñð¥«áä"^Û0¬:‹íof dÁNõâKCÓ¥¹¥¹à›=1j\kN=ë•ð¼^gï­îå ô§¹W“?{à »,Ó_Êú[N^3LÅ:7·ÝBl½ê†îÕu.ˆ¾„ú÷É€ñ±/ݰC ÒBÌý±¨…*°ytŒ}¿ŒŒA÷OëÜa”ËqÏë‚ïwv?­åæŒËX†ãëŒ6sòH4æV0±U#Àtšm‰tÆã<¶H6IøÎ6X"ÉmÃÛ–x™)OŸ-\܉ǚnÙ¿³u2Š^õ"ì-¦°Å:.ve†uëlÐ1{¬èÞÈͤ&zØÇHjPJJc9BðZ5ÁcìF÷vè;±ùŽÝ¡ë°Ÿ?Æ\£,)þ®MFBo»ÐUÒJÄþ[²~'Zöíò:¥4\ªâÑü1¶¯e2øœæDÕ,Ýla;2ë¶£ÔmÍ¥·ö© VCþȮ᠊ÚècÇÞù9¨è U,ë-æ/´b/È…^Ú\x¼Ô· @z]Œ£Ù¼K¸B¿õFÃÒê©¥‰]ÍÒIªäØ:žY0âÓ†Ÿ Å£¥A¹ðß­ÜøëåçŽêþyF·?Õ:êç`wž+ašáˆ,iQÔ,©4we>1:7/¹´Í ü*(:)à‡¢øÆ/ˆ¶ì0!€-¼ƒðké}F8¬g#DN3ò‹ò4G©ñƒ½LzÔâZ–šƒfRõõ IÕó¾³òÖ—˜TxÏ(º†ÀŽy»`æ û«cíž@×\BÄâÙ¬]ô/…kcSn MI¹·jÅüÍÔWéÁ\ íÔ\î‘í„_2Œñ¢¥ìä~}{h"¤·¶ŒÆXa‰6dØ,WÂÁG~¼\hØî}_ˆ˜.!C^Ì/—ü\‘˜¥+Lé”hߢÒ}¡ôHºÔ–Ö›ýá¼ÐÏ0+³p".Òõf»ÅÎAʸwMY'0³Ü‚¼§£å FÀi,忯=¶(p8á9`z—UÐfzÿªå-óc뻜ºÁbrñ®È¢$dŸ)ü('½/›“ÂM?ËM—uOœ²­L“®ÉSD)=yˉô—ñ{ºH$¼?ùfT€Š ŒØßGÎǹQC D¸]}Éø¯ñ›_“d ;Ží¡ëè´gO÷ãÔj–{©Y›Áª¯qq«bcsþ%@{*ÿ¥ßjç0ÿýBJØ m¨µ%Â_XzS¼²Vå ŽÃ¾dªŒ’bï£- ß±\¢v¨ì\^¨(jV’â>üöúwY ³ÔŸÌõ ú®»Þï®C¨AiQhØR4!}[u¦ßm‡ôf°ãù§ÍAƒn ôKÝ×:ž–,@šœ-Ò Âêó6[m˜n’(-1Ft3öÚeTë‡Tò¢pï›måSŒ#ø¬ò¢zÇ L\çº5—CV…?I{fáX$w¾PúàÀ†½À9“æDm±¹¯Š²YPdûuçrßll÷SŸEÌC¡4Ý‘py¿‘ûîáôØÅíZý%Eœ"PÙ~6Þ 1g_c Òã¼È(®;0çwžX?ú‹Þ¡ q| kèDc¾‘õš`tþà›ø `´øi ^kž&ÌoÀípÎS/¢þ„B“Äœ‚l} ¿ˆ]ùf­Ý_•ñ³™@e iœSÇÌaòóSŽî¬åŠÐU´=nÌ!𤬴ßma3/sÖã ã&M?õ/8x:*#¬ *žÉË1P ´ÖRšj¤›:S‰yj{žp²¤ ¬—À¨¯;ÿ"J§Š³OH>]± ¯C£³Ø£·޾Hݧ÷‘÷üÝùJyœ…Åçæ›bY¾,T”c– ‘­ÜÑ•¯FþoÈJÓȨƒŸ•*UßqƒvéoS¡¥ân ,´®œ\Õ gìÌ“->òyÐFÄEõ›¼%]sH¨Qžž2„sdÖãçÅudbž_¤Þ\§ù,I‡OvÙfÂgB³ÄïˆVªãz–ìðpÀI.ÕöXVgá²ê}Ä^Îc‘€,êé-p±â94î ÌòÁ·ðð+Á¿ôˆÔ—Äh‡8<ïÌ«ÞåTš_šð‘$\m¹ tnÐ6²§ËY†ãë¾»¸š3J£eü´Ÿ£LuéŒó£ØÕ^‚É?{á ©MÄ"cç!7¿c›\oK8ÚÎ-‚ЋáÔ0<¤šrêÏá/0­g±×CÝ }–¢í¯YÞ0}Q®vܦ'ñ}ù>µÎrÙ Æ]S4õõã´¨}?“>3â*&•ôóG¼ß~1Êó˜ÙBPç z¨Ñì1=¥DïÄOÏ !:|î‰%dÁ°ÕðžˆÜmÚ©ú`ÓX½wZ†ÈöÉøÒ—‘&uM´ ü&ú¥ÜE)åÛǺ쬈°4„7ŸÈ&ï{¶O¯ÈÊÅQH_%N½ Œê¤y¯E&‘Ë~¼X&Ó•¤&UÚ¸µÚÂlîû³£Nñ— [ø©¯B”ÄÉ'SîAL(#Dú¬¦ CôÕ-±ÒFéÒ§»)]í±tÿ²Éf¾Š^÷H¸Œ3fo7ÛÕôµë ¿­ U½ÍÒ·Y±…ý½ ‘å #‚1R0ó¶éè¹1wöv~H2"s%Ÿ™ÑÒŒ.†/îH±ñ?óˆ(3_å}¨ÊO4:æ³%ßpõiÛ0Wøõ[,WðÃÄ7.¯ImÕ¹ô9˜s6‰¦” L_+‹ wû_õØâÝ—·! §Ù÷ZÌ!‘òtúì{³?#˜– ¾tVÊ3Ê{ЇÑp“³ÑXÔþœnP TþÉÒAWˆ|BIõÝ8ƒ"óë8ƵÅç@¢/Ž’ËuçÝYTèþ©ÊÆ/·9Ó‰áu“ué¡u£ÂâPBrU…Ǧûš•×ÒioÚ:rc 0ëcŠ¥Ê°òý—(×(rcZ ´J*y°‹Þg ,rØhBe6V[üNY|A¦^Áí;uÈ£l_D+Œ¨¢Öû:ˆ¼i¦ªÛ§ÏÀɶ6ë®,èÔÔ’­Û ù€ªãË”IîR±ŽLà0Áò"²Š¶ö.ÂÖ‹P3›Lbͽ‰í'ÛÆÅg`ì¹Ýý ‘¤mÇïwßk©Ýåïù25÷½€˜/ˆJã’íÛ¾‘4‹­×”ŒŸ¤¹Jû˵߲¢`ºæi’ß´T…Ìä}×îÉl?Zj„áË|»Úœ ¨üc±œwCäÊß¹çS3`¡6ló »žáŸ{-Þ14_”%gÅÄü®A—åàèà®D,,8ï2`©É~Mf€ ±¬1ýââ-f«%\5ßKȘúž¤d Ëȹyÿ|úêÛbAOàlÚ_@W¥“l‹vóáSÄÝ¢€Ø†`w•»:ÿ¡9xM˜×­º=.u[†H“Zee:²OÀç“l@õ†ÇzËò®ãv^­5º»áøœ”ЦűÿTÀ,Ĥõl¨­ª-ÜjŸ/©žm<z‰Ø}¸ó¼/4ë?ž¶2Q’±ƒiZC ¾ý9¯ÜóJœßÞÌÇm2zÉѺðìY-+–âj© R5‡Tb< â5‚ÕcËH6Ÿ}Y¦ bÖÊ|Õ2L‚DÔ\™e,RCÊ[—@H‚߽®ÊäQ¤lÓˆv.oSÄ' û€ABq1ñƒ®–c?FÏ­h,ÑÒ(ùšdº¥¼ÎŽ`ã¡kÕgQUÁ<‹/÷#¥ ŽÔJ$à¤f «áÉÕza¾»Bj"Š)$×~|©ÉíAjÛ¥w4MŽÀI×¼„zä¹™,Ï­îˆ$tèÝÇê¡øfuOR ÝߪQM™ï÷W9û8»õxh^r+À[P&X¿3/(ðá6æÏßB»ÒÙþ/'Pov±în­!`Œ%"FgÏîÌ!f;\@nÅ®ÐtÅ4(š9ØÌÛ²ôt”BõµþøpìfaÔmhú ¨"Õ )wüyÉ_Âj  8#õæöˆ«”<š‰mÝŸô7%Oó/E²KxFjÕköÖ¹­.qïÒµi&ºµ, ¡q\1æÏA& A£ÒÉÆ—ÿÎyÓ endstream endobj 111 0 obj << /Length1 2165 /Length2 15051 /Length3 0 /Length 16342 /Filter /FlateDecode >> stream xÚõT%|÷ W3ÙötÒdÛ6&Ûu²mÛ¶1ÕdÛ6&krM¶ñvßæ~þß·Öû®Ö:kûÚ¿½÷!%”S¤0²1ŠÚX;Ò0ÐÒs„dÕôôL´ôôŒ0¤¤JfŽ–ÀÿÈaHU€öf6Öœÿ°²ê;~È„õ? el¬’N–&+''==€‘žžã?†6öœa}g3#€ -@ÒÆèC*dcëfofbêø‘ç?_ä†6ê¿ÝV@{3C}k€Œ¾£)Ðê#£¡¾%@ÑÆÐ èèö?!ȹMm9éè\\\hõ­hmìMx)¨.fަ ÐÞhø‹2@Vß øoj´0¤%S3‡)mŒ]ô퀥™!ÐÚáÃÅÉÚhøÈP”|³ZÿËXú_Ô€7À@Ëðßpÿöþ+™õßÎú††6V¶úÖnfÖ&c3K à›¨4­£«#5@ßÚè/C}K›}g}3K}ƒƒ¿K׈ Èô?þ›Ÿƒ¡½™­£­ƒ™å_éþ óÑfk#!++ µ£Ì_õ ›Ù ?úîF÷ïǵ°¶q±öø26³62þ‹†‘“-²µ™PBøß6"˜?2 #€…žžž•´] MéþJ äf ü[Éð—øƒƒ—‡­-ÀøƒÐËÌøñÆÃAßp´wzyüSñ¿†`dfè0š˜YÃü‰þ!ÿ ¼¿½™+@“þcüôýý÷›öÇ„ÙX[ºý1ÿû‰é¤5$e…¨þMù¿JAAW€ €†‘…ÀÀÀÄ`c¡xýo9}³×AÿÇWÂÚØÀñ¯r?úôŸ’ÿ=äÿ^ ÀÿÆ’µù˜\ €üÏ kѳÐ~|0ü÷¿]þÿMù_Qþ_ýÿV$êdiù·žü_ÿ?z}+3K·[|L®“ãÇÈØ|ì‚õÿ5Uþkum,þ¯NÂQÿc¬M,ÿÛF3Q3W ‘œ™£¡é¿Æå_rå¿ÍÒÌ(gã`ö×iÐ0ÐÓÿÝÇvZ|œ‡™ü[üXžÿM)bmhcô×–1²°ôííõÝ`è?F‰‘…àÁð±ŽF@׿§@Gkmãøáø ç0¶±‡ùëEYYt‰þ…Xt‚€NèbÐ ÿA:‘ÿ"6zèÄ ûƒtâ€NâúÈ'ý}dýƒ>2|û/bÿÈ ÷}dÿƒ>2(üAÌ:¥?è#ƒÊÇ#ý?è#ŸÁô‘Ïð¿ˆù#ÊÇ ³úcý×ëÐý~Ôü/üh>Ý¿ãÁGYÆ >ˆ›9ÿñ`øK`ùGÿ—µ“ý?ü? Lþ?*2ýSßÇ»™ºÙš~Üç?2³À²ÿ€l-ÿ?èZý äþêãFÑÙüIöaûñ+öõG±¶ÔÜm?VÙæOw>n “ÍÇýoO>Hüƒ"ÃGÅZð:ÿƒˇ¹ÃÇ!üãð‘êO¸sBçhjüÇ3|Ðpt±ù‡ÃGœþ?šàüøQ¦ËÈøáýdŒ,Ýÿ†ÿ³r†Nöö?<Å}üþûWtÂ,/Ørš×¶?T àºÐìMòü$ÝSM¥ ñX¶ïpzB€L¢¨Êôÿe'4Ò‹´¶#B~Ë¿BðêqÜRÒš ßöìù¢§0³×³418Ux,P7€G£Ä¿ïùjç©âgñ©´K’4×ΉA.õÁ¥_̵nàÇêxðžü~«ìËYš(åH-¿â9Ò<ƒ¬y,"G|(J” WĹۻŸ(9Sï’qT0^'QLE›ŒÑóîëåJŒÝØ$ØXøŸnQÆg¾z&Kb.z”|‘ ŽønŒ¯³ÐÄÛ¨oÇ\CصÌýcljk ë·®ŸÐÄ™àCùmRÁ‘ß–Š9X(¹ulų¯ º0ò¯îs™ö|ƒާrŒ!i}ñ–_oÉù :C“Y—°ó8¼gÉêãß}øã– ò5{<ô]áO;ÁØ.½¦¦Ä¬¡²5o”ÔOy•¦AðFˆõGL s¸ ôN³U‡+QÞ-xAqK£fÀ¤ÌÇk~Iv¹#¼÷Éfð­èpï?òÃm.Ú)í9.ýÐÎD§›sJöX“z'fT̯ÖL<È4@á/—õ· ¬e«*Ñ×/LË'Ãi1¹)¾—Ã眬Ÿ) …ýmT%hï9#ñcú,H\Ç%T±²l H,,|ÎUƤšPÐF;>}Ý%I‹ÁºX9‡Ïª„-3rBÔÀhEŽ].ë?„ž ·@œÁÛ¤aÞsgš?Ò@•ªüæRõ~¹óRO»Û7½­µ¥*K+TD®2«u.»¿—ÛÑÛÌ„Œ>'Àwe©‹äÐS*,×tÿ¶3ÚÍXÑ­dk¡Ë`y¢Oà ß6zðNn{8¶ZÖÉ~BšÇÉ´4zÞvXxr9¯µGn.B§Ø «©™¼»{f—¦dÒ]™rbõ Sù÷ÝŽx‘ f€ûž<'ØüdÂÂÖmâ º¡këŒg¬”8äéþ(ÜÜá¬Ké,R«séu ?©Ü_Å:òèA)º}ìÕËÂ9«ÐF óA1áKóØ$Tæý^ÁörUß[2´á29úÁ542ð­< 1mCjbýFü%´¬‚eA‚½­ÓöNáîr±éµ°_¥á2ì?/-|Nj÷ûÖÿî µ²>qQÑú‰ 4|”leŸÀ/¥Ç¸&þ…®ÈP5%@8u軄4`ôIa\ ±$°u5ÿsÿòlWú ‰íjá4þÚ5G]¤€‡MãÊ«4s-V›êØ„¥©™6 aÕ@ÓIùk§{“ù;Ér¢Š°MUrOeÚœêŠT²mCÜY~scž·’Ó}ÑÅ·ã:ô¸rÂG?r|wæ1éÖU#t‹ –}H…$çÀ)>À4bYàÎßPcþÞÆ‚Js§&”Q-òEò.º¬³ IkiN[- «5ùK}eÆË²%eóÜ8.§:^ /žíÛ]R³tësýÐ’FöwÝåíãmy8J›QCá'ßÀ„m¦cÍ+*A~’ê’ƒIÅЙ$e¹p,VÙи ÏBÈEêfGµµ_v ?+YŸä#p ½ï|oNg?=¿’ /‡DÁ\m¼@õÍVM7ÔÑÀ’¬£ AÂmš+´á’ÚApùÌ,'€šèbjÐå#þÈd&âèß ‰Xduâ~Ÿ¤€þ@‚ÿó哦H.³jP¿­»êãoè6¿}] ðý†+Ó„@ã+²öfßM»Œ¦b(Ú¬!z'<åü{ü“ޝ?öV©f ]hˆ¨˜ëH6Šxg[ Syˆ+üR”ª`í"aUŽŸuGh$;‡` ð' B ®5Œb.JMC’ck#búy]eOKó7lwb>}Ïj“ö]þVdÊ{óÓ&êæçqnIN7ÈŠ$×…$•–Ì$ž$ö3+Öh¯6jô‚•„,@Ox¶G³ãphÊêâ!€hÈͳQéì•1Ž("Tô j–2ÜÓA`1åÄXjꦂЯ"ª8AÑú±p`Ç›8ÊG™2X?»AËì‚ýQrÂ,sÔn,ÕJ U0îÈ u£T|Eðº 3SauÑoÅÜ"±ª væFr×ÇÕ‘ž? —=R°K¿#\¥dq’þÖ>Écà3á}ö¢». ý*ä/ÁÇÍrY«çîP\êå$­—,³äû£“ˆ-[ªó'ì§³ûƒ¶«Ýàuv·o€««ØBÏ\g=Š/½´ÂÁxxLyËûIR$IØh‡Ðiíj&ŸÚôäÙu‡¾,a̶͟ ïa E½Jð¾ã†5eŒ'q¤~/nÂZ2õÈŠâKòQíóYöÆ7bÅ:/Z,,!IÄ;ËÎèšAêSõèR»ŠE:xõÞÖ-i28jÏÔì_ ØôáHÞRÖ“`uéTQü–ܦº|ºú±Š¡®Ë™›å¥<(–y&@‚T…ÛxÛlp€G+¶ËI8f4@D~¹xéwƒÏò;b[p"Ö.©(i9µ±_îßs4‚ gï½ÞýN)H1¥—¤Œ¿1"®WúmoÈeçê•c«hêïö>9‘WNp e-n‘J–¼­d›µâ§?á68füi)»žj`º¨Ð,ënw™~7»—J;„$`<(IBhr‚—;Oâ¼íGò©†É>DÈ S¶ÂWË‚›ŸI_‹ÎÑóÁ€Œ'Ú MŽëêOæ®Áo 9jߺûku`f€—o$*Ð¥Ù3cÈ<•$c̃d,Šºø ŸYÈ¢µ¯|ê5ºå&¾ØƒF/.6~·¯â–‹ÕAœ”N £ù„;3eˆŽü#bížIÌ«H û€,}%/ƺ›xX‡,f6w,—&£ù³ï†ïšèC§‹Ì7¥Ø*aÉÔ1yÜÖú§žHèg¿¥Î¦ÆJì¸JY‡_³ÑÏWˆ>¤‚+¦‘W«$x‹Ô[×¼9û!8×6¿¥RVˆÄ³¦ªæsÍeŒ“‚H%Êld’5äçmøC3W† rë±[÷´–¹òs æ#¢¤æ9%rõ0 gmîŒ#•5ª¾2TVRM-ƒCÆÔDktLÞýU\ÏwÉlùtl=€,É[Â=!ÃT |_ Ûvnò‡àÓSäÜ<æ”Ú* Ž7Ž >ølsd”D掭s«ßkZ§4VÜ m€EC¡ýÜìÜZ`?"yê"&BMó Ì[àÃâ_oæ%ÇDæŒÁV¢”ú×ÃɹàŠ4pßN‘ȶ®pç¿–¼‡™«·©9|z•±ÇY:©ª?Éš—~!:$<¡4Æ¢Ê U‹›ƒ£æcBéÞqeK¾d%Æíz×'¼ç Ý!F}~…Iq*ÂŒ0Èo¦Kø%ÖU·F —»×*ÕÒ3ØÚ™'¹³çé¤˜à‹¥ûŽsÏÍaƒ*¨Œtt¼ÌVI›ËRºƒ%ظ£ H«+âÐ'ÂR!DÝùŠïŠÌ`²_|\n—j‘Ä“|IÐ`hŽªj_×<¯LßÕÿÔªèÌ?[`ö»Ñå`ÖøÕ(ý¬wì<‰Ôaô—w†Ö'àcØÃOmòèóν+}ÆÖB¾Ôµ5 Ü½±œ…l(þZÎÁ<Ëò§ý}öÃåN±ÖY•pã¶Iäö¼µÎã¢q Pn½–³h¾³[m.šëc 1/E’/úžp÷[6p‹Ndç.H¯U)n~ ¢G©¶}õŒ_ÐZú½$(7ê1…¦xÝøÒ¿éŽ€‚·3§ ¼pn'Õˆäªh24Ϊ\<P…HÙˆÒÄêã´ž`¿çl3‹éŒ—[Ä“{ôMp®Á@¥5¼mhÓ%›– p’Æ ýæùÐkr^ÆMGî¨P&ó}Ÿu2²k1³ñ©õKÜÌ@k.gÆíßZHë¯D™°_LŠrŠ6"¾æËx-­– 3å[ ›êEiåØ;ñþºäº7kÀÕí< C‡±ÕI3´¸•͘‹x °ÖËr^DŸ§`LOZM‹ß%!>…g}œñx;‡z);Ø®lÔu¾U>QŇíòµ<ƒ!u×4©ŸC: 2gõ.B‚C_#Š×±°ª=wCl]#Å{# ˜&µnTÍüF‚&ƒ”æVá3‚6wà}­a×qZºB`&‘äGò=GhL)ƒÒ5àÇ€öóƒdгPPGfŒd9m«½0?•Ÿ q+ßžü|+oÛ)L@ÌÞZ`®HiÀ$•s5«•ÔØæèhq‚ C0"ü ”?‡Rß›²³Ëä3OáA»rpîŒ×m™¿YÙ;ßfÈÝDÃóÆçBøº^µíèÞ·}“îkîÏu+ þ‘P}3Å‚8£ÀOça|,/téUf4Ú 5¼xç½´c_÷›°êsÛkÎLìm[¤±\MÓ©o—ÌO*2êT„3V]ºUUÓ¥‰‘óº°™ô£qóƒ:E¯Ù·SÉw؇àðùÆÝð­sLè’:o÷ i;-…YSŠNwYEIéu7ß®†ªe‹ö|ÇÊšú—¹Œµˆ±œl‰‚~•òçŠzu„*öÞôÙ Ì’‚÷,)¤pN\íþBA˜²|¾’èáçd‹´wÝÜrý)|{$tù;¤† õ~Á[Ö턟)v®Òó8ÐÐØÃ Ödt•²ŽÒᾞQ‚ eu¾;IxÞ>R½/З6Ár‚Æa*B^T”õ²V*»Ü~#Y|öÁŸ¾¸:æéã$ÈEœÑ$ r¥GAÞy…M»Õðððê»Ù   Ä‚;e"kÔ€®ï™ÐÕ'51‚÷nó‰5Ø9”R+;VÎu¾½Ë•Í Î¿£–LÏÌ À,y_*¶ w ‚»Á©è,sg¼Ñ>¹ÝÓìçÂvÊ‚gª¢£ª¶ÙÇL£þ ÃŒäóéU¡6òÕ"áú^¤ÐMŠ&ÃÞ#|âÓ<-ã î&\OÌ û ì¡_ªeÎÕDb]„d4šC61!%—Lf›°§ÿóËÝš®2&(ÂÂË苾ˆÂ3Q×wnÍ£‘r/p6_6Ò¥Ö´_ :F‰F‹0²Šb—OÜã‚I(wžC\êÛq°•]–Ý¢K?¨Ñï2÷‹8`Pɨy,ñ 4ììø«õ0°Є"ªÏ¨X±W<¯”±k5‡J=šÇ÷ö=ntëðÂ?†Û2«]³›ü6uM¤|'(6ók½!-½ãoŠò¢X$ã‹Ûù¹k¶$ˆhËët—=5îLXÏø®‡¼æ‹ ,Fâ¶QNϱévÖ¼EfñÇê¢9 ûKo‰@‘è"«›¹Ët犰*ê^Ít×þ~[ÃHêAüM©›öxÒ˜MÝdOLcœzElB1ÞjTºû© Äb»*ýŽÚáÂFðõ„Ð ô¾Û”~N¿ÀWÝo v1«ýS«qËìÕgN¿×ʆWç4’^Ï`›@\ ã(«}5e]úÖypý´9§:ÉX£³‚õ Šà.Vuú0“Í,{}u3îÐU|'Zæ}¦5Ñ»Ãd;€y>Lúò2l¤vÛ€vÝ¢&âR“y´TEiÓ=̱´”/Ee}…0³) ô‹宋ÅašK{Ñøô¶¾-Kб!¡ã2‰µÑç‘ñU¤"’3ß‚€ö÷2O°¤¦I³( &Ç{~%£ÊXªÔ̵sæÞcÕÐ €ýÉô´\çmuj£P@SäÓ´cß&^sì¼¶ò¢Q¼Wþƒö³ö[6f>7a#º)Å8 Ï3ÃXSý‡åZ0Ӫߨæ¯÷A„:Dñy€úkq[2õ¡ÑLˆÖu«MTêiÕÜR÷a ŽH/;ˆØ¼&Ò,*‚q}±¿/›šf.R(äo#U2Ö6I’g±[IºsXo!ühÂúm¯\øßÏ2Ru°Mœw¹Ø=ðqq6^hã•¢òyœŠ‘ Mr¦Š“û+Î-¯éÊfu ìeX+çõ¬Ú:÷‰ã}íå NˆA”é Ü=¿®¶‰Î9i?{O¦=@}­x^nP‚N‘Æ•å¨óæ¡ÒŸìÐÏ£{Z n#¯‡y‡éú4úóÝžã‡ióÔð>þvuÇ,)¬”7÷Ú˜kê¶“›¾,ñ¥¼!&û´õ#I²¦îwÿÞŠÖ7—à0eÍ£ƒÓcùδ· ãú0ÈéÎõuâ1„¯Ú‡2­\”M¬q·/Û:2½³nÏbF-·ÒxŬ„µ¤_¤6l7ã¨Ów¬m{0¾H{« Sy:Œ©Š±VÖº1@”¡Aª1ƒK‰Õ÷Ÿ±Í0ÍÓØc !Å1YÃO $K^/£àB$m sÑzƒÆKÍƒŽ­Ùøk|Ÿæ*p1p½n?Æ | ìU¥³JêíÄMÙlgp<Ç@ÂtÖ†aŠ™¿ Òa½_šº7Íè$¬Wô‘Ø”VjþŽéÿÄèæOc … ùc®Š¶îs¨¶A¥³=WÇKØMÍv«{L,nëïo»™ -VHíÓÇL9väTßðaKn"Óo(Ž3´ÑžÃà“#XR?ƒ"RHXú¥ËèhZ¢7ûNÄõéïbt~&€(è÷¢Û_UÄqy´Ü¡# «ô‡%OlTtvÜOºÐQ°úq¨¿GZy®]ÕBC£-4Œ¶pÞ ¯ÊakÒ§m%ûõµ¼þ¤hÆ÷H¯@'‡M™,K(Ì6DÌÏûþÑÔ‡‰áhlëçÅWO¢O\=ïûž3—ÅNývh$²~ųîØöÂè±Ç¤7Ëï*`ÚkEþy$j¹¤É2ÆÁZtéXñ”u Ñ>AxNÆ­~X¦á>ê§ç1^JÙ„… R‹r´ð#¼2hVÇ3©ö®ŸÛ±.·kî{=H/Nœt±ƒ‘lkç?kâ#Á4´£³xë13ðÚþ:Õ?®06²Qš:Jûº¤½Ù%©9üV½®yèË?Ôõ•X”ÝlZ‡#£‘;!4ÐeH•GSlÙÔ!˜g”™áŒzλ+X Ëh»õE“ãË|Q°*µve è`ðÆLZÁçxS¡·¸h“]€2'­3 KC†ý0½¬_5dr&¹)-á iysê¦Q„–ÝôU¨¾#étQ-61ü[JEbo»2”¹£zó!:ŠJ¿UÁÈ/®¥Xls·å=ã%ëôïäu άÌa9a~I÷Gi6:Î9g3j[öP¡b©c“ZurfFÂ!MÁßE¿¡E…Ú~–‰Ýs…ƒéuo”É-'r<áïòýA‰Ç)D«ÀÝ‹=U Í—zhI܇"mtX@éÌaYB’±±\áñ&ÅœvõÅÿÄi¿@lž¦w}u¼¥@у[¼Ã¡(Ütaç³¾åN¤üdäš÷Žš}V±1-_Ô5ûêúÕ’v¦žÒø˜kôÀʬ+owý) dÛd^bh'mó†Þ'ÇH”Ü®þ^ºmÞ¿„× f{×FÝœ9Àê€ô ƒƒò‰Ìåë¼Nw’dgm3þn#æ¦Ún©Qò ù ì@„\äCª•%ØÛÂkËÕ~Žëk🠚žýY‰‰àd`÷mAÛc*q‹™¼])wqó2w&¯´}DKƒÕ×b¯Õšê~sÌéPÁßâ+,Úê4ùmH,.iëUlÊv3ǦŠé nðß1à.G®q»çêŸ z4ët…F¿X0-”sÍW«™ªqAž]‘~^Ül~ñð­“ø>Ð}53E2ÜÌÒ»PÄZ¹üÔ¡Á¶²q`Þq±Õ{Âe6õ–b³EÆ2hy®øQõh-Ž€šAùâ²Ü¤sÆÂNâ^9î•}¬~V{IR«sÚú©¸&ðq«+\5ã*tæéøÅ)–¿ôq ¡¢‚…ŠÅÝæöÁÁŒ`í ·ÒÆ÷«9"ÍH›úؽäÆ~p…ìƒÎŠòaè_<%Íjø‚eÐÔ™‚Ÿ¼0@áóU[ŒîfÀÓ‡ãY'˜óW½NÏòØ c1Š\}¶‘Ù2ÚçN…Qê¡ÏF k #6ŽòußMº{Ã)¾Ãš+®’/šu ô§£zv —Ò¬¶uw  „»ìº×'êägn<ì‹è²GÁYr´XêU¥6hŽ›S=$(Ní GèôÁä5¾íwIŽb`}ÎŠÈø\ƒ·Ä©„½ä‡Ëºö–ÅØ¦,´p` *›žÅ{á´Ââ­@€w~¥ðFe«öŠNäsÖÉ(;@‘ì®i㜹Úå­ñc5ÑÕ2ö''œˆ™ýþWâiÉSv/N+‹P7cMÖÏ“ P~fã_Zι ëùNƒÚ| *òw¼•6|Ä+ Ýâ´Ì¬y “…‹«§Ï˜á>J‡¨z&!c8=|OÍ.¿Ãd‡N¬eÊ¡zA¼w`W£Ùs‹ndg´â|©Òe¯-w¾è½Û/øå–Ç ?€Šíøÿ(iºu^e\ÿ¢q£Ìû£žƒØl”}¾‰æXP©d–tDÔæph~L(áã¸L  °JÒ"âˆ'Nœ•ñ8ms‘V·8©úé8'öhcœÚOE…7ì¡î×·àF>ø»ó•΃f =R_,h>¿å ¥óÎßDûj*l§4Ôé»ññW•mÛ%Îs9R†þð$Öìî%Ïv©üÀi^GôaDP‘l<©ñh!§ ¿ %Ó·•ÈœŠNþϤԹm” Bù s¦,Û9’¤“‘ÞF¡BÑG $®D@´ª+6ŠÛ‘2Št*ÑRÔoêð"S©0• øóU%Rý ®Ñm‡ ¡î ƒÂÉN¥NJÉí±‘²È`¶÷f#wXFX«^‹ê›þQãMQBw÷1ngKGÏ™ñÖº“¦¤(SBˆîWhpüÈü4zÝóO\ó¬9Dl™&šuZ_`×BHTU“E^¼“’xÂSÕñîÑ^l7È9æÊR-­õä=Íí¨ ¾0š˜?š+:?~]Íb.©`ÀLôœSIå*²ù½îDeÀ²n•Šòn Áãhñòp†¾mHª«ø­Ô YXÖÑ›nÜÁúàç<6ÛPéÅe…š¢h”ø×CÐÛòAQ.ª;o}«úWWÚ™-éN•Ý41è1£Nsë©ZÖp &e㊤èv´}ŽŸbt·ìE² j¼:Âë ÿ붷¦ ƒ{C(®Ó^ãú)u&¢=,§øÓ¾Ü ï@>¸]B:ç°ÒH¬ÆWÃ/÷ú˜‹ÆÔˆ ݘ´!'‰MSk¯O<,š:„àd{®†7øDç²™eÌ…FEc¡å«+žÛ‰sFw)£–zQŸ–"´9£ÖƒétúneGA³¶c5ÌÃÐøÖ£L¬í–ê¨Ðƒóù¡ýhúNíÛC‰BgFab9Œ5t¦Ô¦O“m°,‡j§^;~ÄÅËÈa©ëD |¢ãð ØÖIùÛS~ï»ýÊÊÛÞ ë¸,ÓÎ (­9HO-š5”´)=tT–;ˆ8 ß®q%ü¹çt"J9\¨ë /™¥‘6ŸÀØí ÅÒ¾˜ë$~¶Ã8uNåav ©ÙGä;Î>³VÔöZ_DðÆÆiEŒfæìØ/Ž¥¶øjC_~ã¶’5—O'¤†­l–¹|¸ß Ð]‰ ÿ»9øô­éJ–b¤‚a? Z›§†êòž¤Q¨µø·»f ‰Ùèzíù£v°èvü¤0%wer®ëþzºbBÈ®… „“Oä,J®Î¥;ÝZ·­Hö¬¯âMIêÝÌÈPÎ2³;ªj²ú-wfHtÁÉÙ89Hï+õ·5gÚB»ð>ÌL'PLðz€ÏÏ„áT <¶–q×[‡·K.9CÀ­²«kŠlloÕOX ‹˜¶eé¬ÉT‹ªpýá—¤ë¦Hvx¶hMBû¦Ç`Á<Žž1£²z®½Œ¾1$‡gò3a‹ýäT¢ µ¤3EæNh_†×t8‰m?€§÷Ïé\YE0®s.ÁÓgåö®B\f#|º……b§Ž¼ø®6´É1^5lëùHáÒ× U÷&•UħËh”cæàRµYÿ9+,X؇ i ƒM0ÝxÎB°sÓò|,N°›ÀÏ™‘FšKê*§$n–`K ‡õ†¥íÖ¯¢µïÓ”ÑÏsÖûÉÐ(ªêÊ7³uÎàq—¿ÌË''¢>y4,Iš½?¹£Í㣩záð†áøÛÇ@ê,ŒTQo%F¹sÈ'o$g {û¡ôÏo¾  -¯]bS¾ÿjvÅŒä83Z¦ðé7×#ƒû¤ó ª³ê·xŸש-0éÒ`´Ijq”Γƒ–‰µ¦ÃBÍÓ®Ô½[ë£\É´U?!çOº»Q +ä¿u©9•ê0õxFË:¢j*­—òøì¥3`¬ú|*Ž%ÔÀ£Ži퉹õ%Ü.–ÏTK7Á"WLμ½oPâlg}ñ²æ”©!óG—ÅSJ®;ñÈ£«ê„YØ:YBýÁd%…œ¼D»™|´ ¹äßAfrn~¡zOIC<Äý6)£?ucv•‰SY½ŠY'É{jïëŒî…*ªÿÎ .ܤ£W:ÞÄ?ØcÊ(¬ÝÑæœãÉ?²ýø ìxñ‚i<›ß½Ë„ôè¶³u1Ó+äÒ༣¾È‘Åå@ìD• 0 Æ_ ™QºURÞ¶þR]JØU`±–ÿ;w‹ùæéGµpSäàõ³Ûa:=2îohœÀV8)k,ÿ‡'’#ÓúÞ”mý-¬Æ»jX±ï·‘¥É§xD?cƒ¡Eµ¹Iôc$'ΉNL>£Q?,Ëô²lað[p÷0MËÀÆ|ÿã±='zÏŸ§zFµà·¬@æCMí¼Ý.³$jú´W sùÕÄ5„R·›€Só/YAÄ×C”:ÖøšPb%Fù‘`WODݦ Ws;c0•- Uó‘Š´[&…ÒØ¯I…nÔZ¥î5Ë=¦šv*mÚ` <¦*M([äeÓÐñ£, Åqú|—ÚT†ß&(Ë›J@žá[.µQ3•£àé.„_­×j  5öX°*éd¿r­Hó7†(.}´E¬§_ L™ûa7FB—êHË’*‘4™+íw´»^Œë©T¹b†,qC9¹TžŸ\cîÈBb‡ÒôÑVZ®¼Á{´§ÌùÓAÐÅ#mY'T‘é}Œƒ8‚lË5ª£^ˆ—g˜¥CÃ0S,Á_0XöJTãÓo`ú†€&aÆ/zæù™\d`Í8/øT§<_{_±+SË6ªùQ67r¸ZLÕ¹Y¨ê…4;䊛»¹¼ yÈC¤0ÒéªÚ~¢Y Œ‰ŒÐžÕ™ß²%q!¤¯ùdð`žŽ ÖûS›é|ø'2.Þñ¨üßy¯z·u±´MÜù«&¼¯Æjæiš {_RÄZj®wÍ,/„;®p%Þo>¤ö^ŸÒš4ofïQ¸âF[ KdîúCNµá$ÞCÑ÷$j£~I ”–}(^a3‘멯7Îô¦ã¨—º vš¯t·b>$¼],u<îàÕžXHÔÏÙ¸_<ìšöÄʃ*¢=Ó“'vM-bÇ<Ιͦ ],±f{RCzs¨å§ ¦à‚ùérÂòo2mg‰&xtwn¨2í“[IlýQK¹:aeNZˆ(Ò/Þµ¨"EL¾tç©?y.ã-eÆX­Öç$ÆÀ/à5KGÆw@²–ÊmsDÔEÐf3W‡8ËöÖ5Y‡Ñ4¦ÌV··KT®_$å2#bÊÒ¢ÊÂU'Ìê¾ÝÐ\´æ´nT†DÌ”Âdú»^²&’PHoÅ$›ùãš‚IÂô/Êöcî;¡/S•w¼ÙÌkE§¥âk¯\-pu~YŸªX›°!`) €'pD‡(¸C}ŸÎA§„Ç:„ V\£<á ~›þB¤0Öž.Áä»H0y¸Z¢t¾ š-Ÿñ0ñ™¡µhë§…7×Dèì Ãp3A^aÞèR­G{ì„åÆü7'm=”Èû-3ƹñ 2£9ìÔœÍ#ƒÁÏBDÑ B0ùG+Ìl ÜD÷þtíàD8ÿÔ ÔÜ<+Ä\qÉC(ÈÊhö ‡çÒï…Žä¥{ì~ôí¬ 1›»¯7If-‘Šö–"É6ðÎO£ëÉÎ Sg{K:Ì ÔnÈNWü”qT±dψCQR‰pGõ›ÜÖ)Þ!‡Jvš¯1Aâ/8еcÉN^eM_XM–ü#”4Á«¿4ÿVD6 $GëVᣅ¤Ùv;•õ§9Qz¦/0ðNÁú}ä“.f\¿i7ᩎ.$gXkÓœç¸ ( dÍÿjü#Ö¬.+½{ìßšµ8ï5’…ÀTtÔÜ<#úË c±‡R Ã€S»¯ÝyV¥åŸÿÊN}”ŒÉ¼nê#&QœiüBž¾§œa¯óBˆ†s¼¥«áñþб_F ûÅ”ª„w/Ù0¬âDeB4eAŠÑób$(¾ ÷…Æ—Ñ\ê+™È÷¬t7ehûâ¢ßÊ\ÔªŒr%ªüØ…™^ -Ù!Šóî‘6 Öu6´;Lëæ¯¹TDÉÉÙ£)S=ß¡³CÉ61jžôµ3T”j‡¢Ê+$Іæ)zmCª"õ%Ô‘®¸ ‡¨¶óbéÑ#Ž/}št Yí1–I_™wuÿxo ‡Ñ#ïýdPƒZ±s:ϼàRÄ—](,JŠfÔ£÷›S<¬E1E’Q;o~x¶l´Ýøø%ç­ÂÊâúˆª~¦èŽC®·’x!N6Kç“Ìγ3¾ºêFŒ^ûp%’žxDEß(>c&y>ÛV»#ÄÏÜÜÅÝ8ÁŒQ4 9ÿÖ5ÃeýF=8±ÙÇÂòU¨@oõÆ6¡J] Z™6KÒúã~'Sê~ÚŽp¡Š?( ÏSJ1ûŒ€£?MO½âe…;Ny{áédFG „8.4­j|Ú(+ä1â` Yâ¤#Iu‰xâK±ýw =àK_¤± yïS'mV_Ã@ôÝÑ¿,û…l3×ðµ51#/PÏ ¦¯•¿X I2ˆÌw Ò m#F~¥f[å?C‹KC&ø˜ìYZL¸Ö%_#Dº…a|­2úݲÕo*V€ì@KéË@9&Ffq2©dÖ³åæº³èDi%ç)ý›E `÷äø˜jï†p„,T§ÔUH‰ÒÏ”}±Ûw0¨{B“°ô2 ¢qbËW'ø» /^µ©c„ç—©)Hg(e#™…æ/¸’â þ—±8{ìI½ÓSóÅ1á;7$—€{ûeØ¥¾l‹ÙÌïŸÁ—ÕÃꕘ ²´ðÆÕ’(Ô_õlq•î)³t4¢Ÿñ0ÝO/£ÚEåX¤¿FQ•]g!Y©%r»—å³/”@©Á×:¾ûƒóØÛæ.‹µÖŒWƒq:±›?ñö¸@¤øê¡ñVf\¨q‰³#ª–M¡ »5úÉ1§%ðT£wªŸÙI£÷ Ã!À9è Ï‚]å€Ë˜(6¸ôêsºr×B׬HÊ:ŒÔÑ&èëËb¬ž&¤~ù5Ù5¬ÖLÖŽëÆ#XȤ¦ùyÔìf˜ ¼G2ô º†\0;{¡k¤ÕOè…°NYdS–ƒXR§öyv-â—©YäºRyÔ¦×{õVÌÜê‚Ä[¢ƒôÙYËIcæSiaZYÇ¢g­5P’! ¯óL·ô©u6gíQ ×}GM5x¢S‡îT§d›IJvi  ÁOoT&Q4A¢·î÷ïhÉ'êß³mi."‘¯±˜µÞ3P˜.Ún…-ï§`=?Š¨åæ€ó®˜ž&yÌCI‚H6²ŒW\«ÈZ¡=Ö3”?Å€ü>}Y’­#aÔJ@]m}·í)ƒUÑùÍA®Ë°È8 ~9E—yÄzî%Ðqu–X>CŒ²›·CÇ©ÎÛå,¸1’Y¶ÐI41¥•©`µjvÍjÉþ -Ô¢&¶fA`Q¹"²;Ÿq’鮑¸[coü[ DÆM“è['³Ó«Ÿ‚Ÿþµ- n B¼sŒÈ'’ᥣ;˜A ú7(¬íÑ çÞF}·d¤üNÚ¦q{OÔ])…o8Œûhß¡hœV÷7ïFrÍõ”³f[.êÉÞeH­å<{2•oZ™M±Œïølaþ¾4Òí!lmÔ/‚&­Ý”FŒçZæ’§OŸßŸÄM-ïsƒ$Ý j‡KÏ.-’åüH×[¿®ió¸õ¥$ïÓ½{†Y=¹ñ˱ÎHÓ¦B®xßt#ùì¨bœO+ìŸò·àRS“ÁnBR›!Z#áEþòâ;¹‡vXºÑÕMmˆªA߉y7ÔàSõ€ùA'Í=¨ëë#®»qÁh%ÌÞbBìÿéà9¯½¿m÷†Šïõ@ši©1*\|$Èñs¼ìaéÛ¿}&¥!Ç‚å`u6 ¬£! Ÿ‚B´Nÿ[Ð Ò–0”!5¢»µÇ‚£k3)ksèËšXRÕeJ‹Ä,¬ŸÊPÆg„îwŒálÏÖ%‡'&õúN¡’`ŽlobÂ;ôý€qVŒÑµ†hE~‹Òáy)«Ž¼ÆÓODŒ—©÷-ÜÊ”cº‚!»QZ"+]ÓÄs™xEƒ í³åÒ£%ÔË©f¢]•ñ ýϨáͨ¥3,ïæ(V(Mb> ®Ù;{¾}[çfê­SÈvoÙlr <ÇÁÈviY¥·=2d¯G\ùô˜û çôŸ™©IÀ¸Uôª¯tBÌ«Äîê÷6^Ú&THi‰íhX7eð {rr1Qs/òUÄ/Øo0¬†–ºI…3¼4…ûš ´ºFOAÃ6éÆMsk:ä¶r¾ßÇi\ã5h­pÉØ?]-fߘr§‘åNï{Ó®¦Ùô ŒfS·ajqê[#Œ/‹6}e˜†ˆR:±Ó„)ýÔϹö“NsßCf;­LPô»†¿…Ž(†nrÕú|*Ì*½Ž’´Ý*¤Ó¡ µÚ›Mì÷þ•……᳦z'L ±û¡Iø_– Wa°î´ü·›¾yཉ(X¬~Êßùã‚ìŠl±§ª““lÙ1„–ʰ¿`'Ú.?$°]¬äô ¹ŠêrP†³,0¸kÖÜj[Ø3¤À']‘»X¯Œt0€3k¦¸[9šÊ(š@ð›fìúó‚¡&—3é¢OQ¤u €µå÷9+ô¨}w;$²jFÙ_AÕ['á€Ü|Üz¹iÔÒ™0à fíéý²gášãæô9Š+% ë‡d=‹=¯·ê{ Óù8¼U­£ ›´ ÆœPleõT †ð±ûÌ:±ïhÇðo¸êApZ’˜´±àˆm±uÒh ì9 òïj0¦¾öü%'¾¼£È/lŸŒì8l&µ·ó§rÕœÞî,¶,'K_‘ξßíP:“['Œ®;{¿ Ñ’1c½Ûñ’n“üî³`’{0˜À “fÁÀÐ’Ó µðÆ¢´§™ÅØgg¯ƒá|crn%´,‚zNwa£Ñô}“•಼´(š:ð)⣵U‚µÎº1»ÅZµRåüD Òm7?÷­–#acŠÐö›C¢P¦âÕE<ƒô}&ÁÆ6Âä%ò‰ºŒW×?Œ/wWW S¿â î¡u6–îŸùñÖš& ¾„™‡:‘Âsl¥yYÕQm…“¯¡ˆñÌj#nê 'ëd¾³ˆo¯²ç½ßÇB(¥á¹yÓÐ>ÍÀª­Ç릓÷5r3P¼Gq l÷Êhž{" ÈyäR²ËŒ«ã{ ÷³ö M¾­Aõ`; ùk#­3ãt×Å÷Ãs©P£fˆP_(/—3–:(È=Dò¯O}ÕemxÁgYvD޹í}p£,qp–†bR|@Éï*`¨²RÎëÙË•ìæ•¢µ ÈlÀä•Ì’7YQõF÷ÝW7}:ݬßäÈ ˜ù†y¯œG¥{^² clD“@Ì;d*8(àÜFÞ »ÅHÑýñtñÆ +¬-[L 3rX­¯äFª°RÖ0—œ]LÏ…_úZ[¯bU¦šËøWX³èÅáæY(÷ãcÄ#eÜÞìÞº… >ªŽw]mHˆ"§%“õ¢ß8‡¬˜I´:\Ê’w…t.оß _ÄJzk=2G?¢hä ™­œU%dŸ ŒVž.Ðï[°èƒƒ™>®æ:?Ï“¶¾Îg1mÚg7i=iÀñk"(ãg_ íZÔ‘)c7Å á#%‚(]8³ m?e`øÄhÚoK:Ô3ÃÕ“©Ñá̈p¡ìÖˆ“/3EÎ>”Ãòdvå4Œ`õ†Y³-¿öá | Ž?á[NV[WÍ3µk6£«í2Ä>ÛÇáAª_<ùb‚®ÆU«‰³8ßä1Æú”Uðç ësˆÝÚrh~Èí Æ"÷¤ÒL™Fb&Ëbu@`;¿õG:^ˆ‹ƒåÊ"ÿV¼·[šQÆ×­CkëÇáeÓ‰ †­8©îàÒF.ß×Gp†ô Ý}:KoUÇÀê’W aZÞãó'.Âè¬û$PMï]“Ì»>Æc•Š\YbWi’ÔDƒ¥šXZë@àű»~³Ï-þ{> ZQ`… Ùvx[ TІ‚æ-Òõ_?“(Êà›ÐËÝI/|8­ãBšÊýš5Lä2v¡O†îË3 ¿‰ä›Qª¸ìÓ@¢2¹ï¾I;’ õ+GPý§pNM{Œ%V@å{J”w϶t&ŒªúpÞgÇ€ƒ£ÉPŸ€h±ÕfìérßÝi9wSÏ[4ÏÅÔ•¶~Á ×Äœš×‡¨ ŠBëÂÎOœ“‘jO²'P´u.9TI®qæW þáJ²¬s¡8•„WøÁcQ!où‘0~ÑÛxe'/VìÊ,íeìwí·"’«­:´œÎ×°ºƒÅ 2`ýO¢&ByÂ÷*(§¦r9}-èÏØûˆ³Pyœ°¶/ð»Á¡ð¸°L˜Rõ.¸ƒ)F,^ñääá+Råf‚6¿•wJAÅÁ¬…r¿„œ °s|o•9BV…ÉŒ7”5Û@<‹‰ö.^ %d­ð—xºû¶Ç^«úÕ„ðÝàíKÙo¥™fÇÄZZÅêöÅ1p1Ù½‹qÄcS«..®Ýâ‚(‘;XIâó~{˸\x-jq½ì±)9pñ(—rBÞbÝ:BIÛÇ̓¡¤)B-‹‘íÛ² Xͤ2¢|á&>ÝJ7+ûÛ±õƒ Ô¨ò<…ÌO²S×ñ˜vvm C¯Öl±yà³¢Uäy"jîU_í b\¶ˆFQûƒ¤ÉÔ"‘$¡g…ãÓàŒðhPûçƒuEРv¨,Ð8ý,k†C;Cudd=üŠå¥6^õRlï§½M×Ê-ÓÌRá'R•Š¥‚ÑÉQµúÚÀãAŽw¶ïéÍåBGÛ U‘‰Y1è^¯´œó4œ®ÉXLfå´½É÷2ü.’ä'&L•š“zîDé:†Á\š*8cleÏó¦HÌ¿Ì4>#á³n’J§.Ò'8è€íÜõö€ª¥4€ÊîHu¨Â'qßõg]reê•!¿M-Èô}ËEi`©u:êĽ{×HdZ/NÄåÇoÕ7+V.Á‚;åãÊIJ2|æsë8³¥'êaßÜ…+/xGÏ Û˜‚ÌR¼G‰$š NÏf`«þÁ…ÎC endstream endobj 113 0 obj << /Length1 1362 /Length2 6115 /Length3 0 /Length 7052 /Filter /FlateDecode >> stream xÚtT”ïö® ˆ”t3”’CwHÝ  0Ä Cw ÒHJƒRRÒRÒ %Ý¡ H ü1Î9ÿß¹w­{׬õÍ÷îýìz÷ó|, Zº\26+(Gqñ‚xÄ€r겆Â@~ ‹ åýkÆayEºÁp±ÿCB!¨[›<u‹SGÀ*îN@^~ ¯¯°‡Gô_@R (ñ€ÙÕA@ê†Ã"‡pñFÂììQ·eþõ dµfòŠŠ sþÊ8C‘0k¨AÙCo+ZCœ€ºkåý¬ö(”‹7·§§'âìB í¤Ø8ž0”=PêEz@m€¿j@œ¡&á°õìanìº[”' Þœ`ÖP¸Ûm„;ÜŠÞê*«5] ð?`µ?Nàß»ò‚xÿîoô¯D0øï`ˆµ5ÂÙ÷†Á퀶0'(P¬By¡8¸Í/ ÄÉ qñ€Àœ V·€ßC€`m ävÀ¿ã¹Y#a.(7Ìé׈ܿÒÜÞ²ÜFáì …£Üp~õ'CB­o¯Ý›ûÏfáO¸ï߃- ncûkwn}8ÌÕª,ÿrkÂùÍŠ òðð ¡®@¨—µ=÷¯ôzÞ.ÐßNÞ_æÛ ü}].@ÛÛ! þ0[èíޝÄ D!Ý¡þ¾ÿÛñÏ//ÐfZAí`pœÿd¿5Cmÿœo—„yMxn¹Ç äùõû÷›Ù-½lp'ïÿÀï—[CIVIS—ãÏÄÿöÉÊ"¼€¾\‚‚@.>A /Ÿˆ(PXèÿÏ4ZØß6xþ« ·EEÿt{{MÿêØã/XÿŠƒ øÏ\ˆ[ÖB¬ÿ!¹) õíƒ÷ÿ›ê¿Cþo ÿ•åÿEòÿnìîäôÛÍúÛÿ¸!Î0'ï¿€[Òº£n Ž¸•ü¿¡Ð?¢•E8Ùü·O¹• ÜÎéß—sü 6Z0”µý®ü±ëÿÒ˜ ÕB¸Á~}T€\¼<<ÿå»–µãí‡Ãí–¿]Ð[Ýü³¤ÜaóK`|‚B@ ñƹ]ñíIèË{«D¨×o ¹Apê6x;œ?ÐÄùµO^> ·×/Î?òZ»#‘·Âú½÷Û¢ÿ:ÿV1êµÆ™FX‹‡9Ô„½;«’¡öäÚ‘œ`Ù4HgãòE¶º_àc¥°Uf…,#OdRúß.¬+°KÏÑ_ùî4½ÅŠhNÒn¹ôûi‘¨óq³gfŒ¬wôÕŽLmí=.=é-¿+W¿'ÁŽèMhí*,y®î"øZÄgžÝŠ^µ=¥óCO§7µ·*…Tïÿ,çŠÕ1 .ždÉ·Êž¢`ÄDqÑb³}÷"˜<>™ ʽ¡WIäÀñßå/ô5^á‹;ŸòY,×ãsë d¦4¦ E?&úøÐWösª ù'ß×Eñ*O£‹liͧ¤Z ®Õ ­~ûUÄ1Ôû.+¯A£³‹UÐ ‰Á¥< <ŸÇìØø/ƒãÖî奶FXyòL¥sV \<‰Ë­¯£ µZ/ÎV{›4B$Ž;ðž)M@Ìye—çR꓎¾ø| Y½dþ\Ý,Í m_ð9GÕ§ ²Â«‡þ”9é#XÞ,úí¼Õ­¥–Lº2w¬{©ÈçzbL ¨Éâ(<;¦³Kn%9[¶iŒPüÂ6…ؕڮ37ûéïb'ùþ\Z~þ­b!çiLØ‘¬ZSqçAëRzº‡éFiq´E´‚OPÁrŽÚ=`Á_t%ëI¼¨ÉÍ3” !¯Ý›ÑÛ‹xÜ,×òÊ@ÊàÒ„vÌÕ‹¼!1ÏûUiô¨æŠÆ,ê‹`Ÿ>Ëîí4IdCTÈv´°!ŒìcyÈ>È\çÉlH(! 8^Nîb̳›ÜoÖ‚õ½pÅnBDqÁQŽEÊÏpž;Dªó$ìQñV…[ê$W>ž€D($Iq<ùNb_E‰’ƒxr ÊFF´J½Õ.•»ÉèbØ€ã—Õ(F`1 ã$xíÆEÎpÁ©-:r hv§)é3o—'õuçI`h«áîÖ7ô| iOØS}IIgvíýÆœ¬ÓõM¯Åú)÷Xòó)5FÆ_¢23ð5©¢j88è7—®ñÖe± lWÖÞ¬¬ÀK\¼eÒVÇ´ârïåÝCcÂþ©,¹ýh¼Ÿ1A1L–†–3²X ¶ 3×Ëq÷|àûEùHk/‘|`CÑ—ì•nÿR¦O]†CÀéóñ°*-|šbKœÄw*M‹M13M-?bÕÖý“éÉ¦²í¬¸ñí©Oå.Xc–ù©ŠQaèÙ¬?³U?ƒRL# ”{lî¾¥óÒ¾š »ûÒv¼wH/£á+⊠>ýD¬U¥hR ~¶V®‘”@Š,ªK ú±gÊFêÆÜùi¢q–|(,{wòÚЃ¾¬'©le@DV‚å]ÃÏŠOÖâb}j¨ÒF5Ó-žlÅŸˆdZùe¹t[kéÞ ¯æª s kÚ-Úje—¼nž&p MP¶ oià—Ϭ;*º*Ÿ­œ¤Ò¼ÇÛÂ(»—7ÅAýÔAg^[}E |­î·ß9Ë¥<‘&‘ˆpŸ Ä5i˜ ï'iá9,ŒXu.èŠôèJkêâcúøS‰3ù:dW£áÍ]ç X׿î›ÂJ,§¥‘Õa\Y¦d–W'£oi™0áO»iÈ)yÒÕÖgíã/ª6[ü™(}IŠ›R3camœ\°m@€Ã|\}nŽÝçÍÎáìXñ§ïï ´Dŧ£ÿüþ<Ëà¹Bj"5`’õçÀ~ŠæÐçØ¡åÜJ`:%14Áu%ÿ¡Uç„üBéæäV#Eª+"¿æí\Lê¡Î{¦íæRœTW£Fÿï̘¶É|ê4$EË+\Æ-€†1Ý›–M\J3¡­vî9ûR—E0:ß1{ºF€Éì½->‚0ê‘2ržÓ¾%…ÁÕG—§&&åõ—̵ºg‰ª V¤îŽí[Áætì„)láèm(¡NC$í7ñø%C“ S†y+r¾s´ .d'ƒ¸·`¸·Ÿa4ºõWÌRmtß ôù‘§—s,•M%G“3èËe–]?‰½u¹3)”ÐLÄV»BµOÎ.¶¨1Ý© C³Œ‹a„+øózAb7x\ø²æ§YZ{(q™-ãfƒbžŸâmhþEø9„G2pÅátW‡’µdí½Ðž¸ØUÁΩ~Õä^ è[]\ó󓺊)Ÿ»êªE_Œ¥æÌùŒWB_δ•ãÒ÷é ·î[:ºh´…ZÞÏ]c6n'd¨ì}íc¿0RuÔô©Ìž¦;5yÝpZe‘÷ .pBŠyÛ [öiä“M°±†aŒf\òª_ædÌ{<óôÇÊ"Ækë©ïÛY/—öðÇäk’·½ÒšªÓ)˜tÛJ袸°eÕ·ƒ„?ZÆMÕøÏT¨iéÆ¿Ælø4„ª¤Q̨âRI‹½óƒ D@ 31>mã%!Eµ‰iWáp³KH18]öH=¥€½×äí¨„Š4Ê­¼16ƒ%WR”<¹øHŒ>ÿÈ$µx#¼-sø+Š_¤+·Äø…ßþ•¤ÑÇvgBe¶ü>þSبv ûÙ„d¤.Í?O÷ìú:煮ÊZh…·‹³ë²ÓÀ¢‘Ó÷* ­µåYVïcyú°$žX=ÉW«Ñe„8@eµ9>¾ã©íéþã"±£Ò_u.—·ã§”OÚt*šæ ^pÀMäú’߇ø»Z:ZRšÁñ+ñVo"_Hô©j´óe²n¼n¤è œJfgOä×áž4· á0…ˆa/†ëÜtdHO&þî?!S9žÂX5óÛËõÊǯÔ2ì7Y/²â\¤u8ñÄÓ)ž±V}¶u—£¸®”¾=üÜ…ø±’MÑc¼¦E}%k°ñ=€¨zkFçšl”cvb¤ÍÝ<]Ká»ZÏÒÆì°ä|§7âT}ÆÊ\Vu~F!y’Y»t¢3­_vNK«â˜É ›¹e-‘"IÂ]oÒˆýä°¯¤ów,ð0~ìj¿0fR3ƒií A>ïk¾ÞyÈEDqy\¢Ç˜ÈºZüfßâ“ü®°LV­xâT:ÑúÂî“zŒ–ÞæéÎ9ˆ¬„TÔBpÛÓ¹ˆ¡KÉÄöÊkĨߢ½·x 8 5oX¶–:GèK!i«ëpÜÌË%:[)]í.Ÿu4\Z><Ð.¦\_7 ¹Ä½éæÑ<˜°æü‘«IÉT’ØA7 ›`ºq—Öæ)iu¸~ÍdFžT†Áõ~nm•Uj>ã•e¬­‘á#£XH$ö8Þ±il$¾®³5”¬{„ nCä(õô¹³æ@íŠSQ8¯.Eæ0Ë2q`ØX¢™_Æðû®ŒI&Ò f=G÷§ÀÛ@Žsg0ÍÆ¦Êʇ ÛY8ùƒsõZ/­y‚‡5.ø,CÛCòÓŸ2MᦆïÏIÅöŒ-} û¼š}Ç}&ÀÂQëGçœõ¦‹á–.ÑÔ/£–!:za¯à*ø™–94íòá ®ìw±ˆœ8ïã²ý‘W Ÿð=à¼Ù= ÿ‚>C™çä]ËYE"•ˆrÍ™éÉ ?0,öûÕQ–¦¿“V<(AQ¿tÞ{:NY¹4‹õñªÔAÿ q—«Hg+ÅN†ÈQái¼„w ÀN@Tòù#íú^}%Véo£;¢î]ÙS… áb²häÙjéu;_w½„Ú2%Z±î¾ƒá3мúV|OÙqûËýÎæŽÊÀûrµûÊ%r¦Uƒ XiÄpé "ºËì÷ìnN:(Ï|šh¾…vŠçhñ÷ɬšN(+’µ$;ù6(gc<·Wè>ri„.d´´‰L2/öªÔ«’vOÕ´;Rì_Ï÷dè„Î,ñζúž8FäÆt=ÅfÅ]ÁTÔüz7g?¿;d°6¤R†¡kY2Ö\Æ2R›Ëi&jÖ Y–cŘÄí5XÑzØk°`®¼8[î€]"ªµ*r,Ý®²Xw`$1oÅEÀfþb«¼’p[tãáz÷ò4ŠÛ„8ƒœ´Z$Ýë3Ô§%$ 󃩳 /•öF>ÎMÌt¼êfYd¾ã`…îÝäI*§‚LkÊ{–X~4Ì2mþ³×Ô_›Ô[#©±¢X<ûZˆ¢4Œ9A£À,^v×D²fT­ðމ.÷~;Åã_=6gA ¹QîRAÀ$[ècýqeOZRލ^­-1r\Ê]Œ3J·»ULK£m%„bÊ=lR–“£>ïƒ);˜qšç.v—YhÕ’Øyld¸9„)EÞ?.½y—ÝJyÁF·Ð"{Îpä¶ððîÕºÊ"¸«ˆÁ^ 4€NŠá»Úgµ;©­Ÿ{a0|iäZÇ!1(®Ù&^•H7¥¹6M*e{fë4H@ö¢=ðæ¦1¼L2Qæ÷Ô+ÞÂ67ÿã$§ £}Uy1ýl7×x]]x»[£îHiyLÕæ†ŒZÂçN‡õåâÈÝ·E5K©ÑP F_Z üX~ÆUTåùöîÅX÷æ$hžžW² ¤Bð,Ge%ç7 #$&yu®ô´ÞY\?:/ÑFž­ë•)z—5vXu=φr´ÁÎ{B(–pÎ5[µ@ …Ü›A ºiqð몃Ë)G(þ“›óLý $¾ÿÀÞž¾ÅG=*œâ†×}e¸?JCz dB¨ÞbßÑaìJ}pQ¢÷4¶°h·…dô«QÝÁ,I¼ÒJ1u@ïLfFyAô;·œ†Þ…p[×FÉ¢ö¬øÊmÌ©oë»4¢^ =ph0…\ÛHÝk¶"Ì ¶Ý¨¡GIÉï”xKÎ6Ýù¡È«µXª…ÝW-¾°ú6µ2pêC¬ë$•I.Æíéâ–å.òæQK‘ õ ƒR:’÷C?”Ü&?æàîöî«Ù!ûÂ'¶ár.w¿?‡_7¶|¡¥¢afÇ~,ëöê…ß§ˆæx›ôï1çrI›}«ß¯¨#”¸NvЗ†”«‚KäA¡léäMóM½èu#?¢ºÄ_¡ÇAzÈ^£oc¶‹dÕÒãï?q1ÈR{a51rM)o¿W:«üþ‡•Š+¨_Âãä €5«öñàý˜ ©•lóà•ã\­uÍLliêÚ‘²ù…ãfì¨ÜžsÿD­ü‡esÊ=ó,ƒ©9nË3Õ\Ù¡²fÁ’Þì3ðkOf¥g ;âÉq??Øau¬“±àï{kç¿ÂÍ<½x„ÀDê«’¤hNß{”F1]*êGX)îŸÜ$OUVú-Œ…MÐãÒIÑÛ½‰³µ·tï^7|s—…uÝ=¥y¦â;«>÷z‰°P@å¥ÅÀì ý¦‰ÓÔßëç²ΞÉY2C<¸Û¹Éó­f/ÒðÐUËö;R°÷X›„xºZ¯w5TQˆ…€ ,h¯®ïS+î1˜ú Õ-…Àƒj‹wÍ Wªzº:]¢/ZLö™ÑDufœzœz¨b2Ì’jôÇâîÄß—ZÂÁþ7Aþ¢G>´ÌÁš)²ÉT…Èm ×1ôóÌê9âÎ#–ü;¨¶OJ,î~:€—)^Íäo&Œ¶óÞ7¤ 9Ä¿#óÐPô0OG¤ ÐsÐëñÐòùÏLRUx\¯5¤i.ëWí®=^n«6•“w•S Ìîk·šz×ç ËðàuúÏx›£òu4vÇO8Î>A¾ú§ ssŠ¦Î¥€q¿*=Vo›¡‚}‹j׉“—?M2äTµïvOëÌç²}I^”(DtA¹wt>c²`ú„jO‰â̤éorÙª?e¤[BiÆrMOYŽIL•Ä,ôCuWŠppû1æ•A\ÚñjÛ®KÐŽ~ÙQÇÃ#_š °²X7댆ÍŽIƒ—6úƒXð´:h2 ŽéâÚOïñD¼‚šGgIöR$ä>ÞÍ™!™µÚxQÝï¦"¯ô¬"6¶1º¤îCZ%š˜ñÀañQú ™}“2õ '’eCK¾Ä—C:'WÕKô¥óˆ£ö8:Y‡ñý ÝÕ‚xׄ‡ïÒBot>àZy©K}ñz{ƒ×V„)u9áóÊI TÚ0ÎÁáxôŽaÃä0Ö tû›(丘 öfü–¢º[éÇ>UÝáw( ÚJ=91š¢Wñ×g7Šä`¾cfv{=5M½5ŸÀêãÊsÓ—Ñô´)2Y;Ú;Û…2O:~»Ÿ[u­Gzd~Ý0þ²½*m6:è¨+«ì.‚/d[åÂÑ;xk².ß§*&÷åóq—¾7±: ‡•ÎÑ‹¤Þå-K³a>‡:®[>e{±[ɲ½ÝárzΚ­fãkÓäµ ÇŸ}?S¼+ó^Ÿ&Ýf)[q¬Ãk’øBáòiaX‹9Ѭÿ¶¼sª°ÏÐÅx°ÝªÚ‰~Gšé$¼Ð”V2¢Ö·¥]B?³|™ÌA®Ø‹§I¡Xv —¦~C]Köc½³þXzrö5¢Ö¨¤UZýmÒ>xÚ(DÒ:Õqš€pp!*CVT´ˆëÇ…ˆšÓo ½·ÖĺžçÜWÝiCŠ„:´]ãè¿WtiçE ·ø&'>i‹ Uµ'¢—5Ú^³``DÁwÃùú­è¼<ó< N¢¢Ó×Ëç°£® ¦m<)˜tñM|j&ꌮ¶ck©¹@ªlæ‡7Ñ‚S­FK$æ]ÔÉ%^þ|“w”¬‰äˆÌÐèÄûéõàåÉ=ç¡áYÑXë¦NH‹5Åó°D/èç”y§¸ê`lší¶Üe^½xkÖO{E)0ДÑÓD{R>úì^Ùåó™¼Bö¤OËVlYs0Nº±©:ðùÊ®SXm~XäÆq•”_Øœ^¦½ó¶ˆé' ΨS0œzÕnß@„—hø¤ô'5c+—•ÛêõÛ‰W.9¦Ç¬(zmttË)ÞPü{¹§i¾GÊF•iÙw¯ÏûXðe¼Ôrµ{Ð×^“­´éÌBÔÿlÄŸ»;K‚}é$Ù;†k [R¤‚3_ú^8¢ßK³r?­Èðý§; Ù±_ê +KbO+ô?(‘°Ä]“¼¾­G\$J9ÇxSiÖÛ‰ÕuñÌ ©ÍÙ4h8$‹¼ ê~6™F¦ Ö!.þ(Ê€ÓxˆdÉîâXÓÁ¼ Š}ûêgýÈFè¼[cQyê4åÚ£á²Kì׬Úç÷Ìiû”´14L#'ˆD+&·wù V9;¸ëÓösNÑGï-°x"˜÷ƒÛéI5¨cjˆ_´Læ‰ãÛà…–}’p¦T)Omrf UÝt“ª‰* Ñ&{¾ ×>ÎÊ¥GeúysÄqÚÅXÓÿzý^&Hö'! Íÿý PÕ¦+½št>.:õ?O¯¥K endstream endobj 115 0 obj << /Length1 1652 /Length2 7498 /Length3 0 /Length 8555 /Filter /FlateDecode >> stream xÚtTì6%ÒÒ¡ÄÒµK—tw7Ȳ,°Ä.,KwwJˆ4‚"ÝH (J‡ ¤t£’R?úyïý¾{ÿsþÿì9»ûÎóÌÌ;3ϼ,Œ:ú<²¶¨ŽâñÅòšŠ& àùñXX `(gè¿ìx,FP¤; ÿC £nm `Ô-Q¨y8@°8HDðbÿ""â°'Ì É PCÀ¡îx,òW$ÌÞu›ç_ìHLL„û·;@ÖŠ„AÀp€&åu¹Í;ôåóì’(”«8Ÿ——/ØÅ´—âàxÁP=¨;é µü* vþ)`àsÿ ÐGØ¡¼ÀH(àÖà ƒ@áî·.p[(p› ¯ªÐv…Âÿ"küEàüiÄ úw¸?Þ¿Áà¿ÁÂÅ ÷Áív0g(@[Iƒåâ€á¶¿ˆ`gwÄ­?Ø sÛÜ~_ P’Õ€o+üSŸ; sE¹óºÃœÕÈ÷+Ìm›á¶ò(åŽ÷ë~ 0$rÛw¾?Ãu‚#¼à~ÿ:ÙÁà¶v¿Ê°õpå3„ÃÜ< ª 8·&¼ÿØì¡(€PŒ_DXu@½!|¿ø¸Bƒ _æÛü\®»Û2 0;è힟;Ø @!= ~þy¶0 `µ‡ÁñþýÖ µûë|;$Ì`¼•üõù÷?Ë[…Ù"àÎ>ÿ¡ÿ1Ÿ¬©–’’ן’ÿ ÊÉ!¼~<ü‚~1a $($áü3öç"sV…Û!bÝ÷¶Qÿº³ç°ÿÙÀ?ci!n¥ °ÿGé@! äö ôÿ­÷ß.ÿ7™ÿŠòÿTúßHÉÃÙù7Îþáp° ÌÙçãVº¨Û5ÐDÜ.ü¿©ÆÐ¿vWj ópùoT¾]Y¸½ó¿ sW‚yCmu`(ˆÃ_ŠùËnøkלap¨Âöëuð€€ÀÿÂn âtû‚¸ßÊò7½ÝŸ¦T„C¶¿_HF"Á>x·c¾= ü@·i õþ-d/ºuÜ–°C ñ~ÍTà³A‚!Pg¨Jfoÿ þð ÿ ±ù; þ9AQÜþ† ýû=ü€·!]oøÿx‰þ øg.‘?Èÿû;ò'ÑÛpî..¿_m[˜»«3Øç? ðo êúOD~À­¸½Q·Mv‡ýžá/äí†x o3¢~¯Ä-í_çßê …àÍN# áŽuáíg5²¼xÖ†M°¬?åàñ›Evxü$º›ÎQºˆ<‘MßC<·¢È~,ó™áÊo§¥ánTëݶ ÿËÇ)zckmx3£”ý#Å;²õ}t¸´<2ëþWnþF!N˜-è]j,n¢D:Edg^o•½ëûʾ|Œœ^Ó]¯VÇ¿,çI0Œ·y9ÉRh“;EÍ„â¡Ãá$=ô¾7y|2Aš?rà–Â…°› Pâg¶ÄŸx>å;_aÀïþ†æ!5æ1éÇ1V?¹Í 5ªO~¥% Ï'í\o”¥ÍO BPN9lB”Ú:Ð÷!sÞ*¹÷ré»r`ÚáG#Ρ²DÈóï^ù;Ž+ä5mÕïÒbF²Z˜ø1{Vº¼´c+ ‰´*.p… ˜÷•D<‰èË¥gõÐ*´”[ÅUƒ’à_{^ô°“~ÆQCS%RÒæ»ãdÆ!ñÂ)ä‰oy§H‰‰‡§×³·„ñ/g#ÏìY§ã* /÷ɹ*eKú–'ëé÷èV¦tr½%ž³X"sIdŠ—ÐïÞ$æ“*ßM¯Z Þ[ ´û©&¾#:sî癦¤yTttöµ™);°bãNê¶×3ÏŠ¯“Q× Æë·;ªhh.j&N÷q°ɧ¶ú„@ y Ñ Ôý挮Î+Vì{ã^¡Ë’ÆWÚ&sĸ¥”®Á¦ÏáØ_(½ >-Ë<Á„d¾bS¥ÛKûÙEZäÓRÛDZ`í¡úœD6}ì®=b—åî1¨úžªD¼ªñy˜Yôy‡ áÁ,/_­·ã¨@6ñÝã/ûËRa¾R÷×;î‰jåxò|þîù|‰ãÒÀà¹']›ªF@~“mÍ×Ò/6ÞŸ¤NOè<ˆ:’«X·Ä‡w9[NÑïf!Ä››¢¶‹ŸÄU¾¼Ü3O’µÊ¹:!Œ0Ïä»·¦¨7$ñî›ÞV‚DS¿0{ß«zõÒòŒCRöSÊà–\1¸î+†ù7’¡2T뇑¹ìd ’ŸÑ£²øsErLéôˆ"çg}UkKŸáæ¹\d^Q%r¨kÚì]S:ª^ðDë×cÑåÍ„3ÀW7>úû•@¢í|<U=;–žÚQÒßæãÌHGbµ˜U÷«ØÐ-Äæ¼Þ_ôFèTaº&ý¸¾Ój½Ý«ÜÅb_åÿ u‡Ý×®¤è{u¼lû`йtJÐùàM@fЄˆÖŽ>Ô1G±U¹¢*»Õ\ª¦ù½EÀxÇë»Róή|ç·ü æð‘ÄòÓüU]¦? _åûŠî”ôUÐWbœÛ­]Q{~€WìÛÈÙÏZÙµ†k“WàGw ¨ÏÛoÔÃ<7gž@¥œœkµD$J?#­xÁpêV šÑçý0]Ä•Ô|$¹|úÒ²é£g^Õ´Ë]³Óý’‚÷ÙÕ :ŒLï±ê]ûU2ò}™kßdÛÑB Ó½¯M²I(p*~FýùÕØ ÎVÍ5e>Õ›W÷‡–wØÞÈDÓ€ú)–ýˆ5Õ瓃Ÿ\<ž}« ž7è’Å´AŒš•Êù­ÚN¯u`ó«dT­J]Dô&pb†Õ+Dâ—±ž~JJÆ›¡ê6 –xW³¹[Ée.˜,÷ƒ6½ ÇM0PB¤ ^ „a,¡ÚÆúI ½tl᛹ÆÚÞyùÉ¡›_ü¥‡ÑdžÀB9îlÐÌ’Îà"aJžŸ*ñHfBdþ³ZÏŸ&ÿxŠ~Ž[‘$¤ ñg(×`KõÇDr„í"ìø‘ÞêÂYi—n=½ºW‡dÎóâTU¸/Þ[»íMúÀò<Ö~Þ‰%²EþUS±K¨†Ý}›ÚNrPˆÓ¦âµ`¶#eßðòí v ¥.ˆA+NŒqMB_æób–xðHë6O»BØ|Xzhç$Ow—Ü<$¿*cΛ&~ží‘wd·qÊ÷Wø°@8QPŽÒÇ~«ûp<ØýAÙTx×]aÉ›³šÀ»‡L []öSy ò%JÄ$—èªe©9ÓÜŽPŒû&9\U#£Û5•'ã1ÝâvTå&$šÝs»D…¡ì¤/QØÞño7 ¥\eÖ*”UOr¦®QLÆuû•DL:gbj §ÍáBæÐøgž»L¸¡¯Y¾h¾êÁ’MžŽÔò ŠxÇÒšn™@­pQg-±.¥ºÓ^¿þ¼;‰1§P^¶9Óûa7:`OëLí!Ž(ðÉ 3<œ/úthοCÙ-qg—íÉRŒ]¹Ìrv–çÌ׊Øk³«jG|-¹:˰òC^Ÿ×°ccÞ \!¿èøw™ÐÈ$l}2½´,O´.`„¾ù]ÖÕš…×Àïùÿ@«7Òçö¬žëûÊ;vyè¿ÂB>+56þuà H¾k#cÒ‰éyŠù ¢N†SÏðwâ´pôªäfÁAëxo…þ÷èqÅiÔz?Ípafñ¨ÌȪg–^ yÇÙò²eyélŽ–…..Ô£ïÑí+Êö>*®ÿ ©= îç22yo¾ò†{žÎ<ËÙK:$,–i[c9XÜÖæ_mM?8"MN‡òü¦´Yä53Œäçè¥cˆ¡Ÿæ[åYÁÔö¬Âì#é–ñ z*°le½Aÿ4öîüºw`¤Ú#î‡ç¾ï%ÍÐŽ‰¼(›9»Öû‡ºMKyƒ=E5,ãzÑþr0uN-–ÃÀ:¢r¿Çã¬6³yvHIÚU$FèÃ`;½ä'oÖ}Æ‹Å.Ìëàá33$¾n9BŽ:ïS¶·iî\I×\äÕK«$hQ5û- ì­c:<ÈïWÍœcv†5lúÔ?¡šÅyxƒ$[”l@ÑdÇÂLÐÁ½{#äØüù’Wõ©Ð/ÏBƒ‡b’TW#Æ_-}¢ˆ_³Ä¦ûL]ÓÍ©ôãT¼F¥p&uÚÙ”P)CE?hŸFû.å~ìñterÓ•£«ðÖïÊõè¾{h1~Æfhú óÕ·M¸&L¥$áŸ)ŠÛv½®e„¶hæ…~Ú«¬öÔÀ2ÀACšÞeA²œLãǶ5÷cÛÈmaìkù(À¸™e@ÙÚR¯Õˆ‹£˜Z–8…oäš•¼3ìòãˆgwùCAzTƒU*Ũ†ÉøJ±ä)ÊZ]àJ"=i7_SZ­ñܨ³M;Aq>©hî (Q¼4…¦…„áãî5Ì ¾]|GKÄl„=]Ñ[$öþÈ8g±ÿYW»#î3û4‚‰|»ÝÙ¾ƒ™Ú¡mHKF+ðio½Àãy®-wŠ»ÎçR;û}™¨P emußãt¼àÕõ7-¸úEV‡iÇã©Ü zSvÉ»M׊=ÙU7…0ô&ÏW†™$Ä×Ìu¿éM¬‰ÍÐ:¡ÙvFž{½I´Ü€y‘MÓ|¡ét\Õ2Û>Õ²ýÜøát…Hèó9ÿÌË팃É÷¡^í²ï¤ü ú,EZovÞÚ.ïú=9¯Ä§Y1FQ|Ót%‹,€}a'+<å²à¦"t5³mîê-j©ô‚8?V¼¿;sùæõp°2óƒóO“µÝ=M¤Û•ÏYšŠræ$N#Ãt¯ŒÆºsáȾã²{ô%[M1ú7ðûœm:u€†NÚDŒœ¥X›Eðò-¹Vo¬{Pb‚žƒ¶ã³Jœˆs‹Ó§Cr>/¢§¾”¹DýƒŽbhcãÄ„Ê'éä¾-ÒÜtÃtœ@r×­Š¡´÷6W_PðV8 ¨>©Z‚ÚKNoµQ­~‘Ûn=7۱裠/rÍE6EÞ£Q|^NakEÖçUÖ‹1 klvIÜ=æ' f’îÛÆ³´~\KùÊ @}WT‡ÝbkøÈëiGm˜‹X ÕlÛp}á I“é€Èx®•<÷¦¥j#V>2 óÁM"À»ƒŸaÝýËQhKù·ŸíêSºØI:&‘ûÒ=Ž9F¡'üxëÎxÚ Î6ìJ¦¢CÊ$?k÷kxû¡úD8hu–¥’µu©ä62~ßË2 óêG¾·“ŠG¼9ÑxÞ_(œÉ7¿Ñ•ÖÛG¦­R•].ŽÂ8f&wÜ_ÃkÁ/y#ÜxÖtõͨ•ž'²aKúN•ÖÙ´Ñ¸êØ¬/¨ñ…·U®ŽìÖðÔÍb`ÅúYžz.ÃTªþMеYØ’²ž5½Aº½À®J·àrî'¶1¨º¡uÙ\ˆS.WsùÍò1‰£ÿr‹Èb©MEKÛ4&™DÍ~[pü¶1°^‡ Ó´ÞKÿ†UÝÛ™¹üÞ‚&)Ã)É)ݸºÎxœ”Ì»ë“o©ãH¥5=Z>"ûèÜ{°Ž4%³ß¯«ÑLlÐøÊGDºÝwôØäýÌðj[çÒƒMvÑÕJÆõá¾µé])·H€ñwåvaÛƒèÓN3âT®o”åDPœ5²¨œÐ´ão?s ZΊÞï<ŽŠèq´ŠãYc3‡Ã}(´Íž‹k¯^OJ&¦•½Lîð•PB‚—ˈ±^$ó=&ÌæYUÈgCÔ1Òj¯ê²Å Öj ŸÈ<%x5ÃËhè=ú=ðTÏñ‘á£á·õZõ‹©¬yb[XCnÙà ’â‹€Æê8’e ›^öo7AáÕë;,Ÿ½-Ë?ÅH`UUv¿Å$8¤êXï5=Õó‰Ì'üböY|*¡×9%±óÄ*äþÉ—˜ˆñòÒ0þÆÔ‹}AŽ>íæTÍ»š¶o‡ˆ[ÖSƒ®§ v@‚Ç |&1Íîàj Ó>–lùNEãÀ1™µÜ”꜈)§u/¶™/š;6_cºˆô‹sˆë“Þ†8Ìæe0þ g€à‘›bmÄGhÇw|ù )&=ž p»×)t'?3âég{«Ï#eÒÕ «¯³;˜ì/›ðAB³ŽžQ¾G±ã´S[b[ÍiU±ë™D¦ÉÌŽåMO°Ñ×êƒ[>jw/•‰s36§^’GÈ?äüì`®ÒÂ.˜°©yÔÉÌUìó$C˜iTJO‡Dœù½‰K—iEQò{…ÛòJnóìËŠªª(¥"ƒí‚âhû&ÐÑÚ®ñb„Ø@Tº…ß~xåáM»‹§Ø1Šòî¼!½yIC‰n_:kØO¥׾ɒ/“Ü¢Æqk;™M';J-#Jù!OŸ¼áÑ-V4nUP¬órÏcæ5ÀVëÃçc+jÅ/Îp…!9·*Ó‚ò²Oî/TÚtåXX*–zºßõï•ÖpBÔ'jWO^Ôbb½Ó‡ØEc|AËç¶a½?E½îâòæÞ?´ fÊ©aQDZ‚ EôŠuËûŸZÒ©m¬bP º¢ûb{SäÓž¾‹J ‰Â{[yR™º³ÆÇ Ç aþNŽ»(æÐdˆ´ù®X üVÙ1~×PºÚ°lcZ<ѱuªVì2˜yÌK:`íMók†,ô ¿›™8–ãJí‹-”±&uGz™D¿huÿ Ù÷]V·K „[ik¢‘ã] ¬JÀ)$í!üøÇ15yï£xEfÎ^O×ÇXeš™é›=Íôá*LiÑIo(«ijx»ò™µÓ2KÜSUð%ÜâZïs”µ-ÍÂÂ!qÔœtsÍ®é ýÄŸ…|Æ&ΔÚ[IDU–Ý~¥ŽÉxojí2Ó£Ǹú§­m?2ÓÎ`?à*»²ÞQÁÅk" |–3F%bë€ó£kºÜÒè ¨aj2|7‡¿ˆdT}ØØ¼—¥Œg Hco$Üõ-ÈW?êžNØì&)0cŒ£“:JÔx·hü L¤ôR¢*Ô«¸° éªß¤“ͨI› ÝšÓ „´)‚%Vj$‘3úVXDYX¯ðæ¦ÁÜÖõÉ1oIžFŠ[ì6õ”‘­gߟÄ(h‰{ä4ÌómKA‰ØÞæ‘3Í» mÚu\¯A_‰$Iõ½&Vx£D_Y“LUÎùhé˯c¯û=”FvËVzÐÜ« 2gêãÀWM‘KâÁeµIô“sûugnÁ8ª·ÇO%wÁ‡?¸}jÿeåH÷ïô¹krgÓ úÉß“·=Oùé`y@òßä»w{•ã<†ûÁA)ñûÍÂïH‡·Op)7DŒQåk×/D\ŽNfý b²B^ŒÂÝËö³cDÁ–+ÖÍâNs£Ú|’Û¹PQ¡”`-0|’W"OÞ_lj]ÅBCÞsCu3Ñ)r?Œ×»s7aIýH ž/?íâϦÁ¶¡Áádž¶†Ù±±p!É›»/¤Û6{ŠºÑѯÌG‘o‡Yf,p>ŸAy¡N‰•}Z*øAÝ’Õ—ÒŸ:ÀG3ÜðìKwChœH‰–u„³ÚƒD­nF¡ºh,yï^Ÿ®ZÏŒ¶ ôIóêëî~;lbˆ"C+]O«Q‚<“´ÜV›úZiDÄŒ4Ù_ybÒð'Ìn|¯zƒ½SFŪ†+¤uðpóì¹c4¬Ï“zËS÷S22“É)¼6¾~’ϱw‰Ñ“‡JÑäp*ëáxß:ò°ÂHZä½f( #g/Yâ†d’`Ê ¢bæ"ÕxÒ¸Ø[‡ì©‘.g^ì[e ú‡Y_§¨Éêiâƒô^÷k^ÂL6 uÆ|!ûg&¹uh1°×\¤OHîëì…wÖ±¨L›}ä±PxÀl0›Ö\ﺗ[4º­JÛçÛŠ»Ò<~l!Ïy~ßà{^͵W\ÊEßS^p’Ñ.’K¡`¢gªt¯°QQÚF¥™§à›ç€«hÛʤՇ¹›†à|þ·!z7aàO >¦;Û¸­\ _Ü`óÊÙ}±oú-2$"ɘ± 7u9Xõ™=²[Ô$ö\¿éAnRzèÖ–oEáQÙßó{’© PúŒÍ:ü]`0ÇkÐfŒøáz p •&½;Ã'´aÌ,[(mˆnˆ}ÇVS¼øãnª ¿¡q}ø‚HO7$¥lSÁÞɆ@X5Xû*¸½<Œøõ´wÒ#n© ÿ˜5•¸ÐGê®§]D4±8yïù%(1•4ŽÖðæÓ­ÐTàËÅIäd×±k»¨ù“,0Á€Uý•¤=_vsè€ö\¹Rþø0-ƒŠVšØZÆGœg.J!VDÏ •"î*†<™'ùÙ†'+w& ²ìƒJRROȵ×RšEº{|2ŸÞá”´Ã ÒóO9榭 ¦Ý‹>Ð×k[J±!uSk…Ð¥(>« ¥5véiïZõÓ!g%ÞbjSLH·KNÒß…ÃJârÕ<9\ÃTù>Iø;OÀ®‹_}ɦ©.Á#uøX«J»EDG^DÒ]>rÍ+,¾âÎ9y÷T}YÆ)±üÑþ«œ%Ï•sîè Rv¡Ÿ]@_ œÍü¼âïfø8]©ì !úêq+M®ÀäâjßE›È¼\õR4øc×Ï–5ó-»#=D~nÉGã>a–-Ïsµ õåVÃT»ÏIû‘{kÀÊ Ý|èÃR Ë‹¸fýF‹+•¦% YýSˆç Ь÷¯ ^èK5lŬõÖVÂ=v›“¨RÊùsþ½Õ‡^ènÎßÜw¤Nk¨Á;ËbkŸp{6’uZsjç±hŽ«;Æ£ ûÌ8JµöÙés´¯†*ÑhöpÆÜ×b«é“+qõg†Ð§ôsÖ8â¢_Ô³ÁÅ÷¨TR2#ÞZÉ‘š ØlJKñí…š]´¬r´-b0¼¼Ç#•8ùºŸÙ$þ¥“¸z¯©åKs Ë—c†«€§ê5˜AmȽçLÞü©z5Ñ9(iJ£èœ×]VR&ß•â= ::“œÃ·Ÿ>ÍÒ„†)ê$<Im^? áï<þ ú(ADvqÅŒüSËè| ð‰ÝÄ›êPBZã®â:ÇÍ)àÍuÜ1 -ZxHåã\aâoôªb «ñÛ«ŽÇÇÈ—Mëôƒ¾í o†H*ìÕö Ȥ õcٲα:2 BY„$¯%(Ir7Î1Yå„ï[$ó¶˜GU =6J«V7à'ïŠ+°¶N¡y(’dÍ5ùŒ•7.]Ìu" \åÀŸVãx"=„6pµÞž¿aûÙ&‰­ì˜¾xèKïݹáà{FNç‚+$™@,³o2¤¦'V¦ßý•R0öµß¨˜Ðˆ¢U{“dÆ ÎùÝó¡k2W‘EÓ5rÕÂy³9t‘xØC¯qöÎMÚ}Яfi õ ÐÄÄ=§^;šŽ{ôÀyFÊ-r9%§Ý ØßÑÎú8 H½C÷pΠyQ|ž\±ñ ']ð²+IŽ+R0âí°àWIÑ©/’0ã•!ÿá&u>ùheaøÈ|‰¤Ó‹zô‰£õïøülá•=‘˼gæWÒͳZv ºXÒû‹EžÑ{ðþ‰NjfhCÚYðrÍŽŸ÷œÕ¶ô¨Š®_Mÿ<ûZlD îΧU4þŸ'ýmM­÷°Õ¤¢´ðgßBe¦`ÔŽ"ã˾HׇDŸQ¨ê•›eWΗ.†zJJšâÜ¡läÄ1Y^ã¨Ç#”ôµ‰.!þî”ÊK’ÇE‰­ŒmzsÌ^†Wè“’Ʊi¯õ  2ÂJ‡gxöÓJ±uHú‹¶³wk4"¦eÅ%õºü§onê±¼Ž ¾·|iž|šðãOÃC"ÿ8y”… endstream endobj 117 0 obj << /Length1 1977 /Length2 13607 /Length3 0 /Length 14844 /Filter /FlateDecode >> stream xÚ÷PÜÛò ÜÁw— ÁÝÝ‚08 0¸Ü‚»[pw'xР îNp'ÀãȽçÜÿ÷U½WS5ó[½»Wwï½zÏ 5¹Š:³¨Ø$¶wffgaãˆ+*ʲ³ØØ8YØØ8©©5¬œmAÿ±#Skœ V`{þyˆ;€Î¯6  ó«£"Ø çb `ç°óð³¿çgcp°±ñýÇìĺZ™Yr`{™Zìàádeaéüšç?:Sz;ß{¦?âv '+S =@èl ²{Íh ´¨ƒM­@ÎÿCA÷ÁÒÙÙŸ•ÕÍÍha;YÑ3ܬœ-j ÈÉdø£e€Ðôwk,ÈÔ K+È_ ê`sg7 ðj°µ2ÙC^C\ìÍ@N€×ìuY€²Èþ/g…¿˜o€…ý¿tGÿAdeÿg0ÐÔlç´÷°²·˜[Ù‚ÊR ,ÎîÎL ½ÙŽ@[ø5è ´²š¼:üY: %ª ¾vøwS'+g ÄÊöYÿ yÝfI{3q°ÈÞ‚üG}VN Ó×}÷`ýûpmìÁnö^ÿAæVöfæ´aæâÀªioåè’•øÛçÕ„üÍä àfã}ÏÉË 9@¬$Ððpý¹Èþ‡ùµ/°Àüµ •9èõÙ tœ\@>^ÿ^ø_„ÌÎ0³2u˜€,¬ì‘ÿa5ƒÌÿ¯çïdåÐc{•;€íן ^f¶·õøÇýÏ#f•ÖÔÔ––gü»åÿ.ЉÝ^Ìœfn6;/àýëƒÏÿò¨­þ®ã_±²öæ`ÀdÔûºQÿ©ÙõoÐý=!ô€ÿ%S¿J ûGéúlÜl¦¯oìÿŸõþgÈÿ?™ÿÁòÿªôÿ[‘”‹­íŸët9üÿ¬í¬l=þöx•®‹óë(‚_‡Áþÿºjƒþš]E™•‹Ýÿ]•u¾Žƒ¨½Å«¤™Ù¹Xظþ²[A¤¬ÜAf*VΦ–Éæ/»ægkeRC¬þ¸b^£ØØþÏÚë”™Ú¼^#Wmþµ„¼ŽœóŸù½ÕÿÖ!io 6ûcú8¸y@'' òëá¿"n€û똚ÜÿT7€•ÅìüxíÙ`vBþã ß³X¥þ0ý…8¬2ÿ N«ì?ˆ À*÷â°Êÿƒx¬Jÿ >«òïk•ÐkµÐkõÐkÿ"¾×¸ÿE¯ÃÄ ´u°þcye29ÿcà~MûzÙýËå•Ð dû/ö×ábý ¾’˜ÿ ¾ÖcñüÿMÈóênñÇÍÿ*š‚^·Âê_Àjý/øÀjó_ÈñšÞhgbö¯’^çŽÕþ_ðÕü/øšÒáŸ^;r:9[mͬÌÿ©œûõ^‡ löOª×:œ,ÿaâyå…Ø!–ÿâ~uqþ|%qý|ÝO·m÷k™îÿ‚¯tÿ‚¯ezþ ÿGž¦.NN¯úýó^yÕîðŸß ;Èyal*d]ÔqW#J䯼3ñî4õN‡ƒy¢ÐÑy@rÚh#^=+sQ¾Lj¡Ÿ]Êк[IÌñ.{õÇ¥×v=Yƒ× 3™Ô¾™IìüË͛ٯ[bŠyŒ¨<íd1Rþb§(’HÌ$a3‹uj oÚº¾Ÿ n4¸%Rïå‚;”ºzjJ2ð‰ù´Ž7Õœ[M†5v¸¯hs6íb·¢ôÃS”Ò)ä?tÝ#Ç»Uc® /cõ“| sý´sFâŒ%)0L,Á…P­µœþ;§Øõ(ÛËDGäÈl |>ù®NéÂrûâÈ€œ.±ç½¥q¢yû±ç= –Ã2®XªSnÁn„ÙRv(LóPÂ-D«/ÍJ¦wµr“ÄF.4XÞcq¶ojý·‹|ЬÞä`“b¼àsÅö±O#ug´«ytmM2Ö¶$#~&ý¡y`þ •mÖò ]ð¬6‡oªe‡†Ð—e¾ 6,NN4W«Ô*MtÖÆ½!„"Ý/O®çüË®.1šEÇêHûhì(Ïõúeïô¼‡SvÖo ¦iµQâžnv7ÞÕî Cvà"ŸJÏ;î_ë?°>GùdÙ¥‡ÝAÑ,má•K[ýaפбfP%J¹åWøãƒà(Nd×’€¨‡#ϧ#èt€²Cƒ'\™ ž€Öt·ÏPþ ´ÐÓ˹P[в¿—(ÛØ“gÓ|+î°Wc>~Û66ÖF°•ËΤbâ¶²†¥3ÎÝg´O¸ªÃñ1Q¢ëW­9‘ô3µD³¿P;JYøJ÷2®8-ˆÞ]f ¢“éu‘½˜¼€ðÎĈµ ×?"4]hSý"„ÞnÊÊè]^€?JA”ãèƒ ©OÍ8.ZÝ¢”Ú+gd„®%|®Ï—²ât“Ð;~5Ô²[¬ ´(A‡ïf7éjÍ”+zwYF›øÑuÓ <5Õ£a³N<«ÓzïqgŒ“jI¨ÞœOççƒíðÌ<}h>o>Î!Ùryj\îÛ«]®w“EƒDªë =½“ã’´­XšùÆè?¥eˆ…¨^ƒpNÆÄØä2F™Ä§Ÿy޶"QFƒ¹ŒÄà·qà°—'ØO§±fÞ`k7ÖqŒ8€3%3ËÃaºÜ…FÌÈÔ?è‘hHÞ|µNWŽðee|DÙÞÆ?¤§b®Øòn{ê,Ç;Ô]åddÉkÝ.˜5‘7ù ˜qåha,pó3m=¸âRšÊÇ0,Ï 0zNâ!èDUc{¢Ùުά¹ MxvØ£eZ'ä1i‚…AßçnãÃÍ[›>œYËÿþUcR…Ýë{Î;\ûÜðÏsÙR‹^&cæð!›¿r+½Â¦hÂzzGt²ø¶F¾¬?Üž4® à,n7häþŸ¨¹¥Ïlð#‡’f€ZO‘Y/ÂÛöIˆ¿à{<þjA÷ Ê uÞ]ÔÉl£xïáè´nþàÓƒ½;•H_ «®6£“äñûC&¤Ã:IÙ°c#° “º5F3«-xÝKh6š5·¢ºTܼ¶Ã6A]á“ç°ôÖlÏËŸK(;ÁÐMâË™HõÊ ½æhe¦;e¸.ÒÝ(Œè€jÅ)@\ 6Ð*Jz|$ººF…™¢¨”7=Ø5BF¾úýí·xJ°§åÛèƒG¡ÿ¥Ûž¹Å¯ê~ؔѮ„uwÅîÄññß–b¦Ê È0Á%§kô´‘PU3q¨B?°.léϲAÙV"¼áªœçºO»Z«_~±eÈ0¶˜U-«­C¢û²'0P£Š¦I$î`²?."p×'Å: ‘Ò£‹DÎÿ|{ÿEx‚ Ú *âWjG›ÑUU0Ö´¹2¢öêwè­’œ—{¶•Û¸hÖø*šjîkÔoÓ%÷s?‘j™×tµ>–™sédë±¹ŒYã IÎÚÔùבë ô(Äv5Ýå Åi¦ÔÜdlCm4dˆ‡í§Ÿ=2úAšƒƒ/¿!Á¢°Ë™à`N² o2@µAÚö.‘gkÀŠRMÒ»Dé9Mv‘ÆV>h¡)Ù7‡“Rž‰T_ºü5²ÊCÃØþ0çqw\Ï%Aò@®só©ý$#ÃZÏ@ý°d’d“Pý[·Öìþ½Éºê$Ž N½3Fߣ‘Þþ­p[¶Öþ'¹&BÒ{xF²a"`p®¾çò˜yj­?vÊsY†=sBAhEl|ÎzêŽóÒ*ÌDzª&e:¸n}æäQ„¤ÙGr}È2ôÄNª7±$ªÜ7~& >¬BÜ6ØÇ·‚ªnùÊÏïñ¹‡Qñ'ý`Ê×4ÂWPŽky«$^µ«®x3gu:‡¶(”„Ž58y¶ƒºsÎÉðÚR»/Kâw?’êbÆû"›¼Y*bmd,¢åùÝÁ¢B‚€‘²™j²œ,US[rŸî0ÝO|;5Ç}œÛ9=‚x_óyL>¦äržúÍ·Fì3¾n”ö£Rbu&8aöá@D”ËyÑÙ5[ºO[ oZ'ÛsÉ`N5c±OŽ÷‘zËíD¦ðc5pùb—$[¡‹´•-˜“ç¿FaÔÈ8ì eàI)õ= Ÿz1Y§gHƒ$­µà¼²ØÃ/ÙÇ5j“´ž¦¹=´å½=ý§/·Rμ›œ ­ÅÌ‚µ«<8{4æå|;võ꺟ç8Ÿ\sŸì5Qµ´Ë2nXöpt™÷ÓRj*ë8<³æÎり B‘?¨ó8³7Úö¯·›¤ïF@å´AÉáO‘Ñg}L»œ5aJ¢«;£išr‹oˆD~pc¹+HŒw‹¦F4ù= (}!ù®†Pˆü•i>è¾Ë!A—üiÌ>·nq>sÉj9ºlrª4 ‚ַиùÀâ´ŠXCz)©WÊ SñÔ||¾XàÙ™¼ãT‡q¸>MU$V†¢Í»Çß"† ˜_ýAã/M”==ÁîÂÃf1@Ž¿JòNoK «ÅçÊY+¦;8«{bƲéhoÍL¶)Ú¸zl—'­²‹\Y5*¥ÝýÑó±m¦‘‰J „d7+¡SÝíà;zVuޝ³œ úÄk'ê¹p½— kÖ¥$@±æK?_Òå°Öq“‚×B(.ýÝ?Aé{NÍçþ5o€< fèÚíì±ñv—5?+ ô[¨õ1þ/Ïš_¨xˆô^6‰ín$,Æ}9ÜQ‰¢r€±®ôï„Ñã/ú‰#¦2žŒ·/|lyv Мc‹1¨$Ëâ¥d¢í è×Þ®rÂU8¯©íXeØR±Q»Û?‰§ñûóìM0·ûØø*L‰ìýªèE•ì7,·üyHoSÑj…Äk2w'1|{ç.ý‚ˆqD |’þ1òĺzK Wc¹ã;ð4ðÚRÍÄ<±¡ ð§¾¿© ;¸–q×1tlw Öÿ«äa¨Éi$“x•ãeMÈ“ΑlAÆàl*¬S§þRSï§o/›j °±küF›“Ù[áãÉÙªêLó9ñ <øÚÒùVm¸‡ÙKøüÀ‰aÑ\2LºÅmÂXH˜¸7˱†•SsÊIµ¶+“ß=‰¯J =¹%£=©ž?šþZeµQpˆé=ƒ¶,v‚ w†ÃÉ'<ÛDv´ŽIKZÎÒì ã pÖ«ˆŒ“²¶Dø"ÉSÁú" @5g”c ÏîjÚw _o5B˜%§Þí'€Û5L6Wº¹·«›»wr$v •Ë»;rö­3r.&©Ì8ˆýî=¥ýƒ:Œ­`LpW2N¤º-‘å”ÃpÝ©¾ˆ?‡ŽDF¨´¿¨2¶R?+P}ûsèG^væóy£â†è„'Äb⛵ðÔiÁ5J€t°B§doI…ua(ÇKö–§ØžD®ZêK›¼¹’ò¦7SÑ­^ jrâT;¦7Ì^Ÿn]ŠÃQ”¿¸vG¶\[?cAE §æ<,Š’Ì4ïr³”@ø» †ˆd8ª‚;ë„m»êLq±·Ÿ¡TŠn¯ÈAˆK‡ÌÎ쟲¾cŠ3ù’”°üdóȺãµ0•ΘÈÅ Š„Ù:.â•¦Æ w˜èøv4جxÎ?“ô`ÿÔKeªîDÄM^ŸÝÆ–±¹(LžÜä»-4CÉ+€ê—wÑðr %É1<¬´?ËkX¾ß?úበmdÐáä 2È$T‡³:ÿFé»Üç/}/mñ¹5õþ?ƽ˜ Õ5ÂùÐUC3OÜõð¥Æ5y ¨B`ÝZ{€¥ÄÌ÷Vú™M2uc ªpU„}»©(Úݪ&B«Ì5n¬¥º<±sÞaÏ¢P雇ÁÚ¼~æ2„ÃÑîäY8›åg‘|ÓÀ×õµF+×'ÌžœÞ˜b¥üÓYÌuø.›£õ~³u÷¯²–’‚CÅaEñNø´¶ÙO¿28V«(ÑŠBaÓžôjRçâM]ž¤íûò?åµ4)ËyßmDe'9Hòw¹deOáKšb£$äôUô Ê^„Ë Ÿ}KÔ¡Ws?]?ˆ…éwGâžbY´,¡Dl“!<ÌY`†6môýÚŠ/W„ºÒrPÄï‹?8lé!¿Òð,%ðA÷%ç{ðoß>ï‹òæ5¤Y,œ+ V§î§/dÕ§¢ý&‡C Û¶Ô¶žWlé@_È^‚„Џ—¼~ÓÊÅ®°$幚ýômÅᆠåJÎU¼ˆz ýË•Œ>,<®Sâ´«¦°ÄzÿÑäSƒÂŒ§Â“¤ö³øŠÍÕéïCX_à šÓ5¨öeo6<"¡Òyñ¹XI¼kš»)syÞ ÂAÃÅ9£áLðÚŒZåïÃ0¹OÊÌi}ƒ¼•CÆH* ˆ–+½äççÉ‹ù^ÝÉ&¤Ž*Üæz{\d ±ñKj$PÑ?¬7ù-.åHÈû1õʤz“4 ÕY!”wª–-e$Ì{ræ°K™”~Íñû¦T’3R²rt {]QÛ\ä"ÚÏ/F»‚/ ˜OŒ)ÕÓÊ|;!¢>cyú(†L‚îFQ_Ë5áÚczÖòx”C­Å·x…ƒ…ƒöv$àÚ£¡âˆõ>_yH$.t#Øi1¢Œ7‰¾${»ÑcɯcTöYÞÀAH'3‹\–†ç°¨Í®‡Hç/üÍ™O¬ððÁÐ=É`Ö›ç©w(î(I—׌cø ï›Ne&îT?\°»B”ý[5ÞÔ/ÙÌq§˜é@’e>ךÙh;¼ßqôP›þ0DºáŽ"xÅœéû[z×§F·õÄÓ4möªz’™Þ úÜNk~®B!=yKÛ"²ôͳӄ k-F7 »”†¤ÁygÞZ±:51b‹Gt®xa"’ÉýVË—hF$Q7<ùäÛNqe\&R$Ï6‰­¶Sèx®ç;Å£wai¾´©®Õ$ªÈÖëÁß³ 7ý¾¢¤z3ט*`Â]5hÀbåœxúcaAg¾)<¼ y?[T)Íø±g4G,ªv-¢ûP°‘f,@ã}9¯¯_h,eêôú`·mR//^YÝ@g‹¥Üc󗯹ÄùtÖÑ/&; ø‚ÞÎQDL¦Ó lØl¶Í§5 ãZmnPCìÓ‰·yd d1R„)Ì%}Kõ÷X~ž`îÀá¢åòz³ LI¿„Çš>ÜO?á‹ ïüZY„ÙV˜ç¶ÕÅQ¡>½wi¤â0)Bqpp®k^®qÖÐV6!ž>ãbQv§rwßÃ:ÿ|§ÓB˜[1jÁ+/ktÂVO0¾òü¢x66jÅ?¯ÝÁx›"_ßÑý{^Öm¯âªQÑeù‰ÇiñðB!†_[7¢¿#-°eQVب`& -&òÈ9§¢®ÄÏ›W€Ok{ðe QSx•~{VÍ„zêqÕ쟺/Áƒ¶Õ?äKŸŠ¤\^:¤ãÆ=ÎJ1ºám“VB"¤b%X•+ï’FiYO ºþšM̦x ¦´t˜Ò8 î¿Øyì–K'TÄ0ñ€4]r<ÌÁfÙ±Úæ-ËÂíòÕXU¥àvþ½ñyiŠªêPÓÝo„[®n‚¤zÃâÀÕ O¸ŠkÕà0Dnicáq5)Å´õ>ÿç¹1‹­Ï‡=æ‹kÐö0ÏP]܃ìf…­q¿äû°õ=Z‘Äzš¨úDh ÃFxȵ֑'b$ñše[Oyî˜e5¾R…F?Oxd,„ž$¼‘Iv°jÝ}Ñ2µÌ6 °¥YZË“ê[áoíÅT.< ¢Ž@$—à/ŠÄ‚ ¦¦‡FÀle(º'< Q)¤<÷ša3<ð‡Î³r¼w§;nãr@©Ráí` fuå&ɆŒš|¸þ\o  ÂvÊp×$ÿ+éc³ˆˆ…y\6³ÿ,NXEˆ›œ7èo;‚y:8¥ïK¨ÆˆGåÚN–(J[´YC|ö¸?ÌÝâ,yÛç$Ð ±'çÕ– Ý{[ºFJp×Qà©8SÒ^„ìtµ†³¼‘ΗøLïÀäó­:W%¡}Î] w êƒ9¡L´>.Ôìð"!Œ2ÀfÛ˜K?¶ŸìÇZ› J…Ns¡¦ÔñcÌ‘YW«eë¬Y¤hˆœ ‡6ß0®Ù×ÂâúäȤ`21ÚKžÁ×GCœlÛ!á·T_dsøßÀÆCÙ6ú&ž9HDd)g®¸9X²¿S§÷@v:•œªk–˜8êIYÍ%ОDš§Se9áêóÉ𦃩¸Z‹å‚R×§k|4‹+J¤xé‘ç[¡v I·#pþ”q™Êø÷O•V)ÔxƒD~¬Ç7É䂜èNÇŠ6Ê:{Á-+Š„_Ùùoc4é2Že|Ò*bK:]‹ |#øU÷‚µQPg‚˜'¡¶Á—Ò¢UøÐ¯¾ûÂvÍlùh7½]û1?ògr©b=Ó£ x(̨Œ“]uÁ¾^é\>½^ÒsR=¨Ø„³¼˜ö*××fjÝw›JÜÕªYNuêc˜á).ÓÏt¶Ežf ^ßY/°¶Â÷S‰¥“ø½4Nš‘Ù™pI†ñû³Š€Mï¯qåd_»ŽO{3‡i[Y?ABÍŸò+\{Á9¾ŽÜ˜yq%ªs~áá±7T‚ÓŽµ7{´¦µ¼Ø–gùB¼8(|«*˜¼U½y€Å77;øõÎ"³jð•’“¥½0q¶”†T".oe²Pä‡èäb!’w—ÏÎP F_šŠeì»È½ðü¨žuIɰˆ¨ß' ¶ý=¼ÀVµ,®R«<_~Æ ë¨#A Ó…]ò[‘ÅOA=»%„‰áŠÙs¨¹ ò«VŬػ„"pfIró¯|6ˆÙÒ¢„.9ѽ9šT Px!:É+›­‰ÂA¤FÈ»+ÿbø:Ëq¢4•o­ ¬û¬{0Gi— vï©Ì‡õeÆ ùŒiw®‘‘˜)/$çíØísXúÛ0qÍ·\„Ó¤( úúÑó> ãFünå|k6pxËë¢_Çi½Ë½ð½ûËÏé‡Jݸl…A •tUWÆ·Ý*q§R‘yfÚ-Í$ÞçP&áb´¦íûæ·å :@ßÎtö˜Aym$nм]ÇÐ3ȹ óNöÇÃ'Qj¸¦ßEåz0S! : ÛAÕíšZ¨°$Üz[ ’Y/ëTž0=IUZ#Ò LQXÏ´Éh¹7¿¨¬Ú˜[!ÙééÏ Œ¤K=­_ðœ "Kä ·kä®$£e®/tZ˜§Ôú®YÆÖعúvN’„[±õ“a¦8UGçšÉËDº'¯{”î*mF¿S˜= ÎÇØcA8¸½]ß#–Œ ~Ró[l­ßÊnæ$YT·=S cÓ¦žrãô ‰®0Œ'O†Ù2‘0n^pf¦Te„{Œ»‹Ù¡§5sסCØ×4üóRúEUæŠ&©•ÒÃåú7ß+¬LÞòïLÖx4$ÎqÙÚAk ÆeÈÅ豬DMiÒÊõòw“^ôž÷oïfª?US]©•…ý â.jä5 NKpñn%ºŒì†©¢]´8þjŠ‹^#ƒX%÷àTCã»5à{ûd0Ei0bPMÆ—¬9¶Ÿï°éÇÅ—-ê¹ä9G =§=}Óú± j" jÃ#e«o»»ï6 *R‹Õf‡Ë%qG:ÐHÑŒïíBX_YtcûÛ#úlÜm“QÿÈC¾R¦4 ;Y«.¾#WĤá%ª‡ï¡k‚êÕIA¦ÙX¬½j©ê͹å°RˆOç“w8œï<Œ?‡O$ œ~#nXS7φ³½\¥ ±eÙh)=àï÷Ö ÆñKXÒTÞ¦[ó ¿B¹bú‘>jNÄ‹.<*«o\*¶P%#Ý+_Öy Äoù|k… ²OŽ~:¡P8+ÊmW.üšŒ7ž}q¿!åëõÅ7Dôº' ù`ë-©v+æ#‡'òõÛ¤NM‚Á ‡Ýê€$˜ƒM²Èk¨Ìîe†lî]]„£ +†vŠè¨”Ѹ«h_ÛûË6jÇÜÓGH¼¥†$ß9OÛÖ…ºÕÏÍÕù_(¶©ßk{nÇǵ"àÓè˜ÚÓ¿–ìåueæ_a ñ–ViŒ´Ès=äîr#÷Ân;[ÉYoó1À7ô{ÇU|œÝH vXÚ²&·.Ü™Ë+R;b˜ ,”«â,™¾ ׯ`¿zj\DùY§»$¸Ø÷8=Ÿ¸2;èÕš@"ûê ÛùÚÁòçO³_Ó¢},½–îÓ²{´X c¯`FÈÐme6¾ý*ï2YÍ÷ÛÆZ^ši=Î×7kq ¥´ çW¨Ôo2”_ Ò·y1¾e/(mA›•@ÓP!$L …ž¸äÈIÌúÌfíFlüÙ4¾äñkO&_Õ7±€wPŸþ4s0-$²z¦bG4ˆŸÿY.OΗo/1x²1Å袞K~C×ôiá^‘Ï¿€Û©*NÏ¡»N‚½ñYhkME sRSäqV[<Íãz«uÕ ¦xO°zŸGãÑÑ÷gÅý‚ñŽr&±è öˆ‹ežpª„tSŒ›FÕÃ#Ö"N“…ÀôL7mó2íÔ‹ ײñmÈô‹Œar9F_îëJé94îô—Ñãœb¢}¼[EžP7襈6¸4$>ˆnñ±vºÅX}T#êETåkŠ"7š}¡{‘¡5ÌHÕJÈ}0{>V@ývI™Ê# ·¡]÷à¡%ë ~ÇFÞàs‘ЏwF£Ïê~Wù¦h» ǚ°Qß½ÞWQÌålž/nÝv  U›äЈ—\¥³=u}wãKPbåP1SÚNì²âû7Fè ŠÒL›Žý]9šÞﲜ ¯[ê˜|!qÛ7Ÿo¬J$5³>^¼?vãÁOß z°ê˜“ÆqùAÑRsàa76úrz(=ÌÇ™”ñ­ÊMŸ…¤su–b Jô0ï7_ôéÔM×ÔÜ]È8&ìu5RÐ35l(`H¿LÁ±¢mÌ~*£ÓÓ•ÂZà\^'bø,ˆ-Þ}¦¥Œ£Fõ¨?†\Sͺ”ôÍ]>&èfTãcoGT Þ¬ŸŠé’] w cñÃÕz¼öíi 0/¯ô›¬ CffwW/w»é÷šý>3X¯ÖÃ1'{›¥î€Æ±IAñdªï`T}šEª'Ô=H¼.^ÿÎk•~Çu¢“¹‡cÙ‘£wcwŸì¸:7CªÝ¥nc>›NË%Xsu4“ò·ñ˜Éñ tu¦‹Ô{ëÕNuÊFá$BT?¦f«æ¢2¡-v±›ß}>%-Çøí¥W³ì¾·AZ5Yö'ÍÒU¢ÇýZÑ3¿uÇR Ä ¿g2¼àç$pÛ˜ñ° 9ÏüÅêÓŒË6Ôoð`K+žï—HŠ.%uǼ? âë¶XŽ»ŒÝÃH« -®Sù¾y9±ÚBî3#’”9ö„[†NdŽ<—¬µø„Àüiqa­•7ùB*Ø=4!X™Cõ#p‹´Ë¦šU1d.•Júq9W?rot\ñõ!<÷Ù7L°\²ÆCIà$¾«RlW ¼W9ÃØßó•ŸäXŸä)X ‰*ÏòÙ¿z{Æçøi½¿h>ž/á:ôí"y¹`ó¹fn}‘_ÌÈëõzmdš¿ ƒS@?Åvw›±*wqNÓ©nmƒK;BiDX, ¢°RøÊ–ï"‹`À_vD¼ÄkßÿXEí)})?(°C›qY4‘$¸0¥*ÂÝmz<•B’ÔSõìžacûÒfö;é¢ÌŽmÞ³*ú-Ëtø·¢ø2ïl8-—/a Zœ²zÜS°p& wlÚ…]´8뿎µ¾9öÂ×Ñ-b/~šÞ»O1!†\s¼i¨¼p€šÓ jº|ivµ¶’Å6#ž÷Ý{ gÏ*í)1™Þ~z拤¸î;á9ÈÝï’¯­¹Êûº:É&Í4Z¢È£~‹<áu‹L>{êÆ3m«G"~’Á°byQé#Ýt…¬o‹\j¸k ÅÚ6;M;^r-„ß…7Ùf 8‚"Bº®¦Ÿw<ß0»oјZlOMŠ{”%t§ˆ‹–M¾–G°,'ô.1m“ñCÇ­xŠ&RB¥1Zö‚§Ó›öeÝ^ÞiO÷É9¶ t?)i‘•I~Å)Úª!yCüãqê¡7]‡4ÿ¸”íÐõ›Ø¹nZh„¸mœ½Žou]DTBœÅs4ùÊÎ|"þâ­ú„KùÙn÷¢ÙJ©«儾$Î\Äß=Œ—~,T}”µRƒ™m¡d[C@ˆÔ÷Ò Zq,ÉìYÇ]ýD0´ËÕÎn åR}ØÈgWºæw9`kÆ€Yî[¶9²HîD'>Ϲц¹s\Kp„Ä‘|oà#Ø€KZM¥w,A¨CùT`o(³Ô€ªæ¥²q¶ß´ÁC!ìS³PD~Ó3¢¸Æ¨$Û1FsˆçßÜŸ_Aò½¹¸gÿ›¥Øj!/^çÑøä'Üò-›/¿ä•%}Ÿä©Ö0á “‰u©\9v|7ŸgÓ¦÷”_Øòè#ûL¥É8N¨‹¢t³óç|ƒ–Ô|sN´pmÕÙÓbõC^ižˆç1fR¿²ZÈE¨Øåpr¶7ðF`y`kç“é68 ±-ÚFÅ¢6Çohƒ]V]D¨çŸ‘ Oô‚’½Ù£ ŒQç#|±± 10\''$|®aqÞ¨†% ‹|[ù"ýwÂöÔ€FºˆMzzfömÅ‚9ØÆ3í 6›Ù©Ë_Db!ôŸCM_ÖUÜo7f 7;>PTÄmQY”DÒ=óFÃN¹ƒïoy·ƒøøelj¨×#ÏóáãÇNšhÎU/}ܵ4Y<<Ñ;n>Èû¦þäíŒ õº9*ÈJCEs]oe D^Ç<"®Éçw&ŠçVF÷Ž—Ê–Uýú0éTö ˆuOE¯Hks·/Ê­™žái+˘iÒë¸k‹…¡×&ªkŽü¸Új÷%/ûÐs}ƒ^‡ä†9æÊµØ­0Ô¤FHÍë_ë,ê¦g©¹Êp-pS­w¤=t·È[eüÒ|½Á}Züèãùóû…"7÷㲺ŽtLÀ^ç²O'Z:z¹ŒbcVúÒÐ>ø®ã¥b^8,(U„ž¼¯ÂÎY²èRì¾ÄT…F#¦¦ ËÔŒñõ±+ÊÔÎÅ߻ĥ9®{J wëÚi²½£³IP‡,BeT#ôõãm§œf£–£iE3j4n¿‹Ø¬SºÀ ¬—×K&’†È)èN,£ï¥x=h7mç TĆÊ%#J:Ж36Òꌭݧ3Ôôþ—à¼xÿO ͇”(ym,»…‡saæ5Ì.-ò³è=ŸW©”·ÃŠd. Ë¡.Œöí,ךÑïs6ŒÑåë•ä«îËN¯r»Ä|1Kf ,'¹ã ý–¾xÁ+O,Ò³Ÿm; "ˆÖã’«¯©_Ê W6{?õ‘õ1“910§Ë™ ç„÷4ÉÚ#µ ø“j©ƒÅ w±Û;LK^~AºÊá{ž2Ží $þGÞ›ÍNRNº”õõÙxTI^Âä ˜ë /ÝŽ†Î¾J%þ6C¿cé¿Á‘q}Xz81!¶a_ŸR¤qYŽ¢Q5@^“†Gïp€,†“[ôÈÒȧU^SØ  qhÞÜç¿·?{¯,ølýùhŽ[e~ƒ×°´7ŒO „&'ºbì/²Q97IÉò{„Þ¶k!÷xÒò4Ç W½¼¼½ÅF fA¨]vÇmÖ•¢é´@ú’쌸g®ZîÒ“±¦ ñe¦/å6ÈÀ+ÿFѳé3Ÿ /¦0ÏôwAÐŽø%Û´Ñ"3·Ð„V`z­ö/Û2®¹›pîH¯tÚ˜ÀÚÆãBÓ[hìZŸñ²ÞCeEÀÞ¯û§*ÔL'È#FóH°öléÉârßµ¨SbjHéN^'¨#nm2îÎ;Fm®2غ¬‡Î½€_”€L+:–A–$ÂÚZ$`ßkƒy{„®#œ2›B[g+=1û|›LÛ¢6É„ÿ¦‡ä·µI } ÝM¬Í½‹˜r’ZÙä@ÊîU¦'+ðþ¹/‘?WŠU<3Mh´ÂoK,Í;Z/8¨º˜iªÂfì8ø\ªž\ äÜ(k³”ÊzOžG³–o»á×ë&gK)øtÚFTwjx~ŸQÐÊÆ¡*Ó¶?!œ)u•ªãWŒˆ3âÍYÏMÜàîÇõÌ“e^®ÆT—akÐ6˜èÛ]\Y $—¡^úÍ<ýò©8ÀpÏ{1êç–ÒÒ–Ô$EÄW)‡î㜯ÍjeÝŽïp?\PY}ÍO•ÉÍŽýQå KyÄ KS¾Œ&ݹ:AâlrËÂÂô­¯ÃR‰þ*S•^&g“mzrÈjé± ¶œ n¾Áö_‘¤±÷HTÄ#G주‹G¶1FaT+/¨~¿h µ£ã< ìÔ?÷Úºïö+•{l n.( #"°€?RðÊq"yc ‡¢˜+ŠdÀöcP `Ð7/ÐÔ^<—S^:Kºn›>6ö!6ý<3Ìœx…üÔ¡ò‘¹WÒ8öû?©)ê/¡c1ÈL-é"¿ÍÛS¼‡Õ1G®9,éÅ ²·1Ä&°Óçt{HPøjq?!†óÚ*Þü{Fuh|ž¶¤Ô銯•Ð…½Ç‹?1Ð~êDüä¬Ã´ü!âdé'Òk—‡WýŠÊzÐ ÞˆÇ3‚›â%ìU_œÔ’•eŠ|HûD¾KO>&éžáTiZ7œî&uuE}Ê|—Õ—9ž‘G…j€—±†yÖÿ=»»ühï‹=·ñ¡žbÃJüô·FB¨Lµ)r›$£.”Ü›’¢˜w¡–'V†Çze’÷ŽCüÓ!z}zóc#t¹£˜ï‘½ãÓ5')Õï1áL¶B[ˆ2²=‘1‰ÅVèÖŒéö|k‹Ñª“Õ<ا–Í:ÏÖ‰T'Ä$)­8 +oq$g¦q|c+ÔÞþ`ÄÉ’®¯‹l ¸e"-è!KëddÙÄÇp6o1ÒÈefùîm¡j¦öB Fxº™}À?šª5,÷”òMÝ3£KÁ"»†Ðqp[¬öQ±G\`ô†SNwÞíEþƒè³æzÈ£d{‘sÁ¶à`.]g{ß9´ºtÙ1Áè”)ÇÜ©gÕ8¤>E¬Õ]PÆ2T“î,[Su#€ “úªæ|<þ|­:3ßLÖOóØÊ†ûÊÓ·—’Å€jñŽà¸»Û:ü0˜³o`ަ˜_— –ñZ›[½pG¾ƒàºJd)–Þ…¦Ÿÿ(±"íýBkð•G€ã·¹çÛm¥ªÞöV#º4úauo¾çȇ(A"8P²]élš@“ÂNŒ²ìû‹Ä©øw×ÿ³#Y!UP©o±„Ê!^‹Aåžtß¼9ùZ/zÊã¯rq4¢µ·óiŸ\5A’|°tÈ ë£N‚9‹!Ô§÷]#D‹_‰¨îâbÞÓçTºüê„Ê Æö·‚†päk¶LV8b~nV¼‚Ê™5ä§[ØpÅûªZb,Á!Ò1"·¨h]²ã.…lnYOgY›l—Ø‚Žu¶‡2mÖôyi³.§ûR‹&Uí’jÄ!Ž ¤bì¡[iWáû‰eè×7ûÓd7¬ÝÌBuO¤’´23¥…9þ\i5ôyÿ·*B»êñ¢1±/Í“²ðj7ÅPºÀÂÊ“ $)4â0¶Å˜|À{Š•b—Ùàl ¯Rî#ƒ\–;'Ý™ˆá×MAãð5Y’¾¼n÷ ;©’ýfæþ\w#ºòt ¾x»¼ ÿõâwãïïdÜ>ßú;Þ»ˆ_)`Ü´Åu!æX¨¤++UA<"îÍ>3²gà:V°¸!(Ža]àDÖü¸=`÷A{C6 Ei°Äqã°4ð̆ƒdSØ1쨬Î-½øÍà&6Wd/ùðŒ°•™Î6'îÏÕ¼§0‚©«%AYòC#—é§ æÑõ[ ›){‹æb±fGKk­ú# 5ÁððùÅ(”zjS\ºÍ©q>ÝŽ†côÔP‹×GãŽMò“8çŸh& _ÙzÊJÙ'JX¢$ç]×Êñ}¿i¿•Këe7ÕÄ;Ãüå:ƒrpÌø\Ä…{£¯ÅÇmncÉì­€V/ƒ…›|mô˜…Eêé/°'‰W>r#Z7BvxÉ„ÞK$GX?Ë`Ïÿp9BF(vÒ°V?Ëßî,«G\NeùÁ<&s …&ÈÍlŸ-‹YY—b1+ö’I¦*DnM‹tÅ”¸mâÿÙ¹ÿm‰’bWpoS—üàÓ $®SX˜Ò; @·²ƒqóÉ]gÛ «¨~¼¥ëp(;—=qtÉY4>]Õê{±6:-)~)Cõ¨›ºqŒ®þX>ô‚i¤Ù25Tæ/ƒZÞô©¦ƒ2Ó^j&þͰ±[m@&èë¸8îêYEY15O·Z[}wäÕåsˆªI^ nÿ‰á߸µ¼å#Iæù˜CíkhڳΑÁt/Ö)è£—ÇØ†Vj€‰û.§ QC(ª½·–“Þõ‘O16y›C d~Ô¨7ŒsîLïˆ|O8ß^H\’=§¢:¸R'¢ue$!k±‘¦ŸX­H©—W×Ü:ùô¢•„úÔ­ÝæÂÐÔý?­ö½ endstream endobj 119 0 obj << /Length1 1445 /Length2 7095 /Length3 0 /Length 8078 /Filter /FlateDecode >> stream xÚtTÔÝÖ>)]J HJ Ý1Hw#) 0Ä 14Ò%ݨ4ˆ Ý% ’Ò!)Ý¡t}¨ï½ï}ïÿ¿Ö÷­Yë7gïýì}ö>çy­º;Èn‘…Ãì\@aÀs>ÈÃrc30hCv¿ÜØ º'g(&ü€çN0âÞ' FÜãTà0€¢‹€‹ÀÅ/Ì% ¸@¡áNÂi°+Ô ÂP„Ã ÎØ ÏáNP+kÄý6ÿZ˜ÍY\BBl¿Ó {ˆÔ ¨€ÖûûÍÁv-¸9‚ðøG fQkÂA˜“ÓÍÍlïÌw²ga¸AÖMˆ3ÄÉbø50@lù36@Ûêüǯ·D¸ €{‡Ôs¾ÏpY@œ÷›´”jذò௳pqpý»Ü_Ù¿ Aa¿“Áææp{0Ì ³XBí 5Ye„;‚ †Yü‚íœá÷ù`W0ÔlvøÝ9 Ò€ïük+Àà Žˆ»¹5ç¯òÚßA®_îû |¼àËû! >PKÈý¶—3Ø@8¹@|¼þ3ðO ›‹ `5GÌ VPößÕïÝË?öýå;A݆À{îq€¿~ÿ^ßÓ˳óøþû~9µUµõ”Ÿ?û3ñ¿cRRpw€€›xOW^!€ÀýÂçŸUÔÁпºþª³„¸€º½?¦uìú˜ÿ àŸÅTá÷¬…˜ÿ&¹h~ÿáú?SýwÊÿῪüo$ÿï†d]ìì~‡™ÇÿŸ0Øjçñàž´.ˆ{¨Àïeûo¨ähU PûÿŽ* À÷BÁ¬îÉÌÎÅËäýã‡:ËBÝ!êP„¹õÊüñëü’šQ‡;C½-÷Y@àÅîõen{ÿ~8ßóòOì|/6ÄïküeCîåôÏ>d`æp‹_ºãæã€œÀØ÷Woñ¼¸îjqÿÍl' ޏOÜÏì°„;aÿºf. €úË÷—Éà´ùSÀiû·yONØß&7€Óã·ùžÌ]œœî›þM¥û†ÿeÿ~ wˆ9öÌ$Ü\$Ц"°é¼ D鯾6$Š~rþ‚›}(×Ñ%3úr9NëÝÛY¥²3\²&6ŸU¥Ïß/LýðZ­¤©òà=e§‘Ý´¢1‹™¼;E÷:{B7IP‡”¥—$E-\àÔ…¤NAØŠ%aaÕ¥Å@àÍTÑ1ïÆHüAV@1¨Iµ¥µìÃ[eÒ'Bº»ß5õf½Úk|?É™2¾ÛG¬D…%«¦Ñ)‰¶\`ǹ•.õôõÎuRµE¼Vtõ_;|Œ ’é}"Í‹Qª;—QຓöÐËLî‘äŽEWÑ"KÒy…ê±õêñŽ1-«cçûZòNÌÝ=æÜ*40*°æ'5˜ôs%¸çÊV"BC¡%ª9Å\ ð´Lu\9÷ /¾ÊÄËå"ÊìŠ4d©k"-S¦µÃ .›ý™¾H "й‘, RŸœùj"ÒÛ|ÅãÚ$À— 6õ—sez^:øpHfàKÊzRpߢÕA B‡æy•¦Í 0ìQ7·ïØ-ÆÝ] 'Z$å¦Cî@+]­Ìù÷Íkf,³3™T²ó•µaÝrµá&nJÒPR›¸² û?o9”½)Ì·<õòBYn°<"˜œM¦ ôqÄFñu%<‰ðdc»ÛI«ŸwÊ”( æR7+‘2ËX>™¾éˆ#&h™ëK;É]“+ƒ$7ÐQÍÃØ‡ž¨Šàa½ø ÆRëìPU¶Ïã£v¤?Æ’.îfWxe¼û30¯ækÝ"‹UÔëj¬QtñófœAÖÔpÅfZÇãÕ;¢õR¿,Óê³­x“@Üç ÈWxßgÛ‡}6ÈÁ|iÐØØj:En(ŸæÅ<Ÿíê¸lßÏ™vÑ3väPôC¶ÊgÐ5\F¢¥ Eî-gxGªmlÙÆf)ð'\¹ÍÅá‚a4MÉþK›–‰’·rR¢0÷ÏÈžS*¹b$òŒ<Ç ž¸5ª%}ïSr¹'̆lY~YÄ{ãŸNGüA·tÕ„Är© cz5§JÏ9ñýkµ¸Á]÷ùUÛ]@¤“ZÃn†ç"a‡ÛùGu!n~‚Ø2Eíѳ?±¸`XyøKiõà%N§!9&£ ï’핤ªÿœÔâ0͈eXî—` ÷cL]–™øjâ*߈* ff o»@ÞWÁÝ/¦¡?Å0}¤åÀ¦J×µôÖÍãлðLˆð%Û;ë`Ÿ‰,úÓô½z{ç§·ë•®G×CLK óΓÕNí…Swì·G¦±MÞ} åäÞ”$7ïrÂXÞ ;Ù†º9S‚´_\ª’ç‰Ì3öÚæïY¨ÖÊ>Ìðh¡„Içña)M·Sow¿Ù\ 1©Ï&{ÛÐÓ½ÔÃO‰É,É«@¨ƒÈCª$Ð ‡±Ö¦ÐJ| 3” ŽÃÔG:YÊÐa3|‡+fŸúÖB¸H¨ÎÇAfÍøýKg2uŸ:°¨P uæ^ûé©¢±xR<Û>`L}”ú _µ½!¬Íä¼°Éï­_t\eceqߤ̘®;œE?ßpøc4/Ë &ÌŠ‡So½+…’Ê£E{ *…1Ï>[1e„žQùYä|…4x7<纉b?)y¢Њš³ÅhéÁH¤G•¾UÏ~6®¤S&§ž¡~ÒN£5s±uhþ©e¬(õ;X„žGmsÑ¡ 8Þ;ÍPoñ_–©¿{ëLön]òšu¨ºì¹_k¦.„úîC’9Å+< ·„gæDaîÖ¸%©PC‘*j.~ºvAî¹Ì ùy(×Î¥Z\ w}4¯N¿¾ÞñR承̷ê.]eß"ÓºøŠu/ÅÕ äÎ+ì‹[­PÁPt€2›73V¥VÞºHƒ×Ú9Š mœÑöf‰^fî²FÙÄëxß“ƒ×Ó>Cöñíåû„ë2¨šmnŒ$/éðtpÈô— *Ú&’¢4{1qGý¶Þ‰àûµD|ž1EÎf¼Ýs&:¨ªñ•RÒñ qöÈ2ìÓv'ÀÈDCª¯|—òÄV‡õ†áÁH?ùeÚú4±èÔãð@ÿ~ÝY}¯/º¤»£Hsìš–áDÔ!uó âm»ŠâXFäîÔdï£d]kleÔ¼ÄP÷‹1dÓyóœøð0Êê·-(2””¹5ëá5T;A&¢1é–ßõGþ~¤O˜2ϰDç±µ&¤° Îyç¨È£'Yuœ7—ïFŸMú‡Sçúú“·p$÷wcÅùä <óãÈÍœÚÑ3ó~d"ò@ýÂ%yš<»kÎ>j0B“G@­¶.‚ƒ@Lžh’Ç]9¥¤í,±²ã߆æŸz¶g¿YÜûøÒ)”"j(˜q–ÿ\Ú¬¥¶Ì§E©\<ž«SÝúqiAê¶Ã¶tî ªÅn[éÙ³BÑ^œ(#}Þ¼%Þs²îžøôÊ›…Ìžp±TÁË îÂPXCñX¼Eê®2Þ¨q›ÇÀã0ÊïÜ+!›z=ÜÓ¼V±‰ËÔ^!µIö9ôÑZ#Ëߣuölâ Ô{pÓc¿''™Q¿JpO;®1H±çñd½³_>g9”_sµ˜é,Þ¢– Ì#¤OV°(ÒÕ›$ÖF?ÙÉ|*N‚%šdåáUåJVcª½ÒIsÛiÔÖ°ùI· [¢ŸŠIú|Iº N å<.×UEk ÅJRË@ÉaŒ,ZU±,0?aVÐÖU|\ ‘î:)^\šzñ-ØàÉõÅmäÄEGEj¤®lôûù¾iw¼uI£ªð‘cÏ|f™ðSRúègr cÙ](OBíOž@ .z=ƒ•äE*oqøLö´µ«Ó2§³êúÇtÛÕ,fªMH;p‘A¾P£ú¹ˆä쵇„NœàÄk}¥óZnáO`¤kuIŸ´]•–R‘ñvoÃrº^Ò ·¦.u‹¡Ò©—ÙytN‘“þÊS€¬&Íg¯¡dà‡B<ófú~_!ÕJiú¡C}+pÿ2[§W0ò#':I¬QʺofbcJ’qÙ\i¡‡*þçëiô âEæé¼öš}ΧœW1'ÐsE,ÔTûö 'ËNþUÌhñgõU¾Éa謎ÃF 1‘sT³•®Âˆ§V¦Æ§XI&ï¦âïÖ =ë0ëšKdá\Ÿ=¥Ï¿WéfÓ~?cÐ?¿ý0FŽd`8#’¡0<'60ôÌ Íô”›é‰£^sû¡a;"cÂìšÙšl£9R㪽è^(ü¾ÝÆçûitˆZ{¬a²–»J¢POØŠ7ëWgó,ÁŒòô“åï+ö“SÀI;ÖŠ°ògêÛ!ÒqÎRó<à—?$&Mk}Át·¡C§ô×dÔ~Á”Ã:Rf Q©YWm–bñ|¢!îPSxS‚RL;‚]i‹²ç†jt*w¹—Êé’aÛ©Ñžš C³5Å‘ÅÉ=F¹/iÏßcoi“ Çz{ÏKœiü ~„õ)¿§=gõT¡ib6äö½3qß«àågB•R~ÜvÇGÁCÆ‹1…éýJÖ}M.ÁDœ"™L6‡ ò··Ì&G2I;w¨?`KÁj¯ßŸ•iв°+Ès¢Ùè :9Þ¢rä…Z[ÌLïÒåGÒc¾/¹;Û³œîSŸ×܈.ôò±ç. S¿ö;.Ål ×±•~ý%aŸ(Ž*KkÞæ·ƒí„ò¥¦Žp¥Š»ÜüW3d6„Ùs¬À)éÌ´u­ù¶‘èj†ÄiÕ£¶+[NnKß兀l–fž·cq‚àµúˆª½!¬ÙånÃÒ¶”Á‘*c©Á qÒFÿ±—þÔï¼J²¹¬(ÒŒ‹~3É—ˆAÓᙯy ˆµÖ/|¹ &ÆGtkQƒÄ£¯¥>oæZÔR—O÷ŠðL5u”Éõì_m]Ã/¤©fš×߸}rÒhµmL£EE{ÊÔwxl{Àsª›Co¼ßM? ‘vé¿Àº ¯Ïp'˜šfÁËÅI7U<à¯f¹!—Ú˜@g"˜G£4ªudõ±)¶ÇkË‹" cèÚ.QW$¢%ÓŸÆRÔ}†ýü°JÂ(êžö@ܱ5¯UN6Þ¯öšz'd|hêgx-ü…âº(Çõ½þJS¯ƒTo_^]™Ïh3þVT‡Ñ úÖ‰¤flò¶koì˜&Ø×þPû¡ƒÕB0yŽ~V=ÁlÓâ>XŠk:tÿæg¿#K-a·ÆŒü“½o‘F ®yi{[/_}•*)á|õºW2J_¬¯¯ƒwokŸi†»> on$q’Dc ˜Œ_ëO[¨¬Ù¬†_ 9º&\›-%c„ó‘BûÖàR5‘öÈËÛ¨%:*æC®ÓH73ÃÓ~٠탳½K‡j¶ÑFr³:³¬ Û¸¡V€’âAÄaýF÷îBñn諘\ÏøSëB,“—YD6Ùð ßéì *\!ÝÃÓ~dþhô³âm Qð÷™q¥³4±ö9%Úܬ]€‘ÛcAŽŸC3˜_É$$S˜šà€ ãÙâ1šô‹!OÑUggÐO<ȯqºÕ’4””ÎæÒ Þ»ŠÒ}q»Ë–ã1Þf54?zxö†”¡ ê);‚×ñÛ ZÉðÚàÕ‚íw|+øAZ¢`×° ½WlÞB.oYÞ ¬X†YÛ/uέ0*(ð«7”·)O´ …b‹fzË„ã<3KøD·Þá×fëWùÕè ¤oÉÍ)›¤bö:lnL¡òœ ´[‡qLjpÕ oû"ÜOwNXgÏ-'‡›ËGì÷(³Q‰Ô?ërß¶ä7EÖDMO”÷­L_ØLÕnFznXGéGÇv:6Cò¤¶ì-Ê·2ªŽèŸcvF²÷Âh㉊¾)ƒÃº‘ͼ¡ü±ì¿úžo*Ïè_/½5õ»ñŽõwåPG^!ÐÌ&]ª¥VaWå\ÿÈ¢ùÆþÌðåGËN3Ɖ¥KÞd U]‚âø„+¦A6Îd5š‰H´êáðÊ [ß“ƒ|3æeyl ¬çvÒÍ<;qÏCKd㌫Ÿ‘ kD›ùZÑòe«×Wì õš– ÓS7ˆœØLlp®Ö̳l&AÉ­ @€OŠõ$t1ë¤Ó7ždîÍJ$<õßöp¶$r8§ê}£=oÇ-!{γX Gøêý)«Ü€úeVMÔ_ÇÒ2w¦É˜PïŽHŠ…_xà ýÇÄ­›\1ín«—™ˆ¥¼:7QŒ l,L_þ gZ?Œ#:ç¯sŒŽßN¬,…#hŠUöø}9Øìn1-ã‘ØàÚXþ‰~ˆ#AQÉ$ÍÓ/ïr2¶pé¨]š¨ûã‹NË@ظºß@A–E|³da¥ÓþÁ¥"ýRõÁgMÄJ( u x€üÊœ¸RÒÛMêq<1Ô¨P<#Š;dH`ºŠ ì”!G¶x*ÖÈ%l}+HéÇZÖ•”6rðf‘»dýYñ“(ÂÚÏH†¬-ø¡±1³ü‚B £h¿Î»PLl2ES`èo‡cJÇ4–,?ò F>²ãJáÕk.ÁÜ€ç}vî Â ŽEi_ fü¢b ¹€¹[Æy6¶Kˆž9‡_ 9Âÿ¡°É|š1fƉ‹˜˜­Ü³ûrá€Y<„h ÎCMxs@Qúâ:b×AtÌt ’gÓãZÞŒ) ÐÌÎ=° I9lC)¥`Aœ´]Y1âÏÒ„æ&?5äíì™’7n0­»…‰éa(=m#ÿºg1|îÑNæt²ÌA|-Öŧ‹Ôˆ÷Ì»(g˜E¶ô癓êáq@üé _’ÄGeã4¦ÜPt°þ.ï¶Aö¥õÂW"mò·ß˜¿†î‡»ú•Q⟸ë`ëÔô+rYçùfŽ³æ“ž@1pÍv¦…?ž’++À×@“ú}‘åñÔüƒ,%6kè—·b†®zÉξc¦ùØ…}±ž NãZ¯;1ùâ*h'¥û8„ôcžèaƒ@ë*NBÊ”m“%$pÔ (ÕÑ÷ý…Í]‹b\‹U@Á­ÂQdÂYÚ7f|£Ž(›Ãk×&Jšq@¢ú%Øḛ́'X;×fXlTºƒõ½,ùdgL…£_¶ W†öÖ’žÎ¡J–Ì9†•9±/=¥ÎhÐz Ãû̶¥ã)ü31&]ÆXªVýŸ–tâÓ*ÔhÛ@ ·Ü}ȳ¯/¤Ôf0ÎxyÏÌ dnðbÂ}ŸÑ‹øxBúÆd Ž–¼ªhš}m}?:ʶ"7½‡§Ú2"¡ÔÌØËÆã©ñ8Ƀ ç­Ø—‹3°¶’¢†¥5?š®pÃ;~¼×h<•gêzp¬T¾Èù}M£~)‘=ÈËy[Ø‹½ËŸ‰I9U“$j,¦N!eܪR RØ8ÖK.RÚ⢯G=ÁÍ„Stöë=%×/­¹¶z+=oHurvªM¹>¥h÷s÷WÔ䞦âúÃiSýŒ2-°„B„žêO fÅNŠ!-–FM}oª²â :ØMÑÖgž¹j£Xµ#º#¶ë"&iÓ¾KN?ý™øøa¡$Çy²´™vq©{÷²üDò%¶ZÞ€§D5hõr ^.½šP‰IQ6(;ÁXY³¤?oIªö0ü2íës˜rv +FÎ]g¬œ‰\Û±g]d—ðh Ãëá$ë§Æpt³Þ“[ÇàN‹€ŒÉO%<¤×b"pD$¹ò*èé·Uøîp~غQr/q&º¼Vow–)‡eC¨„E6P2Ç8ü6¿Z„jo wŸÀ»¼—§e.lX*èv»eÒ©¨{8Y|¬“ª‡½sŽ?¹¥jš·0¸„Jj=ÆG.E½À—U(¨~PVÐX+7Ï*²a–_B^Æ?¹Æî`À!ÑV‹®=©ÔÆF@†,¬ó±DMüà2ôðs¬&H mæ\õ¨Làt3@!&õI¥{²ßC~f y ¤ÄƒÔ½AJ•nlß\|岉˜9o[˜^Ñ÷t™†Ø´å+Ã]‰£2oóÅ‹0sáTYkNLP*§¼ÒPÌ‚@”‚ïþ\>ËŦQ†õý{%Ù¾mtÜíÖ¾"žæä©¨d¼‰“ª¯ç¢>¡%Òî)t鸶iÀLžZ±]/f­.¥œC^Û«D†Pø¬ÛyÖÙØ?y¬Ýàp’l‰xâ|Y›#PËî-}B§âÝC’Ò®,óDÞÁ~§o_ªV”¼âÅ $ Ú4ŃwVÉ#šiÊýŒÍÖ*÷ìr¬×‘â´Ä|Ú|i•á¶“ŸõòW4î öÇ´·YAŒS.—(RE´Ð§2^êtʤ=š&&´1sÙ¯X?¥üð#i>i{Ee ĺáÛš˜}’nKcž\h¦có]à¡ÛmË©þÙ€]EÝÜ”ïœçÕNý„Ú”Œ×ÙŒn^¯§ê¡³‹w ¸`:k}êøC ³A á÷¸1k&}Ã÷X^Œ*¡þvÿªË‡læÄ­1Q ^Æ{jw˜åT÷—ÒiÞ\?ìbÁ¯®çC Š'X-žË”:‰}ma€]ÛÀwLYUü^Ñ©{=%¤Œ ÷Ç_à_×ÉHëmcé]ÀùB£¿)à'ù ¾€«ÞÉIˆ\€¡v=v+ñÿçï¸ endstream endobj 121 0 obj << /Length1 1777 /Length2 11314 /Length3 0 /Length 12450 /Filter /FlateDecode >> stream xÚ·PÛ²-Œ‡à wwww‡‡,ÜÝÝÝÝÝÝ!Hp® $ÁÝå±åž½Ïýÿª÷Š*øF÷èžÝ³GÏ*(H”ÕDL팒v¶Î ,ŒÌ¼1.33#33+<…:ÈÙø·žBèè²³åýAÌhäün7r~ç)ØÙd]¬,lN^.^ff+33ÏÿíyâF® S€#@ÖÎèO!fgïá2·p~?æ>Ô&4.ú?Ã"6@G‰‘-@ÁÈÙhó~¢‰‘5@ÍÎtöø¯ÔüÎÎö¼LLnnnŒF6NŒvŽæ‚4ô7³@èttšþh hdü«3Fx €ºÈé/»š™³›‘#ðn°™mÞ#\lMŽ€÷Ãj2ò%{ í_dù¿ô€¿ïÀÂÈòŸtGÿ‘dûg°‘‰‰½‘­ÈÖ`²”$åÝéF¶¦¬ìÞã\@ÖFÆï„?+7HЍŒÞü»='G½³£Èú™þHó~˶¦bv66@[g'ø?ê9Mޯ݃é¯ÉZÙÚ¹Ùzý Ì@¶¦f4aêbϤa rpʈÿMy7Áÿc3:8˜¹¹Ø¸Y@ÐÝÄ‚éôêöÀ?,˜ß;ðñ²·³˜½7ô™ßÿÀ{9¹ÎŽ.@¯;þÁ³°LA&Îc 9ÈþŸìïf Ù_ø}øŽ wÀgæwí±˜ÿøùÏ—Þ»¼Líl­=þ¡ÿ9_&MeeEiº¿:þOTÔÎàÅÀ``å`°°p±¸Þ?|þ;‹²èï*˜ÿ •±5³°0ÿUíû5ýOÅ® €úïå üw2E»wÕÔÿˆ\—™ƒÙäýËÿ³Ôÿ ùÿSøYþo"ÿßIºX[ÿé¦þÓÿÿqÙ€¬=þ&¼‹ÖÅù}ìÞ×ÀöSµ€-­Ðäbó¿½2ÎFï‹ bkþ.fvFfö¿ì 'I;ÐTälbñ—dþ²kü±jÖ [ ²è·å=Š™ùùÞ÷ËÄêýýpz×å_.#§÷esþsŒ`àû:ýw¶&v¦ì+'ÀÈÑÑÈþ}ôïˆàÅò¾ ¦@÷?• `b´µs~¼÷ì0³s„ÿcÌ\l&™?L!“Ü?ˆÀ¤øÄÍ `Rþ½Ç©ý½ËœÉÈÚÞÂè +€ÉèüƒÀôþJØü‹òÞ>“é¿à{à¿à{³ÿ@ÎwdþÇcû>®(ïå‚þ9L–ÿ‚\&«ÿ@žw§õ_ïË¿(ïEÙü«‹÷‚lÿß ²û|oÙá?õ=Ÿ£Å?nÎw²“µ‘“Å¿Þ)ÿ:Œåý0·]Ðûaîÿ‚ïñÂÿ𝉋£ã»þ\Ë÷áÿþó‘Ý&ð«Kv&|Á–ÁÝ÷õ"øn {ÓüÐgé÷Ú¬ ÓÅú°ÎÃsÛ‰j9Ùkr•’«C,’ú–}Š¢÷¹?–¯¼v›ˆ›=Øoˆ%™Ç/½Ý‚/$yÝ.¡´ƒh¥Šñ–9ƒ)F}ì‡25V£@ñ¦j\Ir£Ä(—ä’ éVìí¯/Ï–Ç"àÑ<ÞQuî0Ußã¸Æ¡ÊÛ±‰ ú­‘¦˜I*ÇßûŸèV÷qkd|tuˆðKT€¬kàÞ9y¨3ªß(8;LæF!òs^™ëQ&š—±º¨4Íç9š¤òø½¬a­#(¿°ºq›ý˜}—"¾Ë£Êçl‘ïÁÍ„œ(öÙ†*“_÷N›®6d¯‰ßÆq >tÇÕ² ç_Jw3ÔYU!4vÔmYx¯3á–‰=5XcIª—Çëà÷&;GÀKŽQ97Î(O´IL“?WúF š'}¯Ÿ{(‰L3?úôªnŠ-ÔìC‹!úl5_>J!è?+6a2É.R¨ °ÒYÌýÁ £ƒ‰‡G:GµÁú¬¨áÚT/®úe³FBcÿeÈ`Û×ì;qÎí£åFÍCR~„¯a'µU‚“ð/jØÌxéÖ_-á½^¿+ûVÊ‹óxš~Üͤ©P%S ëG#ÌT´DPÕ íA:äªcÂ-È߈&P„Ôă.´¬1Ïzøè¿Ëó)°P"‰êB§r™y@aÞÿ¾hs+«óRÍ¡còoD$ ¤ö!K_ oW³çÁf)ŽxÉ<™4?cŸªê"²cëD×ÊlDÑ5`x#m¨YD£TT)íªÛƒýL«+u3‚Z6½ÄÛ<àE…H?L áoÍz /»©‹‹&x`÷!ŸiŒµkæø&‰ÈüÚm3P&ŒKÜÂ`‚NÃPqï©â"žŸÈÁke÷ëƒÿeYlÀ Sj8&T|0FŠÜÓË÷Çøá éÚW‚W8º¬‘”5Þšž`¿ÂÏB`ϨÙÞc6|ü› LkKæF ÇgHvÙð¼Z®“Íc¤oÕ(4–és*Ý¿cž+ÜZ2 H€Q|M0+’ÊœÒ 016RŸ´ßxT\!L^¸Ä>W°fçjé‘'±Df›(>Gʼ0Q0œæl‚= "7udQNܶvwfžJz‚”/»‡lÖ™ë'éC‘gìE–ØŠåŽ:š\É®‘qØ[‡’#wï~ #ÓzÄ ;z¹^®-Ô¤EÇËÖgø$m¡ÿð®0h:CYïff´Ûò©ÉÀ=£U©°eþ^ÊN3i›´Ìíi˜‹Á?ÚŠ®¶…=rîEm9Ã(§ª}Ó÷ŒIæzšÒ1ð3¹4ÍVuzq)9|HØê~Ô·x÷ \ðQ‹¿‡Uqa£Ýë ൤Ãêx8i/©Ö_gV†qW¸@\VWÎaÌÂÇÚéC~A€3€ )†é€…©?o¶­ÙìnÀ°µ™f¯;¼VòûòíÜŒstѪ e^¢!Fk{í¢¬“eÚýøÚöÁ6ãR{¤È2U¨ÞDt½z;?ý¦ó çœ]Ñ-§{›Ë™Á=YIΛo@V:b*ƒ“Ñò.DUùr^ hm~JùºêV_z²àÊ!¦…OSXðx>µÌ™áêsQÊÎ*.ˆ 1\>7yåPMqáû“¿¦Æ}ŠÝp\ÄnpºÀº®/EuEñÃr6Aé:¬/ ¥I¡i]·š9sŽÈQZ´¦P¶ÎÙ‹5Ûûx8â–!,[„Y=Gޝ8ëå¶H 5¶ßQ+:»ÁaSêÈceÕ2*â- ·Ÿë¾š®ŽdB¤ÝŽj‹ÑG0‘£ÇŸ[áô ÎðHrLQà‹ìõoþëóã*ëÛZhk°GÞ.Ø® s~C [ˆ-‹¨òí»Bþ©ÏŽÔE>ML–nKÒÒsS2¡î Ó+FNÛÀÌõâxO<Ñ׆ʼn¢Y,Üî£ðlÃÌDx¨îQ•j|¢Îðd Ù”•R•rEÙì0s¾j;¼Ñq »‡ðÐÀZuÁØTb„0nm%1UÀì‘“[†, iV]zÐ-üÈpËs„•ÅGR>¢âÖ.OaYÇ¿m9gãŒGŸ¾ÌGp.¨,0^*5¹À.Âü5‚MæS†:äâ½Ð±Ô†)¶+äu?ñ‚c˜ãc³~ z$Ï};0Pá˜&œìu+®m`wáò®ÊÆiŸ·r?‚9SC²g$„€±#1Ò{]D¿Ó+œÃ¶)abÈ)XÚhSçk<¼;óo~¨¹â¹ácJ90[ô ƒ¶ñ—úеØÝnì@Â6“!bÅUµÞv-hþÁÞH^»ÇD–Ü‘K éùO *ѶN箂¨yí¶g+Z!}5cvˆD¬Â’_¨ø%]¯Vô4Ü…hÄó#ãsˆÍnbmиŠq¼sû:¾¦0*Ñ[”Û¡k{Ùè,MAœgWµ÷<“ªõóBÙÇ‹®.Þ"mËž#Šûâõ˜øúåO·S2Rù™kÀ1ØÁæ­ ÃÔºåôœœ¯ ·È@ÞÂëžuONÈH޾äýè Ø©ÊomüWVHy˪(£ç¼']I¿Œd-ÚB¦sVÄPmfñM³&¥•aZmL‚J5ï^T~^òÀ+AôC޳»×7ùÕ¹i~O¥V×Ý: æO'}¹-÷5!ˆÚ=bþt¼BÖeœõhßàˆkÎlI¤˜FD€ÁßÇ—ªž³¶3”„”Ë0«]êüâ·$IÒ ¶/7Qø™•ø«ƒ2ñÓÍïcùuÉ}¢¯;fšßcïŠ{Ù¿Ž]²´ýúâëiš¾]ÐÁ˜3@M¬âL¬6ÐFØlûP'j ÇèÑJÐП„§ë/‘%’4M™¡èmЬ"œ<àýe.¡µ Öàò¦’h¯€eOÉõ™ÇaUÅNæôå¹E3*NUÏh(îð…Œ¸æyÛÿÅ»Ál•iG2αò6EUÓõ9Aÿú ê¼üvÒ{¨‚!Ù~Å]êäå ² %&ÆOˆ æ ʼÏs˺±<)ìþü>º‰¬½4³C¤UL²&#UÇ&7DáX8š½tÅi©3þLߌD`eY5n^ì£ÔK½ NçïE¬Ø9±¸îZ-§U¬ŸHƒõ;”2êø¨’Sþ²'½À‰+”_1” @rÍü+ ìF¸¡ ¬“TWk²[*΋,„¨&îû¤($éÍ^‘À®¬"½zÐr8*PvÍ}brw/œÉ×LïþPðÜAÏ0ñ2à±q uß—ª)ÇÒðŠíêËïËêý]–ª¥ubßßž ª–"n÷.µFÜdΊãõgгÈeÅäÁ¤a$¨)eSÂ,<N¨é‹jéÂßõ@%ôb9‘®š„I ¢‰Ô—” ð÷‘e¼ÿ&L8Î¹ÆÒ›üSœ —>€ù¼óKV‹ŒÑ7TAnäCAB¨½òwDZPuÆèŠ•žƒ¨/1eIFxíKn²n›²º–®U'ò×À‘5±<Ïœ8 ÿuÇwÌ~h'«Ââ_v0Í<$Á÷+BîÁ³i;-É-k>÷î$æÞ¶Ë Rn‘Fˆ^Àᨙd.G^ãûøqˆìÙˆk¤v[,ü–ócQ ±úz¾K£| KÁÀdêí|ÝžþjÒó„í—´­Žh`Ñ¿iÑ~cu™Ö×½‘@Ø6&ÖÒ` ¸Þ2AQZQ1÷œ¥·¤#ÆÕ»ê„ÛYñPޝ°Ý6aÔr°®hp;>OùýÖã<`¡•À†H…, Ó¶ FLX1ý`Èà¤õ³9Áð7”î\Õ××ß'=$;’Í‹³f Ü¢íKf«!CeÔ4`Mu¤*1[ƳÂ8Ãt‰úÊmœ"ßv]ˆ+‡|¶ ‚•y¥¨¡ -ƒÖ¤ie}ŒH U× òZœ‘dž;¥À® ²Ka¼6¹‰ÅiUéïw)ª^kQU—~,ùÌ*{«LZímB9|XP Ñtæôê`Þ!5êZËÅåÅ@Ô:s‘éíV¢¹+¦ˆhŠ+[:‡©´ˆÔÇ¡ª¯r/‡gNÃ"UŠâH,u„ì˜ÔB«Æ~ÌbuUDûj-ù#Kƒ˜ª}×Ã{\¬è}• ÿ™°AGÒ.ßp@ÏÞ >)"…¢ûzlÏs]©—<¬x¾z,ýx+ùD¿®Wcç‘8ƒ½ÒìlgHÈwŸŒoro£*”ˆÅ»££X˜;ù<ŠSŠ^†fxž’òéÉ×Ñòg浊k9]%ØO¶Ð8˜P¥=r8·d‡½U×K}¼èáÈRú 4Ð;j¨hÜ£ÝãŸÈ*šE,hùñju—°£p÷u‹(×RÕ\;o_)ðåÒT#w¨ûE~ 0/xì<1ä=‹1˜Ò°âcÌ:4Ôã|ŸùRËÓ;êgŸP<è%¶.¶5Ÿ†¿ÊúE OÝ _Á<î*1lÐ(­?8œãÈW§_Càðq °èÃ}ö/ù·?Ïì’{Žä­UÐEÑP'B»;0ËÝìüzÙ˜]îq®‰ÈwÎJŠ/í&ýÈä̤T³„6áÖµvó#ä¢3…"Ð"ö-¼1¦æœ Z51˜¶,|„×[ïÑ\´`( ¤~6ѯŒ&æ5ôš*}íÓx‚Y†¡,èÿ°·q}»ð ·Ê#=ä ÁzùÀðŒ%ͤԿÛô\¼¡Wl¯_õžÃUn_â¤ùƒä±ýà¡£¹Ï!%‘vß9þY‰‚KË3é*íÄÒ·aàLÃ\Ç¡;xïä .Ví Ajóæ¹Ö¿ÙâH©{òD°D[.VþŒ•tX§Ž<"‡÷ûy ßO¿Ãx+x $Éú/à8¢–ÖY¨ÞâËü•.ͽrkøg¸Ç¤ÏàÜÜtŒlØ/]ú+¨ŒGìßî° ³q÷“Å<}g#v>àE.ÜW¯³ @z‘¡ktú>•g€öÙɦBÏF¦[)ðlâáfÔïÖé¡Σg¦Ê«2ÑÄr°ô w,‘žö·$wÌ/Ú½tƬ[€K®Ln›öá^7çÓG†3]>]Yéìl-Ä’_Ù¨ èM®’ðeH§«„òŠŒÎ…7B›M¿ŽKødÅ(G¨ÐªÒ’//˜T?Y[EB [AdˆÊò?¹Rž$…v&ý¿œ—"Ûá‘rð•Q_!þÁp½îzA8îȸ¸ Ï¡+ÊÌÊÃY¾öØØI»¥€9Ä Xób”.–(1£¼ e„S>ê‡÷#‘ƒµ[I^èÖ¥ =ÅÔ·"U57°ý\aÐqZ—Ö-—Dƒ¨<²„N`¬†u™м mÝ¢V'$„„î[¸½â²h¿‡ Fj#5æ”b¤%ïã±ÒoÊ\V„+=ª6W_Å´•tW3—¢~ã½$@ƹäEîÍÿþ` ®Éö³ UÇæ‚Ì…¬#RÞp£M ¼L¤RðÈ O8%bõ¦8ÎŽ}ÒZ¡8…uŽÓie¢äpMÈ_FI#³v˜(œX]oC-Pþ™Ý¡ÍÏá‘Ëø³²SÂjّЩKRÇIHkO1Ÿpt3 nzqe?/25‚Ô˜“Z1öÍÇT“t‹¿GAKÁQ…QDϸwàU>¡™hgvbÂÈWüåÛi{$ÑST=ÅèŸuyþÛ:­ )ŽÂ¤Åg:ýîs)~çðAK³kUÔ²<õ/§ò'ÃekÈÍè ^°NÈÄÃe–q`Ô_ö+ª’°[Y)~Ÿ12â&Ú5y)BÑÜâÏ]ÌÑ[aÅ%‰Rí`?“Ó𧇑Òò‹WéÓàQ8ùa꼂=ûS p‚©)G-ÀýݦX ù·ÍE6NGjÆÖ¹» 4¹’ª±Æ´‡d×wÊçö5Æ ÒIÑî—ͽ¯ÜáçôdYº—Rµ“+,ða{d ȵƒ4¬«RCпۑtmYăŠî¦ø Ö=ñÂMÚœvâÂR2_iHsfKøø0÷nÑ“¼Ù^+•ƒ-ý›÷ïX¼NŸer¨·±ödI(&‚ åu%(ùTT}ûF`mqõ‡Q÷¹fazöi02ZÚneV¸ÊœŒÚQÿé[óuÁªë{òûÏ5©hÚI^â,¿1Ï…äj]_ ^¯…Ës–?²3^c†‘ޱSx‰yA\™W*ãt€–øÐÔ H ø!¥5$ÓW’Â.f‡†ÂS6˜3ä阬;Ð~ªL×8ôjc,Ã7že¹]ÏNék¹ =ŸWªˆëád ú sûR'“¾~š{ÑÈÁäø’)G >Ç!®þzJKwÁs$ µ–¤·<ç¯óÓÆQ1sû٢͋p•M1oíŸ2Öå0Ѿú:{l[ð1 ÿ°·½Ïè’ß´oHæØ¥³Ý÷óplz:WŠ7ì#V{,…Ná¯Uÿí’#Ä+¥çMtˆéË@ã¹*–“UH=Ÿ¢§0è ?„PÖ[¬…ªxuôš¢’îéñD¢<([O!ùTÁPÄOAáýj ¿wúŒN︇ †I­2ÌÅäâàçWžHËg’òãˆf¬1Ȳ ax°’Dý,{´úýÙæjÖêGol%p;ý ÕœÊ?·´¢O­å}#€mT¬Û\LZî0Šÿš@s)CB`UR ž6ÂÍ»¶4QK7%+o~°û¬wBúåüÈ'±%`0š÷(äŒö £Åƒé¨Ïç4ýèäÃk…¸ý{d÷Ï­4} Þm5.‚00 ÊZô=ÿ²,Èc΋Ýû³ øˆ‡ö„ö~#»ŽèèJ°A¿ÕvûÔ,…y–2t¿›c\q­§É†?ž .ag âC„~Bzš0O?[ihYˆö\ÊrX`ý¶òÐÚ·Æã¥ygõ…Õáv¼5 ŸQKŒäûÉÒò&+G©h,¦›êsÕ+öʧ:ú: ŽCŠcÒ_Âò§²Ó ñ6¿‘´2¼R¸¾ÐdÕ‘k¡{ž/[,kw#yÍ7%è)3œXutæ¡æf窞ÌØ“ú‡ô²I€APSDêPQqÉ« 3FPusŠùÆÔA ¸W« ƒzcÑAÑô!”øù›©@vû–ýêÍn¢KÇ}a9!””‰1™ÔW£¼gðœ G?ƒ‹ÐÞžôåËŒPßUâþ"%> 9}1oWÛ„÷ùÏ ß n¡åŒÛõRaãN a±Ãd/pVŽÀçòh>ÈæZÖñE©n îbúdÿ$ã´UïO¼Ùšˆj×ǽy5üJ)ì­®çËU×¢¢¤"?È;K83é¹OÌ¢2DƒS:-„(ùÝ¡ F¦ÀÙò?ᆤ?šR Ýß ! º)iÒŸÙ+¼`]<aØ¢ëZyâ3eÊz?cñm[X`°™‡¸Å#$›ïZpªôÿHT<¼?ê~ÕqS½P]ý¿hæø«txÝ¿˜„Ú%wå¸L‡˜D­Á›R?ËP¿¬ßî‰ÞÏÅàš7|0Ä7²9º¹a‚ÕˆÕnÛ²åvBs“ð/ŠD‘ŠÜ—4èÙ½›ÃŸ6- ›·lÉ~Ý2H&ìt7€*ó-AGŒ¸þ’ƒ:ßÂ8'†Xæiz.ý=O ñmlˆÄ,Œ¶¼¤6À;ÝQ³=²°¡Òü0y„‘}+ÝŸÁxŒì*¬ì!´§RQæG¢ü$¢•eþ%Ã:Ahfo5L…"­”«?µZýÁÝG‡>˜’ž­dQ—‹KR7mJ›øèêfÜãܹ?ÕR<'-ü»Ãò= |TåÒ1GšÇÓKj`¤06üMáÎ"YR.À~êåY-lsOS_\YZ‹TE"f"íXhç×WXƒÏ` ûÃ\c‰ŽÉË3Q WEȦʫO: DNý’ÅÃñ%#¡ÇžÑ8½&0 À.¿iîÞ/dB–r)°òô)&˜Txº+=j3œü 7Os½ö× Ù*HK"Yeæ¥çPyTù${Az=èÁ˜HpÝ6~è“°MÚaº* Á=óHOHªUà™]Î,«5ùò 3¬4èÃý2&4F»IžöN²ý'h· ÞíªÒØé–OÝq¡"mɤ­Ô˜“÷.uÏÄ oNBH·Ì ¤_¡á#â¬bŽoòµõ ÊŽ4Ÿ '©Ê˜XÒÔl8 åŠ*›U{µÆuE眎ܵ—……ÐåÚÒäÕÙwCÂçJn–¹Å™¸äÎQ¿L-t¤©î¶}@mÓ ÜôLø ç÷%Ù–ˆu+(£³ö”´fgª7hú²HL°üj=3`P¹^XªØÞ­¡0¥—˜Uš1÷F0’=ØF²wÆjždæ¿tÊ‘éûÓ„éËh Â÷iO_7 ”Å^¹ø¹sžݾ×v_õñ#Vè‹X¹}£YŸ¼g¾}§­Fé« ÿƒ{†5Æ¶æ †éÊOz d$F gˆï3'¦™g Vl†HÙe’+ÑñEAÈØˆ§ë8@ÉŸ€5,'L,<ˆJbV¥.0çê‡|ÔIº„ã­hÓ å~;qrvVy3˜4y²A˜Å€Ý\fë/™˜rñãyÂkØc;¡­ÚC0B×nr?wgäC.ŒWã‚ý-#¹RÛ©¬·S"\¦*­ßÎ7]é¬?²yÜX§Ä¾Àå­˜?MHfÝlôÛtH;éFà|'7[4Ô×VN¿KHî^†÷fÞ@X)eº5 i J©ˆûqþÉå5ÌRì:=ᩪ5>Ć´¥È÷ÅÒq›l¨Ü(;Ýø¥H°Ý‘N HBÓÊ>ùP¯Èø‰g eðOß„z$Ö(õ–|(ÊÞ†‰»>⺤X ¶˜qû?–©/d•æ8Dú$@Ñ=Ÿ]TKJÕ,¼2a¯ZÙQRЫ-Ÿnª! …jùðŠÈÄöÐKàë3\^ +ëâ«bÅgW3Gæ xî2Ë¡ˆÁ‡È\x IñÒO¥§!îK¯ã‹Mæ‹t#œXöâ2M‰ÉB¤¾/"È“À¹{ëRa8y: à›Çà›Ê_æNÖ7áæüôäΚ—rÛl‹ÂÓäšGLk¨ÈX[ιRkÃs„ßÍY©®`¼ Ôß pWpµÑ­eT„:İuÙ¬eõ"ø%¥ÕSÅ8dº×/f¯Í Èy`ûU|Že5ä#LïÐ Îãh·N¦Ú ·ŒYP‚Z]b´E¢+…ô½ÎÙ( U²²ŸK$3ógÔƒ–o0É¡aW5Bbà±l`µ‘?~h+±ÄüŒžëhÞ5íÀjs“ 1ƒ›ø†‘¨ÖS˜j\'°–-6â¸5[A6žv ê!õ" ùúù·!º @b©º쥆”3j¥ýËéE‰2*e³-»¶;?µeýWœâ6N_r%Ïóî B ²u B]¡^«o-×α2ëIŠé²M3å¡‘Nê´¯âðc‘O¹ëè—¾~Q6Uˆ{ÜU<¶zg×!mE´nîÇ×w¤ýðY!/0F´®òÆÓ6a _'Y Ÿ\):oh £¸»Ü- ÄG¸$Žëßÿ¿ˆ—J’pañh0ÐÁòç+æÝ+Ýôš@¤S0å ¢Ð#eŸÚ3®Ãt´ƒ/+1dŽ´>iŽ’°+¨Ò»²'¶G3ü2/:&Fzš‹ð+^FÁg…w=S=ø½¾jì;‹­ë$Ïvûù¨+e¦Ó¶r×b W:˜ÕŒbÊ Ì;²: ‘”†[‹ì”û¼&p¤›ÜV¦ß‘×Èœ.¹$æÒíK‚Ž(^°¯~Øû[& úåÂ…RH¯îì`zAÉÞ@%ƒ 6CQGí” ·¯•":6·Î** UOû°Óp\|[»ŠJû6ÖD@`¸ 9ÖÇ[¡ü5VûpÝö«:»Žø#´áïÈ6?;ØÓBªŸ¿@HêšÇZß ,|J~Ð¥Ò†YÛ_µ­§´£G’a÷‘¶ $š¯kÀŒÈ`þ¸© «‡øå®=ëø=¬z›6·îB ƒ!”nO8×2/™O¹bç9:„îŽi'sR÷ã0!ÒOWG8…ç% -Šp7ƒ”¦„šæy×_ä«s[䛼 ÂíñLd‚|Åç¦&ÕZèh¬†Š([æÎ7<§»&2ÿ»¼4Ž1gz‹šè2gÊD.@¬Y¹ÇôWØ¢&éÎóid¼1¶ì¬Âgë9Ò1’9‚¶¢$A> ±xI?î,*ŸÜ³°ù“À™( 䆻¨{³;QeÈ¥¡j­ ?¤öKP­"®È>»þNt‰úªô4ÖZ œ¬µŸE–ºÛIñd¢ëçÁÛïà ^ãØ(©J¦Í½|ŒÂVS\— ?4=A¾`Dj¿b QãÝvTZ #)0Wº öÓVð|Ô üÀóãµ¶:¡€$sÖâA·q|¥ ¾Âß.ÆÇ<¶A(ïæ²)sÁE€ƒ±¤­ váÁp'&}<Ö'1Û™¿Sâ3Úq¦a^{‘qþS¿ëœÏÊA÷«¾6]ËTpìWFD±@¸‰Ë"et$ž¥J”CK6ª~Q0JüÜ6Î÷ˆæûß—­–ärüi\cgÈAhº )²;Ônê 6U’ôOÅDûË'FæAüw?»]Ã$Íå®B­ê÷fëè0æ€DúKX»´©LìvÓ3U¾Ds¬qiJsø†²\Ymy¶s~ã¢éO©šMa…C>¬]Ä…œðÏ(­ê(¸¿8èúõ„Oæš¼%ïc/IÁ_tú‘5³²¥rƒQj:á$ì"cPÏ0· ¾Úа £\A8 ^#»‚»~ ÚïGØQ Ã-ã»áªH¤ÿÖ͇¸Ì4ã-¢°š=“½ƒb {g¡AÅØ¶nÙ»|„ÐÝ%bšgÚzö.Ò )X¯Àó¸Â·Ùeõ´£OøxCéQÏB¯œº—¶NV?q;ú²¬ìÒáìB[Äô Èk”a ¡pX-  `»Þý>J4=}šÎáêØ8Õ¹™u5…-™€v¡‡øk¦½s©" 2Cä×q˜qTÀE5ivØ[£5Ýäç›U¼$—Xûës+.m´Pjô¸‚ï&ÄU˜N•i¯¨å<±}¸»QãeöÝøG &±ÕïZ±¶ŽEwl/æY°-é Âý(¼I;£V»;?$•7a‚"’Uš}–Ì}Þ’ƒFÈ[»ÊM97~‹¦03bëžDqE‰(ÜNçH®ßïùù®]~Š:>·R[OÔ¡†/ ÕtïŸþdëÐbà~7Ôù¼¼édw²NWE{ýD¡k¾yÔTãvGô]òYx “Ûø¦ðê[v´“=źTÆëçµï·ñ+Óõ» ‘Ö“56JÜèúwפ2)–tÙRØ·–…ñ߀ó2ˆ—õš {JÌ©Ëpj_ÀùOù>;g,+ì]ïyh{‰ô}.§„;úõ(-ÓÆRReÐYOƒ­n'·0±ñ}/¬ò´8•9fôkŽm­8w¾ÈC™’TšõJ ¸g…±¾Äh•¥§ž•q  G©*ÌÀ€ù+ïÆX­ÿIÅOµŒàìÂX÷8 tkÄî”1:Ÿ—þ¤Xû£"`¢5õ¦SDi¡¡tÈ«áù9Ïé×ÅüKy*‹§ý Ìé¾ .¦O4~1HVФÅ.»4Ô/]Ê'7Ôçk³°Voýûâ:@óìa3ÞŽ9¾«׈ã/Á8Ûä¯é뤾‚öÈñ…tMÇÇ‘8»mfUõ¾OE!çoý[7VsSM«ðgFx‹:­oiÙžº%‹Lð}Y¸W`šù~h깂zR˜‹6öâú²kpà‡é¾0k¢“ð²¥mW0‘3Šô{ˆ”­|òµÏxgN|â[²ÎÄǧ/t½9$û^1¤ðÁ Å3øËLA°ntã#$Žø'M½KŒ{U ô6ñMô~IYXÁmæi…ÕÛ©À]« <¸Â`¸ý€œi Gq^بa8C¥‚Û¤âxéb>‘Ûc…¿ù¨“7ÿ„/IÙ Q Õž-¶bÈOh½GfÀü“”nj§mïXrnä…-\ÃÊìªg‰etÝb‘•˜õ[¶{iȈc†ýæ†ÙzŠ/ âmê¶WŸÛç4…ÜŠúÐfþ#Á•VpÅêi×0Œe±.À,é·Q8‹7ŒuNo§ï–Ca(ó9Ê¥° âbè­ò“PÕÕc< ä ¿ù¯³åþ()~oM\*C78ÌV2L\‚œ°ôÑ)X _QØÁ ™„Õ‹w·‰±†ç÷üìH[òN|aŽn‚;‹—ñœ“²´º -¹.= d_ÊJDéý ýó­‡!tŽK4vfÔÄ~ À§a÷ï“Ö’J[&jž×˜ªõu{ÅñóG‹S§Ú ªMãÅ3•Ć^ºI^§ Æ7û°à~lN\/0ëòü‹±"ñ@;¯à4Ö,]Ö3©°gOv©kE˜F”ØÑdâÍâ+ŒÕ­¬O¸1¿2 ù²åâ­62GåyKÏ!Kñ#žØÞ(𠇋Juk¡uñÞÆÇØMXËkQ"' #Y¦Ìª½t±‡ÈÂE„³C6%õÒý´ jÆvÁ©V›@U‚•ƒIßûe¤©Q’ÉMŽƒ0?ytßl¡_ÞÊöí?ت?=ZZâÁÜ÷ªÑèÆä|ÔàjHº‹¾½> —éö><{ɯJ¾$µÒ axÆP:,îV?„1Ý—ñ¡îåMHÌã É[NÐï mçòËò ^"±„ÔÐm6­Ä]-ÆLˆÈ,<òí9ð>L[ù*3ÆBKrl2KL!kwNÀùe´<£Ñ©‚›¥Aäôôæ>(w êý?·¶68 endstream endobj 123 0 obj << /Length1 1345 /Length2 5936 /Length3 0 /Length 6850 /Filter /FlateDecode >> stream xÚwuTTm×>]Jˆ€4‡%fhN‘ép˜€a†néP”Q@@BiDiéTRBºAB¥¾Ñ§ÞçýýÖú¾uÖ:çÞ{_»î}í?/§¡‰° ã×Ä =…Á" 9@MOO[ @ qHŒ‚—×鉂ÿm à5ƒc=´Ü@Ô°pˆ'N§ñÄ!õ0hà¶ ‹`)9°´ˆ@²1X9@â„z"Àm îAÁ«†qóÃ"ÄþWo"¼€©Òã‹ áéÁœ…„ÂÑ8/4 Žpém]ÀÀ Žþ¬û@øóv°øïpzÿ „Dÿv†@¡W7Ú‰vH0ÐÔñôõ hØ/ åÁùC¼!HÄø];ÐT1 ¸ÿlЊEºyzˆx Q¿šýwÏh˜ÆÕŽöô øUŸ: ‡â.ÞOô¯ùº 1>耿E Cüjæå&zt÷‚k«ÿ ©(þÑ9Â=IŒ´¸¬wà¾P'Ñ_)LýÜà¿à_j\An7k„DÀqŠˆ7ðÄzÁƒþÓðo‰ `H¨'àwD¢)þ‰ŽSÃÈ8 `‘¾€5Ç@0úõü}²Å‘ †A£üþÿž²¨‰¥Ö-3ãõü·UUã ƒ%a1IKÒ¸CпBò¾Úhù£^ÜEýU³÷Ÿ4øsIÇÒÇàØ þ!» HŽÀÿgÊÿvùÿ1ýW”ÿìÿ]’¦ õ ð'âÿ@\‘(¿?!8úzyâVAƒ[ôCÍá,°*ûo›¶'·*hG©…Á" ‰?ôHM¤/fˆô„:ýA›?ôÜöxþÈ/ŽÛÔ@C1°_‹$&)@°Xˆnˆ8Iã6÷ýMS@TñĹ¸ÒƒKñkbbR€(Ö óKIñ¯ÀP/,—ù÷hqYÿ’¯+î ‡RL|Â@oF8WD4~­Ââ#¼Ô/&AÙ±w@f¾¨Áä½b¦&ÿã›ø‹ûã*ïG—ïÿ˜£Ý+¼¨u±`¹ØÍ¦ß—c˜¨j$F8| ¤Uø¹á¡ë%¶¬ÓÜluW sdØ©3îò9P¼&ä¾·4–µ¦%ÏWWv\ê´CÊ_³)r™ ?þ>)£ŒŒøe‚ˆ&•¬¦2Õ7*#£ï´y<ö•û·³i®ýHoï“\è˜,ÂGΧ³¬¯Sºéd­+>Ym‘bÖ¹;Œð.¶h0ž$ „4F„ 8xô¦”ê¬fŒŸtú1ì§«Ó3®°>и¦ qn'óƒx©Öœ’›MÚb׃ÄÍU;sujö¤cÏBе-¿/öôn½’‘ wó!‰­¾ø*+€ éÇ„¹Õ0Êb3Oi†Æ!ú£ï`ôŽ ]ÊX%ô¤ÄñBµ]Cþ+Æ‚(óu¦ÎöSVI…%á/"NÒ”]‡ïòÊõ¥‰N/¿jVshòOOBµMÀŒÏÕ—îpøŽIIΖ-Ξ¹¦<%=®‹'ÈÌR´ÐÃë —v/)“ËÏ¢‹åѤë`þàÅP9ŽÛ×t›º•\™u‰gÅn‡ó½4ó•ùX´³qW` {‹ðþ$£Žú²«]†e¯H¨ÁËÜOZDz5ë“F5øŒÉRÔ[g{öÐiºï_xNRÃÃj‘tJm¦Óï†?ÊËO–•òux €ÄéÍÄ‚ÓA×öžXQñ¾x‚0Òm8–ëíd$}×FÚ¼š¼±¬/9aÔ91€xVWꆬ̙ý•<‚íxh¡Á=•Ò”CHtX˜œõ™žMMô2hž׌Ì΃`ôði·ƒ;;ªsâ‘æƒº™A=Ë9 ÕӃ薯[~õƒÊtª»‘r“Û*»4îhÇV©–GÄäcmŽÊ_¥>ë¦`2gúõÅôÊuu÷ l`)îï‡ÈÍû‘_oÖÉd¹ñ~¸î^qç÷¾y—²·¢ñeÉ0m`DÖæv™©5öÚÕïMôRüýjcVºÛº{%Í_Í ¼&ôïªb¯Úw2®ÿhød roi-yIuœu¥‰z¹zð°wa;wj´v!GB>ëuk¾Ðûú©ž}ö"ØÀÌÊ>ÓR*g"–U&+žFwa#¼Zš1WuçœaSlWóRª®Ÿãü” 9DÿóÝ^ÞŠ¬O*sâËÛü† ~iµ·†»7?H¢DS£©í„äë ¬No\_Ú~Î.z9z0æ ?›Ù,©ßxš'›3šuY=èçŽlÏX»A/º^“À}³ÛC˜|7dçDðÔ¾¤Ëy7sEž‹“2·.R¶X'ƒ€Ýq.àžmâU.,¸µxÙ¾ãôÍ 9‘­~\XQ4úêÜ=vFì×a;QÂæk$Òïn2jkò¿ÚhG[xnßM}óî½à(å7;•)¶äÆâËÔ Q†ý«êÜï“é >½TK1ŽQt0ãq]b"úؘgi‚gØ ³ ¥ &f|l_67—!x?y|ìt?Ï”ªæX!,êÃ~iÑ4¸õGqóûŠ‘)®¨¼“qA/Sz-ÚÙ-R1øì˜ÇŸ3VZV‚ˆþÓí8Mº‡@D¡Ä@†i² vÎi«¡+ãšM>kÞü+)§ q¿F×Á$hÇØgäFLïFâÎ|êh¤©ÓÈB’ôć,+pÔ9”·2øà¡­ÏÁ-Ò*Þ7‘ïôÂvsqóOE”¿á÷æK¦µÜ~È÷Áé†*ÝmýúJõ^‡Ó·ŒYlÜAIª!ÒÀ¸Xß‚L»îJ{Ìò¾™ªþb1)Ú„ÖTfbù8á,;漌˜XqŽ¢>ýs7ྪ I¯½Y6ÐàËÓ2Ønà Õ¨òl+˜ýç4zœX{À‰gÞé'çì}ì}‹f6IsL:‚D·tªúž©öf¼Ë Ü¥Õ»_.ýnGÏøCÄxÈùÍy´íORKRM&ŠÃ)ÿ;ŒAÕ‡ 5¹5Ûµß÷?%<Æ|ÅÓ1jV–÷ž(aÿ>©‡ç«µÿùá”çŠ&mŒ~•ÁUñmD@WR«°„ïÓκoC^ Ò·ödÑëË÷˜_žq€]b²‘eÃô’À¹þ ØoŪßVµPC‚r·-ýÝÑ[îþi…`î™›ø÷çkžäí9d?x°9TRJñ$,/¶Ê¶È…CWDÈϳíXž!’IR¶b´+®R_Öb'Mxº­cŸ  )_Í=hNåÓ[M~5ÓyÛ‰˜8³ƒu] =;8\І"z‚•8›í¡êÙ¹z£ÏÔ(°Ýv`Nyq^ú­b„ɹ: eÆ´D¬ð­’¾]R»­Ñ$öÐÎÀ®ÍŸfSvp_lѶwùê ‡¥kn<¾*0[Š jŠ÷f^Ðçø¬ÜŠñGȶИ÷<|z 2@™ùúÖµ8O35˜‘ÝâÔú$|9NiªÞÏz˜ÉgÒY®ø”{œ§} „‚ââÝ×·Ï –Œk+ƒËàÐ=lÍk^°3@4{fÁ-VtxÛ¸~M}‘r˜âtrnb Î_+g鮀ÔN?.}ÞËnè<†l^€»·9mV2 rmtÅk.Oœ{-ïÚ=Fp¿¯¢âüR³üùMî—ÊHSºÜÌ<é0cã îÙ€kˆ§Vø:¹C-tô2{ukC_ö;×)ú¼¹i&p#ZöJxëDmØ9}†®X™"ÂéL;» På÷3¶ãQð‘öýüÙãæMë«Ï.½áWîQ9KjºTø­} W³àMÙü3‡ >ÑŒ.®Œìs¿ö£š)!R³@,N¨yíL-=XƳŸ×(fîlÍŒRó©|0èw=Yè¡(A³ò¨R1”7åfÕëzB*»žPpÓA õn¯&ãâfÙ5­e$wìë†þ¬ÓìK·öТ‘%Gë\­÷rñ"KÆðÝ·WEé@˜ÉòªÑ]k¾¾†AMþJ‡oŒÚè˜dÎک⟊|æëVæÂZmÓåíñÚ®SO´ç˨a¶;1º‚F'ð,BŽ62Â,œS¦Eo“Ðo±ç’Ce›ée·µó“ŸFÝ:²ê‹í÷ŠFSý.â*äpòIfÓ$/Ÿ³ÈÁLÔã&/Ÿ®·òé5©&X5~JSO«¤vš0 ¾ÝÀ¡@ßÕÁ¼ÁôŒ ‹nzÔï5o–«ïQ¾@DzÁ¥þƒ× î&Ó˜ä¿h]^(?~Ç*.|4ã–’¬ì±;º Â ;lß|Ó‘‰eͪcœ|‚ÿøeÓniÅOÉÖL\Pý¤ÖÓúÙ[À«#aŠ *Æ8ÊŽXÅ …eÓÙÝå–Gq»Üë–#‘ü¾o9‘RùåSÏžµ—L~µZø™3ŒŽð’šLnç/üêns©î²HŸÇ•¥ !Ö²Póñö‘eµo ÏnNNËQ"Іr¡µaã@[¹—Ïѽ/†ñ,ìÉï1ðÇ`·êÒ¾$ÄÄûز”õþà o ÷ÏÒ&¢eì¡1jô¡Õ‘DÃQ³ã ”ã{“Hª0¬tháhŽBЦh÷=}yQ¯ó²ÔŒwÆ ì[«E_8óÍÏ"¸«ÊçÜÅ.r$?ÈMr:8UR2^ÎÇçfI$5ËÐp7‹7åYSq\h’Ð øâ{‚ïÚX7ØF†éëpj É/ÚD‡ 24-Ê›¨ðWã—m'3ªŸ<”tàÕé©×Šy¾T»|2 c[MZPÂ_%¬š1±iÏ}h¼þ]mû>ªÈUÏ™(̈[EgäÎ ÷®S[E«~ æO*aÿQ’1Fw§u4Ìwã©“üþ‘Ò{é€Ý~UùNb—'ó £q›Äa;ÔÄ+ \A"ýÛÉ"d%?±9d<ÎÜz®˜ÂªƒyiúØÚo,(d«GJ¨íºÈWŠ´ÑZQ½Wðv™RUgI½8+Û¨‚æ…¨8.ÞkmÄâ^ç–‰»Ÿ&ÉWªjý‡¦†ê¸~¨&^%Ö&=°µ¥> ¤ß¤’Éj 6òf6Ö}ï²Óc¼[ÔÍ5‘qC'NV¯0|žPz¶iN⼬ˆ]Ó–èÛ°¸ âÞwG='å­vòáiV‡ŒÓT•÷R¾ŒSæ²[Ï´aÞ˜Í|g—Ê×?y‰¨´Úí T^_i6û!^g,sïª $æùè*i(ž)¹ÂÈåJ*w,vÚOwL–êÓîJµÏá$²:ÿZ_Éðõþ ı!¾ê+ØÀde ¹‰HÈ ce´·b¢Ïòžäöç¤+°Àmé8æé©b*ê¡T5ÝwÀµlŸOš¹÷öùäÛØCëúø^áèxåZ#Öå<ƒ¸N®OÅëV^ÊV*•xï{·Ó¢;(1 Ö¦ G(ú³à,™=r63C¡s<~ÇÇ’A4@Î) ¸_¸T¯Õ; Û-}m´ð$MSû=£U_¤ñ¦|!Ç™=IGì–ÐE|&jÿ9ÛÏh‘Zêúý,©s y ž‘‚ÞZ&ú*ÙV}£|s¥ F±îV¸´ñø–~ü-¹íŠå£wå\ø¡ïŽÏ—’9jjñÞミvŸg-nû˜Ð Šólކ==Ÿ­"á´ ½f—ýC ]øL) ±_­ÌIÑ ÜgŸ¶p‹"ÓʉÖÊN&š"÷„á%^œe W·ŽSù´»H”þ†ðDÜWþÒX¥æÚ©'*æ{×h}7ûTì¹õÈ2iŸ_,”µ¬ÔÝ p.hú@àS#5žU†ôâ Ý|Ke\n’çé²W6åYÀ ²ŠðªÖ¿Î`Ú›Ä]là\„]?–0{ÑÞè1Ú½¬ckfu2Ed¨Ê§‘R®e[  (neÓIïYU{Òaêl/2(ÔiJ-‹e†š^'Ä O¹«›bTŒ.¥›$I.–=v¢& &[_„åûÌ_È›nÕE‰©¹àË 8uËxoqŠÄ@„uàTº—äî¼!`·-ZÕT7jÁ]@*Ǫ̀výì{ ßDýQ4Ú0{A¢÷FQvU¦Goãé¬d·BjG¢¬ÁŒl域qÁc¼ ìdkÞb3ŽS“š)ó Ô{n™°_YܬPƒ*)‚Ô hà·Í—¦ ]Ó}±‚'+[R£$_b¬[yB,ƒ‘¶Må+ã³zKE²ëuh£Ä{¢O«E‡ÍâWæüüBÅ«Ù{:Hæ-d „ð .ùÞ˜Xýœ1CÐMê!ì˜F {)öå«ëúõ ÊŽnYœ/Šz~§™`»<Ï…ßìÉí¨Y+·]é½¹fÒye²½'Œz ¯d/+Òe©è±¿ú£¹p—ýÖ¶ïïùûßE+DÒ”#&4_qº½²û³©Ø˜õZÔaSÛ¢8?OzA<¨qÊ«£÷»ò@Ñ7Vê"]šþsùg9ä œ«±ë¢šw©‚ïª'§Ï÷¨öt¢Kò`A¯Gñ•l1øKw®I¢d™«¾$ óÉhtÏ'\ö„îò_÷(`¦^ØiÓ»}RâÍñ¶ï4‹úz;ˆÏð ]Cî?…ÂýK*`“íI83†_$îǘ @»,Ÿû½ éÙ™“ÞÊù.@-zbRüÉÍM!½,zŸ4¶SÀÓ÷²¥ÜáÊÖ¹Â*±©ðà[FÅ„–WɦÎUz}ÉDÉÀ ž»pçØ=¨sâ£ò«üÏöúÍ‘š[YôÖëÉîæÔs †ô^[²j»×¿~àúÚ­›YàÏÔøRÒ¢XpÞ‹÷šµúk%Bƒ‹hBAù§yŸô£"E‘Hø¥0sØ3i'õC£ ]'b+oÍÏóW½kÌjîCR20JïÏûÆbãDˆïQu hë.ÙuØÀ‡é†Ø>J¾˜ m¿¢>Ð×õV‘‘ùî­UêþI¤ŠËýŸJå`c£í­·>Ÿ F£—‡^%§ÓkÄ¿Bïh;®Ó*,¾¢M~^eŒèyÆv?§u,˜×XËúÎ:7¬‹ƒõ-òzš’Å`_ÝÑ2ùGDÙì¸âøJå+ýÄþÑDrã3šøP›ý<åòHîü͆A=¶ZbšEP´m sWSw,m¾È ðÒxDô^§ Ä7œÿWÐI×ù!w·¥"tæYëKñfmî@)%Jp… ÊSÇMäÌšåSlv°]Ô×=øBÚUqù²÷t™Ë΃Û]‚X5Ï¿˜üKXËþáÝ[‰ }Ùjï~¢é¬{P¬raqîú-Å Æ !z~¹r‘8ÎÀ§GÉXªyæµ`&¡5z}‰ïP8û¸Ñ^–qùyuo_AçÉÁí½òÞ·DU,WYn-…YÒzŒÙ‚3.ÙÙõ†”v¬°Ÿ¤v•ç^5‹ÌN|AEµb=Qü¦Óë&¯V^­Û·©U­ao“µÐ¹]teè‘~§¶V)¾†wdŸ}+SÃ[Ñä4ÅÅ㣞ÄëØúzùælA({êÂdWh¢)|P–’'¹ÖÏX!~ñ¡AÕšÍ^·d;×[X†Iúqú£Ì±ƒªAb/Ôã éš0œÑ’ åøÎ" Z4=T½Ê#Ž‹ÂC²N=féþ´-VµŒ¼ü+ÅKmN„V¬Ÿ†‹Ñ‚s±y3Íå-ÝCˆûäd~QA¬uØPº¶X]uþok{³ É1#d P*9ʱ4Îa±jÎï:?—iL$±ÝvóKv´NkïÙ%pG¾rÝŇÉúó0=³Ó8IäV;¨B¨wàù#ŽÏje]uBVùáìþµk’u”66Ž®bò£mo§N_¶Y¯f.D©¾X=ìø:moý`žŒ:òp)ëâ ¢Øü8¢J-üž*HlY±F.9À²3ý>Õ…_ÅhÆr=ªŸÝ™2£—Ý:Jzž |ºÁüÎ+Tž(ñ I e²™Øð$¿âÔÿaLQ€ endstream endobj 125 0 obj << /Length1 2757 /Length2 24034 /Length3 0 /Length 25579 /Filter /FlateDecode >> stream xÚŒ÷P\ ׊â.Á·àîîîî4îîîž@àî–@ðàîÜ‚;.·33ß$ó¿WuoQ½¶®mç4”¤Êj "掦@IG7Ff^€˜‚* 3€™™‘™™’RÝÚÍøRèâjíèÀû‡˜ ÐÄ $7qÙ)8:dÝí,lN^.^ff+33Ïÿ ]xâ&ÖæF€¬£ÐRÌÑÉÛÅÚÒÊ ”æ4f´®·¹Dì.Öf&7+ =(£™‰@ÍÑÌèæýŸ4üVnnN¼LLžžžŒ&ö®ŒŽ.–‚´ožÖnVU +ÐÅhøU0@ÑÄøweŒ”u+k׿åjŽnž&.@H`gmtpy¸;˜] ä5y€’Ðáocù¿ Þþé €…‘åßpÿxÿ dí𗳉™™£½“‰ƒ·µƒ%ÀÂÚP’”gtór{ 0q0ÿehbçêò7ñ0±¶31üÅÜ )¢0øOy®f.ÖNn®Œ®Öv¿JdúÔe s1G{{ ƒ›+Â/~âÖ.@3PÛ½™þž¬­ƒ£§ƒï?ÀÂÚÁÜâWæîNLÖÎî@ñL@"„ß2K €ƒ™™™‹‡ t½Ì¬˜~…W÷vþ¥dù%Uàïëäè°ô·¶‚þ øºšxn.î@ß?ÿE,,sk37€)ÐÒÚáwthñ7 ßÅÚ  Ç Ú=ó¯Ÿ?€ÖËÜÑÁÎû·ù_óeRÑ“•R¢ÿ»âu¢¢Ž^_vf+3€å×’q>øÿ7Œ²‰õ?4þð•q°pðüÍÔ¦ÿ1öøghþ9ZÀc):‚¶ ù½äúÌÌf _,ÿŸWý/—ÿþ+ÊÿÛ’ÿ_B’îvv©iþÒÿÿ¨Mì­í¼ÿ1-­»èAgàðMµ€­ÐÜÚÝþÿjeÜL@‡ â`i÷o­]%­½€æÊÖnfVoËßr_WfgíTvtµþõX0€Fót Ó2³=:\A+ù— ºœÿ¦”p0s4ÿub¬œoÐAˆà˺Es ×_K `btpt¹@åù,]~M”“À$òKô7â0‰þF\&±ßˆÀ$þ/b ›Éo5€Iâ_ÄÅ `’ü@ÖR¿+€Iú7b0ÉüFì&ÙßDNî7‘“ÿ@ä~#9ÅßÄEé_Ä â¢ü€•õ¿d©ìúrƒ¸©þF nj¿ˆ›úoâ¦ñ¸iþF nZ¿(‡Îï„ Ýï®ñ€è˜þF >™º˜˜ÙA¯" ·ßr¶åŸÓ¿ Ph³ßDÐÌÚÅÌÝÞÂ4õÿ‰9@9Ìí@ö? û/Cгó7_«Çdþ²‚Xš[]€® çô¿RP‘æŽ j®tŒôTbþN²þ‡# +ˆ=hoM\­~/Ë/gwеÿê¼Åo2°ø²ÿ‚$åø…ÿðþåü[ÍÁö züfÅòKðۜ㗹£»ËÙA–@P¾ßlÙAÙ¬¼¬€X€d6T¶ÍÔ>Û? h– ššý¥‚Zÿ;2ÈÕtúèA½püMäìø5¨§ßjP0'ÐÝá? ÄÎòô¿ëÃÊåt½¹ÿ0åüKfíø{ØAÍq²sÿc~õÓù÷Þ8»;ºÍMíþ“–ý·âÿdæùGó_1˯Aÿ1–_sþ–ää ´·þï2sü²zü1PWk¯?€júôbr³rþ±´ ¦¹y:þájŒû4?? ˆ‡ç»òþ#+(¼÷ÔŸß­Eòºüê?r3wÐ|ÜþzÕ‚®åø¯/N@ Ð aiÞÑŒ/ÌæSXû]'Ãî„À,å®V:-ƒï’K‡û ìÚÚ¬ —‘Ã=oV·%h~ /“<û·4ÀF¶&«´=ú=½SÞmCXœÂ˜,:ùÜOOÈ .¼ç÷ìì§l ÙÞ)K™çì΢\€qçÙ'åõ¹¿be,b~We¯–Sñ©b†!^#N?¸tŽ2ß4û.Œú…êÜÏ›YôÜÉWÙwôþ'ñlžº›¬ ÷ß|ÖªÔY]»ð(ðtq‰ ¢MSùФÊâ,ø–¯,zµð“ä"¿M[exÃxÀšYk­ëÐÔ[ï1¶ô•e'ï Œš`c'¹¾¼ ÓÈ…L «¶Í(ížÍHtðÕBt­ýF³ÛÂ&p™gæó+`YÇê8¨¹Û÷±Áymxd€á6"­õn¸id»Û_«T¨_ˆÐÒ“…NÂJ/fu…7“( ºÂm¢K (ƒùæí²Çî'h“p~ߢø,[EÌ̺Üнàc„gy~?ü«/¿nΡ-Øy¢„¾†r*¾šv¦]\áY±IgEÐägJtGç¯'R«ð}6Î…V0]lVc3Š—{“È¿ßâXü1HQÌ.#Õ»»P3£âmsÏSÈx¦Y­Ñ–ÄQ¨BºsË0~Ð<$ÞÇsc~°uSY)jˆuè¨yQí%:œð]+,?ú«jPn—«R†”H,DÉíö·^xSòSuQRû¦DT-u‡íñ®’'Ï·2â”ÔÅK ýr‡ü‘~K‹âŽxuAÂØÑ?›‚X?u(ÝÑ€!e½YK ÌÚ[$7ênÒüñ‚/J ¦~W@î³ Ö»03¾M::½MákW­_ÊÞ× àãˆË€´{GEº°ôñjC€&.æÅ"¦y]ÃS¤a™°>sn¸q24%JÛÁÆŒ)8q" (*§ÎZ—/¦Ç|OαÙòÉ~ Il¨qÕÿ´Žù“?¼R@¿KœbC\(Jã~jï‰óCT¾ŠÃ`%ÁǼëZÊù”Ò…iyV0œz½œh×ÛÅ´¤qáÚ!pfh±p`Ûüߨ1‡>%¦þPÅPf“È' cà­G£FÛ|™A‘E­e„^êû$&;&ëVFÕëË“ÄY@¬9Õ;Õuüm^úAòU,SùŠä‚¥'™$GJ¢çµ( ¿Ú×`Gþ‰ý‰4DÄiã€/üÙÄ,Ìå~p¯üm‡öç M÷‹YëÃVëyaÕ‡Apuu™BÃxÆBW\©Ç؈FcyÐz¾^q{)ó³³|#ºp›¼“[FEæãÌhiöw‘sg9d{ü¨!mý»AÚ­úÓô¬ÿ7.ñüK€1¥þ†óy~kë»»©^¦Ñ­LkAÝùçÚ'¬=³X|æ ¢ÊKÙ{…\檱D_Î Ü”yý»^„K®óðG26<[È”lõ9ÆU9Ü£aÚî·8R#r¾&Qbº¢R9ÄDÞ+4(1rÔ½¬J̳ši5œ2Êv‡UÍ É8›z]¸ÇsÀñáØ»A|8<¹±*VJËÓZŸÈ[Ù1¼Œ·îŽ U¬}¸>ëîï3ò"ç8é>¶Ñ• ÷÷ WN¿òtN”mõÊ¥`›ÑâŽNÞ~׿°Ìu«¦*ßN˜ì»~t€Q?öìHÞk§È3WÔkÎ,šXVº™Ÿ.Öbõ"9àv&•7Ÿ©Ù&p<ØaÑdŠ®¿~8%e~ß­“ïèsOqz?u;DÙWÝ9=Œåù}ý;«cEï ÆèYHÆjÕý _ÔþL‡·‘n1Õ®|È›ó6¤ky@$‰\†+ïÀ¡véIUÎp<[W›#åsWÆs„+0Dý~‡fì¹]æÉJñ¡aÞwG €’Î)ïaöÍ Úº±¤‹jˆLáäȾO¸Ië[D5nQµýª)%óR:4݃ñÇcؾüäd&‹Ÿ½Ùv”ʬù¼Äš"´j¢òL{üØgb9"…·Ùv|5¾£]ˆ4ä­‹ÚÐ’0ïÉàbÁÒÁ¸}Uߣu÷©ãÐìi]@˜Š¼Êù9* aÌϦi¡m¼¾ë@öÞQ8xå ªÌ«7ì¸5qË!Vb9ÎgÊ“üA’ò…Í›˜ßµŠ1øàj6ìMM¾Åüöñ¨Ydô%F¥œMë·íÜÜ•óÄ¡'½á–¹ú›Âd’›r’wêù¦éï¾j'ØÑØYÖtg/›ÎTǯ‘6T|Ký6ªÿýùÞ„]…§n,Pɳå¬YÜ\:V xïjh{Zàjžé€â󮶺p¿ˆwžP>ÇÏ í¢Â1¦¸ "ÔCQük¬‚…Œ¼õ U²È¢%Áw;NoðHYøDI¥ÆK~9øÝcåñ"¥ÜˆäÀõ»ªýÆ;Ý…ê¥ 0­å©ÝPª®æ`9À ÖG&kxë¥wxÓü1îÁ>éâo¹làÞŒ—zy^-[kn,"îÕÙHN"“Œzì…j‹¬¦ž)ôf·ò¢3E >®¡øØ§5Ä~² Râ€J>z.'ý"O/J,•ä ÉbG厾­i´Ð@ŠŠ¤²i6¶ö[ežžÝxWžHpjãíjbä—e­{^Å”â“!×ÙŸƒ)M$Â0¡HY†DZkáT¬ìžèåv>ˆ{s¬ZçŸ<×ÂN”xL_ wÉ”Jr1FƵzé$Ñ Ö˜JŒ½+âTêᣆSôU>ôHÒU¬Z«cÿˆ?0Tܘ*‡E¨QªVl•ãJ¬…£W͘?Ñs)Î|ÒùÙqªã ôo¯_$D*޽]Ô‰T~¸F˜×(8¹z‰»˜• íú†*Gý©Q\WExUfiÓ²pç55«K¯Œ–«êëSžÕ„C6R=Þ¡õÜŠIÏõÞÔÚ.sLF eåó±áhR…>íçN}D¨nT$ò•“lÜX¹0´Í4]ïq8 À>5–¯-4€[1)ÒÈoé÷}ŠB÷Ø:—ßXmÝ+`O¾fI§Äp±y=sý€Ù3pué »I—Ñ&P~;âúóÂ1ÓTØ÷÷<"©˜I¬/Ž£V î—“sâ=fÄMS§pKWåts †iý›Ú‰¹L*y~§N(ôfT; $¿–€ÌîƒÆá¾Ô/ôƒ>Þe¡$ß>‰úÔŒDæF\³ y̵ÃÁ@]µÿ\ˆìÕ7)$åÆûVdº‰ké¹D ‹wǵ+ ¯#ì­fð#<"£úÞÎQtþ°*ž$5Ûƒ”ùþ )F2%imãá¬\bÃD/*ÄâNÙ‹=H:<ËïVë©oEŠxó›è5ÇÒÕÓÞÝwi}²ÙJ=˜;–Ô‰ŒS3Å;ž 'AV.Ù6Ñç«FW&ȳ˜/–4×å^X*oQÜõj¶¸Ç“¥\û›ƒyIT)x 81©¢ì½kh­ñb<†mzn´ÊbåûTE ‰TEMì:½>å-èdó~âP(­…ÚØ;[Á<ÅHh«Fm$,{d²) ßì_Ê'õ_7N9;§Ò" ªØ™WÑn¡€‘Þ(ÝGs“)ªn‡úƸ¤CøCxÑ«Ò%õÑõ‰ÃjpòW¥§m7Ehyd‡ÔØ4!h£ulC£ðKÌ÷9؉|ñÆÄuS§er¢‘ëœÀ¾/Ý«$µïšSì:UœÞEàWÎÒvE|ɾo,¾dŠçƒUãF2lùÉ»ã%[o#|„éUù9T‚†Yj„o¢kEÅ"¡|0ÅVºå"ç _’3ÃâŽ{°›¼K²v€c? ÁùÝBŸóLq°ÐN씋˜}a¡2Ko1pBu‘­R6Uay$ìƒÍÙF 8âVà°Ê.“½“"‡¸Ö30ƶweqÝ$F:µzS·e¶–Ý=^ÿ³¿o¿÷º S0M:šªóáäìžt\nNg·l6OŽòÀ¬X¢¶ù„ø1S@tiLÆ€ H †å@~5ÛŠf…2¦§!oU¡Ã° æòâmÂPPÖ‘*ÿ6ÅGÌOhU>£†.Ìô~mcšá.ë bvohÓ°¸´MdÁ›mW,¬¯ÕT„=ÆÁå©ùÚÚüIÛ´še»>>ªÔZrþýåu—À$Z‹éIŸLã¦Á&¡‚m30a5iªTþWèö÷gP@0nŒàlÉBýmÍú ¶þ+“jRoïömÞ³ô˜9f¤…^Щ!¦“ /óÑß™,üÙ6R3GK·É¤ÊSѪ÷©-\§açH´µ4žýŸn›šuúã ƒæ(mÆI¥¿žÇ*I¿Â '݉å;ˆü4qønK¶Ù#åS°Á*9Ålê µ˜•P0G˜î3QÛéh]÷áäjÜlø&U A ˜"f1‡vŽZà 4—‰0.`M¥ä+Æ S:¾6+Æl~vD5Æ­Ô2vyÏ›õ¤» Où?g•{FN|S `$jÓÏ&œL ƒ­Ž•l!³›Î2·îª¹Ý½W)\.S×®>¸#Xø›9÷Òst{>¯|Ë~ ¸ë:AÑx[ þÕ-vÏšWú2ï«Rxß MP)øHRóë‹ó ¶`•ûÁfƒoÔ÷}>¡9á™Òºú”‰§‹]`ÄÈS>Û†¸›£¢ñÞ”AŠê[Í_Øpk´7…Úí1eßßa3̲…ÝYo/H ¯&[L†>Û6«2gþ|ö ´gD§WÛîo[²ó~"ÁåJÆ.@Òù9/ö¹ºßþ°ÑðK­”;‡‘T&ì]ãKÉ#ªSÜâËfüØÖÂ&{t=KµC”Ñ£¸ ¿ ýžE…%ÑCY?¸ð¹Â½ñ¢áL‹”Ü G%L‘çàqÔŽŒ‹ó¹H›…HȆšµ{UPÏÚNïeÇôùÉ ºŽË›C"ñŸ7¼‰ñËæÄ°ªÅk€d74¤c†Š¤·˜´[­&EíV[“£ðeÑíôåEç;‡q"Žd€Q¦Ww|(¤FŽ_,µöûÆÁÈ0)aTù7œ2ˆú$aû«7‚O¼ÃšFÂßJxèd˜ý±gõÉ|±àÎÙ4–}~ »ïHfY@Ž-…ÿTÞ(¡•HŠâF yÒ(Îz𤭹°ÄˆÅržãÝ4·!/n©]ÅŠ²‚uD,ÎjwuDfh~ )Å•¸MVí§2 ±Wñ° нôqõ ñ/ì“ÿÖ †SÌdQñ;½´G„ís¹±Ö\äÛ›ûøòñ5QQ§ ×xÁãCr¤häê“!¤_ºàD[·[דçßpT‚ š5®)챯z öÛžÄ茆á=m|\œÂì!~âFw%†§qº";¤%o¿œ­VS6ïÇ>ÆRqFi;­$ùÒáas4eI¨¾Ó|x3SöÁÉJŒáLmb°ó”'Pçk•ô1É|›‰ØðïÉ›Ï8Ÿ3?†žlpÜ¿)ÕÅÿŠÄØnx›o²êF‘'}_•têk*?·{ dòLôÅV ȯhôÈ`œ –€°%Å(ÂÉÐܶîôµëIWG‘ò…`"Cì±Ú ô$t‰è&ÕäwÌ=»ú&?ÉEY¨X>;ì9‹!«QÈlšsG(¶üžO£|¦‚A®n/¿–õBƒ&»ÎôüpAWàrúgXºà¶áxŸº!áÓ'»7QÓg¢¦®n2Õ $*$“/ÆlV‚ª{÷]€Þi+:ÂÖë¦Ï.®'–NƒìUO‚”‡“¤¾™¸Z|vb‚0ÉH>ŽéÞu¹ÍV8+’7iJ9òÔæ~Þ¿í¸É]ó"hÞ±ˆŒq}[‘Œ"¨`¤5ªÖ9Zz¤ÃHKgÅn,°rö¹GçÞŠb}·²t½[uÀ¬%€Ä!çG)ÉÒö2bÌãöC¿À6õK˜.Höäw­±·ïÛ²éÂíÜ‹\÷Èñ Ü«æ:Í Ë÷Sò²: =²úmhˆÑq}V|çÓÉ{e«®Ñá „§õÆ: p fï†Gëü(×ûEõª¥ZdýêP2êAÿ4­ ÷gëjAˆKB¥‡j­¾ó1LcKOG†­ÌþX¶÷3%ˆZ“F^¹pžqSD •¨Jß.’/Rê%á*?Œ± ¶kc.¶hÑà’‹½[zVšß+Ê2ûzoÆ<š›>£—Q‰®þÄÍ2£1Cš£×9ºUÕ"¨=óà{…ùæ¾f8®ØPûg—ÂÀ쌻ò‹e£Žñ—}™·Ûp´ëÚ:ÀÕt‚~“ùõËϱá·cû28Ö+’ÏÀøkÅ#´äÖ×®ýÚ(–Wœ˜»õB‚om¬!M¨-'·Ü<}£—¡s©utç‘}rœp¢Ç„(ò¦üÓB/]®tà‚ó*ÊRµâaFØuo™,?°ÛÆeÊ\û¨¾ë ám¬³s¬¹Øö¦#Í–Š§ŸgDƒ8§ôK‡ŸÅ ÿ{y?]‚ëi¼Ñ(§¥Â%¢1“˜ D7æno$i«mR8Ź­³Ìà „K€ê¼To‡$@u£B\º—Xù­téÚM“b)V†%Ñ'*2cÖ6þðòä]èº[sAØèéjZ½À~…䉊V-DåN»_.Ç6Ž…™ÁÁ²8ü¨Ëñ´FVÄkäîüéyßÖ°%½Œq5¥@Gêßa%sì ®mæÍ Ÿà½[ ÍïÕSå^~³ægô‘ÀyAJ l;6”þÚǘ⾮?âï_•‚!ˆj6Þ[Γon_Rÿ5žÛéf|%šF3ÜzF—"÷ÝòØ(>Qætw ¸‚²~Î~:ôÜ€„t;[¬:<åF=ÄÎ:mî°GB19<©’:†÷v†V„F3žúʇВo¸î0s®}Â,Œú9¶EË ß}WÌ0Ý…ÉÉKÓ!Ͳ*.A9ÁŠ¥tòËühm“…»ž¡bç7ãà^•§T .ÄmËZ¹e#4Øý|$Fè)NW»64#Ñ'ŸÄ8´[H¾;n©Ú‘J™A#ÆãZHowv ¹ ­ã¯Ïê`îåŠEfYLå#ѽŒngçúñÖµeÕy:~¸ŒQ!›ì\‚NBO±#ð¬VI_¸*[.v]JlvQ¿7ÏÚsé~ð¿{õõ?Ù„iJÏçâ{̯.óÆšÒOÃXÛVÐ_´ÇgšÓaéÚæö×ÉÂRIŽÍÜ8Âó‚KÄŽ4é³óûÛNÀó\4ß“Øõí¬ED‘M4[ÀyX¡C›Û„¾+‘`u¢½o§Ëµ «É=&æÉ§ôb›èu¬Ÿü²‹ÝóZßà%!.l¸ AÅ¡¾Ý{…6qŸ‚¬ ¡™-&t1Ï”0O®ÏÁwf $»ûÒÌV²Og«]ã-¬æ¶9nDF¬È)/ø¶#C‰}*WcÛ8xú”Ê0õ{¥zþf¼ô)y', ‡eë(_û™ñç(ø´³9—O=ÍzÆÅásîè5<^·6÷è¥ìÙµl.Ÿ¿Õî? ®ÍÂpŽ&™Š*{Ö"Ûᲄ¿ëŸb7$ù<Þ‡aÓ5R WgA»{|ý‘•K^ùæ1:áÍnÀñ8+з‚0,ÛÖ6é9ÄŠ læ}ÜO¹^†UÓµè O‹¶O‡”øwŸ‰ì5ežß+‚ßòb…0ÝOÌ¿0Ãâ} Ý Y|¥ÐÁEûž> sEô¸3ʵMRRp<²±™ô)OóÞMpH7È{õw[\¹­)çr^A¢}EÇÖÝb!éóµsXF:—xˆ]I¦E ù…bh ¿WGO×÷…^ŠXÿmçm@6ìä;‡³\ñÜΪ¸›eû‚åªI/ôÊk/jÚWnÝ§Š¦³¨%j/š]æ9<+Apn[e3”6‘2|Ò»üãgð‹«€= *¼’aK¢#ÆG´JÆ-J•‚FA‡á•Ú{ÕÃÖdfgDæ6(‘éª9‹¥L9 r Îõòöh<@÷®¾®5žŸß,P·kÛ`^Ë<Ç2nˆ—¨Bïl‰§7ˆ¥9EÄ R/ŽLsG‚Òr¾Ì)âèMqÚ÷­épºs^š¼ÂmQ˜ýÇÝ“ÝKg›ÜÅ”Á£yY¾æ×³ ¼VýwélfÕy`1hë‘üÓÍ•t¢Ù5hƒ.òg¥êF„íql¡Óîˉ™Œý†š9?Ô}ÕÝÕ 1D+BÝQ† !";í*Äö°Û÷KæCÿbè¬Ò½  €”d5xfa|©Í®ôpđŒdA‰§Pã—¼æìôY¼ÞÞD•ðè šosww|NÔw&“ÕJÒ"Nå§®„e}œñ\ ¸¾KÓŽs–Þ`Ü)QGâR/‹Á;îÐ{°°Ít&®R-”7m6§z’ûã‚ÕáTy†fù( ¬-”Òž´>Hä†| ÂÞÐöýuYK5îxÞ}%Cª#pPIJLRƒ)Åed=.è…ØØœ7ð>£ÆCÕx¶]—¤ Šm¬×«ƒ•ýÁˆ4ߌgÖ1ÅŸ‡Wö&=óL4x‰1Ê‚ÒΟ«r#šà/rÉ\¤t°žáE‰‹þ€S) Ë?h}AFúîfçÎÿ2Æ\©'±X¡ +tÑ@)î–LpUc ­|ìi="¯oØS· ¥ÒU†o£ò:¼;CDÏu‘ÉÈb’Ï5î)g&l¯™~pü<útDŠ~M×7{=W³@²Ä`W•í#·[L¢lùŠQÚµ'áqa§o¯µLvgЂž™‹VgÃ̇xIÄòÎa¸‚Ö¾Ï#¬£ jƒöúU¾?>Ö–Bè %É&í˜ F¥X'Šjp<^Ö uÖËl\ùôå¨{wqfØ[óçi]Ó«éÏE‚“rÁ~¡QÛ! í’5ÎÚ4å¢ÎPŠKðó Ï‚¾¦¹l ÷<ƒtŒczýÊû ç`PÎEì\?ç8žÆ:”üIË¿Õd!Z|æþ–AžE윛rk4¤Áö a­\Ù6×^¸\4ÞÜ¥À§Š* £ñ¬]zr** > <+÷ÇÆe õöT:•á•ýtñ›ÒÇÜùF7_q+ØíÀ=û÷D»À–EŸtú¨Ò9ÑNXÑ( á£dZuÚl^SªfT¸öb}4ráø“ÌÜžÕûO3(Ø|)ÍÞI,ÊP‹;† l¸™‚JlV"Š¢$ibϨÁEÓ†ÑÛõüØðЏKƒ~ŸÎÅñ>oO¿ ýÐV°K,™ýÄjîÓ·+V¡}Û9»UBƒBOÿSà-süϽf¢À‡X*ƒ˜ïÕÁElˆ›°1þc{K.¥ >çGg#-ïˉïs7¨_ò*ÛZ^ŽªÔMâ%ôø0Ê+x÷õÚx`/g­ wÝb.ÏǪä9:K¯dIfz‚ÜgM³¡È^_—dØýZµMyNðJªœè©ñÙ¡µÐgÙ߸DîD‹øÚ+¼É“_Ù† ^9c« .b`@rªBj~oY‰r;ˆ®žÈTñŽ'Ó¡iû¤© ìÑ0_º8AnÞÉ!PÔL6DÈ£ƒ'T»ÅÓ~ÿ3á^ÖÕ]Ñ rAàC_dbžâ¯¡*´ä™«¥9a±ž¨9P¯¦­½aŽC ´ä¦¤7ï“ôò?õl£Z†»…çI‰š8^ÒIÈ™c¯ŽÞIí¶@Û_+ÌxEqÇ=wVê’š½…ÈiéîpÒ0Ó¦¯d}’»›÷ÙFó@Úx¥6*ÓÖ;K -u‹O‘>Uœ2wºIA·™ç(N“S9k|²ìã~˜¾„È…·¹×ŒGåÄà3×'rBiÁÀ’'|ÿ˜ÏlRÔ*ÐQDÉÎlãé£H -ɽٌëZD¾€…¦Ë^ ¯bk:\ ~)=9ÐbU* ¶ú Óó¢Z¢Ë7SfÞÈ®[ë5qk¨|€ºŒ6BOŒ«,æ|7×R{jÏ­Þõ„ao;¢lÀ ö–Ïü4†ÙþS[pÌ='®L9`÷Š“Ôð‹‘)y>=P1¯‡ 6Ä3p­ûz÷ö!#eÐÓc¦Þ|Û‡žtrN6䈋vÜ$ ;ó`¦ÿööI›"ÙÀìÖIŸ) ×9ž¹)±ŽúÛ;¡r¨1{†„WóoÍfwÁxi „ø+¿þ/(óõk³×Ï8ßÞYB;nÑ:œ t´­ÁqÑ)Ó+Ê#.ì—Ú¨|A±›ÃƒEÔ?`¤öU±‹c|y¼Kzîvìˆ6,IWœ" þ(š_Ê¿onRàëpÐG;}€ÿ3jŸ±T=¬×]&4¢£jaïy%V&C¶ð+§dž²‰I¡¨$;=ZÃM-\ £n•`\B©¥U‹ýÃŒoñ ÂúŽº=çûêŃÇäRÒ'>õ— _“ï•ÆÛ”ª é” ù,µžãó5¿Ë¶iÁ”Z is8tÐvÒEl\Ü ¼ D‡ÑåRrÓQ¦ˆï2Éíè;¡Å¹Ü8®*<Y#—"Ä?P^¶÷•A 3!1èÉì9XzÈû)ãÿœ95 ÑÏ‘q§|öÂøÁ%ýmnÞ¿!ÏO…qWð¯²ÕWo›r¯âœøx†VãFS‘¢ØÇ?q”c A?:ý4Ÿø´|‚reWõ)=¼C8ä6XH#f"íh†•kµ£\°•~ÿÝ›<´~Ö½2âZ_éÓ×ÑÔ߬•­Ÿ…l©HÓOAVèd„ìÓ¡ý«É‘Õjâ¡7´¶òœ'wtfP¦ÃÇe!:šNCç\%ðʼV…½Ã±nþ|ðÀ$@õœgÌŒñüOwŸãü‚JÜŽ ® ƒ> '…pòv=€EZ'½S ƒ&ö~Hñr>1$ÿ‘{Ñ•´ï¼ÉøZß8øQ_ïaΨaŸÎ_+ÍgƬ†j;T=E°Ø«ï3±gá*[g£ùef:Y§q;Sb­ÅfeÖ´ñ€á×`½(‹·èÞ)‹a^¦¡ Æ]R¢õí6ŒL—/U›Øó’ß7âÌùzñ [”ËHšp­À2JÙm~8£×®9{Wg÷K0ó¶Ös™éP6Uý÷­/“häõ/Vé3hæW5Ï0tvÙÅ(£éñKÊ¡ÕÎ ¢âïóiÇ6%Iɺ}9reŒ ßž€7Ífx:¼Ý ˜iK 1`±ÄéNo \º‚æEæw!¬ÀÁ p3…ÌþR¯¨ú¤ª½FüÐNN44ÃðjŸ*ŽÝ(ú“»nÖv§÷X Ÿë{=vïgcç»±ÅÓo@PhÅ’÷øjéuºöb‰Ø(¢Ó³b¾o¥Š;ÍP¤­ZôJòp“(æ; èýÀqóµUZrXYÄGö6ÅÚÈE8ðÌ]5u|ôLüt§ÄŸy4=·ÖKp™Îê^ýâuZ±Å>‹¨…2¡/˜zê&˜äûé¹}-§[ß}Ï„¹ãsb̪Ó*™4ëðƒÞ€'´0Õ­SÏ"±Aòko"E-^ï­XÀ+ú$­xýAA=3ý˜–»MÐüsé>ö¸þl›YV·šx#þ(Ÿ¼ø;µYIðlúˆcRì产è'¦à‰1ù| ž'/WöŸÆ9}­ÚR±­oŠ~xÁH,œ£ŒA{wÃñ9Ö'·º;~]¿œ!Çûf¼°UÄQé÷ˆ¦`æ¾;·¤Fâ]öTßòS´è.“˜ÇÔø¸þÅ«÷Õ€‹º¸`Û¹j,‰órè›ßüÛ×4ï^“Æm@FUñHøˆlni™ù&1Ô/û"'´ŸØQfTØ-×sÏlÀÍš² «q¾”=ãÖD~*žW·ŒÍÒÔäI§Œš·†ì£/|yÿfÒO3 ûãh…‘I “Nx@„¡|r*4öž8D™IÀm×(j©¦§Áª{} •á!œŽpö`Ù-'ATc}0vDÂA|ç…ñ5vW“ÔI²{e’ Aiðòî÷ÏWù€qwA¤ºG4¸µ¦d€GÓ.*dö‹Æpæ±@t©zxŽŽß’/zj~Mõ£Ùt; 1ÃÉË ÿùA)îâh"fô&`¥týùdbÐÒ©¾kfV†ò›™°ÄÃÎÃ,k{óW×´øÝ¤û°ãБ”°‘ïõ˜;o§ è‚,Ò`túXºïÕ<|õ¥‘ËøÓL}¸Á|¸è!Ê$qKkˆ½%¹Ú~Š5=Gê¶nï-/øc¡õì ‡“« ³Ã4³É¡ˆÂ—±6Ä»û ¥Nȸ®–³­§'Î:å¢Þ¥–®5ݬ&G¯Â6ØêŒŽöXØH8Ò˜1Ü™q;»m×1/º= üÆEGØUF/287S^'ÝB¨CÆv?ÅnÔ»Ž1 ÷ðô €xŸ€Êß»,Dƒƒ$Ú“ó$ h-y»µú>nÁ^h1¨~‹Aù¤ìD€>QÍ*ž-0Ù»Œóôt°ÞXP<ŒçZ«”Áí+ihñùcÈÈýùAk\žñ äLóµŒéºýÙé,º}8õè6÷ d™ÆÇHQðÜZqYÛ½ììÛ'tsš¥ ×ïûBÈâ,r?'#é§Ãv…¤’ýñN–òœÔUh£Ò á·ë©/B TJ—¼nnêËBXsL9¾(Œþ†0³§/ò€p.Ÿê#]me7˜½™KÍÄZ&,Jjä´çgÇ~íÜÎùfý,Nì ÌÛŠ‰÷ K £HœÞ¿äi|èt‰À _'£ni{‹D0EdâUhˆW–L8õøÄª”`âÒ¿xŸü¡:„cšEVÀE?8‘S ¾H@óìColæ 5&¹•ω–þÌOO˜×:foz£^V ú¡èÍkÛú&•¦XzÛ ©8‡„÷G-ʧ÷O”B7¸g…§¡‚ÍÇ Jšqq’Cß%[lO_R%PŸJ"Å>Kàç Å0úüØf*$±çAI€3ŸmAÕÐàdaM"B!Œ)ïèéI‹e,#¥ÚîRû f™¤Uá[$È®@¯™ /õcNÖEÝËV‰oDÜ\e ÐS­·ò°«µÃPæ'(1",bios…ñœÜØkw…7IŸÑ§¿äTÉÊœbL¸šâ÷šH +]Ä3ê&æÊ›”#WÚ]ÛTkï£<Ù»î:›×²ta\†ÙÝ©PÐÜN³ia­YLD]Ù묾ºÝòUHvÁŒuÓÜ-IàœwÄ,/º®X[4‘"[ª‰ªD¼ã® }4ˆîИïE-=rÄHa÷*ƒçV¡ù–Ïof›òü‚< t–Ý~øÔºÉ} ¤@7/õx§×a ‘£r1ûM §ÕU\ß( µÑQóÂ×ÍLõ»A>ó¬PpFúHr÷ŽUöK ð'kfHÕ§h2RÀ`ú8èù²hh½håÔ»åëÔ<ç|¥yN!• ŸIîfš†h‰å¶»ë%RUeêJÄËQéFª¸ånêSv9Äš4=ì^ÿ.@T{÷=Ià÷öt(9Ÿ7Íp–ÉEg-„3uìÞÆs Єõ'èCEGgÊDO"‘ jp±}Ql…ÚÑÊ—¶Û‘\V±÷ ÞúŸaŸxï!à6K"ý­Õ%ÙED®­`I WiÐÅÔuøðÉ`ïmÞ>^™níÚ¿ }T›:V»«jb[óƼÃ;3Ê( ÿ]arÏ ]9˺"DüK5G;c–°QÛXÙ ŸÈÇž}¶ÞdA¶Qì|¡4r~×~¼ÃË{ºgÑÆe÷9œ÷)¬~Î^À0û.%2¸œ ¥A­³ÄI!R™*óãàG«ˆëP¹øºûêaŠãê‰c>A¡|¸ÂJÿwFQôûü‹’äTÀ¢@é݉œ’QSJ‘á*à¶æx;©k©·_JÚ¢ê(Ó®ªÝZ©/»¹ ûV¤HU²ÎÔš_ß³ûö^VuI\ÇSƒ^xÂY½÷*æG•@mF¼ øDM˜¯ßð$ÊÛ—eZ¨KEÝ-ýƒuN¸…)B,ѶŸd¾·¿äÚ]‘ÅYbÆJ'ZÊesÚß`B,!#6€oµJÎÆ1-L»¸€Üm"Ÿhµé‘Äs#ͻ۶‹Ùâ©|ËHÓ²!'m^ha[r3r:å` 7˜ ï±$d–©^¹ z7×Ù?]B^¹ÊN«3ä‚^*ý%JsÀçm õùÁÈŠ:üBvcª¾‹)j%Œ[T(¢"Ù"4·ÎÏVvgr¿ï«;ZýxPüž¼46Û G’&Èɹ˔K€Æo07~\j½ß!Æó@ô& 3G"})êþ™ Ä7ç Ýߘ$´&²¹C*76å>­7$¾0†4sçªÔ—ò ”Nûý¡7°eUf/;K)C"…½Œ¹›Š=m9ZdOuWB>%k©!ÉðÙA#²C²Tè[÷öª µß“_(íoÐf`?†s¼Hà #¼®X“}'_¼„ mžìNqDÄËŠ\›‹ÚˆtØì¬‹ý”Ïa‚yeÌù±?5(Õ¹Þ]ª# e½õ»íë»AÉ„ì'ôc-}½@(­7¿d¸~‚Ý%ò€–+ÛªýîC%EgÃõã üXÞ¿Æ…£Ac¢Â˘®çrfÄZŠ?SãîgœMCyâ6,âüÖ7”¤æѳäE…°©‹Øþ-¶:r!†öë¦Ý ¬•9w̪>CBFp-ðgÍ?PF%–u¡D‘Õú| áw€r«hh&¾ŸRtØÀ;z¥¯}à‡BØö-£c=6[Ý„“aX{†A¬òóÃ}*»š2ë&ôêç­Ë›ÊH|ìt“.W½¯ÊÂmUæˆ5„†–»<Áà{jå½ú |ÙûüÁ|–*Ý$­ dìÔ¬5ÌñE÷¼ù8ùZM²™ÆGìJ%ùéG}ºÂv?å 0 %Ùå[“lØÑ¸n[NCëµ±Ù‡œv²2Øî”5‹„Ò®é´0úºïÙ·¨¬ê¢Sçõ‘e°b¬ z¦ûûÌRÚÒà¼#ÓÝôt ©8h|Ùt‹}a3r©Foå¨`¤ÓÍp¼ùî,™asŸLßµ²ð"~ÿÞDm6‘¤=ÄòBB¨éi‹vi9÷ OTxq!•ˈôjl?ÜÖv"»nt;5°aßÜ<2iBð¶îÐLväáK(€HóTä}³A=²×˜8í`E5'`‚`»ýç'ñå0Ù#Ò#)–¿¤®øo+¯¬66ªVÀ*¦DGè­)˜ºËr=¥/MJ˜—¼KRøÎ<áí:ž•?!Ê1¼smÅ7é®…ìdRŒõ<6 "iXÅ?î™õÕß% YY9»úu´ªºà‚§™ê™Rý˜[²Y‡v 3Ø×?]BóIȺÐ6Ã@(wH|$(þ¨n"5õ±âÃ9ú„Ê׆¢ÓÅÿ=¨—' ÷€þ §d|±c~Q<ŒPPð‚š5ü¢! Fùû0’–¨ÝÍU5øG/žØèÿ "õÝ@gOùWÏ[(«Eäï¬æb²™@Ü}mð„‹„De^ÝÆ¤Ùµ ¾_=Ì\Dxâ tœJ‹[ÞZ!—-‹"ÊsR?\–[–qu†¨uø40ÜÂX5¼§;uý F¶F]@RŽï^ôi‡Ù¨é/wXS5ËU;¥R/ ª¯VÂ2´Û|ÀBO.Á¾hV-|>UPBâÖiëÞ¬÷ÎÓÛ¨f<%GóïôCèhu‘ÃR£\ÅÔ×B_66‚ y±8ª_=ʤÃñdˆÂA;óˆä}yƒB¨e:ͲeàªÆ }ã^; ^B¢‹¦r”’„Èl¸×.ãyŽ¡!t2»˜¢>‚d,µ1a°±2ñy*-C{KvЩzŠÿ´JâD*NX·†jśЖEÁ¢qöGÊçq%âõãÄBÀ”)GT}pe€÷Œ)Ã2@Ø73ÃI:‚Þw±£§rÄï‘i]Ïz²Ïm'b8BÞ]yåöz!p\Ìÿrîè£^¯k8íVz=i®a² ÔáŒ9¸ m?ÛîB§g€ï ãŒÔê3úGZúv M â. ƒ.ÔS;é©îH@ðýÒZºWb‹‚òóuÞøãXnCrr\•^ª[úºÙDNÓõ–¹ÎÇ %Žßñÿu£:Ÿ{·Dt]>‘?ŠÅÂA™£µ‘5åì\–#5eþ»…l‹~h[ÎQ|U6jz—Âr¢\MèyÖ%ß·™ÈmA9âÄL‘êx†ÑŠ Ýµ@ûÇÜÂ3Oö€¿èL”ŸR!Z`¡üœ¥¼ö—›ñR»1A7Bjcm!ßúa J<Í`苊ü£V²ßz} …)Îàé$NãWÀóPÇ?E C>õ¨k¹\Ë™5¦äÓ¶¦ã†NŒÛú‹ìïp)­½~ãSs'ÿ¥"Eµn·6)Òû7>â ;kí°öVû¬®K.,(ˆ~µq¹9æ©ÐV‹½<÷0•o¡¤®À ä׌€al¤,ÚÖ!a¼±¤¸Ð§ Ò ˜žTOM´nb^+ ª¸k5WLÚ€¨*Oïð¿° ÿçkÇ语N4yÆ Ÿ2’ e€78ûû¨Hj 7lû¶ràÖõmHcbÓô ³ŸLx@ *Ô£ ]Äõé÷*S¸+N¢—±ðYŠÁV}µq.† æU•œ²ñ6~Nu)òΊùkþùù=„òÞÇf{–˜ýî$¸våÑš£oÄò0ž/ÐÎø™áåPPÃiÄèìËw Iä?Íþrãð—X>™oéî$鉨È{U†« ¬¿bö~¬‰{4¨RWû~áU\è Ê)D¼g÷Î|É"]]~T¼4ø9ÿš-™)^ÂÙ½¹„Á{-^ ¢ý•*ï îÇŠkå.iS(þF7m$¬oåй DƒXW™ž#7o3àÉ¡Vôdô P%™t€YVŠ˜UõœÓê˺œÿ×ÀF|„׌\$¬òb£¢¢Îøàä9“ûwÎZ”Åæ™xd4”š¥Ú­IŒËüâàÕåã”au à³xÀÁ&•¨Õ‹ðÐ=fx^±ŠJ©ZŒGJ޵GŸM¾‹ú pä%Ì„4φƄ® |ð‹ø±æÁõãõÐÂçÐtlž¸OXKü%Ž•e“!§Á…" ¸°øîVýØw¹ñh“ÍFDÁIB&eð¡fFè–Ù eÂÛm?þZ¾ ¦|?—ÙÁ~2‘5@¹4` •Ïòã„tÖÉp:ru9N#®>†k‡Õ桟çJy‘‡äôõ†LôÝÑ/l¾-¶´M»¾þÛ“¨>vΪiª¹7¦—œþº˜û.¦ƒv0¤%B_Õ ®é4Õ4ß`ÈÀÕ6+ ¼|ÑMVe¿tÓô<ü.šÉ]Ç f‹{ÇF²È”ùfÁ ðP-ŠŽË®¼Ó¡ÖÅÃN,:êäyþjo0׉)u¶‹§>î­¦Cã®Ñ\h–²±¢$f£™.H$gçÿ¨ìd)ÕÄ—X´_÷M×çÃ0hŒ¹¨rðB»Ä MF)h,üΓ$ ŽÙû®Vöoc–Ï.À¦iøhW5¬)º|g›{Ä‹m{öG+”¶!wîÝÍ „+3]·~Óè"ª¥> .Tž6L4Œý¾Ä¥Ïª³KßþPr"œÚ³7•"xV$“Ÿ=Ÿï¤ ¢©K©Õê„¡Î{`tïá³àôý8ÐÈ¡ ÑkÆt´Jµ\.¡»(äµ9—“i«€Ðöá6Ê0F“+Sâp‹ A f§ºž}ÁŸ©›Fûò¾ÊO³È@BÐAõ¦:±ù ÃL¸£—Ì@+)†?¿çæZ ù¿“ahTtÜqæÊ W~U›uú±£.ü=uƒŸš-…iž&pÍ ²66SéÇ>~IËe6a hïÆ1ëªU6ùÁî½Rº„g[¶ê²ðßh­ÐÈ£ÎpBüç'[ÝH Õļj5§ïŠßª»‘ªøo™­Lîv¦]ùVÊPÅ5ÎQx"ç ,ˆÑxXñ–)”0¼ØíE8^å‹iË|h½¼­»øt÷Ï_zuÔÿªœæË:ð7Á:­²*¬ÖPÚ…Ü“¶-u_Ì\ï—bû2WïÄNá0Òbù@~‹)`RÁ"û¥Œ¡ÖÁ›e9±Ö;ŒÍ‹VÕå²\ÞƒÁR£ÝGòh: þýK‡¸ÿ­9Ï C†…Æ, WÌö!… Î1î).«ú8Ik IVù,æû d› ÜR'ƒœ†€<c”ýÖ õy³tóWOÁ‡€½KO£*ÈvÜi£ÊàÎSÅ ´‰½¤GibËû£—†ø÷¿KÒxR¬±+.Ôh0±™ò3êø;àyel\™¿¬>CÒåar9;Q"}‡ùÁÒKÓââËL¼Ê›M2›Œ·àeöäã £ë)íÎ' ‡0,ÆSøÚ±Š£ð” )?™ÚØ ±^ãa0ÝO‹gŠ Í. 60$š\šú4 nN@yŠPFÔ(ßž*È\ÊŽáAî2…;¶¹B>êw}#ºµ$©n5܆ÈñÍ4L"ò>ÉrE{k§°ËFrÖ°;)¸âocì6_™Ó1—s!H(d¢â+äíÆ·$Vê—~ ŠˆF‚ƨ$.Éšˆ\{9Žqçi!6¨¦DÉúNa?ý5;ù]†ßT+k~pÑYÇ’nAM„Çžå3P^ûN›„ûôî¹%ù.r~Ü?#é½n!f¹êÉ«† ‚©Üpöc&áOô…Bµ¹AÆOŸIÔ'޽r%=ÏbÞn(.Ù–V¿«wÈO·ÓÏK· k§êíNDÔ³)Zd‰&[“?W``VÿÄ/.'˜$Y4¸(Rçc~Ù¸Îða› X²ÙÎz7u ž¸í.K<3Ï¢4•û ª¢¾IšLþ;P¢€ç/Ú$õu¿ù–;ÆŽ8;â‰zoŽÈ–q®e‘ïéWª AÃKœµÙ²QìÀŒ1»6.þÒÒEB$̶ÓÅ­wk!=™“™Ü£¨:Ï/‰w¢ü0#y«ìqŽïß1uÚŸÒ.Ì‹ü~sfëŒÙ‚ÛÆì0N_J2Ú‘ùÖZ™Š.]æX]U?«é%®åè&eãD†òJiláá †Áh§ºækwR+e VcÙ¶{ {C'ë( ñì|WiЃa4Eº×¬ÿ ÙÔªïÿ¼“ÂÝ=‹ ¤^åUñðÆf$µÐ]Ã4\— 5ÛÙ‚ÅêÚ§McÖJ°tÆš›¡>;®ñÜÛq+gåž7`ô>o–—åÕE0ÆŸ 5Âh¼»Ô' }9¿RP½S‡_¥ªmÒÃÛ§'j°Œ‘þrŒ"”×ù¼Á´Ìœ¸»é ûN¿¬•³GáÛµ/#yú`é-ôåQÊjR¹î¶ÿEþ*}w~ŠÛ’zƒ½0Å?¢Ç…4j\¼£SÞÔ`Ë…ÎX•]pÖéc…ˆ—I–lä¬uÈ…‹ C—ñgª-2V§¥ ¥¾ªyá¡ÁŒŒùéYm]Ünñ«ƒ«Ï©»ŸßãyöÜ›$PÔõe°¥²W¶`°ø¯Ô¥EÞÂʨ†k!¹Ù.ÂBW"½Î ¯{ú?EöFEÈwOj“š<&ÑÚÿ;UÏÏÅu‚hNø UÎhmtôš!¾ksÊ© ÒTÙÌÕN{Ž÷SÖ(A‡Näñ:,ª^èf6-¡Z•»· ôÌ!grÿ€Ü¤‘ZçY<*Êû,2$y;Õ³‚Â;¯MDVh2ð:hà¿w¯çê:"9ž-»£ç†>µ,ÕØëìn˜N¥ L¼u¨ABt”DÌaLVUeû;^­LˆÄ¦˜ÁXÿ­ìR-ÞêýTÌÔ¤×P|iËò/bu,m9åeã¯;]Ò5žvð·Rx¼íÌ(•þ“Ú¸ég…(# íBo}ÊTŽjg¼AGK#˜l#ª¯d¨\æ.~L•FñB'"‘XØ,‡ ³»,Õ†Íä—ì/sË;rµÎ ýLüÈ;#¬økI’‚êxBkÀìaJ0›ÎB:¯6 ;‹¤L½é’ÚØš½øaúI<•°àåÑ÷Ñ‚8o‰ÁÇT(XØòà]8|¨4>·!gÿѓҵ‘Yî›/ :GòìYläAufejeÑKëhʧ‚ <|¢ƒgHxaM7+7Çù¹Ï̧uy‡è©Ao[yÚ1Fhi¢œ¦®+¥W£‡ÍÈÝ“V¥\ñ/X0fmy§2µêûÊ»d3î?AWƒÍû€3aÖq;#†A“@Çš2ÒÙ¿âÃF§ïu{žLƒ–C*?ïÂÀ#­{5¿[â &]ˆPÁšÖ?-©P7_Âsõ{°A ÁI§òáM}/Å:À›×Ì¥ÉwËPªãëðO‚ȉÊÿ0K¼£HK9í‰/O’já]íÉ‚2_\壃t"¡x1õkŽÞ’P$þäÜüúz+ÐÚ•›l‰@Gd.åßä€)o‹Ã*oŸ qâ½»pÜE.šŸUà† «ø[Œ„?ÿÈÜ,å|ÐÑ(–ÕÓÕìwîJºâNú0[‚}œƒÎ -Œö³è«À ²Èr¥Ä ¡lq¥†~¨H9žÚ޲3 ¤FϘøÍ˜öÔ½Ž]õÆGU¨@#ï#®›m8ëGmB°º.Õße¾©é’YJÒ%š/Ó5™+žtÖ1â Xî vÇ|”\«ê¢Hp­út±N ÛcLÿ‹{ç_§3 „ß#¾ÂêKùÖm“ÉÍ t”­~÷äEØÃø¹]-d?µ|ôì~ǃ"3“(@ˆHŽÇÁúœ¾¨(“Yf'`Zcɘ[âÐ0 ìÇ”ÉSÜ£¼ÓãÜïU—yü°f‘ÑÛï‚tرK^!ŸÜŸ¡¾\ƒ×¼Øp u³á_¸âNõÎý |À‘ÎÆjP©43¬Œïö…†üÀÊpÉN#På~ÞÙKô`ÊÑk[·“ú••«î¦lˆípˆûvî/š–tÒ…ˆ_Bʙʥ—vJ_xePIzà™×³›h§xÐö÷ìêÈl/Y<”¦Ù¬ÁïãUäþ£yAýËn ôGÙË}gÀÐSê¿Íqü¨Ú3*µÁŠh÷™Ž†V÷énwàI·ùZ³ò Å*Ô03Ë¥jC"ñ%^îS·4îæíƒaõò¢«[y›U ÝñÔÏJq]ÛD­àg|2tßzA@ˆV„€'&ãÞ%r³+¯‘-yÏ\f,¨oü½—™4ýC–_t•m|›3ßÃ,×8Qúk¯¹Håü¦@>( ’lžÐ- Џoùb=¨£iuúu\,JøÿÁîö®zÂåõu`²‘hc§£ÚYOmå˜Ó¿Žïõº[ÌNä!üAÄ–’æÕVܳÏyR"¿£^E°'ãrÕ¾]çO.«óHg½l»,\@a…*J<ÝxWK Ôfe¥&4ÛÌš€{íî¥Óž—¾÷ç.–Þ[§Ç'RÃ8 [¹sðc#+Þ·W©áóU²Å³õdsVÂÈ®<͇7M{•Êg\Œç½òÆrZLFÉa× 7Eü-5] ÀzŒö(óx&§0eQ‰¿BŒ…ÿØ„ö&iJŸœwüEXãërð@,¸RÿÛÞDÈï@XÇF oyß¾å²PÙ ? »ô"ä+-oì¬Zªü÷P/ËS§6¾êˆ#²ýÁ@zá!µE\UŸþSø“5|•O€&_x(ä¡pß‘&óð†Ä/Ç+„•ŠPsšÙyÔdØ=JT÷eZÙ\É`Ý·ý­Žz]þ˜s˜§Lͬ°´ºñQÉ¥ƒËÔ'M­¡µÏÄ‚éz¿ØR¡Ó¤Ð Å6u 7YqÓ§þC/§ãÎC±Ò96Gd2AÏ8“{í®YDU8Áâ” ÕtgV»À¼‘‰SÚgÄ}«vþ†\² $Ãm{xx&6ܺc¦öGÔè-Õî„q™|?»gMi¦ôÑ_®Ed–y ‘7 Ccí.oiþ†ëttv³ìeh"ɧç¾E®KLwÍpra‡ ¢AúÌÍêÁ¢ʹÂN俉°9®®ïHÁö  ¨YG^ÉfÕšK¦ ‡'Õöâ8ÚH鉕±ê¼ä+æºÝ5_ú´“Å1^óx…>P¡^RkÆaRDErð+°r b::óxHZì{c±¦3§öÌ80DÛǤ÷`Ó·4p‰ò`¡W*½‚þÔ¿[Ú¨vZTõƆô‡—#{æÈ.6õX^éµ÷Š47ç»@kªEÆ+& .Wì-Ô½Ï֬Ƭ­BxÐ+ ! ï+ÜæÏÎW 4'#|”ÜÙª†ãFXfòœÕ]@Ý- \É ÿ‚šL¯k“B¡$áq4#?1MNŸÁp[ʃ—…âù*áãÙ+S“`ÂyhÖüÄá wC^ǥ셴Ø%k%/iØŽ5[óÎÄG–\Î’“¯lªÝѱK]nVGÚØËÅþLèÉ2©oIŸ»;úyZ`µ“ô ùàfã†å 2(í2¸¶™ò ÐŒšh±•uZ*¯Ïg¦P¤.BŸ¨—R2RÒ×A=PÿgÈz5 k Ÿ³Ãd´Cp¢ŠíØõ;çûÛÉŠL›ÕEm §NîöZw=zEƒP¥³€)û—Äõ.ñPZu_­Ï3¥8$øÐ'ÌyóB²—Éø‚ÓÇGm±«š¬X(“‘ÙÈŒ§zHµý™×n^ úæBÇæ.ÓûO·8Õ«ošp›]öJÖTÑ=è <áXNÝÌCCaù‚€¬ó‰ xàTõO0ÜêxÑÅ¡óo!¿¨54ËñHÒ<ÖŸªn”D˜µìêI\hÅ­¦ z¼Rª7æ=Ú´@ïÜeÐή·™Ërå9ú ùÇe˜1ˆt7É[‡Ÿ0ø ý1u ‚X<ßHuó½ëM‚i¦R؈bnß(Oo1É1kôw·Ê{q¼&Ì_Ì₳ÌìoAG°X.oÊ„­›ÉT¸ÁÔYñÜ»5­ƒƒ±˜Ê'Eæ°‹H剅d¸ŠÒh2¢|=²®±ß#KR­§ÌŠI:í{gP_©8†‰™Ü`éÖÓÇnÍv 00à«âö@i~ øvĵÆ:H‚¡œÑÐÖfø½/é±}‰òè…–{µ^]5JºSF[ -d¶ÌX¨ª]ceÁÔZ5ú(÷Œ‚äÒ,ùv3ÛnÖÝ•8ä{öšü¥ƒ4 à(&Y_$Œñ- IkÌwBSL¥LY•œÑ¡gÄyüÌÃÑÈòÇÜSHemÐÄw¶]z0Tpy‘F&0”í3Ê}…–@¬ l8>kà ´Ø—®ÍÞÑ©RÒ10ÑÏ qÛÏ<¨c¬Ÿ6n¥eYË4Ê-Q¤s@šÇG‹oà.!Œ5Î>•™‚ i#nywhc Èávߣãþ#G€@ Qô@ÿNwô0Ÿ.YÚ¤sÎæ²Š,–Ûb:Ûž|5´‘Cb†X3ˆe<~²Œ<§Ò6GñÖºÈl'$7¬ñѧs“±ÊŒ”þX8tÂlˆ{-ùûKq.άž­²Ž«¢‰žÔäu†ÛÍ;x+§ðÀvëjB‰mLÄ}ß57B.M]ǾÆd¡9;;ê3ó×i>B1€×èH!j§Öeœ/ÍÆü–N2j2HÄÜþ@°Þ™7žì‰ÖX©rs@ЭOú‚¹Åãßê®-Nnrä5S]϶¾ÕÐhØÞ—,Ëu·ÔDŽ—é¬k),@ÃûØ×{»žS}~ ZˆŽ0<™EÂ)×a‹ŸÉúc ú…BwÏ-C¾F¦ÄŒ6o_·Oð;õ"J&c; °$Šq@aa¶Ä Vrpe‡ñÜFhûØ8=Ùû_<…9XË;=)!Æ2\MÊßX¨Ó{b üS—¸È¸ë¬À6tï¹FRs9‹Å?3ÃýnèõýŸìbÍc²9‚¸„$wßEƒo‚7.oeÐߊ.œÌít,w_ ýqŠ£Ó&\çxˆFÒ-é`¯Œâ?íŠÖôÉoºH÷¯{—žO¡TÎ; ¡*˜q–C­s'[®þnÂa `-]¾µJ›iöíú=ÓJgÿdOgÅ_;H¢Öo„×µ#Àk}ÝŠ!×”¡’0æìr2t{C˜üW¥¾{PC8‘bú;ª¢n¹½½öü§”¯ç¥!}übCól_ÕewCýÛxÆf¦øhD¯OYïÈ¥ïŸ$ëêÀ‰Y ’+¨J f¿Â®û)‰›K Üd^Z÷ûyö²q¯H€ñó«\Pã² ”mæs¢¦æ=”ߣILžWD:ŠmXøm\ŽøOc3².7÷çÖÏ<|ˆ}NKÊå¦D#Šzæ§³·ÑÈÚ<‹›ÄmóĿדD8üJí;ÜR¥—DŸÝ¿/…W·‹¾Tlù=WP´ï Ÿ:ׂÕÌLé\CYæqVRº»GÇÖ^ a6ªI\ ¢/@µ uj Pß¶‰]íÆ±-½[chÄC‘V• pEÓú1íŒ#Æyç aþiEWcø¥vJ¿KÕï2}‘¿fõ¿»ïjiŽjÂæ[^û±!ظi W¿¶ÛQ­&»»ŸîZnSÔl³³î]brïÕ"ïãn…†Z’ÿshr¨²i#d¡fNÔæ-7§>õA˜È<Ý:2ÜŒÍUÙO>† Џ[jBzL{i½kŠ»(à¨üꪱTBo¦5+Ä!±—9Æ~RŽÍ 3БËlƒ6È%ý"ŠQ bœöbLƾu% íÓeè³(Zâ{6Ø3ë%†ð7èÌDÉ5p–BÕ`Ô'ºO‡!± ˆÙñÚ–ÁÆ cÜ·Pܤ£˜5ä?ú!¢ endstream endobj 127 0 obj << /Length1 1623 /Length2 9107 /Length3 0 /Length 10171 /Filter /FlateDecode >> stream xÚ´TØ-ŠÜ=@ãîî®ÁÝ­išFÒ¸»Kpw Ü%@€ Ü]Dp òÈÌÜ™¹÷ÿµÞ[½Vwïª]uªêì: 4Zº2¶.6 E¨;'·(@N]‡‡ÀÍÍÇÉÍÍ‹ÎÀ qwýeFg0ÁÜ .PÑä` k÷'›¼µûOÝ PõpððxEy„D¹¹¼ÜÜ"ÿ!ºÀDòÖž[€:'@Õ rCgsqõAÀöîOÇüç/€Èàbÿ# ã ‚A€ÖP€ºµ»=ÈùéD µ@×¹ûüW fq{wwWQ..///Nkg7NX’…àq·è€Ü@0O-àwà kgПq¢3ôì!nÚu]ìܽ¬a À“Á AÝž"< ¶ àép€®Š@Óý“¬ö'ð×l<œ<§û+úw"ô`k ÐÅÙÕê‚v'@SQÓÝÛ` µýM´vrsyŠ·ö´†8YÛ<þ¨Ü (£ °~jð¯öÜ€0ˆ«»§Äéw‹\¿Ó½nOŠüÃzZœÿ>R t±ý½a¼‚kÌÚûIH¼?ž§U´yÿ¡a'ÔÅý)ðÔ^ÀΆþûF…x\J¿M"~—êßH˜À¥ó7pYÿøŸ˜Oëíüåwõ\¶ÿ‚O™AC§\v.°ùù\àAäÈó” ú/øtÜ?±OÈíi;þÆOdw/—ÑŸjõøpyýyŸ*óù§a—/ögôÍèƒ=½.¨ÿiðÿÁ³ï®†Î5ìàÔ‚‘aB”™Œ” ‘P%.~½C9Ź‚&…?ÿxEuÝ[åT¨ÝÆ]3ò¨é$Ô( ƒ!²Ûw[Øè»ÊLÔ`Ä¥cO”ÞH 툢„/ldÄŽ.ÝÝ%¬™ WGW ›ÛD°­(Û»ƒ·Ày ^!«z4#+ÞÃ*3'ª¾SÑžÛ¬N„ÇçÀ\—={ãn¶Îdðé ÍÍ¡ûué…òlqÄù”—"íÈä téy9Eí|ÍZC–æ »ß0Xrk÷;½Ð¯£· «ô«ejß çá•¡)£¾J¼iEˆÚ÷ Vú²)*¿†µ©øÀYrføîÃÙøû³#óRnqÎ*°®,¦†SIf’ÚÑtq¢……HS?áƒhÅ‹VdÿŸŠè¡á~ý1€¥¨àª9öu²‚_/»H$»¸-2½¿e4h>0õü:ˆˆé‚aŽˆ‘ù˜Œh]¸·!m— K>Ö/¬¸~¥â Ñ5¨ƒx5$úºw­ÊúÌ–Û~`Þ/(HÜe–×ëÇXaÆm$QXàc bhÆðúMØs ¥PûàøÀ‰g3,â|º“¤X¶=Õªýps$jT/Yã.z("¶Å½˜xŠÜBÎ,4ä=K[â pó8•št¡]Þ&G?ËRˆ}§sqæëÆ¥ªŸS³D.‹´Æÿô¼×š¶¦çQ¡Ó´Üu[-ëV¼{Ôõoÿ ¶ÙpXfsG‰¼~LM)>þ…Uè¬UÍâÍt»ýŠ‘WÐÏÊÕnÎ'†äjâ5‚t>¡T¯ž0ëÔ@}{†}BU/eS9Eâ=•kú ¤p…¼}Ù—s)´òAJ;‹‘8—aÀd`†%ó¡Œ<ÆY{€‰@DpsAE¥ÄZx—I>a’–Úˆ»ôd´-ç†ípÂvÕ\ö ƒp Kj(\aíÓr$óºaÒb§µÊ åÕ…qŽç$-,þ½*²w-¦h_éÞÊÄ‘q€Î× ðЮH`í—÷'ý["_š*s¦(˜Ô/)V;êõkÿ¤ÙËIæ…s¥¾=ÒȘxPÇÜ@ß}Ð¥ØÁRÍKÉþµmÕ‚øF™B–‚ÏYÕQ¦h(o s(õcú%î‘}ŽH.ˆ\ÓŽÛt õ¯ÚN ‡¸»:* ’B™286rß4¤Oº«<Êx³Ž•ÒoÒ¬€©uI•ƒuážW‹éôÀýÐßþ.Eãô:A}ñí$/˜@s™“0Bo@(•ô¶Åk"5ˆEª±\îÙzQ† ¡Å–cnÁ› U±·@,tÞ4Ö3¬Å¥ç½¯Ë6ýsSfú©¢õËã5ë%ûØ#•N3A›©‚Sö9‚Ȧh¼‹ëÍ_ “¨¨M6<õÏs¶œº´üX †Ú°;æ«ó)ññ%»¬Ñ¿¾%>WMìwË ·¤,kHÉFöI Z$÷÷Ó -Ih*Í”Í|%ä rEŽB<œÅ›I~ßhMsWÁ€ù˜%-Ä!ýGtÈÏœò¼‹Sþ'`œÚª`ÊžÉõíÔ6›wºç¸Q@dƒTÜžÒž<¥hf¨EË«æâéÂ$VÆvþר–wÂݦ‘ªè•…=N®£ô‰®åXâ.µœ“…ÊXÀR ÇKéŽÉuÐãšt}çÃkÔ¢VÿX/õXX*Pd]‹ˆy§@ÜQ¸µV,½JhH™+µ´•?SgI™´Sàý»é`”¹]ýf¢yvÅçþã¬+añáˆoc­©w]c°IÃ4t1¿˜Ò0¾C&Hz¾;Hä?{åÌ€ ©×=áX·ùìæQg,­mõølhÅ EŽ£-Lyî¥Ä¦ ~ë°-Gi[quA'oç½ÇLÕß<»ðèÜÊ’›‹c 4Ö»`Ò9"w˜‚+t™ã¡T¯‚¨=¼bƒ;NÕ|h泺’û:]8¸KG4Âß–ØT­ŠŽ.‹76L_ml=[§L„&“æ&VÑN#;_CÉüåé/þCuFiqGÚ=á¥Qd¼äæÈ+W’w uÌTFçžé£{B7Qd«LÉáJé&²ÉCÕ°‚[“Æd¼sðKÕ 'çò$fÔ‡èéÓc5^”ðîL¡oý`¾’K´HïØ ­É»›`Ü2ï(‹Aœ÷^ ÖóvõS?GZ!Îy üü\65ߙȳ/_,àÛíÆ;9=¨©Ö´ŸP•ÊÎiì’´¨Àñ,œNòa¥@eRqvo›®µÕ ;Ê -MS›¦Ø³Öñ.꧉¸³¥éÖÁ€OË ŠÎif=)B éÙÙ+"Ñ­”·%p\è'O±¡¯»58Ь²pÏà°%¯)kDÔߩ”³k¶¢ 2m”“mNà9•4RG梇?Fjh’.Ñy[ájê<3Øåèdn¥Qmn5yïc´ŽÖ]Ic^ë;¿"=ÏÙüæ†SàG ÜÇ«Tyµ&KØ¢òîÁ»—àЊZ.*Ø×ŠPÜ}ðx­Cjœ7à­íaOF¯wÑŒÌ=ðÛ`ÖhÇDqɱí*s8«hÁµK‚­UtI‰Œú­–Mx|$´,9ê Ö)Åœ®¹<棘qdT™ùg?ÃFÀ–`’USM «^_Ïþ–lÕɳG¤“Â’þf†lŒÞU.7çã7J\¶1æ6õD´Ù®%­_q)š®Å£Cî@9«ŽÙ¢-îÜĆàa·Æ2̃L)Ñ<ª9,˜äµiÓ£P)×ÄÄMë—x%±fµ7‡L,Ÿ,ù)ß¼0.®ek¶“´®Ä ~ÄçV0ùèÎ>|OH@Æòr‰ hR7¾›.€â½¦€Ë <°î} ÏôÞ€vKS÷)üB×z”ì(Ѫ\,·hƒ.öZcWI]—8ÒmÔxIõ¹öo6=ާößoº7çS.÷ÒXòº¦la ~åw¶˜)Ûì©SÒ§æl -ÉݱrË8¾íPgJo+¯wh2‡{ƒ9Õï0кš ñ×äüª âe£‹Ðá6(‹ê›´9—3ÜDӜà >F(›»#¬ÄÝX·A²¨ÛO—“ Éy°ƒdhסF%c'<ÍðüÐä”}šrÄ2o6%DHÀ? -¼Çǘ‘½Cš— ‚¨êú–Ü…Ip4 ÏçTåGÃÓR²bzªsÊ©{hB’HGÖƒiÍà[ {{“ôx‡u âh¯?Àø¬‰¹‡¾ ÒPÙm¥ í#wÆêæ1ùPÌÁÕ ,[>bUao T4sefÑC°vyR }’…)¸0;æP :¶k+Î+ÚûOCBóqY-È©é¤0µ(öÛ‚9PîŽ3ÈûŽÓòRž)z>7ø Žyq¼À‰‹jÆd\rù1ñÜ$4¶ 2&zÎ_„5¯{{Í&„³Ù¬Ýmå¨ Ähãïþà2Eè~Njš¬©ãª ¹“ÜU˜À(÷…»­uÐúTËî`ø5$Xg©ÆC¨é%Ö÷Îkg'ôÈØ¾Bš•c2XJø7òÚ®•Ç€0Mƒ=y#©òF_?ûNsT,0`¹§ ïÚV2¼‡x+‘ª ŽRæåhÓÊX”§óÚN# ÃÅMgaã¡ù[ -¹€ð»KÞ•¡Çû BÆo0Ä@õJBó)Èå*×ñÝÜÚv}\É—h±¿¨ÉÖr @4º[þ¼‰©Š_Äæ’ DÁ|¢JIQW,Žé9c‹êL2qË:’rÏ;ÖÕ-nÍ#mÕd=Ú˜½²†³Jª¬8G3ïu¸ý ÷å¼?ºô”C4âƒ_HkHgÅ#̺²«ØY½ ã]«+û™ƒ M"G2ÕkÊtÌÂØBOœu.* °Ø~*·º¹ßBÊ/E>J´Ô„0]/e¨k€|=÷ÔÛoç>;ÖveaÁ•y2|R —‰œ:¸…²ØðëÈ»!™ÈÀ }ÍÓJ¶v}Jú&ؤÄbTLR 4é¾é«€ÕXz)ûÆ&×̱LL@Š\dtáGR;Oc^É‚övý*/œé*á9m‰3œÍ=£<.Éü©Éð{ºÜ×?Wo¬iÛ@E}»ÖŽu½4\ðêðkÍ’…xfžŽÀ¸'‡W ë®:Ϋ蛳 žPƒ’ט(©’PUY»ˆ"²üo”:¯P!~;ÊtMüöÌðEàÁÀ.tá÷s‰ó‡Ýã„Ò×ÌßsIïböˆõòPô0Ô3óµÞÕ‡Z¹ÎD[ùÆû=0åæE{cƱ夠"ÿ)î>Ÿ"¢‘³Ø7¶î¤¢LY´ê–ß/qä©ÛC"¹ *ãQý\õ†5a¾®çšivë@‚+|!½èìhçY)'¼::D‰…%¢çj"ýÊ/lì#bExDȵµš·%=T[­ºQ¨¡Ìµ«•ØVú+‹Žyߣ½Ô’©s„«ÝwïÒLnv$¯=)߯ÓÜÖ•g*[²Ê8LßáI «ˆâiŒg@9Fg\An×>ãù9?˜—Dxb~ts÷î¸-fE \y Çì“|­s*UÂò‰6+#‘;U…+b a\ïuVa @Ú.ëAÍ1C’ééû2v*ä*÷¶¨T­¿\U¶ûâï"W'ð¦ÔùQP,,ñš§Ö RíÝö®)Nƒ9Ã+M÷ Wí»Û¤Nv]2ލ[Åø’#s…›ÍÔ-ÒÑ–­ ÛÈoÎ ¿V² ÓÝ|ž®Ë‰ŸûÅVLB?’Ä»±Ì~#›ïë‰tdõÜ\î`Èȶ2ßÞ„Ì6zÓ\³«NÁÝÞyUü€âò‘zù,ñ̹Ãý {¹ži×JéâœuøÓ ,qéýbëg ¶"ÎwËæ)v¦à×´X–ξTÀX§ñ6ù #\çé‹0žëûQrÄ·TN†ÊÆPœ ã#É_.e— [[iŽù¿ÜޱÍB.^ÛüÌw@Ž8ÝßA¿yÈuæ?¥®sÛÊaO±Q‹ùFLpOýÖ°°x|e ð¨2­w:evéZ|D™ “g;)à|÷Jmmë› g Žr=·]j4r"|%oäAMXR „ÑNeëŽz‡µà šŸ¿^‡è,y ·Ëìg*@uñÓÍtˆŠQŠ|2¦7qBr? f£p˜¸Óá!=ÒJ­€Õ¶,ž›ëùLzpgûnªÇê~•¨KJE<ƒRÀ¯’‡M9æÞä> $<Í$°ï' Ì^SË”Î)æ/86qÙ˜]lï7xª¬^í ´b kã¯áŸåÃË×Ïžýr¼vË(±—lɱŠH.´Nx£UÇèÀ UÙ6<"‰ƒÞ¬dè”®2)…µÓ||óúi5ù c_¬½Çc7• |$=n|h Ãm¢ÐL “V‰™v sp«êæþ¢9ró;òHoaú.œoÒóøÂL6ÚÔ`Š6Â¥¶°*ƒ-åØì”‹%²æ¥¥Â+eïØ-cœ®½†ñ©êx·$Ž~WÝx”ê>¹Æ†¢—¢è¡ˆ""Þ Š¹Î´Ÿ¯\=bT‡”f‡îçÛ†Õí í£crž#¯L®ä.ãó0‹†Õ}2ßI ;¼˜ƒ –k°ƒÌj*“w”¬–­xív¹ÅT"û4 N±çì®ó†*Mr=«xM=—nÂð õ1‘#JÓe¾Í[^60Ûo´[žpŠ‚h )¦Gœ8ÅŸ§M44Bå)(·?|¥¶'%Š/õ§Æ?~ñUR`*ŽS€ùÅæ-Ñ´$jd§Ï¬t%Ñr~âÙÙŸ¦ÆÔm]iho)9¬îðâÓcÒ’OÆ“iÊ+8üß\è¨ÃÑléñ®ª½j@ä…ï~03pZq^žb¹÷¦¢[‘ž Eì_6½‚òìÚp ¾“Ë ü`:èóášfÂÃb$©×/MV¡klG8o`l~ö£AéLŸÈ[å/…ëxGÊ&ðøNJ>ŸžÎÍæž—„q†èæ ° “H3âtÇ&› Lš÷æÛ®Ó”h¬•».Y÷ÿäN±{ÕåôjÚ9® Ÿ€Ù䊺/=©;£ÄÛù®‹$Qº—¼O¸5–oØLH,ìG•ì,§¿Ð*Öªíþ@±oZk²^…çð1§ÉßQ›ŸzxONœÈ4±ð“Äeg÷V>˜Šž-7¥2‡P¥I±§k½ò¢á}<Í|=óæ÷¸m¿Áaörñ&C¿*ÁŽ Êè¤û¶Á¹Î¡Ì Ä­Á•ß»ƒ¨¸ªôÅ_·ÑÒ}!°<ˆõ´2ð¯OªÕlW‡_¤>K§ä”@ÜœF>™O,à¸K9êù~¢Ã‡+&£+Ìaš‹1iEkz„j–{ã–m¡ˆ^X°¶2؃K± ^ÝRН3èâü? wC>áŠßͬ¿kMîªt‚ìšvQ£J˜lФò7±úŒiº(‹á)†xæ­£›Aþ¸êŽ.ª2ö'å‡ëú(KÃLo6È9]ôôøò ´Š‘…Îöµ…|§³Þàó¶õ“d7í½ì¶‘¶Z@3œI}·ðÀL8ò|Yø¥>*Ú2š¢ïûÌT©.Ú½¿ `´¬öò7p3j>}߯$Œu™ëO&XÉz~dY,HÉìf?¿Æpϼ_´”Ý8÷ Ç~]ôL:@ð‰{Üÿ Q©ag$‚Zh’r`Á½Ò³ŒzªÃ|ÅÒ#t¾à¨žæV–ÂÇ;zöÞ>³>hž.cu ¥Ϻ‰sfÁÎM3K¶'û3Ô£oð‹$Ög{‚ˆß;4½OóøV¹¹·ÒÑtÎÖv‰Ð…Eóý²ñŒVàX½—ùNÏóª{ÓÈÐÙ«ü¾Ë0‘ïCÈC?¶Nu‹WmŤ|M qz ¥Ñ¦ÿ1t8­°x4&¡§½ U’éKQíú¬:›ZšØl¨Ž–éÄÄ”…íÒ7Ú yÈ\…1Œ’avf¯HâÄá½DlåÃRÉ¿-‹PD-³|–ĦÎÝj ÁixìXР/ø´j®Èͦ(¼¥ƒnQÄCÁ»ó½^ÝIW,õ¦=÷u=ÆÈ¼YæV½Jï5a œ+\Õ3 >j1A¡Rj1%+…x»û§š–)sÀÙþ«ýç7á'ÒG !ÞË®öÊÇðå¯ ­„ *Ò ²?ë`=ƒ;Y!ÌÙ¼0ÒáÖ®žÅR&\\£4A¶ÂØ…VŸxÖ¡Ö‹îâ6e¬Ù-1]'f(ûØQ˜‘'QK_œ<«ÙDÅ<þù+¼)…ß½^tL¨+uö¾Â‡u)¢6EÐãËÌÚÑ?¡˜m, ê"Ïõýæ–ïÙGKRÑîw­nm7±8P>¢[«iB7Ø,¢ÌPþÖ¯ If·xN6%ÉëfBŸ¹»\®êÜvR\üóÝ2Ã{óÇtš®qtõ‰¸Ÿ \‚ÐP6?eʶˆ-ᚺ0ר'_HÄ8Ã)zÒ‹<˜"íüÓ¾¦ Ü‘úwÕͺ¯çÖØŸÏ¾èŸ“¤6~Ïk÷ŽÒ»œ¯$Ñú¯£'"…aô|þÂ;ÂÏ#M@fDkot«uÌ5ÈâÐr;‹Y®êMDi9œ)ß^$ìâ…+’•"jè­#…Âp;>ßPVY S\£z¾ƒè”ê¬mæa*YœSIÙQ&þl3¹à¨xåÙID¸Kš"ðÆùðBo§v¸ŒòÅÚ´®Ñ.r…ù%$—‰ TâÁY¡iø^)x'iAD¶ªþ—ÛDý«â+¿£!Ú½HtÍ»lŒ¡ð¯0dÚv…˨TPø¶‡}èsÓ<È2LLT€t?…%ÅÌÙ/¿žDR²Tc¼læ r9õDÍ—×gÉÖ¾˜µ‰Å *O‹yI½@ú„Ñ/¦7ù¹ ­ù¹[­£Ò#¯QÏB1ú£ª3ŸR tKÆ&)QWÐô‘ûõŒà³{5ÄÒñ-Xhß¾öØ.³É\.\H>7Wïj ñ;äÉSúºž•òÏóZLzçDó©Ú‰ñ+$ªÃ¢÷§Q-ýöë,-5ÐhrgKv´ñ2íZÓCÏp1%(czÇõá6~n‡ÂÛ1ûfÜ´¼—oHaag]Ë_'R‡ìö•ž5/—+Ƹ$ŸÉ iGinƒ6>L°¥Äcä ¤RÈÆñö­<ÔJe<ÝðÜ^=Sip¡ågR=ɬz¤½µõ¨» ÷áÛž¥R‹¼G–~õÒߺ©üÖÂê™ÓG_‚ɨ™]¾¦€G ;Ϥ¿DÐý€!lkûîŽhæ[×àP–/KÇx§¿1ÎÈSh1À~„e¬M´fæžW’ï‚q%u7µ§¬ÇÌߘ&F" 04š–HwËØªÊO“IHŒÐ|×N5§Zñ9óÂp‚þ£§s†íiç"ÓëY_ßUW=(½1Jš«7fÕÂÍ óTÞå×ä]¤È»<ˆSgÓ=àËðæ¥ºg1X’¯zõ{w¶.x•µÓ…Û§ð¡Íc´÷ŽKkšföJ$YvB €ÔÐþ¼Ö>ŠÊMT¶Šy§hÉNmÊ—ÎS»ì†Ì6{çðêèQÒòò†äë¤U«Ø¹¦©PÍí)êð!?Ÿ9 y¤¨ ²Þ6m½Íß–ÚÓeq^ vˆé)B`äw _4AH=õÝå&ʘ??N¼ÇXcì»#VãÖê3‡Æ1k†ŸÓÔf³^Нue~’O0ë£?€ÚM!f¸Šù©Ô§*¡aòòН,ß#áÁ¿´ådT[×BE¹Æðµ¿Üf­ÒaÒ –¬á9ÙohS`ˤz¨¸ê+3SuÔñùe—¤÷.Co œò)E»WÊÌ"Œkl¸¦ ÖñÚv›\—“6¼Ï´UȪD$Á¥2h7JË\sÝ,¤ÊË}Ä8ŠL/3Oeò5×àÂm ±ÍÍqÏ¿ÀtrEÇ®ÖGïõaÿiuÑ®¦ß]ļóöüq0×KT=ýŸ(\ùmJ­0®ò¶ÙÕ)ÝRI}º[•j$rƒÂ÷éxå®ÅS{>Þˆ·Vö ºý8]ÈU݇ƒ³§ÏìFÐ&é`v _Â\Fªl[û"‚£K6ц ƒ²vê¯-)Þ—i»aEûï{tU&P>ˆ¤P9ä'õµP0ÆÔ³âì™g;ÒÝPëÚ=ÉõÛ‡#!'êÀ‘R´³2ˆMý2Ñ• ¸?Ð\¢Fú…^ñ»Ä€qyÚ5쌖¯’¹ï Dõ®d~)Z˜Y8ÁYÛ¾~=^]ÍÛ¾‹Õýöh‹õ¼fR)LSYdàUÂÙFyo:Á£Ú¡á±~R'Þ!Y÷¡½M^’ÇêÒ^Hœó-”r ‚Ž!.0CÑ”Šæ5æqÎFyg_áZ‘»Š}¸åƒ¦9Ü×@3àˆÒ-oAp¤C…ñ,n™Iüy ¿ïQ€½LAvÑý;Ë¢gZßGQ „6Ö`DÚÅk|¾æ 5'æhhoü4yYkèËãœÚ5ém!Åó,‚„™NƲÒM*ɪy6-R]!%óGVöeAIz*ªF‘‰g1Ú¡×™[qhè²%ªzmf ó˜ABæY`¸zwZ†šÑôá–$}&Óõ-.°ö¤ƒX@<ï'/@è÷8ÒU•è:SEí¼…òKÝåí™LËÙc±DÈ­˜ß‰PÖd·€}ã¼t6ýIÛ®·+r“K2Ìž0‚ûåÔ¦Uƒçˆ¬œåàåqoÆ]òþŽa.ÃSv „¸ðzÙ=Üæ ¡Îb¾;›'¡´!»q;í`|7ž”[º’œ{/3%¥#Ã(ež²oSÑÆê°LŠ™ÓŠEXÙø†ŠC‹U a^Ò6’š%’#ˆÑt4-þŠUwyà€”k`_õ½· !T½©Ù5°Òc¨>ÓØP——íù+vòëÔ£Êóm#,;tÄæúók_kBЧïÇV[Ï‹õ;¹nÍÞHó½¾EQ¢l×Yóº;+w›‹÷ù5J–«Òá–Uº_ìÑL:  Öc@4ÈYo7“&é QMŽª”#̘ŒDሤ?#ÖT2‹8•løœ%Ä ðFSÄVa}ñ8îy@.LNk£mn™Ž+Ã- Lk”Û?Ü@‘æz»“TÅHlT±Äæ1Ï^yÓ1¶|˜¦Ï¢èfèÞ·òãŠÍ°þ—A÷óHMy‹!™||Q9‚ªÐ/·m¤ùŽ”þXÔgñhêÿŒ¦ƒÄ endstream endobj 129 0 obj << /Length1 1704 /Length2 10106 /Length3 0 /Length 11195 /Filter /FlateDecode >> stream xÚ¶PبMpw 6Hp ÜÝÝa`\‚÷`Á‚»‡àî\œàÙÝ»»÷þÕ{5UÌ|í}N÷)èiT5XÅ,`æ`iÔ…•ƒ (PRçà\l@ ':=½&ÄÅü—^ìä Aþe á6sy’Iš¹<Ù)Á yW;€ãµ¯àùÿcsHš¹A,JlyìŒN/sðt‚XY»<¥ùÏOˆÀÁÏÏËò‡;@Ìì™AJf.Ö`û§Œ 3;€ »xþWAkvvwww63{g6˜“•0# Àâb P;ƒÜÀ€ß ”ÍìÁvƆNд†8ÿ)×€Yº¸›9O; u~òp…Z€OÉrŠ0ôOcÅ? X €ƒãïpyÿþálÁìÌ ž¨Àb¨H+²¹x¸°Ì ¿ ÍìœaOþfnf;3ó'ƒ?*7H‹©Ìžü«=gÄÁÅ™Íb÷»EößažNY j!³·C]œÑ×' qƒžŽÝ“ýÏ›µ…ÂÜ¡^%jaù» Wv-(ÄÑ,'ù—É“ý™ØÀyùy`GØdÍþ;¼¦§ø%ÇoñS>^0€åS`ˆ%øé ÝËÙÌ pqrûxý[ñß„ÎÁ°€€\æ`+ýŸèOb°åŸütùN€ðiö8Àߟ¿=— jçùù÷Ë®,§§¥)ÎügÇëÄÅa/V..+'ÀÏÍàåæøüwU3È_Uÿq•ƒZÂüûtJÿ)Øí¯ûgøk7ÿKö4´`Ã?3n䂞þpü?Oú.ÿþ;ÊÿmÆÿ· iW;»?Ô èÿ?j3{ˆç_O3ëêò4ÿJ°§-€þ¯©øÏU[@\íÿW+çbö´bP+»¿â, ñ[¨B\@ÖËŸr­ßKf‚UaÎ߯ €•üÝÓflŸ^秉üC~ZœÿN)Á,~o'Ïk€™““™':ði8yx^O«höøc†ìlP˜Ë“ à©=€%Ì ý÷¾æ°‹ýýI¯ìâ/€]æâ°+ýMüOdöñØÍÿ!~;èoâyÒ`vO þGò»Ov‹áSð¿ Ànõ/ä°[ÿ Ÿ †ü Ÿê°ý>%³û>Ubÿr<åýWOkÈûrØþ…Oyþ…Oyÿ…O'åò/|*Ãõä| õöü¯Ë¹:9==W¬ÓÓMþ‡ÿxÁ`0}azdSÔr])Fîκ=*4E¿­“ÄÈêµàÔêz‹’ÀX‘°êt)–0Ð…»´)Åp!ºHýàµÿ¥%´)N­ùÎûÞ$F}b»}~œ¸w,o_¬¦‡‚USô»÷ƒ£·¶¿-—gíòôYŽ®|ت9×îÝ25=Å߆Cf·Õ¾W¼VÀ¸/žd}¯eè_0MŸmž>CJ‹ìÂJ‰Ê„ÿÓgúâr ?sì‘Z>†Ýçà=W¾—þgôÍÌÛåRMNç2:2}RJ„ ü቗^â;ŸäIæ¼ ?Gâ6² qæ<ȧ„£ÏËÓ0oû}õ²UèD‘<¸\‹xéiú~å烌úó­t‹D-jäw`=N¡&{5 XüWÿ²ÝûzÆõP~¢ …bî‡5 l[ùMä”HˆÎ  ­ŸQüчÅÕCc>¯³í§UŽTO¸ÓW¾:q=<=ì½UÕ)‘™Åœ,\¤oªÉ¼m8˩߅M#MPO[šÉðß¿Á¶¼Žš•ÖTÚ¸ÓÎ/T’…_±57á©/Ó:‰¢Ñ¬Ô]O¼ò±²uäO—Å'9À"޹ð*{`ó »¾Q>¼!üÔõrSgfï‚Ûç’ÉcÃz²e•;´ŽRÎý“Ö`¬`†$ßìO=.m’·”¾‰ò9*¶ûà{Øør˜Vµ‘’ô×z–ûÕ½¾&¶¨ÀÍšjó>TÓ¶AMKdò×·x ·rêüp¢‚ŒÅ³÷¹¿X‚(:4¢â®üj™3­ÌFÕhãÚËø¥3¾£˜Z¯Áy^Úe"‡BÐãQ!¡4d“²:8¬=ú‘:®êüã ‚½UUEKZïÑ¢‡”ƇäøóŸ:«¢–_”¡óØ©fŸI_XôºÒ8a•þpô-Æ£tr>UýYZåì„9ð-×ß­~Pµª¼„9E6úúŸF½-2¢.CXž±;‹1ìÅxÈ‹ñŠé¿Œ¸Ár[`Àâ'Äï»Ç÷mD‰á&É?_%Šn­BãFÏ:™w‡üœ£’Å*¸Há$Vî 5»²°<ŒÈq&MŸ‘\A%Àcõò4NPk"Œß´§À¡ösÿ(ƒ-õn S u'Ýû¹@‡ñÑ…ô[9÷¬ÁyK?Ñ»ïk~ïÊîé?èTH·…ÈUŽ÷߼ƛˆuv§‹—%ÅWaâWœ¤…¯Êv'ˆº(ïYf´,ø5ò±˜ø.¾Xóò«šÉ?K kó&>ÊÅ•BL9HÇh+ø]Q©qÞ³f|/¾D3ú±R…J“.1T3¤\É7iF›Æï»ÃH„ms¢µj¼…Îó²a&ú–]ß¾=¦A™o%ÃÞÈÈ|×0|ðÑ”‚óóÕ{É®ÔQoÙ`¤ç“ã.F„¶Z¢¼–\†J’G¨ØcRÔ¶8ÅU¬ú£öû²¢äÒ*Âgš¯ŒªŽ&Ÿ†ëøÕg 07Læ:#cfdÄZ/°qbróš Ý tô4»Vz7ýŽ\hN½§G­û™Q˜X&æùqt­=yJÞ¸èóñ¬SÉWxmXT<¦B.–’4]ÃÛ¾ª©%ø¼:Ó•ÓʇûààKRˆå;›ÐYôÐ;Ùk¾tX#WÚów‹¿ƒ/õÎ脆Z´8×"±NG•.’là¸zõ¥–¾Šôc¡ Ž*”«¤<ªè’¶}ƒy¥NÉ3»([ºcèýñ†xƒrµGûµÏo›)§7P' ¨0$¢™\‚Q³$3>t¶¤9xœ¸e{Ì;÷uš5ûó¯„_u*–©âŒUZ–R|l5Xr:@¯•>™éÏGÁñ;huâUW#Áâêd/¬›—‘ñ°÷Ú7LVC/Ú•Î/¼µ$#V’Kf"™F™z~vNªzØ·eT:Qí­Gò±ëj’ñ¶ª‰³»bº_ërú_ùê jGVš¯¬%!{§µíà4jÍ7…;¹mT3›brm ùj'•wˆ+æÍÚCÓõõãºÞ_¾¹CÃ[Qü/^‘#4e`/b8Œî•ÆÈóð”T"[½Eàýèo%#[6&Áà½0u·XÌÆš-+¸F†:B†Î×,)¿² o>_óNfÚƒDåeóB@ð@^ï¦Ð^ïýœØ–'J#ø›¶±´ýR¸PîQµÿíí5rTAŠí0Ûåñݶäg`!é2·¨®ßè´ùq™¯z¨vÕ–¶ëÚJz6«DHàþ¶•µñ1ÁðB.SÌœ6)œzû×]Ö…þ«]WÄ÷j[x={! ¤J5ièæÃÉ­üUÏ]ãÂ5õî­år×ÞQÙY,w –crâ”__|ˆú•ûå2Ãõ‹î¨:ìlÚ´åÇJz¹öÍÌó|S:vá·õæ¼ ¯l~È‘!³„ÓA¶a;Üw›]ml‹SÜyŒ¢:e¹¯”µJ6ƒéß Fœ-Æ3…¤‹™U r­„ìj™f5–áÎnúÜ[Òôb‰cíÔt¦=™ÀŒ€T8NÈGª"‚¯é‰¢üa+ÌWÂFÁúpÙ7¹)mòãš=‰Ã ôŒê¹©rïÇ…J1ÄR<æ?±\ìL_±^·@8öž1¥ÄF@2©‘1;‘V‡:÷§ÍY:,^kq^z´uWw£KÙÊÁ.xÝêQ¿£û(‘’Ù¨tª«¸IÛ!\”´œÀ. î¯GQrç1Ñ›pÌ‹o¥¾ÁHVÚƒmÌ›Ô'šÇL³KhG}ŒH?RãÅExäuh¼ZàôÌ<&ü9÷2²ºÈ¹íÒbƒäwªV 1{xX{B?z,_%1ýuùTÎG±ãC¸†A[Ù­µBÊêN\!7ƒb7åQü^Yd"A&[A)- /0Ó.ÑF{,¬i²,wÇâ1Ø©Ìûîë7í$]œåÆøÁ³þýNËÊ8U60[D±°ú¦ºdêI€ÈÀ˜à:ÐÇ`™7u*›â]źá—ZwpAw™;ׄfÖºÖ:§ âÔ»¡~¹Ô*Žz eK¦Á×o°'Rìž§%óæ¿6áTÒ!y©ðŽÎ{|îkóÈBñ‘0\šzÌ…í'„ÆÆ‰y«¡ ±I°*£í¹Ú>ú<°ëTv‰ƒ#í’ÇÞR®;óÙÕ¶C}b¥)ndÂE«/ð·ùG¦ŠÈHݾi…ª¾Á8\"I¼ù…áÉ¶Ý ;mù\;wæ9rãG:‚@Îý:U¦NŒC¾ÐÝb†N4…6Z€©GåpW¼X×ÐÕ©œ®Fn»V²¢›‰êg×t©ç)’¾V8´Ð¦T|OHæÆº¹h\!Οd%GAðoH©ÕÆ|oW"c@›>"ñ†®ã]v!r± ÒCÚd>>zì$O}ÈŠfÉY‡åØMãy hX2”_®Qu*çÊ€]"æÊ_E„Ž&FPWÑèèÅòk9Ã.óŠXrÜ-¬‰†®¹#Ç’s2r›hQßÜó ÎS®Og G½ñ=Nï4näU ÉåÌlÛg‚ððCàÏEü×5é©[…ðx^ÃÀà‹æt£—5dŸâµƒ|KG';ô,–È”eãüÐܰ’Ü„ÆÊ´é³ñ™)â;·» zÚ‡ø"%?²nCµ8dê31N"Æú‹¤—”Eöú^D(Mê fܰn@îšoFÞÚŒè†ÕC¬˜dˆ©‰v ‘MÈa¦–ŸI¶eZµ¢ú þÔˆ}øxÜ&ΦØÃè ‡æ†‡º¥ÒäiÓÔª…²}!Ä …ÍõFqü³Ø2(2¿R÷Ô÷”£| ´ÙîׯæÄÛàÝÊBOþ›iÏ&vìÑɫK¥À&ÕY¥€¯™¬ pï^‰/­DãÁÒõßr}1Ëë|ï¢ð5—¡?ñ¦—QúQ‰êîS†èB®GeP}Ò@»íåj¢¢Uüø‚ÐxbyY=ò9ªX­p÷L"º:QN`Šo?ë‡ f¥Žª¦‹üA‹Ø;HHò²„‰„ý‚ÏeT´¸m,ÊZj,ø$)¶ U×! %\‰ªõ‰ýÊÙ¼‰WðœV%Å=eêÕ/ Ä=Ô t_ ú†Dgó¸¢uô èùËoKék¶Ý{dÓl#Jbl]Ízôáæ+»ûNÄgúÂKâAe!†‘gîi^pQSçfŸõ>ºªBJ}yö?ÒywÞpÝm|òZ=ïoý=sï¦'¥‚€Ü>`µþLŽtÇ2j÷Lw‚s£œÝ‘¿Þ¶v­úLÎ`á^Å$¢ )‘lô3{u¶Ñ‰©FÏûÕÃ묒×1ÓP‰ãîÕb“®ø^fŠ¡+õº{’–Ì5Üê³XŽˆ9pY©yyà”Of<´Y:䃜óÃwáC×Ò*À3E+èάªñ²²Ûjzž‹ç3qRågDmí¡ý†x >a§ÙógÌAWWPô¦·†>›rT ¶jO¥gg¾è0vq× Šø} ÛYî‘}¢D«UÄbn…1göHõ†¯ž“e§ø­U$Ÿ úCF¶=$MkÕÜ- >µJk¼)~Á ÿ^ô‚Ïx—Б8éý ãan›å£¿ ;Ìþ³[øŽMxm5T`¼›‘?e¿xTVoÂ4î/‘Z1Ší¹)BÝ«ñ{¢ÍOCO *ײŸ'k3¦ý¾‡*›¡úô6ÇN:”š¹KÛVà~„•7¥1¾3ë^z<»ø¥é›ê°Ô†kSûõB7!r3‡ä­Ë¾)ŒHÕeðóMYâtº‡‘}÷^ +™h‰]EJ“ò±]/G½ä‰„¼H%¡‰XB®:ÿÕ»±`Dò0*0ˆ$ý­•tŒßØG¾WÊ%(žV;7ÇM)7ÖLªÃüjcy:^Ñ£»(h}Åq¤ø`—Ój/V\Íz9>’‘>9øƒ2À#ûŽD¶í9 RYic^=Ã^¦šŽ¿\a»˜}-{Ùç´d¬ƒé#X„é¸Ã-룩ªË ŸÖÏñ3â*=þ¯8™3·nïp¼½íKVªonǬ-ÎEWŠ/ù¯.õÓšx/ò‹¢B øƒböë½’7#ÅZ5ìuj¾'úñÐF>8I¹-l‡½¸e X´>ÉNëÖì°Óĕ׭oâOn¥”÷M¼å;qì/bNò´Íµ$ë_d 8ôͨp¤Q#:vqä Pmîp=Á¸_öÙ«ˆí°$L°ÝC̾O‰E„ íOc1§ÌdCùÔž•óyª"uŽÜžàÐ9l!aX7õºæÆ½8‚\jMó¦ÄæàÚ)M¯„-È~¤ûb×ë(Y¹Ñ›!BN]oymüz$÷½$é$uƒƒBIb98¼V« ¦V /]ˆ2¬‚rxñwÒÀŽâîŽÖ@k¼è_2®Råq¬ð—&Zü€¯,t}cKéÝjØCm)pÀ¾Z;ŸdÄ|Q:1ë7,¶…¦”€Ã2DB¦?ÕAûNeõ’zAܶO„#4’”dƒe|: kÕ_~¶ÙÊX|úÖZƒšgu¾™¢§èŒuöÜ@ .W>tàžêãûíÄïɵ肅ÔuÙv&”ûëi>¢ ¸¼¸•‘ÏŸ)½U¡;¾¤°®}Sµ«{w,òxþÉïh¶z$+[À5Ïsëps¹ôÑ‹ ­³7ç4í¿>>m¤ID10dœGˆ: ó7xžN°À¤>ú á,äMeôÖêuŸÓíÄ4 8*¬Ê8–TãóœÚŒÙ&¡¨‰Ìà ¬ûºû”Ï¡pżÉüAô»[í%yh\ÀÌ‘ T¢@y¶Э¼¶ ïà ‹B¢y?âOw½+L÷S¿ú6(¯T¬ê”ß'&NñcŸ‹Y[hÛw9ïùü°×޶IÎäÓùÁ¤ðê ^ÀèÏ5/x͹Ѿ}¥U6;‘xæ’È·¯D]#ÕI‰mwþSõIˆŽv”2•i‘†iÆÐ*Ôæü€Ê´ó—yÝ\d±ÖͪÁŒI°^0Lu©ÊPû7!qL;Ö¬†ÝÕÈe7Ž]N£@¢ïN³†”Ráa Rw’Dö?™(ò•m¤Eý[8›Àü´I»}ä9½-x˜~âiâ˜KB'Üsѯ§%[‡kXõ2Þ%pš,­vk®.ùEú¨Ïð6¬jZäÌPýx3}“Ïpæ³ï«„à—YàŠ趃ãoM%‹Æ›;Šdg€ÃØ{TA\>Edü•>÷A@ γ綈Ô6Ž£ºYKdŒ°OO3þšHÓ,â{™Ö‘ƒã+ê{Ï,¥ŠR^ÙÝ«8{Pvì‘û^tÐ@Ö» úŸ·èb<˜Ùo(Ñ3“\›]Lû ¯×+lO#íoc³ë¿#ü¢ð®ê=”x¨!Ȫ~üŠžƒ©³<Õ\ì5LÝÇ,%.u½:"»°e4òKxLîÎôÞ0IIœ”#ÎïU  ÒLéeùøë€ýÞ”!öøÇQXƒÉ–áÙK""ãzÊMí›æ];rÆGU¼¾ô=¦´9|˜È2;åɧ+ºú ·³D;M öâ@PÂò¡"½¡ë¾`Šo£¡ÿš"*–†×È4a —¢ðN^¡™WÔ{4peȶœ!`£g[ï ‡G*Èc9“rz1èžñãð8˜@ ¶ƒ¥–Mê–ˆ¼…Àt8 éGÈù…%€-‰,C,Ho‚»­púh\pì<®A´ÙÀáFòJŠUY`^UçPݰ‡ÙøšÇvV)(ºíÁ®‘2ÊÿË¢ÃQ“â[éÂåJ¦"«<;€ÜÁË ž„~&š˜-*ã]mbÕèÂíZŒž™ÓŸéšùüe   Þ£Ç(*tÿ=™£mV xµIËÑœ¯š!ýüãŠ`®y0A„›]1ñDñ“tÔF xšŠ†í,pN·b£ÓN}¢±—NìÌo'ºðœÞ#ÃoܳŸdÓ‹d1÷…iµ¢óÞØìÚ›0-sâ×…qJªþ©ÍwJ÷sâܦh@̬mnÁi¢ì—íΔ†kW2y†øn™>máYÎÞÄÏ îˆï¹7/‰#Fq‰¸V,©í¹èÕÇÜ#‹©«[S=;øô´ýr¤FÐ\ËŸó$ól­ù¯#“vã¢*ôòœz壛™%1·æÌ¥þr&jÄí1s¿öLi›ö¢d»Èš*ÿp—súXæ¶e0€–’C‹;ô–8Ñ%êÍÑÖ2G`Ë÷ Y¨’<­„“ §ýl¾yX8«j O„Ŭ} 9Ÿ£úÜa3Œ3šîÃ÷ùX_qÊàƒT-©‡|²»YæÎÞmlkaU Åå]˜ƒš9"Ü\t7_ÂY#HÛU1ü’±ò¸¢ízyÔZikŠi>H©% „je ü‰S×m >ëö‡í:’3ç¾ÉöH>/ÂÅ58/’]‹š²á¤ì¿e¬÷°“Âú8Ÿ ±²¬rO“ÚâñLö6ÜBVÖLøÉ­ ëãËݬÃ˜ï °’êg‚Ç©61ŸCk…~Í*Å|}…BÙ»|rØ]ŽçÂ\¨›øɸqÅ´ðíˆê¸‹.ä¿.g‰ûáYðÞŸ¹&PO±)ÓT6åg»öI`æv$ÞôK«eôo¿Óé‡KA×lu¿*€C4EDíšTØ=]\ÞÌ£b˜˜¥@„çÄ«óµÖs*-ý8ãw„õTã>ÁÄP2šÑöN¹Ë˜›Ùv'#-^GGŒõ…W’Ô6'(vÆÊöÑH›QÝÕp– ±\‚8-deˆÊ'p‡k¹õwõ«as¶jÍÇ ªî’¼ÑxHuf,² Òr²ýÖhl¢å °nD½(üŸ(,²Íß‘l°µñçµ&I_Ùáäí¹¡P'oíÂ"·»Ùò'±³¬„‹â Ëki7ÖºÞ;DÄiè±ÊL’¦2Áò¯èz´EeøÞKg¥ooMñfã p$hrj3ö9C¥¤‡¯;Lĸ̥úøœž3êÑ‘¾¯½G#¡eå‹acg-ÁW",.Aù±øc­m]ìÖ-Řœ2Gª©Û‹À…RU01í×ÎG»|ìÉüéOÀ…ƒ/û"Ø'íD¦îmú¶†Ô8ý[ß8r½¥¥òD±äL ²×´6Õc•1H'ê:m}Ë Iž•%‘Œi\¦~WðGºt\¡d„܆ǟPt^ê¹!Œ×È{q9¨QÇI™JŠ Ÿˆ TfJ?CDâ¿qW{IèQ#ìG²^Ea%Štot»‚¸±FqÝïh$XƒŸy“E=½RX>D‘Þ¸oÁßÓmèNsÑ2Û¨9q´¡º¸âÜq@ Á‘ Uh|„º;µéVãLôÚJ†Gí f~XŒýŤ¢çÅóÖ´øVÙÌð…UœØe 6§¾cRl¬Ë‡žÏ¤gÊs®(!ºMàjMöQàœÑ Ûi­Öd™ú옪†-§ï§+9˜Ô“øSà·V™o!ˆE<õ [û§=AõbÃqý/·ãºÓ’Ó}†÷F½\v-Ú¿}“¡Å¬ŽÁ2(ÃuRl›;º4‡R­¿D#àY`_˦Tp¤°YS©³@NÚë@}gœÕßÈÛ^{ì¶S9#ŸË¢éõI¢ñi­³«³:RD³zØÆâÀY6U/íi”#fTÜÐy/›ðFç;‘΋O}ÖHè¼dð¥›v‹¦pkà1fy/€,_žÎ¶ZeqÙâÎÍ´ÆYÏ‘F&Á£cN¬©ÒÇrPÈ@÷sÖŠÙ¿ËJŽà›º¦Ùå :ÕKolÃÀYÑp)D°ûr ¹È/'&f€Ê$Ðø%w'Ý÷CÃb»œôÞ(oÆÒKƒñÜbÒÖžÛרúÝ5c5Åä±ëúŠ“ðg}7àüÊ|¼*ò3†ªÆÚótý RxeÇ3cÏò/Ôa¯Þµˆ PBgL,¿­åØM-?6âê5ñ“µ÷ùíT§±×â>èöûVFpµJECú’n²K"«§ûƒs ͽ­±=ì¶‹iL˜ ¼{ŽD[Õ.ÆÀĸÌÌ nZ†?ퟔéÈ­L)þ"0¦ëÓtf%PÁ-±3µX¹I?óhØâO‰¼©tÒÃÙÂ^éµò…^øÜöÁK•KO4=öM}ò¡±Œt÷ÈÐYì*y¶¡Yÿa ¢\%jú©ä³Ê å‰Apõ÷_/)*•Q}e«–$0^ïc=:Ê®Æødq˜Ž@_ +¾š ì Ø à}¸†&”lŸ–x}ÄkœêÇ´3¹†j½öss£¿7@Ï®b–é‹glïôÒ å%ì·E·"y3¹8­è—¢ìýYŠ«ðC/Ä ·Œ‚$y$ —úôHà;(Þbgþ@W§Wôþ¹Š{#ék9ìº zå=XlÊ7=¬ÞÀ¬5•nó9½dq, /,1ÓÜÀŠªO‚ÀÔ`CçÚ@å¿S*¥aûŠÆK¥Ð¤^ÕT°8òµšViHÌ}4&ÎZ 6ñãà&ŽÉà(˜iŒ°\–éGeo…|™™-={RûItÍÄ/õk¥þ âôŽ+Fr§Ís〸¬‡«£]‹«I¥Iâ=•c¨üy {(Œwz~7ÓMÊ´d!žŠÔNo[êô^U5âj0¿L„˜mDZ.·;6kŽ*âûçm×aoŽÐ¨˜ùÊ63ÞûmD7[}‰ÀþfSQZÒi( ~$óhÔš ""¥s¦«ß¨t ‡eÓ—.ŸYŸÁ`´ ÿ/:YfC8ZÓþDí•~Õë@!«Kù mM•¦œh"ä}áoavÆßúJˆ²Çß\ßcåf¥¬f)÷Ÿr?oÁÔ•{·™Ž«°·ä4â9A$žÜsjŽÀýŒc+È]E—+3º' E3¦~nØŠsýâÙ=I ºÇKMæîá½ +“4‰¨ÃCibÈ7“ÚñÒåÁ_´"ÐŒ³â°øò^Ƶۈ#çîk³Àœ´æ|¨ç³ Rv“:¨ÚÇ‚[Ü"ü·ÞÈC ¦ÜñŠš_ùå‘ÚûªÞ+ج‘;HR×mœåúBæwA‹…zo¡ÇN9Ù†òÀ·¸¿ÖÕ·Þ3b¯ QÈ\3&ý¶‹¹ï¬‰(nv’zd°ùÄþ«†€ŒKý"JH-‡aß3æ Ø»ÀÓ|)ßœ¬Ìû‚˜jZs|“ C?¨©iåwþ^ _n|æP+³ÿüÂáð~\y®*iÇ{9$i©ïšœºjÞ=Ù>Ƭ-J†µj3!0>-ª±Ìnüðh¶žŒ$ù3?³ÖõKúì’+€ÅþÒOø|Hn»qïOÄ;¢àÊ‚dѯq¶A«@‘fJ”¸ÉõÇRr‰+†çc^ /Ì´ë½SÆŽ<’ÛÑÖïE©/T8gK±ÜL{¥‘*­1§¸ ½nT,Ÿo©¨Ãˆ8qY^é›~‰küÌwb$A†¾âƒS¬Ôš§Á•äk’2„¾ÄV*n Þš¶½É©‰ã¬fg‘ø Ô TWh%ȹ6QŽÚ±5š¹¨rÇÖ Ë®¿{½§O/ÆÐ‡+¤q9tÒ1 ûüËe>téèx^Ô¤9•a4÷ ŠðŒàûÒWí5ø„ÅiÑN°„W‚E_#ñ•°c©#ö·‚Z¼lc–L(†XÍ®n›ï~(W•ŸOóKfÄ´S0wì‘ó\ÛÒB$ bµÀÆRÚö¼òÝNšk‰} ;µÖÞYñÐ!’ï®´¯>EÝ™…˜ø»Ù9Û ô×!^rºJx°¯ ž—4ñÞ‰)©¨uÍ æu¸ÂG dÊ`0?<ÇZ>zÝY·,#‹&ì']Ñæ:íšbÎËP0æ›Ò„‰/ó[Ìkº»Á´ê±µö^Ãñiý—ûÆ+>–­ 7íg‚Xf†Ÿ+¸•hJÖÕx¶$€†èæ+NZ¶Xz{Å•7U\îCX¹ÕílTìn¦¥™zM‘/8ÂUS÷ÜBº\Sw|¤Kid q4Ñr˜ûPn¶‰ 1í ð,k‰+séÜ‹!döÿÆ-¸¦ endstream endobj 131 0 obj << /Length1 1375 /Length2 6045 /Length3 0 /Length 6997 /Filter /FlateDecode >> stream xÚvTlß7!´¤ÀˆR$ ©Òˆô€ÁFlÀFƒ¤Ò)ˆˆ4H£tKI Ò!%H‰Ò0Þé}?ÏóÞÏ÷ó}gçl×õïøý®3 ‡ž¡’=Òª†D …D…A²€:H\#àhè_RR  ÔG"dÿ—þ‚ÆÊT h¬™ÐôtˆŠD%eE¥dA €$ó/C¤‡,@â·è4‘(Šøéæëw„¡±YþuðÙñDed¤ÿ¸”\¡p; Aà®ØŒv€!ÒEûþ#Ÿ v“ñöö†¸¢„‘Ž÷ùÞp4 `EA=¼ ö€ßíB\¡&Œ`pÔ_bC¤Úâ`.p;(…uðDØC=ØÜC°6@× ŠøËXû/AÀߣˆ ‹þ;ÜßÞ¿Áœ!vvHW7ÂŽp8À] ]5ma´ZAØÿ6„¸ Xˆî±Åü)PSÒ@°ýýÝÊÎî†F £à.¿;ù;dU„ý¤«+F‘þ®OîµÃNÝWäÏZHo„ÿ_g8ÂÞáw öžn"Ƹ»'¬ò·VDú™# @RÒÒ¨;êcùÜÈ× úG)ú[Œ­?Ðß épÀ¶ „;@±?¤þ(ˆ€öð„úÿoÅ?o¤¢¢{¸` u„#Hÿ+†:üuÇnÞî0a' ýþüûd‰Å–=áâûó?Ë1R~l`¨!ð§á«”•‘>!Q@HL„Å*è.@ {üg=üï*@ÿñ#™¿ŠÅNé_{ý½}¾¿‰Áøg¬‡H,b¡¾ÿÜ$²Ã~‰þÃüËÿ Ý¿£ü?þßõ¨yº¸üÑòýVÿZˆ+ÜÅ÷o=¯žh,öuX þÛôô/ºê@íáž®ÿ­£!X(!]þ=D8J îµ×ƒ£í`Aå/¹ño‚¹ÀP=$ þûA`ú/–UvÎØG…ÅãKš¦TEØ!í³KLBñð€ø’bWŒ½IüE±4´‡úüA0@DDc]ØöHÒßû”ˆ@Ý=±SÇŠÿHîÊD°ÕŸû?RÙyzx`‰ö Ø:þuÿÃj(ÔjG:;…´»îTÞrV¥Äâ-´>"ÿ¸þè¿ÿ¬G«'†‚(•¿2+tÉãD)õcÕš*ß±âû¥ÿNã;¢ˆ¦dýæ‹€_Ö‰cëͤ3Ÿè{Gów”j{XIn )n\º˜„8ã7â¶ksÜ=¥)ôriϼ»Õ}j{J燞M­ëoTJj‘ý*Š5ޱ)š¾±}5ÉÈIˆb%¾CóÓ‡râøä3ÍëÑkvÍDÒÀÝXñ³e±¸óI¿Å2#1T7“#+þ1Íпòfš&ôqÁBïŒO£\û뛂é BT›b/+áÏõª½†fÛD¿æ¤ÂyYzi¿&W—ÔÓY{pêÞªl¶~N‹®w†²n¶9(ÏÑ[Lq'gì¼ýØF‹$M0èIˆ‰ ü‘à*òìu®Ò<<®T|ržq; ¢ÈµÀç•_¦q¤ÁÔ;ÆDèÜ[òîÇ™/Hv{À&Áç5¯÷eqz4JjÇgÝ@L1IIþÕχ‡#^~D£#H 8}‚P ÅÚ™°óbf‡öÊÃ|C|0ÛWÓSjØ]]„ÚNXîªd“îbâ4«¤¤8s7:²¦Ôí9O\•åá´Iï7f)iŠh.²”öÇÃŽz÷ÛwRáÞüÆ{)éÚʧ~ƒš9_kD9"ðvèz,z¤ŸànîÇìÛU¨Û]óª>Ïa™oùTĤ~Ð<u4óñÎ(|ƒI TRlaЧañÒyêžœû¤El0<;C[Ëg Ãõ;<¼"MÔ‡…áÌT„ ÷À]Ï…ÊÏ,[ÇÌÆT/Í©Kt./ß5'…i¿~Ä$ïAðͤÀ@øáP ±¤8Eæp`:—*ô ­ºjÀ™qÎ';Sˆã¤€=w›ú—9ù·JúÌ'GULcï,ŸajfÖ ©]\ÔBò}ªâë2½x²-RǧJäƒ)=y;íùV²èàåD;;Eñ‚aþŸgðËœ<Ô%^ýO‘Ž-kï@¦Î&cÙò*;W8–²ƒ“N7%YMÍ«Yëe_:×È©~û™gëY6>zÿÀ‹€y“ÁÚýþ½ü)ÎY)É]ƒƒýÖÒI¹ø—Ñ÷ÞΣEjŠ eþÁæ¸+q`/}hŠ­ÍC’e&£ðö)'Zõ³I}û‹:‰aØåñ¡Ï½¤fBåXã5³L§k“úML‘Åû Vß'K)}_;Vâ§F?`‰·X­ ‹¦Ô>X&šàÅlU¿y·ý¥PáU|i{K—t,¹«À¬þzÇIï—V¦¼âë½À{]ÓK_h¾€ïW.Il.™v¾º9cÈ]o)óÍBhÖ9TçµQ°7ðvdp*f Уk—´ÞC,ïšÂuŠÑ6àfÐÆÓLÐêùYD7ÿ’ÅLÝdZÕKE0'aŠ~Áó<‡%ø†¬ßRÔOÜ‚17âõ)”P}ÆÉ÷]p†ÓP5.).«DFXñgó, ÷'P÷cU™å§õ¬ÎÙ®¨ó˜*sWõÕ…ÁtêqW¨[’I¯QÐû,FÊòÎÉ7C6kc·¯NX²2][ ò‰Œd?îäõ}t=ºWQh¥ÆÏ$ÛÖDÔ‚ðmŽ s%ç³ïh±ÞÒ¡²Ï÷ã\Äÿ¼Mq­½fßÁ°2VâåÜ›QGR{ë÷¶æùIôN'cE‡ËZÍÁmi5>O´/½-ŒèoÍ‘È|!4âúwÌï´ÑxAü*¡¦•h‹qŶSíò¼³™0às}æE¾ßIqécdh™ ÜeäÒ~eÁõÌ:Öù.Ù#¤¹O¸)¥¤Sj•—Û ,vôlxk¾bN Ç™üya'Ês O8à~=Ýçª0·d@ñó²s¾yÄ÷õò™…em ¼«1±ÈJôm¹»åUžIç‡-8i89A´xÓ 7;iâW€Á³¢40<ËN{kb7wÜëCù ‹ôA±øç6H¡í³ýX0»Â£½0Ÿ¦£¬ã2­/åŠ<³WoôXIl‹Z“›€Ý¾²;-‹‹¼ÌWѬÑísƒ¿"¬¼—¯ îᑨº¯,öºÞ‚Wg± B¡wËK¿HJÊóÔf.n2“)E†sݧ®«Ä¹‚/*ï uuUQýL Lg5«l/P±Mµ ~:ÏnÕ>=f,à9*Éö׋9”Jlð¢)rƒyG3$Ëá ÷#Y¸uë4ÓEÖŒ‰}§¬—޲ΩrMÎhqG¯,ÑИKK#}õ‹RCìÓæ¡wâyÙj$EùÏìyœú¸G1TÅAæÅlh×srMŽÚÈᘹã˜)©Wzœ[/:Ç=Fd–7RPÙeÓ<~8G$üø ¬;W£A ãÒµPÊ~­Ià+»Ý½ªÝRSÈüà  ×6 »PÁÑ|`Ž;Âì8é…t]U#þÄ”dôŒÓ@>.—(ðÚ–!yçî%ÿwO}-wÒ'çÖ9δñÔ|_Þ#h´ôk^ª#~9ÈI@qMÊlXu›>£s×t;yª*žm<½r[3û1y«î…y”®E‘ë¾õ9.”eÅy,}–†„ù&}e‘c% ‘˜ÛrÁÄR:@µ$þRâžÈÜ÷/8”£\Þóy;7Ïé,J)¶—úîö&Óñêöý¸éæ~3‹?¥ƒµÒêØ*ÿmýV_%)}į²î¸`MŠåz!wº¹/cÊ=A5¡a;úâ-Y”±‹ë¬Æ=&Š+ؘèýÙJŽëüW1BªO¼Qçc;—>¯×ïçÇJUó¼o'OD?Åå2qÚ ®àþÑHÅ+úL*"ª4BÍvï%Ý÷÷²¬®4dnó—÷Ëíä[ ¢öG''s Ý['pÓNZ?¬¦T[}ô^ÙÖz\ÏÔ¬  zµ­ô€CàÂnSÁ§ÃHöqhpÔ¦è€\žf6¨lb ,×µ¤$Yc¼d’BÛ–ª×Ljd&°³c)¢>wkÙ ÍÛÚãK¤iŸå`ï"ŸTÜo`ó|ý0;ßï™û˜ä/ÐݺÄÜ—±i*œ÷‘¼ mÉrÃîÛ^–­~ìEvÇÚt gÒ,³2»y:‡àÜ•zÆÕÛÎÒ£ólAê¦Õ7Ô‰ˆPIž“Ôx^~ºbÊù8úã„É3FcN¦d˜Y*Sþ‹Èüe%í­ôvUPI¿¦Ð*ª·ßÙ„U—‰ß).GŸ ÿœžŒ¯Ø>Óp õûÙ¢ä­ >/¾…3ˆv¹Ï¿"”ø ] &ãñ+.­VjTB/*è*ÇÛŽDæí‡´ÐÑQ˜*qbî]}N*xóù­ø½²?(?ë•Á¬jU|«; ­j,ê¼c;öÃ)òZV o¶°_}!µë^Õç°‹™k!ä£+̨›ùzÝߨRK^!ƒÍ/lÎ÷vÏmºFK§$MS´ÐÑ6k=Jâ#¹æDŒj:ðŸÝ͆æyQÅÓ'ý“‹ ¯‘VöAË‚I'ô¢@Å;ÙþF H«T5†¦ISÝ<ÅÂ'® â/~:zv„d_D7¢ÝŽ›ÌîC…ÞŒ­‡‚èŒI º©‰»W3WÞMS,hŸÎmçÊ[Y7­;¬Œ£i ±åuióB"›Ýe|Ѥ÷õ,p&2ŒïrÕNÝtºGæö¬ó$ <HV9–”ó™Ö/c.Ý ÂM_º û€µQÆ“=N¼E 6¸eóZ{ÁŽ‚…‚äŸßòœ>O¦i‚%8§ìûI®ðç…Üha ‘3¬ ¼Ÿ3÷‘Ýè°¨‰Õ1ž5ùŵ7Ônï §7솮ÏVKms…XÁ.ÞBe¼]³© ÍK>|¹¾]ÍZâ¨GãÛQù/ÊÀh÷öC|¾ªC8“JJ^!ižÚ¯ˆY‡:NÞ"²Ldþ¶UH–¦°à'‰¢ö7> ãsŠañiMQ¤¢Ø³è‚ukSH )D«Œ~ÚªI©bı¯«®Mñ˜f«'n#M¢»;eŸÁVnÓ5lÖv$ rl*Q2¡™­¾ZVpÈÔO¼¶ìK0Ÿ%4rSzO“8)-¶hËÐì0]ÍOÏ…éqé¥,[öæ\XøÌ}pÕÈô™²x)·‰‘.϶©ö4 Óø†m{Ð5Œ²äó7zžúÌDE‘Òâ̆”$j§ÞÍø”@W6âðÝõú,%Ãvqã´{zoÞäcz÷sC‘6„“%Ž × @±\–‡ÑÓ̱a$™XPè¬CRnškX_Ðx*%ÔáÜìZKE@®uÍ,(ô¤ÃNOáÔ%i`Aå y¨)/ ™aùÙ&ÅÔë\ÿ¥ªa§½H)ý”ÆIÄ{*ÛŽ ëju*tǺ n¯½áƆ3¿&*hn6Xˆ¶ï6Æ-f¬–ÙmÌ—AâÔ6}3'~·BÍûC.ôÂt ;ËY—4¸=·óÖ7–òiA‚Ò“ö“ª³°Àɇý‹ô´%±ºÏñ‚•ÈVóül*lJ´¡æå3² Ûd¼W×”D­à”Æ÷ˆO(J—fÇûõî'¿ßÔeºø´TGîãu;Ìa(ê´JS„zn gŒ¿^„3) 2ŽZ9™¯zŸ¾w´Å_P{Ù÷}•ÍË|üu+i$ï¯ÔÌz[«‹Œ*p‰PUË=!ò“SÛƒ%ž eÍ“ö;@éØˆêÁèÙÙ5ÃÖ˜þ©® ž'âèiýÛGM0eˆ‹ hBmman9F½[åî~ÖšÏ>§®¤#Ž¿Ö¥=a8P,ô„¹Ú|ó™´Ø•ÍÔŒÅ0:3)Js>qŸIƒñð¼1ùù\ÊZ¨|ßáxÊT®ÅÄÙÛ»þ ÐT šÞÊ"Â¥á8î8sQZðî2ˆ{Œs5¤²•¸aX¹ɆÏCË·Ö‘³HËt>ä”(d7x*®!vÈËØHgdÝŸœØáÓÞŽF ¾Òë“÷é] ŸxdJ¿Ç‡×u¼Œê¿gGÄòé{8Áôàþ ÜËÆ¯EíÛ=}ÒÄ;¯†þ7P“fÜ¥3xîýiÃ÷&9¤{Å›pËÁN¡¬ü• %®l~¼s½æ»Š¶NÎJ/‡;|ίÜ^`9¨=ŽÁòÍEgÛÝæò¹;²––FÛa ‰âGoý®.¥ •jLNqLý»dvg:ïþ˜ËÒ¿1”Ê02pÔ政S£¼`Š*&â¹­j‘ºUþ=µæ›ÿ@ßcዦDÆòÖÏ6‰&jÄkª×^Å7D{ÏFƒŽXû­ ôÒÐJëÏAÒLÚ^É*eïïÄmR¬Ú–Ïš‰½ö»“rºze ìX:ŒÏ§m_êtu¶K¥Át#ô#îz4šî_FW—r0ÇŸ–œ162Sƒ…ÖôØŸÓyTû„C’ZÚÓ€šÂ¦…Þ¼!ëåÑqiέþÅxÇš;ñ5œÍå;?UCBiø,;{€]¥úl1÷MK†M^Gõ쬫V²ÀT‰æo¹C%s”Ýo;â¿.~lÞ`R¥_È Ù}!±,Hj˜‘ÂUí¨ÄS‡eÆ})nPŠè­«gmV¦™s~ØÜRÕÿr™ÑWpº”]h~¿«ýMÝüáQËáªÛàºPžày™Ó³]Lú<÷ø*N‚=kVš‰êƇS:Íù ä¾g^/)"•Þ‚33íéî\¯±ƒ ÖOõ"‰H`_úŸÉð àì_:å jýuˆïÉ'ú1žR‚Ýç_®´Ýö†*ÙL6b£ Ä>KbgÛ¶GZ ݦ) kß+”ÌñAç_o^¸äXuð"¯òf’-½5‹îƒ3WTøÂÉëòqÇ& £Še»n’Býs FEÜó¦êx‘•mjFÛÙ#Ý2€ý¼ZŽê¶FmúµçmcéT<$NEÑÁC­+n‚åÇÑ€[ °Ù".— ®¦»AèúUæoa¾FƒRz6a£Àš¾Þäø@6˜¿œ~¦>QÞ«:©Î Îéb°I>L±°ý´âmT¨ëȰ07“þhÙ¬­=F×û߇ÿpï–²!ô†óJ¦PóI]i¤?=ŸßýÙØà-MÉ‘T÷7cZûû=–<Ï7%XßT~–6}1ið¨ŠûõÚ¤ ;BÑMåîö†.”:e§æEÛBÇ@¦äågªà$²Ð:–sñ-³£à®î‹)ïŸÞð ®2)Î ²ËŸ;äx¥$Ÿ¾4ûÂÖqFÝh­÷ôÅmÿ;ýE,v©øB] o ‘ë»{ xÓC¯óZïàÕ> ýyŸl wÛÓ“01ˆÊ0«œØ?ùÐ¥,G:gê S,Dè?ö¶<ªTL ®úqØÄ~¿ÈôÖ²Ç:ÕeȬ˜ÿœca¼A•"D&½¡cC¯é‹ºã~~D+¡ QÎ'=QÿO!´Ìtˆœ¨¸Élµ9[£Ö¡Âg«ßs6”Ô½œûM3æQŠη&Üá6oªE =ZïÐò-;*nÓ­éÔ¬¦zv¨U‘ݦg¦Ï$¹8íÃ9ïšÄ-Ò%¸æ5ßèôœH:Õä ý®|²¼Ã '%KñÉSuI¿bm"²`ÆáÅð1Qõå4uÏŠ›†¿jûÁ:‹”GùsèØ±±ËW¾“˜¾bæjðÄÖÓqñ/j&_‡:ù#JpÌf³YP)áÓo sÇÚœ§ˆÃJŒµlJ£”;*0´œss´àÝîÊ2³ãBoVºïµ['Æs˜RQ‚nÏõ Ô°Žâ64‹›$8²-×ù¸8û‚ Qj/±ä„/e¹²2cçðÉt|'[d¡‘ú¸Æv˜æ>÷~£Säþ»]3Rr¢ÖÇ›/f¬#oäÇ‘ù/ÔµÂmì|Ö¤‡yQåÄ€fØ÷!Žv­‘£9èqêë¡vÌ^§FŽV†9—fB'RfÔ·j×í¦bvúHÜ,{qÕnص¶fÖ"ýGSúØû5‡Ó´}!€DZâB×ÄÐ#ŠÍ‡àY­¤~¼³;—Ìâ6¨ %¹†ˆB¹( ‡‚-ÌM¸Ûâ€6ÇO¸Ò•U’Þð‹8PzÌ-—o{IeŸ:(Æ&´?sG†lœú½Ô ðÚ–ÞVU;E]>¡Å=cu_ØÛTý&¢ø„ø'ˆlíuÚItV’Úë‡62ëÏ@ÎÉŒðѸ»éÂTžÀÖŽèÕ!Ä&õ_ï(¯Ä¯ÈÁÕ[çlmwdŠuÅÉ/P›µú¥uþ½·¾GÞi“ÉËl.w¢ú†]fÜfD8>úh ÐÙ¥‡€â²RJþ‰²ätj²ìÿ¤Ñ¤È endstream endobj 133 0 obj << /Length1 1630 /Length2 8301 /Length3 0 /Length 9369 /Filter /FlateDecode >> stream xÚwTlÛ?%1ºK`”4ÛHAº»[jŒ¶Ñ5JQRZQDNQ–’”Nƒ®oú<ïïÿÎ÷Ãîë÷»®ë¾ò>ƒ—ÓÈTDÙí×@£üD ¢`Y ª¾‰4 ƒÅ¼¼f?Oø(€×îã‹@£dÿÁ«úÀ¡~XL ê‡UÓG£€:þž@ˆ8"% ‘–ƒb`°ÌÑ>²@5h¨/ ÔA£à¾^U´W°ÂÕÍ{ËŽ@~˜"##-üÛ¨Œ„û `PPêçGbo„A=¦hîü/ürn~~^² P`` (é+ŠöqU"üÜ€&p_¸OÜø+]  ÿ˜(€hæ†ðý6E»øB}à@,à‰€ÁQ¾X”3܈½hª­4ô‚£þPÖûCAøgi€QÈ_îþ´þåúm …ÁÐH/(*rº <á@C =Q¿ ?a åüKê鋯ÚC O¨VáwàP †²1ŠÍïÏì|a>/?_Q_„ç¯ A¿Ü`‹¬ŽrVE#‘p”Ÿ/àW|j8 [õ`Ðï¶z Ð(ÌgÊÙåW Îþ^ sÂÛ®­ö§ü¹Âý€’`0XZ „{áA07Ð/çfÁ^ðß$äŒ? ã…öº`S€‡!\àØ/ÆúùøÃÃ0ÿ$þ-  3æt‚»"P€¿½ca¸Ë2¶ó>ˆ àm0vð @ð¯Ï_';ìl9£QžÁ«ÿn.HÏÐJ[ÓFèwÂQ**è FDL("& B bb@iì!ìß^Œ ˆ?£ÿm«rAeþ[¥ÿðg÷ùÿ\ à¿} ± òÿ=à¶`I0 ûòóß&ÿ¿éþååðÿŽGÃßÓó7Ëÿ‹þX(áü'W?ìì룱€úoUKøëªwFø#ÿ›Õöƒbw@åêùW¾ˆ ¸³ÂæöǨü›ÿZ0O n„öEüzP€"0ø¿8ìVÁ<°†/vSpìÒüûJu íük»Ä$¥€Ph0Ûb¬$ Ä@°kè ú=Á@( í‡5bÓ º }¿ú)#Á>0¤‹'VËý†¥ @ÜÛÛŒÿ °$„ø‡xú'+!ÿ!` õ—()•°éüƒÇúGÿ%J`Ñÿ¢Å€ ¯¿i¬3/ì…ò„»øýBþDÿ˜¯¿`q,ìéïûgØÀÿ)JA«cë ò üG(Ø´Bà>ÿª4Ìß{™ßïMÀ¶á?òïG ‚ÃShØ­{î¯ï5¿Rf Y”å]±|, ‚™òiñ?¥ Jx™{wÞçP9½·“jvYÿ@išã³Õð†(¦1͸é,ôÜ!Õdd¥ 09Ìð~èÙ–rM7Éu3¥ÕÐ ïP‹;ø ¸m:¼Þþ7)ŒŠhßiÕtWÌôGO¬¯¾”Ò%=¯ø$’džh{§tŒ·ÐéÉ8¡Ÿ± Í~åØÁá(MþЇNª l;I¼c³ öàd<äËs31ßvff&6üšþ‘•õ ÆÏ˜²â¹§c¶þ%qâ³í8všáK$ÒrÁˆÔ›ÕŠ=øœ«®‹GÛm¤õ}(aÍØªéŒg«•U^x’«kçàƒüžª]Ì[MÝk®¶ógÒªÇo²è—¤â"u¸9Yäåøqìû$Lp(LshþdΨ⌑s“kPè¿ ïv!ÙY  üy\œˆòŽ|€n¤$­Y÷Ósç4aûÊÚû¬ÌÂûCvwí‹!kŒŒ,ŸYI¾× Åz¸•â¹®·Ìzoã‘ì{ÜN[uPŒÌê¿ì…±s¥¡KÇ-cÓòÒ2Æ8hÛYÙðKH’<º2VãÓw’l^¼}Ôú™lJCNš!8 ˜^K %úúhŲÛÝšº›z1Îò±êµSâ¦Ü÷¸õ•1%âô^o£„€Faê m@ýÇÃìH#2‹Ý•2†É”v/„Gj½£ûÞøþWÇöœÚõJs4ñîf­å>©»ŠKBÇL{}”Q¯Yâ$acÉ'¨ÏÞÕ ò^5gºèIñÆÄ‡™*Ì6¦.ÁþdëyeA %‹ó#'r—ù‰IÖOK…V‹éÒŒ£Ú9 ¼¬çëa£€I¾™³¬+òpzmÞÁ+ï…‡FÚÜ5"ú4iCëAƒAÝÛJsÍc3¹´6ó:Rn‘{/–¦åÝ”§_ÍöÊRµq§ &çèÀðVÛ ŽŸ”…;¸…Ýlà´Þõk0£—*!˜n–"ùùã:gHüæÍl>Läs»’ó Zx°•OkºZ•¡MJ™z¯èPʬüV?¯}Y HÌ¢½žQ ¾•åÉæ§î_ùç÷w{Ì.ÌD×h®›<39ÎŽ5ÓCþá~5¦} ÷ðãõ»ä!º“úlÈjŽâêâÜýóI!ûú‰åšfZ½ïR3ˆ£"àúöŽÕ[í e¼Þ–oòŸõœŽ+M«vV´mIŽD=ú¹ÇX8j¿‘FvB7ù®x2à?þÏ a‘§wáßšG›ž+‚Lâ–·3¨Õxý;2:©Ÿ$U^'®),(#RßÙòÌ>NbH)G7/V=¾k=&ƈž^ÔFv+ ÅÆòÉÜyûA[çìzÀÅ®?F•Ë× u÷UØ] FüT^ý|Ú½j Oâ`ÌÆà£çån¸`àFÅKÜŸ ÏÖw¹3ÞDÈT°Üú xC |M Û(˜‚T:6lÖ¾éϪ±›;ùAì4¾ðÑóWAáH©NB9 {]O|ŽÁ‹ä࣪g\êÅ7)5©Ž W»¿¾{û®U%Ö½…b­9¾Ëô±cŠ­=%ËUö™P±]´ÛŠÏHÕýÑ´&‘2\ cÍÃìtVQ÷yp8ýñeþç­ëZ«®Ÿ]Î)»„ä+¨ƒ÷®îø­ôõçÚ¥‰o\’f™‡ £Ï •I~ܦÐübô´GRæuv]«ú“è¬à8œëcpšÖ¹›„ÚÁ.ožŽ)2Kwˆq¼Îâ0¸ŸZ"u£ÝJ„eÏg9®z0ýXák%%‡Ä bÍnEŸbÛP©œ2Ь=x`g Sêl×c!Ÿ1•ØÚL^øýŽ@Næç›ç²`Œ?³€½«hX™hÆ= ã-ni¯‚SÎ÷ä ÿ-“u1N ]³Ø¬O«œs—¶€:@{Ûp‰`¸wFwWKYEîœÒe×®;fÎÜþ&@½«õ¢™hLÙ oZ}«éÑ»«Ò#s×|9ZïR=殕ÊÕ‰‚îÏG©Ý(&´ùÈx®µ*MË"EE£AÞ™¼^ž@•H+¹­ËÆŠ¿Œ=³…B7e?ÛíQR@ï‘$ÙΚ —ÏîqÆÛ¶Ä[¶7÷$ýÃwWnËriªFÊþ|­!Äe@e‰ØŠ±yŒdÎTöS2 Ý>ÊGÎÉ¿ª°xG«EhY`4¼ þ=ô ¯À ™æj*n5w–^«ûÜÊÜsÛÂ*o/]´Ä¯å¥ÆÇÖ0Ÿñºàëð‚§o˜\¦k[†Çyé{¢ŽC8îkê/‡É?0êÛSòÄùúSÚÌëƒõ{/åC2æXØœñÙÝy÷´å²yÙñSà`e^AÕzŸ­¤¬Œ•ùÐ=Ï$gÞ•ÛnÏÒ»6wϦ¾,u¹)2»£5:Z Šÿd}/˜+/Žxëµ;·Ð‡¥=¶Y˜´GÁékévÃ-”†Ç%,3êð‡^Äg?ÎB/CϵílœŸ¯å‚kkÑ­«Î:î 2Ar59£eHŸ»ÝIÜiˆN+íqzÁÂ͇ lßèÔXò®öü†Ñ…?tãÓSôâù`ñoÙpæHf¯ÏÇ$ ™èL è:Dž$Nr‡¦> »1¸tÛ‚¿2]”îøP«¤„®D»]ßo붘ø¢!Ô(Nû%?󫀉%7[}ÉI‘L±oˆÄ7«8®×ÜNK”øz˜<N´F¤Á œEwb¾œ÷‰Æñc©~ ‡×OÄX?›M£ÂwEw*£€÷ÚúXÓC+ºÙi÷™æSê/=†ø_h\!¨å›‚yç×­qÄO¸D]qÂ7ÄMU&Ÿsd~ð»ÙL"4¸ü%š<#ñ¸~jCMº¡—Aí¼é¤ê÷Ê˃™v´ú™¾xìUŽªèOP·½ónªêLh’M:ã£i“ñqH”É»8ò!&„Zåa®|hAŠ_̉ޢ+§Õ’ãç^-ɾÿ~¶ývƒÝi1îèm1ûÏAÒt+oMÕš$_=M7Ì›Þ3ÿ(«ˆ~áMVCeiKyù‘þÛ G¾w<^ #£Ñ<ÎG?¹z7ÜåÚ΋&Â/¥(Õpîv¯6ÚHÒɳ(tåœdeíwOäÒ?å¦or3¥1Q®ÙV{7GEl8®èIÉå=ÇO9¢¶isàïq€/áÆ1ó!»œ §ØÑ -ölPIt—ªÉAuåù”ŠÜ’|¥ë–®!t máÜáŒKãwÔé{ñ™YF4æƒÉÓ¾²`‘;QÉ7OT¦^Í‘ã©ËÉCdÅ]¦áþMñ:ÕîjCL™ý]{Ki½+\þÚOmZë7}º#·UA‰ ˆïWnŒ,sÒ’µÜ‹†¸v¹:ß~rúv­Î?ƒ¤îlÉÂmÖàP©Î] ¡Ñ³Qݺã´I±@5˜œÂ$PÊí ºþeh]P®ˆ IØÆJxÅ— aóæ-„âùÏãt;éºÌeÔúý‹ôÕ6 y*©¸Ã-X8;·ã ´¾c½öÂׇšªñ̰îí$“N~jÜ7dJÅ¥ô !"k]"VUºû¼|é¼> Íí^i2ª™|v»§w†—3C½T^®ÐÞyÀP_Kʃ^V}Œ¬½ëÞ²Ñvº°ÒçOŠ›ÃKòŒ¥ÜùSíÆqÖ’§–øE¬Xë s«o^DõM\ÔßTŒ¤T–Û ¾B*–ïuÝË^ꬿm³ÞÍdl—·7Ç|ø¦ïØ:e%<Õ{ˆÆ )É&!›1°‹.»œ=#¤P5Ý/òršèmë¹²íÂÄžt3‰ C柚ÝÐ1ÙSõ¿ArRKâ~oL í; c×9»·±4*‚~”b8hm–¡j·Šû:}¿yÔe¶¯-‰1&„ö¨²# 1‰„·pŸ•ÓÝs,Ù¹Pª±TýÒwä'f­µ@õ(E“gKgêÝÁXZ¦a­‚®ÐòëB3?DIyugwŠ®>[NSRLoÌYÚ\#K÷‡]7¦*:7y@W’²fö ’Wû øEaÏg=ê0'c'>ñ–Æ0»æDé7¾‹V6Ù×ÝåOé·¬b†P¡ÚÍveOƒ¼AkLâ˜óF_D‰“WFä'¯·ræN`îÞÔr-­ý“¾³¸ÖŒÉ/{†…Kû Ÿ¾cøx·Ó¬„ïfS¯Wú³ oz)çŽÌ;yBå â1¾ U'‘LÉeêGw%„vÎw$i--z1H`’Ú‹®:Îi»ÓÎ1ìÖŠÛðs ÓÔ‡ÿ =šœ„ÄßfF¿• @õµCÑ%È,=Ì b)޹w•±X^'ßÝ1x(:œèÛÝi²(’Á i~5÷ÅE®O@" Ô¹Œ#e®­Cƒ0q*s{ÿöÎ~«ã`A|< ý8ñV&c›•)œ+k 1v4›Û`Òòp]|ümFÔ·¨FŽvÂàðÜðµtxp²^Jp›±ý°Çȼaãfp‡èQM÷eM)Å:Žî š+¸='ÝüóŸBqåÛC™SÊ-¸Z%»f”ßTï4j¿ßmL¯òN{ÑR°'ɼõ†–ÃfJƒ‰&fŒ?/šv\œ³‹RŸM H®’öû-!¦·=ûÖüÍË`~tñ]" î>I_Òëâƒ՚°ÉÕ1Vts—sô…¤òÎõä&¡WB‹Puz²ž4HþÂអªò¾S-=IÞ/µT9*c¬·VxÍj¤á¨öÀ.–(,~Éæ(õµÀóä›^wKnÇPnÔòWr5ݾQé_$Û+ˆÿÄ'ÛWd€$^þqY-—ª Ê×uœõðÞ~î€ëèñêuy…zÈ´œÓëÁJ(é੘%V!š×Øß8W!COÆf@i)ë»Ñò RÏÑÕeÏ¿Ÿ<§¨bspHK±Þê[i•óJ¼ÒŸÑÂ÷ Üa'v$Œó‡xFþx7Ý WyzaKGa„Ê0†~¤ƒ»Ê²« ÑÉÚ´¯î%˶ T4øn $æ#î$ÅZ¸}ïÊ“iëÝ5ž8O*tX0,›aX®ÐU ×ú•8,“ :–ËJ6“Æéoçe/‡¥†¯Ø4™8z°ÅÉ=Onü©…RõUFžv !|ÍzD Í#ˆ_âхТ>„Éä6ÆîŒË<[$úŽ¥Ô:]cÔe¹P#G™„ÁÉŠç.2^eÈ›)"R!Tæƒ=L®x›TE×4‹Ö)"9X”uÖèy%Häf†ÃËÏàÃÐz¨’m /þ{¢h¸ªÃ¶B-ÕÁM˜Ê÷z> Yq>*ߘýµÁ{“šdT~I˜IK dŽH%„|u#‹Ña87ñsöZçšå(±úÓ÷Õ¡­Y$ʃë!ëgÉR÷CXbøóe§„äõªû2?b¹;42Ö3åká¡j^/2Í®›IÙð~;Ö"Qwû¹W|Ï϶¥ >¢ª¯˘°ÝÎy_$Þ5Z€mHLMTì¤âèο´¯‰ ]wÚãUaQâ©Ðw®Ûæ¹É¿å›–`ôÑVb}µ/+–>bpã»75ÁÀ¬ÄマB@™…u@Mç{­:ÂFSà]fMR!ÅäÊr©Þ¯ K¾˜ýo«Ê•È­B¨É¾ž½æþD£¥%¤Æ›`ni˜¤hA,Þ-Žo@ªqôN®Ì0.Qà6WS­ò9<76sëqfrìR\ƒE:yÍ5÷ɇšÌæ-GcݛնÇû»Óé‘.`·ƒ‚j`¤\ÕÕMºÂûäÁQ“ï2±}¶ÍFó'ÛjÑ ö/>@ôæ ý÷lïóàP½RmÕpÚûá©oŒ’$ÊÑØ·T¸¯=óçÕÍz—²J‘©Á<Þé¼bÃ;£*f µÊg¬…Ú½tÂL󽯑^­:ÌâÑe½Ö÷/ E¾d_ò; jM­õUŒnßÕ²$P¡ygìê² tÜo½ÎJ]ö`ûvPÿ ~2ˆ/ØkP€ ­WÈ aúH3ùaŸÏËþ]«R·జIJ•ð5¤`X…׋Îû©˜&—ŸꑱJ. ®…¢lŽÎ\ݵQ4LeÏ}H—V¨'Y|}%$™MEºÌéz©‚²ñ®û5¸)æWÚR0ƒ8ÉræseæÅ¾ðAŽatµn×ÈL ¿‡ã«ÈÓê·ÈpüW²ÇËQàîkžó˜Ä¥Dbh=+ÇïìOEÙ%á‘ãv¿ùq#>ò£a@[ÔT?ËÖ0^½Çˆ$E?ªaùº³ÎýýÓŒIÁ£ˆÓr/ÑÛ_wÍ«¹Ýù)AÖü²øvãó’<§³qÜ}»P©cSûS_‘Ç#ÛW¼oàV0ŽÆx­ï)õÙÛ TŽ– Æ·xaørºÁQÑŒ/âè:-Ó˜Hú«¯Ù¬ÛdS¿·¨=#WV Ñf ÓÕ<"•Ht@ÒO _Ì’}¯% ëz0´<´ùt¼aÛåÛ¥]ptò_·ÑxžÅÓ•Säë)ó¥Sîýðé~k.=™ö9øëÖµÀ²ÛÆóÛ_ª€Òý4ñ´Yo…©!kF¦œˆ‰éïdC¾“FX[†Æ;_ô«/ø°û[éf×&ã;>’«hrH† RQ½º‰W^àžâË ¼}Ã'ꆣ3Šô©ÙfáÇÚ\L—†Náv ãÒæÂ›'á‘ }B÷ê$K©˜¥½ÉnÇÇy±êЉyË;>y¸Ó:<—v{ºGÊ(¸….£ à?<˜víæØšã„õ}Žzï54’Mbv{Z2§ø hÙj®ñžš5>]Ú˜rq¥0ôèà Eѧ´¢õÌA ‚ä“ñH ˽¶¹ÓÄf§}oÝ¢õ`KzDõÃw]jWýÓ:ô²žÎ*ÎI3¬Õ`ö[5Ö\rx÷Ìmà>Õ=›ü¶¾·Ï6VÒEÚš0²÷£•h¾w%èƒáÉs´¯k‹E£B¦ør:´jE;!ö4»ÜÆ!Ž]úͶ`ßO¸b£O) qmç›ÔúåǾ¬ž. XzѶ”{Öÿ¤Ó{OOõu‡¢¨Cyá©~\è¥pKkaËÎYwGóc‹4ø>+•»@²µâ›uà­ §¨¦¹,9.Í{9´Dô+•ÁŠÆªe¾ôŸÜj ,v¯-Ÿ9QÓWdQ’ëÞØ·¯&ÇŽÓ±kt%]šh)Ĩ·ÆTAbà5Áº]\#w!£‘íÞ™'™e/&m(•¾07!•Ó¼ÌT«³}¢=&™^I…¢IÙE`El`í¬ªÊéKG_«Ë;Ê”¹é5Ió¥MGÃÉCoˆÁù.… ‹QžMò¼øÔì>:ĺÒG€ÏO&üáYÉê©xziªNá.»0‰õ7cðé#¥†ÍL/*©[³†ñ'·uN±ÿ¼¦ôLK©éºâOÙ”.2æ+ôÝ›X°×*eéâ-ÞÊ5ð$3¶JÔÞµb#¼ ä—ð¼uõ)ŒóøaC•M¡Ñ SÆòÿKÈ.êëþO`í¸`]s’øüÎ÷4›C\²šË îsÖé/¹vpÉ7«ôJº–+ÂbÇYIQÆ |~@(%g_¢õ[‡Ç1Caø’*æ´‘{E¸ÊJ·ØÔGî0Wþøœ9#ïÃwË®Iüä,æFÊT¥/œ6ÿÑ^àðÅטú•wÁ—¬B“ ‚é5½[Öcds9l†ÕÛ‡Þ 4—ŠÌ…9.û­ÁAc6SÞHºc*Y›á)fj™…„«»Õ›úA§ÞRÑp×ð8<ªèä G‚ç6:T¶-ó °Ø'°Eg÷­Ã›ç†<ÙÖfþß܉š¥Žne*h¼Tò ‚Þí–ZGªÐ´g½ý|ªGz²Á©ð²f+óKüì.•˘´ÃV$×C*G6ß“{Ò@&ö²7ûV‘ö ‰º 8ý¹¢s®?Çgž{ŸN¥¥[•¾jNZ/íã„çê°G²&<Þù¸«Nˆ´µ®}oW|ÍB#’ñãõžÃPƒZÀÃåÛrÓª&m·xn54›sSyf±­®Ô« t)HqâyDy–ð楘R@ÜúË\W² Yú7;óW\æDße@z ñÎ’{/$ùsBYwƒÐpG‡øÃÄ5QŲ“#ž—ûòc8ùxrJ/ýš"Õ¬‡ø³´‹î—|¬±)ºMV¹å¸kƵ ro1®ÔÇá*2/ÕÞ¸ 2õœùÄ”ùÙfŒ1dˆw5QŒ 3f*U¾ìR+v ¬¨Ë÷{¨ã-2ÿœñœƒßËJqŒ*©K6âT©MÎëÞù'þ²ÖDIžMS¯CÒF»&>÷{˜—¿%ÞÜ h&ªX3.ս߾{8dð¸{s#ÞŸ“–ãÒõ&²ó2)¢žHª¸4€&²·ù¹¦EJCrZiùÇŽcK¾+ŠïÇZ¦ K4=u¢'j]§ïº³(yT™?=Léâñ5 /jaIÿ€Oésœýì}žßU ‹qÙ7ºÜ³\QŠÏT.ØÎ‡¥Ýi1ˆÛÕÊ~-G åÖbi½Š*>Ö!ެ5 \棒ü].ï0)“q}]nÛj„îO…Ž´·f™#Òi£œ™»G©Á—y†‚Ã6:eL1DG |6û¯i×HY˜se‹[yñƒ a èéÔ7t`òÃCåÎeläõœq¼²lÓ5Ÿ(Öº¬Ò“Ÿ R 9t'G?xRæ{°Dm¢Ü«EmFKglô„(ʺSáFh¿èï¼± Þ.M ¢°¬¿‰¤ìA—çOm=½ ·Ì2Ês0Ö\.ùqÅfLÆr¶Ãt6Ã#MÒNÂ1ÿÕ‰þq¯šé/Ïtçv5A{ùöUñ,ãW<{›:6ÅR/wÄÙ^™kðP0õÚî¶…6‰'ÿÀŒNAG™¿ž¼§ËÇàÛern=k¬N¶ÌÕQõHŽðÛ±+䕨֞,”¨¿æÐ ^mýÍ0í°eZÅ!¥“úGßuO°„ð@yÀ¥eÜ{¾ÊÞz~Ú/$õfDër9Q’S7%=455gŸ‰¹.³©L0EìF¼íûNxQ¿ã9KÜ |Ñ}Çx\Á«ßì§ìºûMÿ:·º#­åsT"KÄܦ1‡Ñœó¤„ÁÒ¶ºüú 6èsŸáu%„*ØÕy;k~ËæZ±Þ5‰W2 3Cëý¡ßGï©…BƲ¶M0“*Sç^-‡^˜dBUî¯Ã{'Ìãg¾u´äηSdùYLÊÃNÖð<Ÿzô<Ó¢ÝNâNXûIb[„ò÷Ëë÷íÖ‚¦gÌßf6ÎÈĺ’>*~¹)ôíØ>Á¼°24¯zAòÜvµd§%Fl³~5H@<¿„O†`Å•uíЄ¸Á€QßVd“ŠZ]JñÙÿ‡Â£ endstream endobj 135 0 obj << /Length1 1688 /Length2 7921 /Length3 0 /Length 9029 /Filter /FlateDecode >> stream xÚxTÓûß?ÒÒH( 8ÎÑÒ%Ý]c Œ ·Ñ Ò]‚ Ò! ©tH—„€´€€ˆúLソ{ïïÿ?çyÎÎÙ¾Ÿ×ûõ®Ï;¶36f=C>{„DGó ò%JÚ†æ‚@(Ì ±±AÑ0È_8› ‰‚"à’ÿ`(!! 4{BcˆÚ8@ÊI ŠK! Pâ/") xò€Ú´ù8EĦ„póFBÐ?=8Á\A qÞßêW ÁÚ ´Äã ‚ `(íý/œÒNh´›¤€€§§'?ÈÅ@:Êrñ<¡h'€Az@ì¿Rè€\!¦ÆOÄ0r‚¢þ"О $€`P0ŽÂ¨¸Ãí!HÆ;ÀP]  ëÿAÖúƒÀ øór‚ü‚ÿ1÷§ö/CPøoeŒpuÁ½¡pG€èªhñ£½Ð¼ÜþC!0ú ²Ã~‡¨(è@˜ ÿÌFBÝÐ(~ö+G_f0׬ ·WB¸ºBàhѯøB‘0æÞ½þ,® á ÷ýëä…Û;üJÃÞÝMÀ}ìQø'ý9BÐQ „˜˜¨ òñ; ür`äíù-ü crð÷uC¸0i@ü¡Ì‘/ ä ‘îß þ}"ØCÁh€Ä 'úÛ:†8üqÆÔ õX1í'þzýçÉ Óaö8Ìûoúï ¨k)˜>4âù3åÿ^_>! Ÿ„ (((øÿÛŽúgÿÐU‡; „‹¹§¿Böø³8ÿ.À¿mé 0 pþÝ耢@0æMðÿÜî¿Uþ]þËÊÿÚèÿ‘Š; ö[Îùáÿ‘ƒ\¡0ï?˜ÎuGc¦@™øSM!Œ®6ÄêîúßRu43 pGLGó ŠðEþÀ¡(¨Ä^Š;ýÑ5àÆ¿æ …Cô(è¯ ƒÑÿK†2° f‹ 0­ù[ÁÌпý*ÃÁû_Ã&$*!‘ o"L­1'Q€¯ f*í!^¿› ÀG 1*LŽþ’èWa……-„' â€þ%ú Åv ä?ñ߀½ìF× CþKWøOøø Çd €aþÝ¿1 æê†öFAþb\A¼ (4† ýíOH àøkËBÇîÿ`,Ã0—ôoºBáiÀ††ºa®î?Á Šà˜5úÛý/âø{ë£` ”Óßd€€&)„=féý Âþ·ä_å»#‘áï¹ÁÔê¯óïExAÀDó3°T¨ó«ÐöËO¾#ø«kQ æ}‘¢höÙç¾NZYª“mí_ÞJ×›+ Ÿ˜ñcà>¼õzô&'h%M-¥Ü·¡xÌ—ÿîëéG“=‚ÞqL<=jƒò(QŸ‘ÜÖ×’#Ùtæ:L¼[x»j«ð{jÛBbà (>§`Nc@Ç*ävìa?5gf]Cš?¢ˆ»…´Ìp7R7ûŽýw¡ØC÷Õ{Ã2.spyùmN_ºðIfOµØËp*:j æÜ¡¤:šÄ&»¢è¹RYj ]‘k8û Ž­øÊ·©‚þ¶Äa÷G¥ bÝ'ÃîuÑ™Üî•“»6›ÙF_Hï4'E9ñÖ¤³*²UÛcfU Ãòzpi™øu‹?µ§%¥æZØ 1OB„­·Æï\ÚÆf% Y¥ÊÀ |x¡(¦crr0]¸–NPѳQé©ãè0;¯¹Ýçåÿò¢Zav/~¶Óü¶}°;âÁzÈH3”`ö:k¯w7±I]ôü=¥ ·²ºVsÑiß?`—‹«×Ö¡*e!/17Îo"LžoD9k¤µ•¡sÓ\2L!pŠ‚ŒÊå°Œ±j¡Rlø¹Lp³Lêƒ@à“Åñãõ÷ã)j7ˆõ»¿âY'Ø!„­5ó:,œlKt84Ð" ÐÆ½„ëEóÉ cñ™•<ÅÆ6T¥Oå9Ä~‹û²}Öp”g¦Pi‹Jéœ Ø°y1ÌøÚ¢?înF×ø~ÏkŽÔm/µ†«sÜc*»AUÉ–Ð"XEc¼h«Òr?M”±e?H«ý¬™Ù=ìÐxö•DÐòBJd3ÈÍz§AçÈí[ùcŇ€[äsaÂ/µÙ%|qI"M’~yp‹4Þ,4çÇrrh$Ú„Éi.ÐìY²>mަ¬»Ru;>ï¹»LÕZ{W@cÀ|S¬òà›‡žðÒ3]±·Þ ˆÎÚÃ'{ÛV¦k·àGš‹¬–œ†7S˜ÇO˜é)k}Ú°j!~ÙR†PKõ%¼í5IGýn—£³Ýί Azy~eªsÑ K9夸 ³%~ ûì{ÊFìžÊOÚGZ”ØN‚UL*¹vŸ‹_ŽùóDÀH•.­ÅE‹Ýœ„¡w=¦!<cwr´÷îÕù_ÀäW"Y_ZTnY•»†oR¼‡–„j–ë¾›ŒcV'Žy5õêÛö/”?lÐS¦R™ÃŸ3Üå9ï—£¹‡K¨¦ˆŸçøúm|ù¢ïþ¨R ÃIí8}à…[É@P8‰NR3U(í~¹.UǼ`²2þƒT©u:ÄQ—¾èÑš»Ÿ$×]—rO*%§VfhI–7´£¤¢ãKÅ×ÆÔ¬ÛxJ©sE¸9úˆèȦämxsh‡rÈ_»Ü¢µs€m+¨_êYc—×ezÎgQÛ«¿Sóqá_‡x}K`Š~´o8h¬§^çà«9Z6‚¢#91¯nÏ?Ø÷x?f™ßòmŽ©Y‹M7_~/S-$<ù–ôŠGœÆ‹¡ÅÄiÞ~Öz'ãm¹³º=¹µd7î™Pq²ÙÛ2‹z‚(ÿ UòdùB$QëcáÔ(£uZÃôÞúÅ!-º\¡õ¤•@ñ©­I6²°›¡ª$…WI>/¨NO>ùFŒ[S>tZ:“Þ,ðÍÌ\WLÚñ¿êêŸÏE 9p(z¯¬×Ø TMUÉ[™$×ÂéÅs„ÐmÝÐŒ¼qÑ•¹}ð£B~ÆTºã]ÖC(?]¹½å‚ý>“›Lœð1Žå3þþÅUÓèÜÎÔ#D[°õ,k{ÌÇÓ+Ê°ÔøÍ&T {hè„õuk®,f£½ ÒSìµÊÃùñº5ڠ׆,ùRñ;©×îÖX†QD ¢'™bŠ2”‹ðÅL£ —™ñµcí,aeTVø#|þ¾‘ýïäó¦ÒxžÞÍ‚’Ýø^ºÁß-nÔO}˜ö-'Õ) Æ:rbC}-ÿtœ Àj™]Ó)¤ ÞüXi²°9:Ò *˲:Û‘›æ%Á>8±CçÃ’3œ(ÄêFztÛ”à|®“H0Xãè0ë»]Ü|“ÀpÑ‚»h#UTg‘¶ÓÙ¿é1Ç$áBâMGÛ –äí&EÿðÝì Ù™î@0g\òØ Ôªr9ÛÎGUIŽ…-ز\äɼS›6";Td ·ÃºÕÔ"ùóõ{®w¯ã”ïÏíÙsTö|ìdàki$y+„%ÀÐ`ûç1ßÉ3·t›Äu(¬Úâš¾·@‹ìÁX×Ù|,xè´a A=]­¥Vô¾Uséì´1—+ã Áã\žÙE°¹wjßH[Q^ö "*íá˜ï¶|òuÀ[&øß8ÎrÆjÇÏËІ¥ï Éíå–kÍhp©8CÌCÂz/\tLÌ»Tæ¤wýëÒ>¹Ô-ö¬p.LY ͱQ±®¿ï»†¶aÅŒ’‚ù¹† ¦Þî—‰Ðél¥yšïýŒÊ/9~ÁãÞ­#ž' …yL½)Ê à¸÷¯æq’´û÷f^ž3Òî¡çÂp®Õ YáózJõŸE*­ô¢Ÿ´&öe„ã• ¹Ò%bú‘cÓêOóÎÕÆšçT^Îù­ü7yñG.yìkf^}&éQφ=Ô:Ò·ÀºhÔãƒ#ÎÚO^sH›¿^Ňðí \’Åä‰Åp+‹ÞO·ø\ˆˆKóµR*—eý£rÓÇ‘y<’û¢k Ï9ú=ŸÎ· ƒ‘¢‘—ˆ“ñ}šžúŽË±n9Ü‚ÖTdÁü3Ñ‘ÏF»÷jI´L¢žœ½ýy}éJJ ’Z±Qïó³Û&´•ÙTé7AïÐoºu÷Å1  ’°E½ùÍBâqîNŽTq}Þº[`”€U|n/¨™WàRQ1í¼PQ@ÈúÜ.W­b\»÷8ª)~Tfªÿ‰Í'têŽÆ«–9³Bú‚na´º7°·ã?îÔý¨>Ëäû…Œ!ï+¿Nrs«Vá®(ýR»’áq„Ô²…‹Ã‰ )'q$B¸«öÍ$¿tlË´gIr¬h1:`ÄÒôH$±ƒ’œ¥x}iÚ€7¹Æ’{j4¹wƒÌ4»±M9|æ-N‡¸nÂÏLjë¦ÚS({2µFWÉ4öŽN>Y?„–~·,>–DI?[jêfþL½bqå—•vÿ«¨IèžÃ„qôîc½ ÿ‹ëI?™Ÿ1Y˜ß3Ý¢º~z=ø;¨,‰Ï%Lqþ›“¹‰yÝykxßÈø9W‰R*½ Ïþ2M’t”VÝ>w—"€ÿ“€J)û–!`AÚ g­5›Ÿå®'„ĉž®“L/ð9dsÆÕíäP#Q/_S»žôÓæxWa2@¸d]ŸæS§"“ÉÙÖc å‚VÚg¢§Vr€Šg¾žxÕ6ä¢ÂE*‡ŽSu"Ï3w6³÷J+n­½~üjìxˆüÞj×Ê„HA4SDaÂø¡`•žÃ÷7+Oyg ûòÄÎ6Úñî>G–X&.¿ñ°ý@ÒÞPû-ööYg€×…*ëÏ<†Ð¢Ý1~ÿü–QâᔦÒÛKc*#™)àšÚ¬Ä`™;/ß80ß¾q{èu9?¾À¶'~o”ÁƒVNùùù§Y>¤àæ×ò=:ìê®é¹Ë/d͘ߺ¦B>ÊÂhÉL²Lš(©í\>{ðÞåöt"ž´÷eŽŽ;)(…`·¨ôuqÿ6Z§¾¨7&ôŠ¢ÙEAÝ*‚ë¶ö%y@£aÁÖOó´yx}p,>~fü4{)·uã@"Ç5X‹|ýÎ5{Ó;?éÕѾÂô‹¦ úƒš[?wçËyRÀd/¥óŸxs¸î£¤3?I¸1Ût`—è·/|Ùä(ÖÉV­?ZÃË%Ãî+ér¶ª¿?6“ÕÈ÷‘›x(£sv㵜”1Ö÷ñm;sNY–ùÜ;O;Þ¸”z _"Ö£/±–§Qjݤñ=×[.­ I^3$ÐìÙ?÷UN¢,êyê4O·–IÉÏJc²¶íDÎ9÷qauqaµÙ«¦Á=;4ç_US´)ᚦɟ ~~vžîv¦_ýMíáü‰ ½œÛ ÷Dz µÞ ›ŠL[p÷@¿bþùüy¾ªæœžSÝzËf\õA›ÝÈ@%Âc#ZŸ•I‡›Û7¥§ÒRb®ƒnØŠ`¾••g48Ú¿¯-³ÚúÜÄ'&XxTH¹®#äâP:E¸kˆ ?  Ø­?ñXÛí< Ð\4õÍÑžZ–›-Ó>±ú*iJ'EMéq/!Œ"Ïl] /Tñ^Òͨ3ªx~½5u)¹È¶µêÓÇ6½¦;—HéJÆó"îåñMËŽÛãiçû[„?8‚·{Û_RÍ‘uIùß#?{¤ð <Ã)9‰¯•; ß ²ÈaÒŒtŒÀ-ëÛ)ì.Û&§’É1ªLLs€ÎYØú5ýlõ5ý?¼Ý8qž“EæÊø½÷ÿ÷™’¶m;zE0O€r'É;ÑâHv˜Ä™Y‹žì8Üñ즺ʶ¤\7:e.ßÐXwÜOæ,¤äjYô¡BW\6ÈÅôÊ¥¨ ì5öY5GÈ«CÆbÚršniã’W³:‹´ö΋¨Y2aŠY’Ìû ôŸÓ&Ðf©ö4<ÊâµÇ:›ø„™ö1|û¬Nb\[ž DÐ÷¦Èm›£)²§÷ {jŽŽožvþãž'ÂêìS&s¦r¯Þ3q9ı^g=r‰õ“—,Ùo“£ŸÅŒöÍëÆI)ZëÇë–% íäV$® q‡·á°6­ª÷“íþ̳ÑŇü°}Ot/ƒµkÕ6ØŸO‚F_ ·è£v$0WnîÍÀ»ÝÉš†LûN›8VÑÅvÇ7¿jÛ­°Ù’ž¸IëçÁ¥ØGßí.¯±Ë©¤ÓÈÿQ¦+_Ȧ¨¼ò‰]Ü0ç>ýˆ¾EÀnc¢çDÛD…f(ús”ÇEý£©„¬CšY€bü²·ñLÑXñ³5ß¹<¼¸Ç¥¤~ëì0a†(oà%Í”V´Svœù-š:®9„¨¸ ÍêäûljI°x+½FýÎPÕ(¯ƒ"µ\iMÊ͈9¼‘§â ±ñ;aX®¼+N†|¦·âÉpF5îp4Õ\ Ê”n.yê||V‚•KF±^ꊮêm†zöØŒG)£™5W„£Æ‡X¬?N|u|üCXë ÜÏ)åÆ"ócÐ<ËŸí¢Š:˜VÚ Úˆ¾©ûõóží­A·¤ ßæÕRu±ÛJišß•pol—÷ª,À3øVGÁìhÕtczçØÕ üPïèc´‡×´¤»½~"­»…· õ+#{–äòv<Ûotíæ`Õ^8ÄÔdÒ% {áÍé(ïiåPl’9Ôm¹ðd/FPåë'&kraÊ›ëºÙ“Õw…nxù\}4j^¹Î/SÛÝ?Ͼa™íã=–4ÍŠ ¬\t™M·7PØ·åì “÷MÁó6•í<ŽÜ”½J¬Þö–ßêûÙb¼vEøû/…5©°oÚ±´·¢xŒ™Ý+¾õO¾,âýQ*!Ÿ}>îâk÷ÍÝGXÝ#Ø[,~4̲‹Â߀õ±·{Ú_XOW¥ÜášÎäómÒo©Ñ¹¢0(í®'÷z»³Gx/R¿ðÌSÑq$¬ìŠVt¬&æéç4uÌÛAe—r,t K˜ž‡F'Û·Ú«Lw‚ÁD¬,Ä~ÇTNLú„ŠôžáÔP³ ‹uo¾Þ©Öà"Pv¼ýŠÇ¨û$ª¼ë°ä«=|0 _%ßÐÚOÖÍ|£“"ß ì³­†d-³çôk½xÓÀ﹄†ªtàLGf·²WÌü¨ˆñEYÖp1*]ʠå àA §Tõ¸GÂÓëì*è\o>Á2 5õí$FoQŸ¦žg°2¤Ø‘Øí˜¹„4*÷8¿xöHñƒ±³pimضšf‚«ðjD¡‡¸{A[±Ý ¨®Éyê7WI@æ‰?¸÷áŒzÑŒˆúûŽéþØ“ô"óAD•ž?I¼è‘Rl‹h•Û,²aþnm¦ð æAî]öèOÙX_¢ àC®¯”ü0u¦þ­ÒúFFvÞ{Ü’sáJŸ ed/ƒ×‰Æ‘Â,}ZDV‘åªß7M(—ƒ|¿|ª8½0[2¬òà,8v…a—_î×™zžµò]8,§¿tC¶M)ŽÅ6ëláôL P·[¼Y‘áP/ ÓýªÆ©•ÉQe{DO|ÃÙÚ‚ËüU“jJ’²'báóò°¯Qs´“·’€ÌFe%¡ú“mì;¯žÊ­{5-,&z›•KE K½ŒwÅZAéÙè¾Ñçóõ•µÇÈ[ã¢à ¢fïcgŽŸè‘¯Ø—NË´ôé­¾GÕ'Wq]z¦jƒ9ÛöI§vý_Ÿw3´?úB½tb뤴_˜‰ý\pËZ´Ç¬lzžC ¹ÇðIêtjtSRëÉ—®¸×¬fY½©·2iBð@#<¯é]žfËã ïŠr­Òê‘!eº¾s¨‹s†4YË-—Êóö <ž¨w;ÏÓUNR¢ŽodîV}tºt|þ]º«vÑa•ØçíÈ8ü½7Nݳ =ó:d¶ÖÃ^í&¼ÏNO°èhdÈ ^ìtÿýè¢~cw®~ Î~V3Û«ƒÿì^eÏ£;M³Á…úc´Ø¹ôݸÒô†ï(iÔ•$ö¶×ûs7Ñ+„¯p1âEô~¹!Ž”LW)"9¤ßÊiéð¾ýƒ÷Ýõµ¿ÐÏ•ãÐçBbܼo×»ìžq]á°ºG/‡¹0¾pTËsriÓœ:ñ‡«ž}Kö_8úú"¢Z=Í|xBxJyCáßÊ­Ó`ïMMŸµ)ÜFeܹÞ÷6\U³t×åÀ\¤ËÅ¥!â2Ys'ö¤W½ÐÿegFûÇJî½;ƒ# >–IkiøÏ¬k¢‡õk ëºÑ›§ë¶LßÌR¶_¸sMZ“GcñÎv“Qâ$mP‰͈VtIHÄŠ´KçOÌ/bMCÄŠ°‡ˆ,øZÑ…—o¢J9?èGM4Jª¾ÿʾ¿V‡äó1¢–¥˜–(ež¨xf¹N²åZÏqÐ0Fó¶mQI]êÑÀ‹¸…9(}L¤ÿ+‰¢‰±Ì­ù.¡ AWÇŒgÌn¬§N/\Æô2¶ ô¸üùûÁà*õF¿€nbh~AUÌ/îˆÁËØ5)cõš©„µ<@.{ºH))É2R/twâ$ûQÊŽºjV]×d³b€PÓú¬®’“¤á½YŒ%¹J06óè‡7âésîlJ¶i1ξA¸Üh[ÁØ©3u—ŽËÖ”¤I–7ôBk–‚Lq?î ¹¿ä†%e:uù¾[{+rbTï—±QQH,æ½K’±•-ÿÉ3]îÆ?iA÷L-u{oÌÂɰ6qˆ‹REþ5£Á6,Ò{ aÛSf†Žr¼‹ë›vW´–Êï¡wgGkÓžMH—K·û ^P‚Ÿ9ÍH™N¤ÊÆoÞZ8¼Rî{²305NâX¸žbÝ£ø){ÀÀ«F:Q‰ Þ꺭u)Ã×)š‰—¨B²\ñ ¼…KO²ÐÌO±±A’kN5È«á6{ï £²gaüÞ¾ÏH¯Xæð¦Å)½CÏhxtcðdi€Ÿ3Á§¦8ž²oQ¤†³zñÂöq+·¶ëˆcéÒï+å[Úç ,Ô·%ž@b{Ós84ÕGpž&Ý$ÐS»»wóÿnCÂ-R—2”\x¥úñ£kÚhík÷¨Õù)î°!eÕ‡Uú7‰YŒZé+ Çï¼Ì⇟EÏ€Ñä°Jˆ4^ØÊ£œ'/òxµÂ]°·ál6¨x›7È­î«Õ粄¶hœ^À%ÂÐËSí°PskJ -ʲÜI¶›aÊD{|i°ÎXr‰+]<ºi³´·ÕÝ}ŒnÝ·è¹+å#7Ú»ž”ÇÁ ø®GkŒO•ÏMˆ v¼´fßð¥‰Õ¸ô¨2 PŒóc‹¿2U›IÝHJ]zn–KÊ)ÓFô|DÇšÎëbAIìó|¬‰ô¡øFˆú™xÂ+b­9í5XlŸµå„½¤d½§ Ã\U¾O‘%r”A^:?$h×i‰m†«‰eVe§o.“o•Ø¢U=ÄNo§£Ë_âôéCl+VÍ@æ ßöP@íو´ÿ˜íBànÝO¥©¦’ÿrÀ‘" endstream endobj 137 0 obj << /Length1 1392 /Length2 6303 /Length3 0 /Length 7263 /Filter /FlateDecode >> stream xÚvTÓû6H—€tÉ ‰£A”i”šlÀˆ Æè”IA@º¤$D:•îV i”ÿŒçyþ¿ç}Ïyß³s¶}®ûºës_÷wãdÓÑç—‡"ŸÀT4¿ø ¨©o,€Á"`°01'§íû s>‚¡\àHÄÿEPDÁ h ¦AcxšH îê‰Bâw„$î€Á€0,õ/"uP‚¸Á¡€¦ ŽDÀ\ˆ9‘Nž(¸-“æ__n+@HJJôÛw„¡àV  AÛÂ1­ €>Ò C{þ#÷][4Ú鎠 »»»ÄÑE‰²‘áîp´- s¡Ü`PàWÀÄö§3bNÀÀîò×GZ£Ý!(€àV0„ ÆÃ…¡Lr@_í íCü!?øCïúw¸¿Þ¿Á¿!VVHG'Âް¬á0@[åÚ  è/"ÄÁ‰ñ‡¸Aà'ÂïÊ!€Š¼.Á4ø·=+Ü í"àwøÕ¢à¯0˜[VF@‘ŽŽ0Ú…øW}JpÌ síž‚&k@º#¼ÿ¬á¨õ¯& ®N‚pgW˜šÒ_ "þfCb`)qqQqæ À<¬l…7ðt‚ý6 ý‚1øz;!kL0_¸5 óAìíqƒh”+Ì×ûþy" p+4ðfGÿ':†Yÿ9c†‚{Áí à_¯3ÃÈ ŠD8xþ‡þ{¾‚êjzºÆ |:þ·MAéx üR¢¢€¨¨$ !%øþ3Šþ· ð\ÕÖH@êO±˜[úWÁnçÏýw7x€ÆÒBbD ¸ÿ£qS°Ø ó&ôÿ­ôß.ÿ7ÿŠòÿÒø¤âêàðÛÌýÛþ˜!ŽpÏ¿Œf]Ñýk"1[€øoª!ìÏÎj pWÇÿ¶ª¡!˜=GØ`´Ì/$*ýƒÃ]Tà0¨meûG1ð‡¿6ÍŽ€é ]à¿-/0ø¿l˜õ²²Ç<>\0²üm‚a¶çŸy•VHè¯5 (Ä“ŒQ“°˜à-„ÙG(Ìã·Aq0=úÖHñ¯± ÅÁ'Ô/ô7 0aή˜‰üBÿ‘ÏÊ…¬ÝoY`Šù×ù÷ŽÃ`0+âÏ“H+é`»Êà¦Óry&wþµü…¥–ðXã®014×T–·í‚Wª£Î –ÐúI:Ÿ žLú0ñ~=ô0mO œy?¥Üµ¢°ÇÿúÃÙa» ì&Éh3$‘¾Bù-N—…Üò1îãÛ †“—!"¥ëo~´¸l˜HíŽ@Þ1ÛK&ôsùŒ–k‘ÆlKè>cSŠè¨ÇÉ®j‡~\‹„GŸ(4E1œy®<ŽNé››ëMëîJ9ÈÖè,/yFÅ@í¥Î–ñ1¾‡&.M½Ö‹KAìX± 1öC®ã3®yK‰ù‹±ìîÆ¸×j-Å@m¿þ×wŒx7ô ¿D²†Ð"ÛõËá¼äm\AÜØ‘DMßc•Qì‰p2ˆ+fÕ¡@žâBZ—Aò…¹X=eäÊ…Ƨ Ù%ߢм²S„ÊmCæéG¤F5bÉÝ9)fZ?;Úý|»¹ÛUßNq² ÕêD¢sÇý㥕g’ãÜ‘Ñb}Ô:}Ò=ñ3݇ ³v·ZW†„µø ¿}\Hs„Ÿ¿Î«61êێʤ¨Ž¶†3 ån.Îû‘ò‹åèo²//Eûlãúr±åÍ9‡¬^\N4<|e0ùt[§Pàæœ© ¯»~6Ý”\I`—“€%žör ©múÎ㋟'V¹-÷ ún0g8Q{XŠd¹hÇÆr™ï†ôý¨9”žü¤§Fº~Ç<±f]ÿ>¤>¿Ò$Éj—H×eeÑ ŽÏ~bú‡þ”I™n¥¯ù…š¡aÅËÝé‡X=Î{N+©7%dˆf®[á“—slãš?Ïþž8¿‹ãÊÞ‡0Ì„>¡)Z‰‰ð[›¶/Å]Ö‹ƒÆ="@UÂn#3'{÷¢T¹—˜U’ÇS`bWp6x)5ÆKŒ÷qüü4÷í§QÖ¹ÏZJÀö¾ël)(çS è¨£Ž´&òÄ`±‘²Ÿk/X*4¨ t&)ÓãÂ+ôžIΆ'0²Ø¶†uzoSœ%ã<Í_ôOˆ•@ª?÷—à–Psá-ùá*?è$÷õÖ4»^R£|™¼(´sظÀM*B‰{6ìc¨‰W÷Hå²?ŠÓ×’/°®ag6l¹ÛM8$õÜD€¼½ª”’zj›¥1’¿Y+ÙUÂë«ãa+wPº¯è$ÝxkŽîj{UÇvÏI‹ÝÇ#2+²M^Ý-ö½€Ú“øÑ#œˆ¹–•Ç5ª/•n†} ª&Õ¿>}ÌüU_–ú !µ±K’±ŠmEëyV*Û^.³ßáÖ?Ýz·Ìàk… ¯Ó­w’O º«x=¼²ÕRcïˆ}+¬ïRZ=V¼«'&<¿þ‚¹Åf-jÍ¿ QAdÑUïeš4ŒÞª0¹_¦,<òTºG]•ÜÎQ¢Ë$ä¥t4©Ö«”;ʸÑÓSÆÖ-‡¦Ö!vuµ/Ü-^sÅ#@v4U³$¢³½pöã€]SœIåes¬ DQVqè¿i8LÏOoˆ’Á)íîà &w²Æj©Ši^šlŒ]#‰DX3MhµŸÜáóó|’ïïä ±^Ø=’³”"}Pj@¤¢ZìÐKÿ.{Ì,ãùÕAì«ú¡å<ö1®ñiYªT¦„E%È ߸ñ÷ üËßB,ÔðJ™‚lÓ±ª„y-†s æI˜掽„#ûA$¥Å•^×xx–ôüAÔÄ‹GUt"S¬‹MRúüéOAâ^Æo+D|û„šU¤2¡‘X±¬1ìÛÎEŠàL:Ÿœ¤e)vÛÏ|Ÿ4º=i§Â˜@š¤ )âJ'ûÜû,„ßûžõe¶šÛ¾Ç¦$ ¥™?‘Ϲ1±¬TI3U’5‘:»¯{"*2hÉVÞÆü½ÿóÑÚª”ýµ¾Þƒ·ç y‘+ztkÛÇÔý]o7Gãåßj³‘ÏáïL P@½œ¬ÝÜ M©Ü²•dÙdyY&×ä ‹Û}R»tƒf¯qœÔIÁ%õÍ#¦·L×ïeá/j;J3õv2Õ©¶¾Êt–*°QM°ò©%“Žž&a½˜[7\mUn#'§(Ò;úŒo~¬Ë»n·NÈò­•†Ø£ÛjÈþz1ÿÖõ’»yýµ£”pÐ+KÁ·$Á×GÙ”ô,i€mÙ×wÄç²´”ÐfË©¿¡¶d΋ŸêÙÞ}!÷û¯Ú*°>81ÎõBs«1ETrèè]#MÑ% Hd;»b“•|›Åp7†?Þ#ÊÛ=þ NKcàcí+¾5Î$ b_6ïEN2W†ñ'Xú먥gèEÛW‹MGÊê}d¨á|þ2 ½), íK¾bÚžOß?1¥˜Ôg¾3=ó9Þ*÷žô¤bˆös-Ög1Oí§nih…=*}Ѱ¶øqX7¾Þ®ßbµÛd„vg¸¶ÃV(ÐUJ–ìdé)¸okql8Ö7/ç{ððFlÝ©ÂOÜe*l\:2úrd4Nµþ7×¢dÙ’IŽw åТÇÞp®hMS÷üjvÉ"rô+IB«T¿Ûò:m…·xx¥ƒpûzV¼i ˜Aëc©õ$U}‚›…\ýÌ)Ô¿ncTr|tW㘾nûRʨ?ùë»ePÇ=¡Ö­3*2–µ0ˆEØÜ·Ä’àuEQv9 ëbN–|¯BÕÍÜüµ¹¸à ˜å_…ú¼o¥VLJªfÊç3Šžöä@Æ¡˜ˆ ¾y•¾Žh-Mƒý†Å{]ìº( Tû€czí9—)ÍBÇãñÞ‰„Þƒ #üÛ—ƒ\Ò© +e|á²°ìqžQ=k5SêκWϱkâßZП1L§BYò)F·=ÅBªímœ úEES¾\ôÎ*àî¯.ñøž˜èïË6ʵϦԫúQ6ž,7ÂI™Íf‘YëTèÞm.à Eg¤<Å’û9åO³ê5};ôiQÖÌîçS¬ÎEµXû^ñeY qÐE{˜‹<†›|?V‰tëÁŽì+ÿ`ôÅ;mÔ"éÎ-›4 æÌú˜ô¿æÕ49Ðä«VB m6²{ù€ ð¾ì\{}k[xTK~}‚íþDw=™…¸aÖ#‹‹ç.oa^ÛÌsN«úLÖ›ÄÚ,º†Ü3 \$î›iÞvצÍ-6OG9 ¾hÝPë˜ã» 2Î%Ë:rnC±tëÚê dÊÂüÛ-ˆ™H&ÐS1Èa†äŽ©D`e<Á1MëYG¥5Éš –¨_Œ$}ië}Ve]ið±Þ”°z€q¯U¦ijŸî›Zô`ÕlBrì2ÌkP&i°ÏæRGèJu)¤+^çYzã%%ÓÂ{žJ?4ƒøô±Ž9Zs˜íÛ&ÎÉÅ›ö¤(/aËGn_Ëìn÷ä¨É„Èø¼ßõ¶¶|öà uQ¾NtžWaIÂog¢át µ¦t•³ÜŠ¡6`þm—ÍtßjjÌ«j7¶»H¨±Ò™ˆh$5•B‚M%š•äÁãgè÷ž®*}%¬|3=áá…ª„Išá]ƒ;âTw–¿­^>V&ÀZš¦Œl&UŽ2¨s*T.°7Ÿ@ä¤ÐµÃùø·uOÜÔ7Øê[^ûòm^:/i]˜¹ YûÙÏ”›êÇ,N¶é 4i%ðˆ6URVkA::cIö{Â=ì̶näÜê¶›dyÜÜêèW5Œb¨ÚùÖÍÓeÕ.°¢$:¯/zÞôæàÿ¸kMÞC¿§’•ú5 j-Ã˳ÙòÁ ¥é\ÇUØJŸŸŠ÷¨v%ae(YFÑ„+‚Îøâì`ØÖFCÖ²ˆwl*,Ü”LõsÂV¤ýÍwy²óC™5{g\¢Z_¹]Ôâ§þûÊ” ížÇ×o(Ï\Óc±ò}V­Y 1^GgW'+~»¢6x¡Ó…3NÈÃf2G83‘‡íÒJ¢ÛÜüû·ŸÆ;-‰,§A^Ã~Žó€%Í$ìqþžM¤Fíç^¸‘¤>éÓÕˆKÜ–¹bȵî&9Û6‰QNAæj–û"»¶ÕwU›ÏiÛpQ³r…$±6—Wl\ Ö®[,°my÷§·~Øh7×ü,srãî×<×± ½ÑªvŸ5âÆ*“ZÿˆÝÐ^«]x¡·Â´6® ½BÕ+œÒ’ŸQ»î|`hxÖî9Â'öVˆæxRa"jXƒr0rpe´}ÚgÔSª(äÿ(”úºQ|'o`^³I7`Šêõ³rìNâ1 â/6!ÎBªùB#‡ßê=Êýª_z§Ä¨vz}Ù VOË*•{@ËCÞ™'œ¨±dÆ ¦sÆ·‰wÙI@™uEȽšŽåôO°3WžÌhýÍpŠòëøÙߟfÞê»Êë7p8v’«ÙMèä…^›¸tã÷|§ÝYšè6x¹kjd¥®Ð*÷qÅCŸìÉÙÀ›ï6¢”ŸûE°»ÉÍ’ÍÕ°t˜”çù—ö°$vV¯¦ÁÈfL¦L‚?ì÷²<¹ÝÛp‰;A ˜–ØL“’žÍBå 7­Óäf“‰´^{ÊA1 ðÀ½Ä.9Ò¿oœI«ï‘Š&O.ðkôÙìÓ–©RuÌÕŒA¡´òæÔßüPz’¡S"‘Hßûú•ϸýõRB·µHO’))íþõ‡Å ]ÅÛë ¿Fé ¬PÖ(!F”²ájäp²Ä-s-¿ùuø€1Ñ;–¾Ï/ÀCvær$…'Vɼ,:˼SÁƒ7´ƒ4pß(a9‘íúú4iÑÑ^:~J7ÚÌ‚®¯'nOž¿ ±»f©R’×\—”öÒÌõòz˜á–½‰Wœäƒµ³¹MÛd‡¶ôצ6Y, ™qËìÊü†èÓ,w¦–›7ݨñº”Qv dÆK›ÜE3ù]^jŒKý%|/j5Þfmͯ ñØ’ >ñÙuiœå@(Xí"Å4ò6 š¾—ÕÁX¼Á ¥TÅg„Óu\ŠßóÕ‹V‘TÁ$»kÅ¥ï&WùŒ¬p´¿ÌerÊu@{y 4«Ù}"'3eïs óKÓa$¶ëa!¡à*5Â>‘¨eØßkÌC®Ø…w‹ê¹Ñ§FU=9ÑP¼7Ù:òЄDv´¥ÉËTdû¹èœr˜Ê¦U|J¶””³æmÑÝ+û ‰#,ÞÿÞž¯¬y[«•«G}*&UF¢+ÃB‡¸a~Jx¯& ³Óòæ©2«ß)r`Ǻ}JW0±û„ëÓ‚ýi%vj%ñVö‡²Rx°]Ö(¼Ê¥Æ«åjÙÂù™Õ ¹ž™›çO »ð§ØÕŒdõ2!k+’%àº-Šúí2+¾k59Ët8†Lxw3™ÞÎ)¬ë›lîÕ¸æRXWú)™~OŠÏä6³Ÿâæöñð݃©_w¸"í•Ø»Ë; êS—6p´Ôé-ê^œÊZ©hâl6ÇLd~‚Þ¾ïÄLG•ÎÕõÁ Oýo•#ie„h¤£œ‰ÔÎ׺¯™é¡‹fHÜA—F£Xl>ëí¯ôö‹éf 7 0BÆâ’$éü¶µú;“Ö§Ås‡&ßÄqÞ›uâß³‹§t‹Òðx–Œ¢^%ÑRK“³VýsnQ-ç!–¬¢°Úh¶ÍC¿© ÏOIjdé¨o¢kž{»Þ{ÕoäÕLE±ÖùöùnÃåv­éÜÌy¢ÎƒmO- 5œ¥‰ä“}$rÃXÒï0'ÑÅuOæ@¶ó26d/è.B¾¨\}ˆ;Lû.åW?ÕçŽI´|”óuÜq2×J¯&ï£Vr}ù',å#óðñA€gN:+îÍ&™i6N¿n$õ `$ů­ê‘†Ce[)iøølpPK¾mÔ#v×öŒÅÇÙycsH†rÄÜò&Ö;îÄ’±ÛøŽŸNq{Ö Úq’mg²ÍÝR(¾­Ó¼öübƒU^È%šó¢,–1¶¸ÐáC?Õ·)Qy(qêÔôÑDÀÞ›)Á¬‡ò[·++&¿ÕÉtn…¾^öØå‹Qà!hÜaÝxOéáºÖÛÞl&ÐêºXÚò´à°#mˆý\«’'Zt»øD6_¦EqºÑФD×— 2ƒV_BWl·Vójùö›D§…FË;5»EƒÀclàŠÊ»‡8@˜´ÇJ½Î&Ž~Ö?ÅÑk岜o]Û—¨E›vŽ4ÑI~S¸Ö<Ò§-1ð0 -©~[(A—3O\ÙøûHŸI$±ÓD© ØhÜ’Xåw@™3U—ÿмçÅ*n`3áÈFBoqãt–FKó⠞ IO½ºŒ$o´c¢Ç–±¥[­p¡(Æø'¹‹¡wù†ÆÒ‘¾Ï–ù7wÞ9û±¢nmîÕ”êôŸzYÜ…±SŸµ³x¦zJ—§Õ¸³Z¹¿»ÍX*ù^Kº²‹;KÍ©?HïÜu®ó Ø­Z!$ïþT•hÀo®¢–íþð”Iþ3Xöë^xúðiÇ™¡TÊéÄÝiÖRjU ø=‚/ò¡=gø*¾ä&ƒ4B*èèÊ^mÀ‡™Ú³œtd»f•· ï±_Ü.ßÇzGÚ7²oº&§`Òüâ ó[£é…çRÝ×—ô³4:guBBÖgµ Þ¥¦ŒeÇT%—¤0ð½^ðWùaÚïfmè;V“n®êõ¬BJ¾̶¥ „ ÿ(*†SˆŽh뼘èöt|ÏHYJ+o·{ûågø½¶ "w»‡¸;ŠÌôáìZ‘)þ½âÍ´À¤é´e—ûõ*D\)¸‘&¸L#º¡¯:¶9f8Žŀ ‰IâöçÓlÇB_DÔ3žã5|¸E¼<#&™=ôe­ð‚Ù›`UGÁçÓð6ò;DåfL¨ª.éÇÃ5ÓŽx(Ö-Ç#÷Ü×¼bSIDrÚ‰0áN> stream xÚ´T”ß6N—€tƒƒÒ€ÒÝH ÃCÌÀ0tw# t‰ % ‚Hw7’ÒHwH |cÜû»÷þÿk}ßšµÞyϳŸ½ÏÞç<ÏĘ̈©Ã)e3ËàN^.1€ŒšŽ¡0€‡‡Ÿ‹‡‡™Y‚°ÿ…ñ˜õÀp' *ö8ˆ@b²@’§ƒ”í¼ü^!1^a1迈0¸@豨q”aP°³ ÌÁ±²F ·ù×+€Äà~ò; e†C@@(@ ˆ°Û#wí:0Œpÿ¯¬âÖ„ƒ7·««+ÐÞ‰ ·’d{p… ¬Ú`'0Ülø50@hþ33@×âô×Y"\p0 ØA@`¨2Ãj†›t”T`è²êÂÀß³ðrñþ»Üßì_… ÐßÉ@fC VKˆ !¯Ê…pC<¡¿ˆ@;'2è„ØÍ‘„ßòRZ rÀ¿ã9à„—Äî×ˆÜ¿Ê OYj!³·CNx¿ú“…ÀÁ ä±»sÿ¹Y[(Ìêùwa ZXþÂÂÙûâè V’ýKABxÿ`V`@GTHH@v€Ý@ÖÜ¿Êëº;€yÁÈ ¼=`Kä`oˆ%ù‡çétpg°·çþ{…ÇË °€€s°Š÷Ou$ ¶ü³F^>â0æAjÀóë÷ï·—HyYÀ vîÿÐß/·¢¼¬‘¦6ÇŸ‰ÿ“–†¹<9yœ¢‚¼^>äCX„àýße4¿mðü“«µ„Dÿt‹<¦uìòW¬ÍÁøïZê0¤jÁÖDnÂ#ÈB>xÿŸ¥þ;åÿOῪüßDþ¿ É;ÛÙý³þŽÿÂ@{ˆû_R´Î¤Ô`H@ÿ—ªþcZ5°ÄÙþ£J ÒRP+¤˜9y¸xþà'yˆØB‚Yÿ‘ÌüÅ/«ÙA `M˜ä×·™ÅÃó?1¤¿@¶Èï‡R—¿C`¤}þ{_9(fñËg|‚B tÇãAʉOPàÉ‹4¤Øí·’Ü\P™@Îè °„Áñ~]+/€Ûÿ…þ„ævÿñ!18¶["þæÿ ÿ¹éáÈ^¹‘Ì_þü7ƇÄ샀ÿF‚ö¨³Ó¿¤ó¸àämüBþkR3ެø[Ècø×ú÷ç vƒðf&a §A6•A åR´®œß±—›Âc ;Ã,Sï<­U±SÆ¥_Y”Põ'iN†ŒNzѲï_ ¹™´¦û/$*f!Pä:W¥9³{/OZ­À 0«aá¤ðUâƒ8-úû¯Œ1Œ¿ÑŸü,ÜÆ¿Qºž{Óä´a$º°äý bL¤EDKe6í E22-#ú MˆbHàÆÉκJ/†oÐ"!1çÒ QÔ—î«Æ1)}óó=é]IüÇïUÚË„JBH©É<”3û㻩ÉãÒ•kPX¤È&Ä>¤þ`²€n_9ÁKGÇÓAÍcȧ³é—‚„¡ÆÃ5î+W×7Œð±ôˆ3þôpÇzÇ,1HØ×MТ{„!%¬ -÷ü¾¸¦õ¦£p…êï‹S .C~;@C¦ŒîCªþ½6V™¬ƒHÇCEBu«ˆ¥Ö”X‡z ÈÐËZ¥‚²£F³:sª•ø^ZBóà¨Ü¯Ö'ë&2Ô¯IH¨dÚ¬O¥*íl úOJ»ˆßbÓíè§a–=F‘Û0^V3»RT;ëo*ä‘.›Ï:d.™×È7ûf`ô›Â¾ ¬Uo õ½ÃR(×YâùfrkÊýídäXž–æ½jºŒ;ä)-è‹d˜0×úž´y“vZ…çÂ÷"?9ÔÊ{:d\Õ¢”¾$}óÜÑØ®ŸiÃFÅiÕler§L×çÛx4z8.2<¡PÂJøPÓ0›Cá±]æ¶|‹‹jžÌy­®øš„ò Eß 0ÿ± û‡H¢y_”Ÿ“´êÉ„'··Ë“Óîžå³î专Æ}®Z¿äfrØ Îpêü{ #s cùL>Þ5X.mÔÏЃ±¤Ü°J«÷áì>›boÕÅ™#06ùY zÛÓ)Ü1±y6ŸÑ*ß—0Š„§·Õ>fõ€Kò |hBŸ£I ÝdV—4H–÷!0?–ö¸HLÖg Ø“A8µ ÄGÝ+ËXÕM·ï­Ÿ,–§ »Çu1ß7Yçº|1bòãóÅ“B æ|_s´¤!¦lƒ5ŸÚ:3ò­™=ePlÐtÉÁá<Ô3f¥Þ w/Z‚U'¢ç*êùÌܬ6@!¦ª_½Ìp…¥A»þ¼Ù‹í/c¨&:¿â–ÒëãeWÁ~æ*šÍ…s–} Rßâ^I¹ÑbðyŒ÷Ûá·~z÷-ÈÍ[ý[EwæÎì@"ÈÅxÃ4ä(˜ëø!Úå-Ãô™iáQÊ]ÚäÁÓT 1ÿUãY¾W4‰@’ÛÒÙb#þ]?J^~q!k^tò[àsОöÀTÈÁz:ÕˆÚÒÃãøY„Jí=²ê+¦0ÃËew„ãqßQóÛƒŒ¨õAœã}®^Ë£æó/AñÊdÒÄã®>òvCä-{ÌÂâ‹Ôc$¡—”˜·2X´*\+ç+öô2þ n¿ô¬øiŒj­,ä}2) Áz““\¥((Ð1+°ÉA ˆö,ŸœIr€¨>øpѧœf_اjßS啦h°fto½øb[÷•PÇQ…»£(úé½CŒLbÖ8N²~1T!qqî Ž”6Ù—7b•”U;²Ð3y„Ø#̾NѺ ––,éQ±^W|<Ýœºý˜=š.ÌÂKl9AaÕ%›•Tó±w¦^«g–£[ÌoKxÇ×. ÊæR<ˆ£~6±BX#øI¶Ö5‚}}çÄjǯỆíVk`) ]R~0è@6âåÓ#ª™‰°ÏXláZ0ö?_¾!˜­–ÅbW]öÑËuÊʪ—T¼+ïŽ*óö[èK§ü1-{?¼w/ëÇ&âœönÍ¿»hg'x«£y`—qÒ@THî±øCÂv›= Û~ŒrÑ Ž+…#ÿPõî^ñòàºK™UCÃUÁ!IzuÛʉߟƼ(1jš7NSþÚ¨¥B•É®š$R¶\5žCýœ'º<±‡üúvvÎ Û«§\¾â•îòwÛƒÒ׉|ÉciõSÑ]¡‡%" 0dôn¹™w¾ 4ôŽû±7 óÌœ§èZž»\@b+‚m™2zHÊ!ؘØeÞ._chÊí9ÝÏ`Á‘|´žª6` T"7ÀH§x*ˆ³Å" â¸à¢sâ=`ýOS¦-НiT ³¨5*v£^‘ü€-ÔÓýÉèZe*Õ7­ÑäÒúxº´ñ‚±ÍwÉ›‡(ã1~ºó¤%NY¢Ö0GºŒ7zÀG+×@2ÑT7Sìm%:3رå;ý$#ßí¼ÀG”ï‹IÝtÔŽóú-hsNZ9¦ß~¶ áŠæ§©éQIÏZkyŒ4¡=>[ßC1¢ÞEÿd®Þí€]¹:3½\Àˆz\ž|$ˆïò1 ‚ÿÌ—fÒN^Þ,!th8@¹+¿Çì;kñ`3­ªXžqøùâ ¶·wA„£ÅýAÚ„bwóÜ þ#äÊéIåG—4w›&åm=]E:+¸[åýpÙÀMò }¢‚Ó[i“¿VÕ=wÖ*ïë¸AÎ%zã8¹Àüä0\aÝ8rö`õÄGÿ¨oŸ•ßB¼¨4T6B‰01 )ÿpçØÌ]˜7³O˜•—êËdÍÂ>Ðã›mPÇ””=U½° ìwNUÆóaî~í%p+£ÍýuÑ` uSékÒ ç=¥Q¼­O³AœØõ És´¼µafF–ݬѲÄà~K‘¬I»«-‹ÀŸjÚiœ[Ox å/¡hôäé( (¸Žî8ï·c™±Gß™¯1T`–º–HH:JÞ˜<•Ιé ªŽ%`ÖtÚ;߯mM¬múèn»ñÎàU>FWR„}Ce.›g’£¹³Å%,íÛJˆÜoŒb¹3˜ÍpHRh„qÇ€`í¥—^ŸýlLóbw^>‰û±›ÅÛãFh>>?¤Z—†…HÈ5¢+èPÇ/É|CºÄ òO×3 %‹DÈûX=3#ùqåyzæl=íÆ<&·âå×ŒÍ =¿zе,iµù<ά%bÜyqW3(Œ_aµ£ë9&åàóÚ¡@6ç:Fz̧ñ‘÷˜úæ%U¼õ§Ojg‹‹×Xqr¹¨Å~Ä1U÷B¢4ç0ß| ˆTD5úXþëáYÙ—qÒ½ |Á6)µÛGí‹#he#:¨:ÖmËÉ9•ñÅ—ÈzGNYè0QÎÓïU³5~«ÃEj_1üÞ˜ÝU*V~ÐpHý Ø’ , ‘Þaîš-¸Îy¹mĸ@>=ûàrãû.“£Ü_%R€ZÓÛ| m„‚y.¯Ó‹ש:iűNñmÉzdöÓ@›·.£T½žû绋žs›œ¡¶ŠD˜å‹7úƒywÇb |îQ’¥ŽÅ¬˜ó ‡¢4X˜cô¬¶M˜Ò'W¦µ0í±{ ÆÜ>t2¾|óø•Ç›è{̶¸j5Kž15Ó(TtAÅ_}7Ÿƒ]k£èž]Ò7œ§Íhx}®|ºæß”ØÒ²ä.?·»Í{XDp¨Î=E{lg`°cgíú†FW­f¥¶Í”œ>#¥Ï´ûäØŒ¦ø®%m<ýÜîÅÃaF£éÛö IN ZìwJ½¤ÊëœÚRìšÏk›ÃköØúæÚ']0^‚ßä|º¹aò¸6 NnŽ-}ªÿó¼8ý6´Jü‚¼_·SLU™b_w]iñŒ§)¡Âÿç–̃®ô5ÙîæÍ“ÂÃǾQ³í<.«Ú4Ưꪢ~žÏv²EsÔ‹y]½.„Òù'"p*ib܇Ñyó}œñ¼t2³Ã {p ¯¶f<àòíž6 r¦úŠ‘EÂþ¬¹VÞ±jc3zœ•eì—ŽdCÆÛí¦ÃaSaDk (Ü3î””qsuã+ åâ†PnÓYÚWƒ5t²A…;=mJn¡WÓ«oâÛ¢X5™Ðc¤B ”t7 ÓŠ²pÑÒzþp„‡Ø4–ÊÌ>‚³4Ó¬æQ…í(Ó6Ȫ¥ŸrS«¹šp~ÆC]8ƒÈ{Ī´.™ Ó  ñKG„ªF`TôS2^:œí”R 1."úsµ(b¢9(| X Ýù“ëü~ …‡YZ0Õu«dAc•ØO÷xUbÈ6Xéͼü›ƒÖY“ˆšzÁÑÆ-è]*Ùq?­•~Õ*©J¼8Ã-ÑQãû»í%púÍïx™ .æ×þúèÚÏ„5µH¾ ܤ|ÒÆŒÄU» Èmœa˜ð@ ¦ ¬š0Å`}ÖºòIZBêrð³,fŒû“ÎúÙâš%fÎñQA)Þ.±ÏzÜÙã,2¯¥ÚuÁ‡¤>ø=>jòøOÇŠþ!äì3—Ù«1ßq¸ñ}¸ïÌ»$Ú¸óhlx%}Ȱd­FYûéúL?¼º8á3P0!Ž$äÆ2ÞŸÍŸ?ÍMpu™üäìC©P<·madGt!Ú<ÉÄ!û2ÓLªž›«ÙiT©bûÓÔý“{~Yz¿–ñêr†|”KÇ›Jµ•Rk+©n=äõ+õ æjk»Š|u&s•uÛ¦fk©%JMTgî­Ê"™€éuàïhŠÚPåo^ŒéFMáÙÕÓP‹¬5²QÎnÀo¶uK Ñ Ø!“•˜¼Ì@€zÞ¦Ô_äÕ9ÔATÛ;$ÝáuyúåºõÃ[¢7í§yeIàmåou”;ÔÂÑv±FÙjc'f«„¥Zußh¦;”§gª ¾Ðn)OìÀö³Ü Þã¨OˆkŠâýžÐÜÍ^ï:ô1 ßpÐuÊòF~åPÛ2¾·.4¹4“IÕ`µ[ÓµÉð z¦Èƒ@û—#œt6Jp¨_,?­ÎÐî}š³±!†IÓ7ñ’øëŒ0qãJL””Ýç®Çl"íþ‡°ôæoÄ~¸³Í¬l{ŠlV›­ž‹£Ÿ²ê^>“óÑõ­)f=»=š1#y\ñJ÷ªye¹/© ça;nÔsj ˆŸ“ÎÆ2çê‹Ü†vf¼pãZËUŽf ž{’ñ¦Ó$i¼3±örK„÷ÍÁå +)="ö õvÞ³Ä0ˆÙæ7 uU™WDPɽ¯Ö5”Ásž³u¥gÊ„'•œÓU»¯Þ¸UèÉþ,fAKyåppŽãpBØQ±Bc£Á©.X|‰‰ªW!¯Ð´šÑ˜Ð¼{›ŸÁî«2wÝ‘jÒm»ÆU¾wF°Lo—kP¥Zõ‚ß¾6~I4˜¼­ÅZÉ8 |ßcPŠ(I?k<2†§ªÐiû×™ ÔÌtjÝóÌyÃø`„™Ÿdž‚‹2Ð{oŠSº#h }!ånõíudöÆŸÁ·Ýçk`¸šð'¿™»·m{Ã2üNk(NbÞ™ÞülîP¬×ºE"õ¤èÅk>üºBDùù÷í~¤òÁjê³8ÊðºÌ£XˆÛk5ˆ¬Ò;[G·4C«qoéhqð³¤õŠÓ[E—lâ)¯ÞøA„i ÷DüÈÒàÒk FsâîÕÕõ[¡ $’ ·Öƒß}•ã ö'?¸­«ª4mÖ›<×^ÑÞk–•9u5#µ¦<ýÌ]îÀVäVJ—*êeƒ¶Ù>ºþüâÞKôW%,0ªCÖ[þFI,à„]Cf2ÃîŨDx€ F#Þ"÷>ÿó©¯*4Ž]éÈ­ íõé@~É#©UéÏvm£žëNvI‡ŽxÏ…«C¶}c4fqàµQN’»X­õby­4¿¯Ô¨ixo÷£Ù!M[µhßäEvÙ+³É–˜%ÔJhíÑBZŒó³_¢ ×*ro÷`}¤¿-áN¨Ž‘[…£™ ä$§È•øêEæ«t ,§'OBUï˼y4jŠç*}‚s¿ó£E |xÚë¢Ú3¥G÷»`¶lž¯i¯i'ã™fþ†=8l$(sŽ‹³¸øûK»Ê®æÞã‘ì;Tr²[ñ!e´N…üÑŽç8ÍG CýŇg¦~0Pæó•X±Ñquuõ7èCwøVë¦Õxf9§­Ø|œ©—-Ÿ„/³øƒV£È\ƒ6Mä÷ƒ“Y ƒ±4¼jùÎmEÁB²Üý%œÁœUìPß3˰ fMe-ââ¹@º7¯»æPÃ/ hxh:=¾I™Ñq]OPî÷ÒàéK=ʶ˜Ø¨0Y3•̉µ7ÄéœNàk£±nÝaìmEí“0Å ¼Äð1¯ayCX’–\‰‡ÝDñG…ÚÝ+ÁÅd_Øâz‘sÅj–ÙÈqåøåUåO\|¦~WÜo%¢¢@f¶ÏSÅè? ;õÈÓüÉ.ðÒ­~·£hµµ wg†ßþ¨Qèý9¾i“]7˜¼áìR$i§ªïúNSÝ1ã»ÂB´3ß)îíÛý;‡§ó±øJUÖr%v•Z”«îÑãÍŸhÿLoI_7À¤ä¨¡¥@É(eŽ&–ï˜Ôæ=õ(ÆÃ“íC£\±ªŒ<ö,7<öæ¸#‘¼Ú:˜ßÀ¿×¥’t¨e}f‡ËåŽ'ÆÙJ‘ÌVpMaÜÑÐ÷$_åeY>¾CÉÐW×îAWÍ­ÊGB¬’uI«o7V/ÚbÚmß­V,ç7/šô2œÒú»-Úì„ÎÎí%\ØR §J˜b†Ü O‹ÓZ3@fas N%â®#»tÌý©Œ ÚRvÀÝžìQåÓ}sÞƒ#ªOööÙܯ¶_iÇ  (Ø¡ ž rvVúrÈá½¼DgüN„d`ŸIÓc4®ì9’ʦQS _õÀ]Š;b5»T£Ú™j9]+Þ«3f9‘5¼ÒöUá¨=—è¦];Ê¡ô%b]’,óƒÍ[µ#!*[f·ü]Ã&û&K {©!]ʈ˜ï‰Ï¦Ý– ™3›…Œ0OÜG¿˜H…?yιóé¦!ÞoÝ[ÒJ¸ù}Ásý1_¼Y ‚Âô]Ö»²C[“Ó×cßfß@®,Nâ)+µ“*P=dðÍÕùñ)–úsUÓW:C/šDŒ¶ïº;![´â=ÐG¶G Ôj‘ØM3/œµŒi=ïkXc65^3Ò’¤‘ Âh%9‹?œãr…±É.@mAéÂd.%kÎÐj=³¥¶¨åt³ïOT©…_¾Ë²Œˆ•)‹¨ŸÑ'Óµ8¼ÏþÅ@ýQít»ÿc<+wiÊfŒJ-O%Š‚Þj›Êç ôô¡/º%ÒŸ¢"uÐ5h‹å7är»D–Xü©Ê^]¶X«ÚOìqj:ˆÅ¶k³ÊG¦Ó]v² y¶Švý,Žöº§E^–s½“/¹jàO;u®Uå+PˆaýzÃ'ø¶C{ò ¡Û§˜¾õš¥¼à;ƒÙ›víw¯à6 4çÏûI3°åĶ_Ò+Ê JÅ<ûˆ{m@2òÝ-٘ɦ¢ŸåWØÐyËë'ŽIïTFN‘¿s ×õ—§Nê—|:³›QØúWMÌ ¹E€­YúØêkk²”Ù³ÁmT6eô÷k®|ŒÌ‚¾²ð •$Zr™åÌO«ã.†û^ö%έ‰OŸº¢?¢]½k˜‹Ÿüñ" Â95´UJü °"MéAø¾÷;×Õ÷k‚#éLëƒìó4Οçв}ŸHÝ_Í~fDuaüšVMÛ×B†÷L7z±_ÍØàüð†é§bázGqÐ>$‘­Ûaó s‚‘±@Kå¢nËž<ÃåõØ!ðÙÝE<IâŽßÑ‹¸Ë6»å†˜Vii¬ Ã*ˆkäXƒIa`tfÒ:t—~šá-ýJÞäý¥ŸQ̬‹7ÂjÊÚíõS‚ëð"úû9oŒ0.* ì"Ò—cž×z“Ã1( á5ib(Ö\š Y­X jÁ™®Ýs<Ÿ*?­Ë‹¦ïq‚q;Ö’Ödå+¬G7}<1¡µ-nbc<ú-ú*þ—µ8}_Ñø#Í„æ m©s³Æ{îUG”œò¡fÜ#öá7Pà›ëÀ‰¾ãÓæ'õ0gkšgSuÓz*MäÞ»?ny‡á¹ð)Zfx8ÃTÞÅ!: ïïcÌàÈ帥í>>9K¦ßÙ+B/~Kñ57ÈÖúù;±U]‰›ù¤±º0®tQy®÷øœœ)‰ÍÁêä›6/w>'¨ŠÅîKJc ¨ò\¯]¼IÔ—°i¶ ÷=qáô"Ï5rŸß‰*µp¼öNŠ¡°˜ ?ªtH j~2“P¨5Ç+—·Nó-×_ÔÈñþye·X‹^V¯õ™Å <øžUeRÁì ¿Šç§„¢”‡Ƹ×9üØá _Ûhw¤ì¹rÆw?v]ê± 8ýH'ÈÊlqLƒGæ¸b/ÆUyħYþ*I‘ endstream endobj 141 0 obj << /Length1 1959 /Length2 15165 /Length3 0 /Length 16368 /Filter /FlateDecode >> stream xÚöTœÛ² ãîî4îîîîîNãNã X–àNpwwwwOp×Ç>çܳ÷½ÿ?Æ{£ÇèþfÕ¬Z³ÖªZýQ‘©j0‰Y:›¥@Ll̬ü %M96V++3++;•¦-Èø?v*m ›»­³ÿ?n@3ЇMÒ ôATrvÈ{8Ø8lÜül<ü¬¬vVV¾ÿ!:»ñ$ÍI[7 ÅǾû°üçp휽œüþYÙ:YZýU†¥‡ ‹–“­«PNò?œÂß6k ÀÅÊÊÊËÎ º€Þ6,- éãü—“í/óG ~.Î.«2€¶VÀ?w3O äæ ðû§ã#66€¥­`´¶uBø;û‡hõoüqþn¶ÞÖöc°þõùï“ÑG‡Y:;9øüMÿ׳hÈêê(J0ü§äÿ:ÅŽ~L\&v.V;'€çã!àçQ5³ýŽÄÊ9Y9øþ-÷cŸþG²çz€ö?Bøß¹”?: ý»Ñ Y¹X->¾Øþ?·û¿Bþÿuù_Yþ_ýÿ*’öppø—Ÿöß„ÿ¿™£­ƒÏëú˜%çYpú¿Tà¿GW hiëáø½r ³is²þèh&6NfVÎÛmÝ¥m½–ª¶ ›wÍ¿íZÍ›ƒ­PÕÙÝö¯æ#Š•õÿø>†ÌÂþãqÿh͹€3ô¿×•r²p¶ükØØ¹¸fnnf>gý¸~lSi ôþW3X˜œA!€VÎn,7€Eì/Ó¿€EâoÄ `‘ü/âa°Èü8,r#N‹ü߈À¢ø7úÈ©ô_ÄË`Qÿ}dÑø}dÑü/âûˆ3û}h1ÿñX,þ‹þÚ7ËÀ¥ÀÿÂmaù÷ÉýMøaõ7ü ÙþÃû!ÊúðC•Í!çÇ~Ùø¸Ø|\3>lÿŒÿnÿø¡ÜáðCºãßðc`Yþ‘êãŠ`qþ{±îÇŸÈ?ÜR]þ?¤¹ý~èpÿü(üï¢?ˆäåü÷‡LÀ™žÿØ“º÷?à‡.ŸÀ¾Ëüõºý;÷ÿêN 7·«ú_×ÈGëþþ×ÿè ´@X^p¶³« k{¨#ôbÚŸ`çD¸Š¾ÓÙ“Â÷~LëeçÚX)e·ÝN#<9AvQ¸É<þ~ÔÃM `:˜¥Úë‘Â{þFrÓö‰ÖÜ}LC„B KnoêJöj/FNݺIvp ²?RbëùżØOosšcúNÉâ–èådÍ_.šÒž²2L!ixľ%xééßa5Ì?Ñü«‡§Œ5Ý“³fÓ G30T5®ýÙ¦-àG¿„ºÅÚ2Š·ï©‡‚ýˆµ[t0Y™Ukå·õ¢Ê"R‘¡F_>“œR„hçùsOžÉðÙæ`§Ž=³´Ñ°4a$°œ‚UŸáöZ>e;/°]ôéŒ~1‘‰ì[@è‰úQm†ä©Hîb©ñ”ÂT\„—k}غjÝõçN.™Ò›½Í2’Cõ_ÍuGŸ¶µš}mÞ%0z«„ˆÄó}©þmûž‘âu¡“úþ™X«zÌ*>±,`2Ç5ÆØ‚ñ³hy²_dí(wOàù¦H´¼FgMi_­ç-eRím-9åeÊW„ÜtÒŸÚËjñ"§`ëGpì`WeåÜ¥ùàÅ£¦5y5m`;$»y)|·û½ÕãJBŰVÌæW³Ce»A­È2$³H§%¿÷>åÌ‘+æb¬<ý,M˜É¥*×Òw]¨e¯þ¡‡Äè“á$€AÃÂÀgÕ§¥œ8ž×õ$ ×ùêsª õ¶ìô ÞF÷MK€ €¹ßŒ”Gwp† Â/V)Í÷Pv±Ss5ùÖ®0Y$·ï™x;Æa^ѦâWá³'8 »‹EÝqÈ(¦Úð€ª~{ñ[ÙĢ´½;ux$a…­æMn!ø5qe3PÙ¦"šŠõÜt%B²b‹!"ÖõìrÅ I žåòN·>»8S ¬ ÉK߲ʖã¼GûKÚ9%-×<Š?s’úýY¬é’ŒA¶÷䑟;uà ž+½Ðbà UâëOT8ß„…°Þ  >GQm&:MêІbqyÈálRÏlXæÉÁç[éTµÊŒR ;N8ØIo!̯ŽK~@k£@ÔOk«Ü–üªG>¢N2}=¶ÙÒu—#¯Ý¥:DO%Ÿ¶9k|¯ÕæáÁøeC‘LA˜«99B‚R”'±ÎškQWåÈ] P²0í¹J‰Ïý”&SRiÈb3íäf´à]ËGže‹k?ÕOùøs¯y°óβdJ{ÛèWCIž^±ð}Ñ×Ή ˆQS~kB5ëôüC3XòVÊy3E¸$ý%å`i[&ÆŽa ˜O,_ Íχü[–š˜× Ëåt¡m6ãBjKzX77¬ˆF0š{ã-øgxôX†k£?ý#ž8åá5_7O—úºß´…Œ´PpyÇ&ö¶c°«WœäÐÁ\‡Ä×8ÍvK»ò¡o KÍÎYd_K‹è|˜1á¦O~¾Ï[+;hó_ž„É"®=Ñæc=‘NÛQ2{ŒÇ”XÅžy°tÛCïÒËñíáH—$0&'¥}^/õI()­\}r/ w— Ãýâ$á"‚U zx˜{qµ(È-Ò.øÞx°1اº:*Gó¦.e³Ö2ùLBÔ îDær–`ÛÚùÕÁU–&?A8‚ýé%¥®fELeÒ!1Œ½ P™O‘øƒdÐJ8šîÞE꜋Öò, ºz>6ÛS žJåXFkçcQZ6²úÓ £»ì`u•Çô$=á¿èÕ б]á…« $£V±°¿ñ4°Ó÷[KNXŒSg-µÅ‘à ̆Ñóå}k›d“Xã<²ö°ƒ«§4W¿²è™ÂÒXÓqa”óV…Ø þÔKr“i'KnöðUÏ»õÚ€[dÕ™Õ[û–þÚtÆŸüN)iÄ‹˜[þSý°AêFõ¦îRǦ®li¼Ý×H [n)§Ówµöè@qŸ\BªÊ+@ü ³¿½àøë/j—kÏÏœjüƒ ß½Úoœ$æ=;ï@Î`?jCøSÍj‘}BH añõðxÏÚîû5Ý““•8¸–£]˜¡mñ=.…BN#"0’´—Hñ{BÂ]AÇÓŽQÁà˾9Ê\/‘ðùH(3׌£”ì,äÀP±®­‚Š~rçRÑ©S˜¯ó.Û•d‡rÃ&:žcgÔ»ÂJ©1I"»§(4νK~ª¸ª7,RçÀÆ~²&Ó[ˆ=åkUŠT«™HfÑž·¬Ýþ·ÞÃð&~¸]ù¨ƒO*wy²F$wë’%v[„iCß\èeÔ®ê=B'¼º3AͯBײ/Öa_¡ÞÙ2Øh R–˜hZ)ž.5x¼BÝXøò‰œ×Íú¼TÓ ”ç°W¯ý9¹cZzä©Àí7B»ý4~âVQD­–¼ÈjБ1-¦dôÃj^÷@¤ÁÛˆä‹móCÚÁ…“–ðÓù§Ü5ª‘¦WÚé.M‘xüдqf,d36E(æ:¯†1[³ðϲeoÇ÷1L¦ÖB¸°¿¬Öé4k¤dƒž#prRX¡Î"Ò„Öˆ³\iÝî|¥ñqu(dÁ7S2zn½!” 9€óŸ×Š{]ýXí¼‰6ü§vĶð™’ûD¼öDÌg*{PýüÑÑã‘JjYÊÖÒhÒhÙUá·îg«ç û$^–‚·øó’·¿fÒø3ͺ‡ø|Æ^àîŽJ%‘ÑŸ¥¢&Çk8VRQ s‰’úXs¹’H[J‚=+„ãèÉ,È0à/. ìc[ç“nÃtòÍéÓ‘[òìõ'3>FÌlp»`4Àܯ4¬‡‡`†Å'Jí ~¢¸OÖ]›GÒ Ž!'nI 85%ôX ®¡‡:¤"i6ôÚiÈhqA¿ø'ø>T옪#ZVŸMÐaï­Ó Âœo¤iãQY½Ä-¬ŽŠU`ë|y·-ö·9oŸfnš„_4£*¶Ç>D(*r„Ž 6Œñ¾¥ôã-t¹UßæâŸË{VÛ5î‡8`4%iÒ¶^2Ôì#æø’‚ºq¨·S²÷žs³!WˆÂº©œáWx„ýsÔÛä #Ï-Ž«aלæžo@û-ýsöÇ ‹fGY»(¶××Ö2Uvÿu3ÞßÑ£cö5ªÛwœÎ…&§‡h¿‰v„„öѱcà¯Ó <3æâ#®cЮˆ7WÇy´¸ šMfQN»§éŽo€0BŽ#ÛO#!üï¯Mls1hæÈ’…|¯¸£Ò¦ ¬÷‡{N¹êk¡]rw‘.u>¶"c¾\®ÀÄZ K׫wàmô;²Z–¼6wìv~š°3è…‹šýòò4§­Q_g’ò­SI_u²m¬Ëð¬k«ð®•åÚSÔ®Ihâ}ËD1Y4¼à¾Ô«Wy¦ &,@zÍRW®9~$úÌëËàÁùU‹™R±EovÐ)KîÑ¥¹9IvF»ëF™•ºZHlZ·)49‚Ó–"ê¢3Kt8WÌ!"\¹1½fn’YMqAfd”sÃåsô61JrªÃ;>]5ó ;”a©îó}‡õ[^F$âë{ØIYur0U¯ˆk­Wkh1S!WþtK = ‘Í(HöJ·£÷Ñ÷ûc¿ï—©Îh؃ÂJ*6²ä|°Á솪ÚprÉ@’㙺[ñæ®ÛËK¦¨D-¢qV#2¨aû]ݘµ0ü˸ãÜo¾ÕZ¼ö™|~³èþ\ÒêlkSÈÿKB‰Ú"ä.ÿËe"m'¼Ato`± zŒ»'‰]PÚH¢}™*¢5Ñ.Ë^­ÌêõÝ7#-0’ì†îdßÒD$ÍsFJí’i>t»ñ³UýÍú›¸w4â÷ L®ŽÍÆ"PØü¨šëد?Ëi32öwÛg~ùYðþEê-¯ŠàhˆG¸FXžb6G”¨´\9>|¼§b¼¼Î'¨l£ù(U¥CQQ9#*CHbž""_R“²7[ç!v›ãµíÝ…<Í7&óÀ!x‡ýk†z ÛM¨ü.fŠk…©©}‰ä`±3o«Ptž´Û5žlµÄbεóÖÁf(‡$Únen_޲€BVŠþ¨†?t[Îh_²Œ¥LïK×ä<0öÏx΃­ÐÚjÜ[s$}³ÙÝ/û¡‹ßN8ºƒ::¢öù…Åœ‚\ž4Y–›¨ußcç† ÖsV5cE5C? x•!¾Ý—Ìáÿ<«*õj›-Ú™<ñÎßÃA®Û'Xù^Üší=ʚΛRâ¦Üqö¡?¡"L~ÙÂ]–1Îd¡Áé¼ë¼N¼NxqÑ›÷CÂâñ½(ï CÆÔQÏU¬4RdÑí!,ð<… F°ø?ÕKè²gÏêàt‘y‰¥þ&´¦ÑTÕ˜ ‹ùÕn¯Êk…_Ö,,Rvt]ñy °¯ ƬÀR¿}Ø©Ðiˆ,¼IC™:·<½¸ô§ZQC™«GZhƒ.j€Œb‘è?¼4ž8g] c¢ükA ¢ óvŸ §8«:3qÕPKì¥3ÚRPÔw¤ný×¶v˜€RöÆ—/è BÉñ<¥Þvà­]>rq©ÚèÔËÖ±û¨Wq´(çnÛ¢P©q:Û^8AV®YäNdRYmò|¢wÆ’µ&{A%; g•hd‚Dþy|ëŸÔÔ »fìXÓÐDÝE¬J³Axí<æ¯Ùä«0ëG¡Á Ñ­ZvEÎãNÈŽù"'ãÞ-,ŽÏ]eßwî }½O-÷_4#rh½_ÝÝÊvMÇ–Û±4ë0‹þó¨º?ÛÒÆO€*Ý‹!)B"Ò»[dK<êo`Æ[Oo°gmÏ@55EžÓsÌåÖ+q€Ë*XŽÑTj|ÂBǃlïïç 1ªãßMœ ¯œèÝ"ºR!ÛUžÎ¿Ózèg³'bNÕ¢Hxc*_äÐlÙëzV.7k>ÏFqJ›%h’åˆu»ÊCU®ì1:ÝƶååüÁ/õ?ÜØÁ¾Õé•ÃÑE›å™Ô l¯¸5é†V?§2½‚X“*÷È•=W˜´|‡^h¶'8{[ήܮƒZLŽõ{ð`Àd8|){'Wâ¿T§Ki>m(û¾ã¥‰@ëEZS ¯¹Á¢…ˆ>ˆöÕh·=YKýìs ­¥¯ß—-#’å9]®8xö~¸x ã êXù ½æ|µÔEÔ‰Ñ'¨‡ˆsgܽ|„¿ß[9£> ÷8TA׌â .ÒäEÛð²Hã‰èǬ©Û+8ZÜAnX Ç|`”Þpºö1²’5y5PFñ‘ÕJ€fY§ §#™5Áœ3—Œ—t‘MfØzEyÿHÂ'ö :Aò‘¸YG’0÷ÿ2ÕWˆÖØš&P½›Ìƒiߘ”¿—õ®pªÁvÕþ}OÞŒ¯Þw]§Ë=o4iÒR² NWö #ÓÐr5+Á®#ˆŽÃ{Ê镯Ö¤ RôT騎e¸.Çåb‹­È%øô&të«2>^÷4FóçUFøä:ý:Lß­Ÿ~Öo-î8gEªñu¡×ªÀêÖZÏáPt€2W÷ª®bFµ*éJõG³Lè·5Û-‚ý">H>/§æ¡ÞSVZ¸ŠßâtÐÕîóEQõ֘¸W—›[äpj’Ad:Vø^gŽbÊcmÁŠÛ7»Í¯úózÓŠ«*—…ù÷>òåj‰tD]ªàì`”‚ºƒm64ð*$UPSø(™g¤ïµ8bN0Óáü]ñq<€ó|ûdkL§'W’ÓTèÈ2qîóZc»æã"ï¯³Âæ"!5rå *Ÿ9‘6nvTä‹/îóïjßcÜRçf%!£øÃyIì›ßØE"ð4Ó ÎŸÔ õá嚟Ãjťޡå;¨ÝÚôÍáqFªT4æ]¥ù,†( Ë,,Rñ[Ò:,óÍYRF;Ñ¿¤oØôë/^Ô¨‡bÉ „éçdJ¥ã–ñwý$h€rºŽtÆÌ¥îÜ…4ê\)ÇÔk'|¨fu™Ý÷q±p`MiØ`þ1{š±?‚‚@j5lÉ4à²0¤÷·ºg²GìÜØA$dD¥q±Û´9Êû¥Áèó÷]äŽ\öÕ¸Hw::cÌâ5¨;Û»’”/™lØêýK·ÚI¼þ”_ƒDqñMG¤Öœ¤Ó5œ,Ýýºê SþrVæa~96HR9‘!F\¡NŒþ¼F$•ÙŒ¸mâóàÚÃäôÊ!úsƒÃ©ÖI•ô¦u"/«ß`³ Tã%%ÂZ\]Ó)ÎKI® õ¬ÍœhhL€9€TÌ£àK Óa5…X–ŸožƒÝ‡¦ûoû»H¿Ü5€ÊD$ kU­¤q‹îÒGáטÍþ¤ÐJ-m‚KšwßÂeÀH•x,ÛQýišÖûÓßs® óšÀv•³‘r˜Ã¢gíÈfæ âºzôìU€!2Yh@ù¿¥…°#Uzh™ÄY÷õVàSPÑÁÊv_ ”óŸœK8ʦá7ÔsH¶–çö8&¾£LÒ>CÏIúõ­TãÂ3žÛû„.Œ";{"ÝaÑ·¢,‰-q±±aKzìž8}±ÖŒW 1Y©Á+Æ”¤ ¯dC˜â(u/ã¢ûc.ûn7ÛD •`σ—dB݉ØVóËp7$5;K8ó½/#ñͪԋÐÜHØßôý…(}lXÚýPæ¾Are.ËoÏw*§MÐ%nžÿ"ˆ wÚÙ˜Àåk;éÚ—»zl;e¤[Sóâw¨»èÐKƒ.u\¨gá›7©lóÏV(Æt~¦­õšVØàýëAwŸh ~À~þ1 Ç>©=ïœÌ)€}±o}zúF¥”VD²*èqO›Ø±0jJ–c3š6 ÜÆAÍÃg‚¨û5.b8ñr?‡3f°©ÜsxW›pó!ž›w)4í(Çä!-ªèMXÙ–6ñ]8ß|µ'ë¹…75<û¼þXîÓi‚Ñ[6äwô“.fô˜¡‰óÖrgßhÇY­ Š{¿Gä¥p·?»=æ¹0JÁ3iLA’>ãÜ©øtž8bé²¥èމ ëÒÕÁ·Ðç%ïú!È~Úú¡^寧/ŠMhðCÓvaæuF«i>1Å"^ƒd ýêÁÁÝ,Þ‹Âä &Ë÷¬$ͶR‘fàÒn3’뫱> ætiÖÄqÅwWéO I'3ÈÅåóQYdœ+@,ºQãÔy½©È<܆£Ï¼âPÐ¹Ë ³1®t;d[•Éåml˜BÿTŽÏ f¸aTìÕØ’ <bÂË®%|âÕÍ·ÄD) Ç®n$·úÔ#z«,Sü8¥á !‹’°ÛD²¡U²óÀá_k ÜFu²v€é"ý™ñh§-Ãصwñ˜ÌÒœxÓÑN=w†øŒ^ºJˆ®ñðgý¬^OßÇáL”&>KªÕ„XV¢ZÈIÉlûå×nÿHÏz- júiBñ7 û§Ô—øì\“Ád ÙŠMÿM´†®?Õ VªÅ1p4õÉ )(p…ÑGôÖçEa²=ÕìáSµ¯*f’‰ªãŽÇ˜Ð¾ÖF¨Ðøõ·%\«¶!™ÄÂ@ò„Äxpagw!”§e^ABfz*þiª&jPåw‘Š“N3¸c˃ÊþMŠå§Õ{s¹˜S™ÚMaç0<˜¼óŒ‰êœ”GQ$#b(w…@nÞ¼VüxžÞµ–û°ÅLÔH¡yŒônyÒ~Žó[6&=5OÌ?‚’7‘‚qZy±I7Ž¢…("ö7¬DsäÐ „¨šYƒÜaÚϼJ^y%ž9òÏ|µÍÞê2Ðl"°P?Þf“£ž±àR«1kE°lDíü\žî´Ø`”öÇÆfê²n™€%cínš]+LJF"º¯àúšöí0‹ªC«y—ÍH±ê*𽯈‰…ÏH§;´¯ÐoW!ùÍ‘~S? íµ‚ÞÈöš’kœnÜx´ûƤîÈàQðjuV«S‘C8êKd´iÅk6Ä^»~µ4²éâm˜Ò^ÉOÚ¯!—UµVȰ©-ÝNXfGñ ¶ÃwGJ½U2Æ.™)SSZ³H9p`²FÈJDî§.í²1ÊÞÉ3TF÷㬼Év:ê[jÈÆ|Cü¿ÅáÔvó;¾$&Îf6‰0å÷SLûÏ’{Çê0;µuZ'КB&rÔ}¯V¢BÅBÁU„­p¸b}¡1öݽ ^E 1ÂR~ÇŽQè+6Dîh@M0j…T½(»ÕýWó™l¤L×4vÉEØ)¹nNú°ïŠ÷VỄVKÝ‹G+…Áò¿Œ¶yò°†”n';ü`Zgñ!&äa`1 Eàþ$ƒªJlB8\¿úò= ¹K¨™¡E$ô –ùµ}Z±f†fb8ët¶…¥àLïR}W(º>´¿ÀÒÊ tx"[†Øúù>Ë”ŒÅvd–8+^ÄrMF™‰aÕ-'($ð ›ýÚÝôÊxÞO²µ N¸dîôx¨:èƒìËßZ­¤ª€†Cè,)£áÑ÷ßË„Š¼M³<ár±+hy©»lmF4ÔÑ» Nà´:þa1ã¶µV<.•ÊHóXî-Ñ.á–¿øUCOUzÙF'ÿÎZuöi†r`Øh»hQG–6þKY1 ÷ØW‚o!ñk¯ºß,\ņ{Ÿhë õÉšýϼ¬ÖÆ3X/]×øê4#›¾áÛ Wö0YE禰`×½CC`ûWáæ|&µ“|¸ËÌoFæü%q°÷Òôp³b‡#þý³¬4ô9Ð<`ÖÆ=¦[Åæ‹¢¤†V·,èL»w† Æœ‘ý˜Ç[g:Ù×PqG\xã¹[žõ¡•kØ^ÒÅU5ÛKÓ9ž¢__æÍíÕW7m¨·9êäOçî´\ÚâSEY³¶)þÎùÓ쥮¹rê¤Vˆ … ƒÄê`XÁD D»ºÇËÙ ~¤µ©Êð~w[³á$y `Ô¬M´ÐžqK&ì@pн¶•·CèZp9^l¢¢(KÚ(ÄÞÉ2CÜ\áœR”kh¬ªˆPýú=¨dœEºãÆ­¾>zaÔŠŽñ›N¦Tœƒ=ÕN•zÎÉuÒE.D\líjpi‘¢L%ïú«ؤ ÎEû'7"Ê¢a“õÒï¥.Îõ‹™wÜùkn´Bew-•ÖCýÎòÄ‹i›þÙøýD±57ÂÕ—“ÞjB xÇY©ažþ²–B ¦×­ ÷µ{Òžów”æÎЇÆÑ'Þ¡.sßRûâ §uòö¤áÀå¾{7¬$ «ÍÛ?yó¨cw#w!·¯ºqÓ­y¸‚A1HúÐ=¼q*k«Ájåf¨"¡µßö6Û›©`¾Çb1Öo¨è …Ìk”¤tø¬éÃv š¡ß„!É>&4tŠþiœNtPŠûJXZŸÔäjqrŒ— …ìÝþGì e÷âH]ê –Û£<¢}w­]5í˜/(?7FZQ«oUæAuM(­|w™Žöä _Uïä•®"QL©/á)`ÆÑ5gÒºÿññ‚Žžú*WáýÇRÚkކ!™Sóp;q;|ëü-Iª²µò(wq !M2ͧ´†‘¡¬[S÷õ;é[¿/Ú#“Oó¤Ý–"úOÊÓ±ÞMxÖ½âB6Á›;¾=5Ÿh3®mÚÔ—\5:dÕàsÏ)—{´(¯ú÷ö:*H¿˜cŸ`Z±¼à +Y¬Ê)Ø=$² Õ߃FûT90ÂY[›AYO•µ™Ó˜9kjÀ¡œXÊÌíí÷G3[(ñôä¡Ìu*sÐð`·{-%ø½XEv\ ïã6¨Ó=ü‹Há;0Ë;,\0{ͼ¬–á÷„Í&Ýj¤×~ê)dï5ŸËºËõJ°Øªâz}ôf0IC7IÉ~¬ æM:¢G·<®[Õ."G8à*‰CŸŽ«Ö1Å ¨Ú ;$ ©æí×ðc2‘ 7%x’/\̺&+DA°‰™:7l‡ ¦±¦†Ìÿx3. Ýd–l‰;«ø£þ¸¹Tn÷[sÏ?ò";%?ô5w)ŸÎðj(ðzò¼Žò^ˆ¨Qͦ¥øKÙ™™ëy]‚ö FèU~ÖÆNAY·{㮂À†!'ÖÏlvÁ]õ«Ž3O]ùt¦`*áé8:}ã*rC5më*lì¢zeÕ{QG#Aœý[Ë/îîèe$QÈ[iWð¯P1ø¿sédúS³°`©ÍR¢(4Ë2ʧڲ4/EÃŽ8Jœ="™»Ù-è*åo¼6I ;Ô»óü„ÌGÃ_uê÷¯{0³–Í7únJ` Ù•’Ÿ€‰2:‰ˆ¸rÍ`iíÙ¬ÅÿýË÷äBÛ:'Qs޹oVV­øòãéA¶rˆ©™Û;ë‡Yðß(™S2#fÇjýòT /j2¸:7ÚÂáËð¢ƒb¦Ÿrš;í¡ Ëp‡œÉªÜúæùëÜàK´2El ¶+C`QýŠ\JWëÌÃÏ-ºž²{£2âHë€g§ëö†ÞC ú¹üê®4¶&¦+IÞl­üÎ…šÛ!Ð…}ÑH ×ܸ2#މ…¶V‰urµ”!c°³Yqšýµ^'¨±½Òâ£*1Ä‹_ §géBüG3›õXú@dæK[D©úžLÎ:¶ž/í7ä’ØhÌHL%µŸ:áˆzòtþ/1Éã ›âi¤‡È¯íÌ3Ž«ïû¡Îæôþôœž/:ÇË Ð %™ÅŽ¨Õ óý³»Œðjâäl$$òµyµgÙNÈdéÁzlm•ëÓMGx·Ã?ºÇ!&JêL8cñ6‘mü|òKqIžAî­Ão÷åŸXƉ÷Iö«lÏ£Þæ-Òy‡Ë°¼VHæ§Ï…w ¥ŸÏIŠË^Ügã÷r ¹¡ï“•¡(‰Á]–CR0ªöO3xÏ@³"–íF˜WFÙi ì‰Dp Ÿ„ogÛÎtS„y J¡LYÌÕàñNåÎ'ícLjàwt,Ù÷Ÿ—uÓ_ÍÐh¥gü† ÷YÓ›‚úH‘Ï©¯‡7Cáû~“Ûñ«LÑýR*ÅWy¦Í¥2ݧÁ˜d%•³Gîßf )eÑõ‹~[ZÖS&i—¤Æµàäàu m¾‘åêø˜Ec°ÕÌb?ÊîÀ{ß/eFÙ½ Ö#¹.¤Šˆ%º3 #&åºs­X.\d?î=»ÊQÊý½ÝX'?_èûð1›>I˜^ö?H‚ºÜe|ו½øªB:n“Éþh"9¹OtóC=ˆÚ›tõ¾C *øöó«ŽÛ„~áñ°a!ôp©nx!³äk.yEªLžòšY15¡y¯ÌJ‚b§ r¥EW“¢!´ÚÒ¯ô!Ë^Wy¥¯ÂÑÓÆ]šùÝõÁDª½»*K%‰/{X]rs8¬öÁ<Å ŒÙ9IK®Áí·øˆ""%wî²æy娸ˆb»TMΗ²d©ÙZåÀlÇñ}N8i8ÿ*ÏEu×Úƒˆ&ÍqgqYv9U[lup.54\ðËXæO+ÊÀÑ¥Î2¿·= ¦0 ݶ̅%WÚºï÷›”åQû _ò—EÝõˆ/:åk2¥úYÄ«ä1*—í : S¹î"FÃŒ¢º&ÆKŽžÅ1Úl󦛿†ÔÒFØê{$cŒ5‡ç¤Lâ#‡Ã®Æ.B­Jæ§ç‹ãïÿä™iD)`eµ=5~µØ‚?ßVGáµ ž€2gRj?'ΙJšÕ:‹ƒâvÏ©S^ˆvù¦^F ±MØWè=-ò—¢é»Ú {—IAJѤƒ7Mzù3ã%hUq|„Þ똥bÈ¢'¼Â¾“¼së¿Ã~ˆÐaÓ+°3\:ê”Y«Â?Û‡ôœt›ºÁ¾ýý$ËÝî`Iæsa] s@÷"÷õËDF®ÃÉ]FkñÛ Úk-ÇIe“á#ž,Ü’ý!ïÛtœà’' ÓœpϪ†¤î˜.Ç—¾ô—ªÚí0Ö5iûÔ£O#†v™[rƒ¨ŽoFMÄ/5JÙßJŽÄÈW¨Ä~iÊ@”ÓÓ®Wʲn-oKã,R=@«pX0JƒÔp5ÊØu±&O¸çê'à×%ŠûæŽaoÑ$bý_MP£ØLÊõ—Cjí|'¼ÏQcÓh~ÚÛ –ŸB;¥×WC|í‡Oí$tÄÌ ¤«GÓhm˜ݱ+¾F²jõú#=9E'£ìƒäP²Xð³o|15k ¡êìžVAK½)¡%>« ̧;Õ‚ïQ(gm H•Š™Œšë’W ¤é~û‚åæ3ÐDc]y²5{W»y÷\ˆK{ÑNËJR“ßæé%}ÈÛX§=0†fÃy¥)‹)ΩÌX¸Üq³ãL~Û%3Æ£AVR¿HŽŽÓ7’3À „ Êýíª{+ë`ýc96/·[â½3p+Bf8ƒÔÓuD²à71}…e4Žï;s5´-/X2:é½]L|Ʋ<ÌæÇ̸E(ëBÞ&Ûv 8Àç ?=Bà¤GJŒfXpPþró¸Aƒiºu(MX~-âO6’ÊJ>ÃÎÐBºÁð:ÃLã>à92ÞP'´ðyë;Ð_íw-x%LéÚc±•;j‚Ÿ§Ä&78°4,øŸFtÕV×q²‚ÔtL¶¿o˜Rà˜E]©OPêq‡È{ù’ÊX4Ú(ýš5Y¹e°ÀHÚ å'»àâ•—ÇÅëiœï³5}½ñ0‚×Ä¢R‡Ð•ÿ•%ÛuÛ(Cùì R)­ÈUW!±‚¦…–ÚÒ¥A¬6ÂÜüÎ!q‹@µé.ò®^¿,‚å Îdj—Kì¸q€M½nàê$$vÆW¦ÍªäÔ®µY†Ý Nvþ ç®Mc„X0N"F=AÈÒ©½kÀÀkU_^iàÖaFËõ­›;èVƒ·8Ýjo7³ôÓÏ_=´4‰%F2=Q¥PàŽ™tÇ)bÜâ µî}¨2ºRAcÏãýÏŒC’,# Ý1Ç tËáÜîFôÞ¬PAH_I¢®öžPÝ·`Ýlm€wVï3žó=ó\u#“@¼3C +”‡i] &M3Ûï C&­£ ,Á]O]Óºn¨1éÏ}C‘¸:û#P'ÙËu¨¸ÅXÛú0meJ$mC¡-•^ìçÀ”ÓUgOr8©‰‚æ£vÁÇûÝîÍ`Mâ¼Y.•×¼\>o_ºh€¦µÁ¸¨ R{ׄì¢÷\LÀóE )›‹¨ãªÁí¸# KqúªÙŸ³+pA°lXd/ BÿZn'-ÌåJeýtœuÇþ‡†[m,Â÷"(Ñ|ÐÓTÆaaå>™~Ñjk-_ˆå§²ˆ¦ ÚÕ®©YïmòJ«ž1*úv3µÐiy!y{Ü~žïwÔ–ÁD˜‡ÅýíÎ^¾*Fî+0àÍžÐÈ!f´g®«Èx%˜‘·ßÊ‹£ßB‰ ÚŒ°ýYj~™Ø%àßé­€%ò£’ûMtpˆ#çü¦}œ<3>õÔ…Û@Ã1ȯ=évfSO$‡„—Ýc9®"ñãî ¿›À™xnÖì%ÁYz^h©Ê±U7¨iÎe›X¯í\­´Ô®÷Ëžÿ¤Èyh¢^[E0=ÂCðu'½,¸âLÒ1éþ½À7ˆ³Ùù†¨¬¯ÕdG&³{»Óo“±EŒÇŒ#)ö’rZb{ǰ`#Ó“?õ«^é(Bœ¢vì)®¬cCfÜÀ8Ð!pf$(CìýOâ>? h'½ûDï;2Þ¶)ÄŸ0M0[â‰t ?éM¥»ä’ß 9í^YT£ß›/æ{¦Ž5ê¾rñù?1uîiµGŒ'Ð\=?ùÙ!PÔÇI1“»÷%[Ö³;‘ޤ9:_ôaŸHùÜ9):NS±5Sc²áÇÄp}‰êì>–È—§¦"ˆ1ãξ…è÷™"¹HôÊr{±GèËP g?VÑ5áúÓZªŒ£CmÇÏQ.„m¹ÓÛ Æ¡S²ê¦»2d½ÀdîŤÀUr{Z“'4Ë’¦Ù”åT<Ääýí®©L+?OߎÙx£/’+ýø»³]²8æXv5y'+A{Uícôq¯#3Ž;AŒ‰ÉùÛþ±0þ-rnú:7.âõŒM}UŸ ã~îîÙAŠÄ^á9!Z G뛚‹Ã~ ŒÓ0¢ÒoÛøƒ°U¼ˆN¬MNüô îÝ­+?P[ý’WÅèFhæ¦[I{V{ª_j )ëw^……cu¡RM€ûÎSD*£o÷Rg^Sñ¨`¾hU£³ÊÌJ E_Û¨g”dƒ—p•ËhªS*y½V§å…ªæ%^IköÍis¯aŽ'0cÉó­«?'6«œòZMûËdmœS¢Áø´•EBÔæ±nÍ*ƒ¾¢Ñdm{J@<ÇY*_=7ªEhJO`ϸçƒï}çàœÏËÔ`ˆš9x(‰Þg@n6m­Ê²,:ÝZsb"gQ„\e}+®1Î¤ÕÆª> -Ï+3U€4-MQCí8âV©˜è1,{`äìd˜¼élšù‹ô‚j ´~%Kaõ(SOµVým¶V[ã颚‰«î Ë5²5LØ”T7F¸ÿœFN´yâ„wÝ2‰òáûgc²x9xM{R\¼Ì“±Ï%R«kœ†™Ð.×!5Špßü¸Â1hZZø뉯\³ù‡ª³pÞêC,ª ÏÖŸËÚ´UøÁPíUßo#O?¹½y¡µ„<ó‘¶# ìc+:/óÉ© ÙRxÚ¨¾ž.%?ÝMD¯¬ƒ«/ùv; R‚Þ™XdÞ«œ@O2²®›úÐ÷b5Ò³8´=©…ê&®"ða´h»äµ˜@vTqFT±ÓÁ‰2+ÐÎ¥x‡úS¡>õa/r?îr`EÉ9®?§S?Ô ®O´³ ëÇgˆ}:þUƒ’£Ú‘[|ññ*º7àDЏVIrf°² )¨F¹ªsa³¤5Mq;1Ù­/;o6>ù¸s‘ÐõÛ†¸>1õUY5á¹E’´ŒAAíý¹w°Í>uµAå»"ñQ…¡×c‰˜äÐ2ƒ[ç´ÚvXjeüA~(Ó申œöU“O`PžxXyÕxX3ÆÓ ,4zt’òg4P¢Ú©AëM!ÙäIÊ£ŽTsRƾDHÓë|6¼4Lí&¬•ë`y‚Aá_æRw©IS5A(ëmâŒy?xñׇJ ŽÌhaœ‚|¤„E§‘½j´öÆEÇÇ€j#; þfõù^Êê–408ê}Mf`(§@ 'ߥo÷?ØÅÝÀ·°D®±@U²Æ\P§äÁó˜åõå/Ê*-zH†,?6¡Eƒô±愽¶ˆ>ýñ±45_¤³˜½%ÎðÛèô]^)d½*ðv¾ù Ü’cæížàÇ­gféš*Ë5ÛùºÂÚùÔö2—ç7&¨¬”bÅžÊæ9ølìÒ‹aŠà}Æìò#àaÒì/¹µ8‚儼B ÔÜôýÍwÐ$)Dh¬n —ó Ke šA\Me£d_… …=lɳ·†ZËŸ^¾^{SòdÖ­÷}!¬V£Ÿ’ÎÄÃI½Y›áNî)±RKo‘ñ‘w‘–¥@CÉ “•Áq<>°Ý‘´ä~þ©ªºžæòÔ±„9æE&\Y1J%ˈÿ!ˆ8NNd*ªP%xƒÝ[#Xt›U{ a› VmM ï19sÛ¥{ùKs˜²jAÃ{¾³v¸|f±à-ñ«"¦ÿ^¯“¸½#‡sø†îž oßÞ~óò¨$ørÁx^YGötQeQÕ¶èþþbÆÚÞ¾yâùÎA¿²Ê¤¹4aãäŽx¨¯ÚóÉÉÀèö‹S¶¯s":¡%”$U XÀûš±/°qÒ ¡Ä”(°¢D)•"ÇîÐä7¼XmJ­!ýN“õ=º$=Vº( DŽC¿ëx‹ý÷ä¦õêj[âî‹„8-½¼š…4%A[ ¨ÔŒšÆ_µ™’–(àªGo9ÐÄ ]QÓ?­{ǯz‘²èvzZþ€Ç¢-(vüs©ÆW Ã?–a² c-ÓB±t9Àï†ü½ -)ÑÏ‹_ÖÈ]Œ)9™"Ó@C»¨ç:íTÒK–™ç¯_ã!E›ÿú‘Ú嫚Aå5ÁM0´áGí ÂÎ*” ¤‚ö$4×€ÑïÍÈÐdA Äž?•uXšGîŽö‚g‚d©¦ã=(°.‰MœïM2¡"° (í™ÞÍßMÇC—¸VŸ{e[ºyÌÊ=0³;îRvŒ^­Sî<”ZJm¡Œˆ«¡5Q#0)a†5T›æÐ”s¿ï'ÂwîÎ<S¾"²Î¾Æ’æ°yÿÂ~æÚÌ©ü Óÿi%˜n"X4>³b±NÈñ©ši÷ý;W¹&íÏõOàEú{üÀ8,©a4ˆ×./¦ÍÏG)4ÂðG  øÚ£ÏbºÝ ¨„Mæh"h«æ!„8ûi[½>äøˆ ÛÞsE}RSjŒ6Ο¥¬ö€7lõ;6¿âDD÷TøßÀìŽÊ«00—첿V3WgRx¹±G˜ !ºo:϶¤i‹å_,ÃtYX*zëÛ¡Íñp“æ^ßâ;~³Qp–FÄ×BMN.ÍÚ‚ÅO†Éÿ<×7 ƒ ÂKlÁÁ{´å´ÿRå²\FŠUå’ïlÎ5íÉù…|ž>ÆýÄÜáѾÙIÓÿ öÎ 3&ZT ³PÊ3CV¤˜¤©÷<ˆ&#k ©yg¸pHè;½®ˆÝë¶Vèè ¶ âŸËVàÑêíKèË¥ø2*=9C¡áÖèD©O([úO[~å­ç¤Q$}+n>ö7˜ nÀñ„J¤r¹`„j l=XÅ¥`ùzí|XRvŒ•¾Þѧ,˜Aª¾˜EÖ]\Ûfç]–rÁØá¹íð=ï ^jø27!TòÛÎÓŽ ÞŽÛ6n ~ˆVÈ6UF´ª®Çþ’<›@Ÿéûv௷a¸^ )/°g¤œy5¨ _ÖD*ü"5€œë„6Tû…±Nä%SË1àFrà áHQŒ¬ymN}áXì·ˆköËcAIÆsPî4'߯P~ÏÍÔ{ÿ÷Ãáìiwga@XYmÐ ³(#Ž`aÐï±ù[§bøûá‘«ÉI[nõH<øÖ"ŸßŸÛGZ؉ñ)̇lÕ\Bu<’ «ŸL’ÓsÈÓ¾¶€ÒÒ oh!vO¨M ¿· aÂÆ‚uwßöÍ1o?FÔ9Š£”c?­Ý+k߆J_@Ò0cá£ÄNkDȱåÀ›ä`6¢ÌÆôj1FâÄsƒ[Ç{÷Ã^IxYär×ad&ß`j26TâõíæmͶóÆÆxÓINJ #aǼîäçˆÒ↳8ޤ˦ ~7†ò–Ú•/Í–ÏÝFÅ“Ê{ËŽ×=ýÛ=LÎx¬¨XÐŒ§ÔÍuækI%b kPü³ ˆëÆøR2È2X÷´ÔaÚ;s|€fLÝõ–A &ëXW[m B–ÃJOýèòÀK­µ|tÈàßçìüËl9¸›þ±R˜“£-ª :­Y‹–îé¹Ë¼ðž’u'w;w[Rçé™ÉO±9oaðXP VŽE“ ´Øºg ¯d?nßÏÄk‡›«í47²ç‘t‡_]bs—L‡¿^‰·Dö;­Öê) pü?€`üö endstream endobj 143 0 obj << /Length1 1914 /Length2 12364 /Length3 0 /Length 13560 /Filter /FlateDecode >> stream xÚ÷PÜÛò cÁ]C°ÁÝÝÝÝ]`îîîîÁƒ»Hp÷`àî„àþ8rï9÷ÿ}UïÕTÍÌê^Ý»×ÞÝû7CIª¢Î(jno ’²·sadebáˆ+jh°²XXØ™XXØ))5À.6 ÿØ)µ@NÎ`{;¾1Ä@@—7›Ðå¨hosµ°²X¹øX¹ùXXl,,¼ÿ!Ú;ñ$€n`s€"@ÎÞäŒH)nïàé¶´ry[ç?_4f´V^^n†?â¶ '°Ð t±Ù¾­h´¨Û›A.žÿ“‚FÀÊÅÅ™ÙÝÝ hëÌdïd)DËp»XÔ@Î '79àÉ% -èoiLˆ” +°ó_u{ w ðf°›ìœßB\íÌAN€·Õê² eÝ_d…¿ €¿7ÀÊÄúßtGÿ‘l÷g0ÐÌÌÞÖhç ¶³X€m@e)&ÐÎü"ÐÆÙþ-èÛMß–H‰ª€o ÿÖçlævpqfrÛü¡‘ù4oÛ,ig.nok ²sqFü£> °Èìmß=™ÿ>ÜvöîvÞÿA`;s‹?d˜»:0kÚ]A²sÞLˆÿØ,A.Nv6Èò0³bþc OПNÖ?Ìo|½ìo2@¾` ÐÛ¢·3Ð pqrùzÿÛñ¿‘•`6s˜‚,Ávˆÿd3ƒ,þÂoçïöè³¼µ+€å׿¾u˜¹½ç?ô?˜YKWCNQ‹þoÉÿuЉÙ{¼9ŒlììNvn/À÷³¨ÁWñ¯HY; {ï_žíÒ vû»hþZÀÿæR²ë[€æŸ67`ád1{{cýÿÜì†üÿëñ?²ü¿¶ùÿ­HÊÕÆæO?Í_„ÿ?Ðlãù7ã­o]]Þf@Ñþmìþ/Uô×à*‚ÌÁ®¶ÿ×+ë|›Q;K›ÿn$ØY ì2W»˜YýÙ™5ÿ˜3°HÅÞüÇÍ`deaù?¾·á2ûøv{8¿µäŸ.ÐÛìüvföæ 'èäôDdyë$6NN€7ëÛ4šƒ<þlb3“½Ë[àM/ÀÂÞ ñ#åâ0‹ýaúqó˜•ÿ‹x¹ÌÀ€ÙôôÆ4û/âàxCoãøû-Ìæÿ‚¬fÐ!×ÈÑõí¸ÿ!°˜-þ 9Ùߨ ô/ÿ›Åò_ðmE«Öç|CžVo—Õ?Œ7ø_ðMÌÇÁ75ÿ^ýMŽí?ðm|˜ÿ•êm`™íÿYìûv¥ÿËýVºÃ?î·X‡·[ÕÎdáò•õoë_ñ_óÛö;¼ý¿¶êí¡Âìø_Èþ&ÔÑÕÞdnú¯zY߬Nÿ‚oZÿß²ºü ¾Iwý|“îö/ø&ÇýÈöV¾Ç?õ½q½@N‰ÿŸÆ3suzSãòçÝðÖ•ÿÁ^õ È qiÁÞŒ?ĺ!¤ó®N”ÀqwRp–rW;ƒ–Ñ{Éé«ë*\*mmNкÓhêH/úʶ$͵È2ɳ÷q{\ø—dÕŽGŸ'ãDµ™ÝÄÓ¸ƒSÅÇ¢D„Œ"{>ÏŽ>Z¡Û!»ä(ó]yPU ±îÜû¥=*~އ-ìªîÕrÉ#=U|gŒÕŒ1,£,0ý4ÿž Ö…‘žó—ÚÜõÍ,fÞÔ+‰\"=¢ïI,ûgo½ ¶¸ûy¯Õ* 6çn| |½÷DÐטã3TÞbirx‹Þe%1ë‚‹¹,´Lëã$>c]¤§|ú ²˜x,6NK¼Òã¾Ì:}È9»ê&t Ǩ‹ïé-ÜÝ¥¤ •ÛJúö†Ãn#œèÍÝâá”<½÷6—`®u¯åO¦ô½Fõ-¶·pVZÅV%žÃ}±a ¯öxº}Š­³Ë)ê½û¬§Nf=ö<Ô¿VJž¹¢"§f¢¡cgµpSŽ$^×cq Ôòv˜B ôÐêà/Kiþdr¼Ø{á8-×Ëà‘ôrw'lö´I™*‘å4Ål%±6K<Õé‚BÍŒbHæÎ+^©¯ç»#‘ûi^#2Ó!u9¡¾Ûc\QðhŽAnIßðp,Øz t3§÷)èßÖš®M¥t%é˼¼- ×0X|P\> ‰K 9ñYl’*?V©—ã`þö%f6 í½•ÃG‹Ö%ýêJ\±§‘ÑßµR½Â–j™ò冑 ê³*AmݯhñUŸ¯+¶ôÇ û’— ëÈý2Æõf{ô~xŽz ׯ4„×jSIè'ÎBdSa6ï~™ç'ã%ìUñ½÷ý–¸ÞÕ7¼^°f2~žŒÍ_“!i^µÛ‹5:ºt™Ul¥ó°m›ÇÝÓ­IÇ:YAarÚŪJp°ðf‘èöá©ÜÏ;üÄoxÄ\ûÍÊgjšé®TMfm¤N~Ú6ÇæÆh…J{%/zuÈ›¸µð׳­Tÿ‡ÕóJ¢OMé]Ëó™#ì Rò~²¸aŸ¹Ep5¸EW•UP£˜øãVbdFéÁa:õ¹ü¾ Iw¹Ø{Öº:×’‹®•$EœC?Äd;K6 ¢Ô¸Oåùܪ_c9XÌ e2çæ9~¹êŸ ¸ë~íKËÊÝýÄÖ©ï2ý}d4m¹(¦)GØê*3ªÍ4gó…n‰Š"Ñ|ð³I@%94;’/,(ÿ¢œ6®ÜGsæ|°ü8Æö¾K¢¹Øå}Î<½·4_°*;‡sâ²ÓŸž ¥øÌš»‡ò,òšŽ»…5ŒA‹91¦Á 1íñAseZê°þ6¡5ÅEr*è{ëj¡ÊLa+~_Qƒ”‘špã§ÔO¿{KqCàŒrHc èX-Ýo÷Ùº 8˜KòóÚØ»D*· 긽~½ŽfX i#[™62]FKzéR®¸=âŒ2ó´³þƶõuvÁ û Á9öénü€=À4OÖ0<'ªÇ³­/n|Åb¡ |å(:y÷è¢ä±ÌÂò„²Ë¡àQ|A¢LLCí]?ß0†ê^ù¢¾§ʦæ‘x½›!Øjd›šìÞó%¾•ª›/³Ÿý’xŠeÜ“~¦W’‹t¼ôJšP† ©ì=#•ÙÜÌÖ|¼›âç\Í)V±¦Ï×ZõZ-JÊ6Ã% oÂn¤Š˜§â†Hõ¡®n»ç¨*ÑðVM-»Bü³8ŒQ-ç3/ù•èé&½ðØ1ÅþÊí‘HR_)6õt8ċ؀t2>£SÒ¸FAXê<®.8-DŒáã'è²}*tÿÉJI²l׋Ž%݇r“s!Œ#bˆ&šÔêYjXÛÌ"$yj4ù­Yj¡<ŒÉ %>Åï(uvi65T.ƒ¼ aÆ_óF¸:±Ô?®ý¥p™­2úÞÝêäý~´ V”¹/)Õh:L”šTW_me)òÖÃîcËY~7u'ÝJGïtîÕøžQ-Ñ m4§íHjè,pïH6y=(.‡‹¡Þí5Ù°2‚íEÖЈ—­´“‰Ùì´b|HúwÛ—o‚¹z³°˜p³¸  .#vˆXG¥„©Ôemß»©ŒÌ×ìš©*c>Q~x§HiBuBèBó^~€Iü}Òþhé(¾¦% Ø ÛdÔçÅ¿HW3a¾¯‰”GËä"Kh嘸øALÇïxð¥> Vh?D gHa€}d8~ƒ­?€WÙ¥EŸÝ'ö§ ôe»Ãà—Ç•Tßߣ*éŠæ(n"‰ºÜÈWZ„N¸5OÌý'úZà*!¨[¡°ÃíڔЦ3'Yc²öK´×+_àÊ–^µVË3”\aéy_—Ñ8¸1¢=¥Ñhfz_Ö×QÊQz»£Èª½F·Pí•{S|ªÌô^½ÄŠ÷‚EZõDIV¿Ýoˆ‹@i»¦Â ûBd“8©¯#pºÝ¡çg9 x‘‡%–bM60XK±¼šÏRGrµÂd˜ú†½HËïF&­b~ôäOv'•]„iBñLÍ'{ì¯}Sn x€JÆ{â#–毹~(Ð~¥×I ¹†¡ø·g:Âë°Ý$edž%6I¼ §FɥϨAwÊAš;ã© aB­ØüLT _{:ªˆU©"J>h‘ AfAC# ºMÑ)¥JRûÖ/9tŒdXƦpÖÁƒ|!lEËC5ŽÃ,D·¾IK&ÅûÂÑü˽UQ` Z‡Òë+ë:N«ïc ¶À:«õ.x‹›kÿ™âG=²®©°_®°Ëà6Y áªz`ßηú[ð—¨Y²i¬Œf[}tl‚…)¹«6ŽuÉü7:˜ürTv‚¿#@ÅóëHóE;®d×KêÓAêYh†²aáìOÉ7Û¦byS“ÃO‹<>+?¥§r×VÏ ø¿È #Ú nš±E¬×zèLÂbç³’ ómàlb´§í¢ãº÷mà S¶ÏܦZ†D©wô½²ò@¡¾€Ñu+Úô‰dhkƇPøþå.·k/×âÓ`„è°G“ÙઉÓOL*§î}BTÙ“#XÌxüøåvc! ø@Võ#uówWïwu)ÎBMh‹{«¯gúß/ò½‹Â=ëJÅ0DºSÃ4œ–’­Ñøèñ+ÛÚÒ4W&Ùž‰†±gë¼2‡)f*²ØÁ±«âJ׿JÈèÐ˃aÙÛ(iÂÌÐ){‹Ë¾¦5¬¹ýº`x­ÊE&YcEüyjªÌ?í¸hÈsOíÙÖ*a¯à„à-hÑÝREA ºð¥»¦¼§‹FÅ|¨xp1w.Ä͇Tò2kéÈ*} â”ö›ðÿÑÛ– ±EÍ™WbBFmš:{Qw óíNµE›*2…8ÞÓ#>C•pº® Êkü~¨N–‰Àä)ÓÝlŵkÚžò~ªg Dë¼{€Ïuyýî«òt’âÒSX*çјʳ8<«Føs“ ÏÀ{ÔÄuõU'‚€¤¹ôŠò°½'"þ)”»×`ÞÉ.(LIޤùè%zêwæs¾TõǪ¤Æw¿¿)[w¥ýP³¾¼Gß§ ú0?í\®8¯•M[q—‘vÙ³W!w¡«u1–ÍÓtÒÍÚó kN.§ÀŠ+Œ¥Ø4,ŒE6mZ:®ÚƒJxþ­Ý¬uz6Ï©ék-’€¥_ìyz"§n­ü”` úî…á±9¢HDq€1缌óFÐãõ Îàóï­/ÑãµW]÷,wüSž ¢ÎÐ}©9lR§SvEîk»Œ•e×Û¹ñ³§˜½Œ`¼<Í&Þ{ X|'yÀä–[ŸY¸¸ Yœí¾mRÿT¾M÷ÀË¿ýØ´ùó¬ÙGSx¿Á@ˆˆ,õ½}w%^Û²³Õ¤!(ÁmÑj³ Ÿ‰ƒÊN˜óÄ ŠO¶©$3`yz3‡žÜ5osN8Ò,÷3äl€KÂuMÖÛé@? uéc¯‹éZ?ù}!èöZù"°æ¨¹'>-–®[F'aøelÝÖõSØ$×£¯ò!¾$“†i³Æ|»_†1{”VaÓ~>l‡Ú!Y<nqÃtóѯMTT”rû0¹ô8íª.œvm=šû£i„‘½ðInså¾¢½ÑGíè0`âoüÚ•j¼\>gaÙlÕ ÷=YÁ¤i€Æw‘42U‚w‡Iëlœäy+ÖÛF~êë²éwMF%ŽhS4ç—þ=œÒÝH! tãउ1‚y:½M¥™71˜«ÎØÍIKù'naAhæƒuÕ‹«FW‹«­BÄ jY%˜ÂŽ-á,G늋LZÄѸã¬ÙúÜ–QÌ¡¢i§Â­Žr.*™éqÞ[§w>}ÜN ˜—Ò÷”|á¿åœ^ZÖh*-¥zû›ýj=Taä·Öæðñz6ttC%Æ’ÉîÚR(°…-Z¹‰Ó—³Ö}1Ú“tÆvì"öÙŸr¨·è5BA>Ü×~yw¼Rß’`G“©Ø®ˆãç)àmx+D^% |¢¤PÞ W°ÙžóÄŽ\€;¢™žOKö; ñaýbpÔ"èW0Òw²[ˆÇ†oƒüØ}D½¤6u9n´á~¨ÊŒfÈm–æÒú½bS攚êwa( ÷] ðã䩹‡¯·IVs “åjïúðÏ×ÈrÅB¥è@Q˜GÈß_9û²Åà{«ŠÒº;rÓو󻯍a²Cid;r&šúû?$ ²=vs㛊ú†æe~ø¬ 3§ÌµÿÈŒN eÂG25wñª©«)±Œø $ÎeŒ,$ŵ4Cê븢‰§¯ôQIÞ¨0­YÒÊ»µË´¬WîMy$R±Ñ¯q6#L+ÈPö×ÄS¸Íó®nˆò­!êÂ,zD¿rí%¡6ÎSPt"Ì y#¿¡ÖzRÔo¶Òín±ž–,¼B¥MdĬ‡%§Õ É*ÙdugŒ1YÖ˜ù9·ß˜K ƒæµŽn6>ÕàÛG‡__U= fœì·eT#3ݼ.ö³‚"–.8âMʆÆA妷_„+µ¨k¿pˆ‘{é6'p† ;ü¢wÅů #kR®Y¾“9e8gOª‹×ìA ö~÷E~ÔŸ{œ3 e,a¥¨ãTHYSÄÀezÁO‘=ƒ` 7'ÏÁ”ã=£Ê”‡%¯L XÆ­·˜Ñuˆa@Ò—Uã¬HiÛ™‡)úN©éºfšª@·U»eí› [*~äÔ³ï·_ÅÞ!,Ëk þn‘퇷~*Ý+4÷ô£ùx"‚!=¿*cÛòãˆ!”$V«0tFÞ`„³‰l)ÚþäÖ!ºìÁe †Y¿_dÛ‡_¼dÂèĵ=¯s¢ÝÆÕ)¡Œ{©z©kR¶o"g,V»AlU:çxqˆÅƒLÊ?Âô{jˆ ‹®¿KAñ¶jè|?Û´+¥DQ?èmj/Ÿ-»g7£bßFà>oIô›=©g>Ö'Qí›#‘0ƯJê‡r´=Hõ‡{Ù-ƒ~¸Ÿ†à ËáèX}ãu½fãë0š¢—ÎâÚ2j%w4ÅJ8zI>·ĨQ²è¶ÙŠórÿÍd`‰.ÿ”.rœI£-òmýä (ö÷ùy9™©óØQˆ¡ÿƒžÝ(Q_Y6û^o hzœæs½åàûÇœN€ ʼnÚ}ëk^óËJÕûÚ3ë ö’ 6D¥¤f>vqñ~eôQ72ÜáFQa^Á®¹p`Òìûaoì‡űލ–*_…=;×O%þï?süÌÓÒß g*45=*£p¹©L`=bjñYší1›yªõÞ©Èv;¹]i[Ç÷Z¸ x@š€¸Hž.Z…àw‚]{ßH–cäÒ’9šAáy8]±P u0ÔN{ß»ð}N¨hu›:ãÚŸD¥X,©U›ŽHO|d0;J9÷H=0ᘥñ0`÷%hr¡û ì®zH=çdê-£/wdüÜ îcO6^›¬xM¼7ëp{˜hTƺÞ@ã\ ¾“r"4<Ò» L®mìÎøB^ð6çº)†Ä³Òm 1µ»B/ ä.fƒ5Yê!„¶[£§â¤©…[v{ñ=!Ã|˜ \MX0úLK§´~Eèi?Ur¹J·G[•,æû¬Ì×ãâx° ÚXzþ_¾/ ‹âèAhfư©`_ÖÆ­ÆSåjéÉâ8·ÿ¢¦p¢ÏvÉzß㦠¡]·¥ŠVÐWCÙVàN¦­n´¼ h [¼LœHl!cŠèî|µD Kf?Óm!¢Õá)ŽÞñqÖ^[°Í΀F® m¦.éÂfo÷Àî'ÐwœíTœEæãÛ%¬e:áË3JWôîã•LhÅž!„ç™YÛvXÖ&?nÔvþ‰ÀžäÛ‹¸çg8¥$ú<ãAZ‚|j‘ò‹òå{fÿÐ]{ï!JÀIÄqßX#Ã.Õ…í˜&í¾ªè ȱ¼w •ØÄ5¨x.±¨lç}Pãñ–˜0TBé˜'ڜؼ”–eL6òì2°N˜ïâ«{3…±>|ñ{ :èÕ0(´‚¶fOLŸ—¥Ã®[JiáÎy檅9Hñж³Ü>%qN9ë]î}9 ·@ËïsôK¼‰³c°-Ÿ9ÖHv‚Ð?ÃÓ$2æè¸u¸0í1k“ÜgÅT®Îº³Ÿà)YT-…Íž,³§Ýû¹m Ã!ꎎLø‚¡Ìó&§ŸE ÷‡Ö'š‰+²³?ÂÐ` "VŽi~ï쵸á+œÆé2EúÌ ‹ÎÙQ«ZlÇĹ" Ç I}ÑÅ€å 9¯¨ HU­Ï®UZóúÙ+??Í:+Ci÷,MÉÚüåMéH_²B„ËãŒâ£Ï« Í·}Ÿ ~ù}¡œÞ¤ÛÄ|õ„}Y®bÝ` œŸgÀ;1{ײ\ú—ôÒsÁvã2ã5$­?;ìm;›k®°¢†šX4…fpQ…HÝ%Λ4vɉeÁØÓS¯¦ŒäôĦɋ.`¬xqÔ‡ÂE,¿,pE=Œ#‚ß§¥¼MÖ„Šñ\6 [¿),˜ g›Êh''X¿\R׫'$Ê ÅÆãíãü¡„\å̇\æÔ²,‰ú(†Ò(ËŠTµÇBŽÚ?¥1Y¤É?k÷w_àcà^ ·Qëì<*­è ³÷×! ‡X¼s™¡G­žíôö§œw“0Ðv¼Ñ˜Ò}L¡@¶üZ牒\W†À„g=5

Š•£òsL%¤o˜‡i_פWß)’îDSHµÞf–N+jã·–i/œÜqK#kD`=fĺññ¬ tÜ Çúe’œØ½oÁú•h×&,‡·ÆvÈL}“? Ôž0j}ƒo¿†çFY˜ ‡öœä»y¤q¸.[‚Å]òÐ¥éó×ðýJ¨á?9Ññ*©B¦‘HT…Õû‘;V¾CÌ- 6ï0ʨVÖ(×Çÿ„ûÞq[|ãm>䑱ùc¬½òý´5’Òïû²Z)G•Áþ!?9¨T1m@ø”›V?UùØÙŽ)EÒT‘zì¬ >ÿìÉ}·¥ÚÍ3è©Âc U•´‚íÛØþú¬F}¶Óct'~Þ’U£.Z¡\y”vàïe±-ù6ÍSÆŒåw^áyk’1ä›ËgÉ‚Ìe{¸ðq—ñ±Ïö~aKP±n$ñ—8HJäJZš:Øvù±ùƒß霷ÉÞ!ry|•|YãÖ«tÒ¡!uídz+k|¶‹ðs”IÌ>PÆÇí@‘à.£7À‹ÿª¯ßZ’âß§» „MXm‹ñ-ŸO”aSòÕ,jiŽluèxs rFªžÁw=Gt¾£E7Äq¼¿³JV{Šý˜Îøª/ñ2š4k<-ø¢•(ƒHÓqjõýgR~ˆÕò®Ä¼å{¥ô/'ó Eºl[âé\­}.¦X |Ü"_ž5Óh°+¾8C¢Àõ "VÅ–~ÛIGÒœ}ÑÍ …aâ€bC¿÷óøÔÃh¹¯÷5l -f9uŠÏôžØSv ޾ ã÷'&"Àv¬p8løÉûnbo^æ*ŒÆÍLÖ¢­ íçzè. ¶NÜÁ07v5¸®ûCr)Op}ÓJˆ¡ãqÉ­mg’@ãŒÌ2¥¦8«½xX_ª7]$ü4·Õº«Ù¾O­å³¼~zRiÌ&øÃŽwñ~Ó¢ Á‡_S(kKÑUnk(¤lG÷Á'.—(/ñ.K†V´2,ޤ®-þ…qÓd"hU\Û„Ô3¢äJ6%EÑ4¡F ®O¢QFs!ßP ùEã/Àg² Sä.uÂZ‚¾ƸÅÐ9H`XOûK³·Ý>çaCd~ÛÀº@I™È5Ò96«HînÆ<ÂOfGüá ö’‹±žÒo.ØI´ÅîŸÁåÔ³£-,å…³dþ¿ªk°JÍXЕƒ8¾5u¬²azij§ãk£‚>W5Öjs>ù?È')ÌÒ™Å!b©t™Ÿæ›Ó8{æDFÂÑ7q ‚²éŸD“(eÂt_ºË{‚*P7{NïQé$J£Ûªøå¶D2grŠAŽÆè1ˆp _ Ϲ®Q—NäNÄ ô7ÌÔf¼ËPÉÚº\^ª'èkwvònð>²ïYÏrÊ\]U½,dõ›£‹ýºÎ¡†5¦3Ÿ‘÷J’c4ÚEÔ@þ€ð×,0xºQn‹™õÙs,§é´¶ƒFÓʉ¤ù]¼ œF |G$ñ Læ¶4Ñ0zÝŽ·V*õ°SõjA\Š42³ù5M£"ÑÑEfmÞÓú._BNdž%Ð@-lüÝ&¾Ìš¨aU-u4£h¼<„säO ¹,9ÙN?­f©E¾ãBj¬©ÐÈõ°—ØÜ™4XPQô„Kÿs{’mÆŽÛ|=,kgï+qàÐ’±ç ¶’© ©¨TLò«mc’Ù lMÑ.aÞ%Ù|8%_™c.™zõi†XÆî>¡¥¯ûýHppH‘¶g9*3wZ¦ÔG:öØ_mð×¥F¯'JbºCÈeš“¤ÃóBÐ>h~;SWéÕ'ÔJÓB~%"ˆ1Ù~(w¿}”†å‰¢ÂÝÍðv²Þî …·'@~wúC~†VsDuZƒÅäòr‚cR¨,æÔȤv‚*3¿|è:-²}šÒ‰wÑçïµø¢n&÷Ußsf/¦1˜Dz ~”ÃŽÿ¤ÖîCøÑ–Ñ7›æ+ÿ$)MÿûIç "ÕFE˜Úå´3‚Xe[#=ÿA••‡ïQ ’jd,—Ëen‘+òýKˆbŸiøî1‡ôðxiFðJÛ¯~zö³4 §íebNOç®÷œ ì\\ÁI„\•“OWý™^³Pc0¹¾ÔçHŒÖ­ì§„†$й+è¢éT¼z² V)e“÷%X¯ÇÔÒ’òt«]‹R¦Žsuvñr(g \¡”Ç…üä³=°4•o¢î"Àöd!¾è-8‹Ú…òößFau"HÆR…Ù¡[Xd–r…»Zò±¦tFß6)Z0Pi¥4Jgˆ‹ ·‚3Ô‘U Ò¦¶Ö;™ªëEÒôÎèq{n °ÃßåÈZqžã[/#u®íæ8k ·Óÿt:‰cBØíÓúƒrq‹‡ÇØX¡N‹ÛºìCáÒ'/tÞ-ç—iú÷âãÞ ¾«U ‹~UäQâî¹azi<&Ÿú.…G„ [¿ÃtÕµ@+¹À~½¶ïªg¡á4“‚VQkÉÞrEœæ¿Š§h?[feà{½ÎÒ¨C™)°qªô2VžúCò#[ @:ª[Ækk&4R'8k8V0d»gZ("ï—9pdµºg'­éPà?:+-oø£Hï}^è¥ÙÔ%Ð ²@˜½ÌLceeI]¼? Õ \8²]é¤-Ì j·êð½¥|g`ú’fŽ<’/˜QÃ+m½`ºƒL¸J”ß„Z–Á׋¨V+æÀZ²wœ×+æ·qìò‘|죢¼/6‡Eòû}y†ÜÉ4LžÚŒ*Ý—¼o¿0}s`’ðõ=œ2ìO8›#¤’K¯€Åw§&ZeÊ¢,À;ŒÃô¤çûþßM}'ÑÎ…}fZ’\•Sz_Ö\ãÎlI9ø  ç:¹$dŒ_æ³+IEG¨4ŠI«YsLœO±©K‡a™à¦yk{yüP¯·H[Šb°¿Å’2>Y]ÑÝÄ›=ôcyÉ@3og '§Ç frª="L[`×´fÎRF3ךÄÚ¢5½Þï?é>eíŸ~–PÑUauñÙ±œ- V<æ¯òŸ¿C+žæ¼dçlþýYñy?$IvÙºƒÌÕ¸åeC||%(çsf<_Z@1×äZ‹¢L‘ò:ÖV2€î¡—u‡_üþXÈñ|2uA#ø2ðNÝ4íF«¸¬u`>Í9(ŸÏqíZQõNÎVÃð‘´wvû;q~içàÙ¡Š…O9ÜÃó›,2¬±@wÝJ½´1¯òßÖüÞçüQͪø‘Ú‹¬*‰æ|ŽZê5r½ùp³·¨Eæ¾.Œ ‰¤À÷L#‹´ò†ÏFçÔwÍG_Âs;.ÛÕµyàLµ;.eKU‚ƒW‹ò®ÄÑ^‰®6O‹ý‡ ÊÞ” v´Ùמ­a j|Ø}bÆò;ã=èh¿Œµñ…¢á0´ÊÓ{^²7¯u®0ÑÊüpÉC©ßYÙï«6Ñ‹Cå­?|–] 0ƒŸ:PeÕ¿ðÓK ˆØJš‘Á„ '²>¨i<ªƒ»÷à; D> —µÑÞîü˜aýTY»K`Älqn»Æ9™"W°ùhRú¤ ´”¥'‡»Ñ—™»™ìŒÔÅ2(5žÐsU‡óªÅX¥øÌ HŽº,j™ð.as§=²—]=4½Þ íÒo{qÞ…ƒÜï\3N.ñ¹È2ÀÞðÄŠîùŽhKšÑË(žÐ#Áš£nP·‘[! ”´ ƒ÷3Âôô·×Jjë‡|L콩óÁŸp!†eÑ„ßÙNmè!É ¤€hsðŒëŽHn£æ ÜÚÓ([ªªÔPl¼¢"„qÊï‰}n§ã-Ç‹ó„UãÛÅÎ1¸­^úKjüãÇöDÅdó!9ü‰î¯G¾£Oï!ª-ò£öª«i½ÌhæœFËPzyÔ*WœÉªç¶6½´Íû`,˜p4„ UÁFÄÁ6Çv·ñ†øÚ¦ï“Nm–ÕçÝü•‡¨%<€¤QH—yìšäm•K#øÕq|A »¿}ZÃì‚€ò’éÏ&d~Ä%ÚxF“¥Û§î=¡ªŽ Ìñ[:ÍPoz£§S³yþ~YÕ„6nreâÙF’K‘wFñÀ±¹– XrßÏSiÕU—°{áΞ¸/²iädÇO¢¹_‚NJPàlzò]RQxû«µÌWc¹©ÆY¥ñ‚0T“´q?"B-íŒE6de@¼ EôÌ}L0æ€kh§–: ò/F³ ç±¢MÈú™k틾۠J?ÍÅ=¥‡Úîß"Ý<Šúl&ë£üCYšŠÉ¾üòθd)M>3’'Zíγ8õDûjËf°ï.”S½;”4egOAî°²ÖäIOç°ÓìIûòPïì´÷¸æ¯ßFpcúUìÙæwôNܸyb«çdúAJ; íV“!ÍçƒÛFaêºuƒ¥CºFrÑÛv`@]Ð~ÑÒ5ne‹°t„Qò7ÙÊ`%‹ëðõ«-Û‡‰-¸ôÞôkI¬Ó(ÆQ5è›$<îíƒ^ØRc’L·Ä(¼Þ-E°åzA 10Äêumݤy6ÞÑôêˆüL>'ž"”*>ë‘пJ-Œü©ÒT)¤«J¿|' ¾¼ØÖÝnFûB ÷‘ÓYÜ$à%_åµúwÆý1"Œ+%ÌEMޱIà„üæt§VÒˆnAóÃà‘›öC'ˆZš+ g2j3~{å†âg>Çç·•’]"³Ýúà£5C!Àì »rè4§_ªhÐy-ÓnÕ†»Z.z'Fµd1H4=lèãß Mhߪ =xðª™c>vŽ9¤ßxöótgÚD1ˆ~ß`2Tõ‚dD¢ç€\ܱÑSÚƒ:ÿëi;o‡~iáy=s86ËzRBQU¨o™¤ØÕ~]‰b†NÖqœ ›€½'p8¡³ov4ú³|åÒð«Á˜72‡:èˆNÕ j-&ØÊ7ˆÊs8Ì÷õ„R3(ëö~˹ðeŸ6‹yÝønkrd¹€ îîåŠð‘}O{â5ÝSTâDZyð¯#p(%•Ðãð<,GˉQöÊ®;âΦàÁgÞÏ5Ç õ‘—®i½Ö “iÄ3”HüªD3½„~öCµ¤!)ºµSÜqíý¹‡Ä{S2ÞA}ŽÑkÍ„ٓµ¶P[çÕn†d?š{á >^œVÚöíj/ã·ÿâ;¼hézW#²é$Ã[‚¨ëã$}ÖÁÖÙrOÒçÙ(#š?Çýâ&ºM“·0¤zSvÖToŒóóÝ*†Ž²)<"šž”œ®¼k¿Ÿ2k2  #¤b²|¦/RFårkð¼¢Î89ÈœŠ4½Z F ¥™òÆ ›VÀ€ó>:ÔäuÎ3 *ÃæTø‘/ÌwFê¬CX±ÏǾj%Íu”k婪ïü°›5 ¥+¡UU:^î÷Zæ57õì&ª%ÈôNW…¬`„½”3Ž4ÿYQ”èýcá³$ýÉÖ°é÷úæúžû2n«I+Óܬm!QÍ>t%T#£L×Z-Òç‰ ÇŸCíæî?pž½ôÔ£Pv"C´„q[;à¥ÿBƒ¡û±ª(€¶¾l¿/æ¡Ï ¶ +„ˆÂ†¡ã~54'U‘÷¹J¬8 r;~}>Ú[%è#•€€…ª™v‰ËµÛçÍéôTY“#ÝVa›Ùçât<í­ Þ`à¬B]¾RyÊ>·7‡„Kš¦Á÷vÀcö×N+æŸ w¬’>.T„f¸žt·44ú¦ÞÛ${LÐ:¬X8s]çÆ{*¾‚ìø7ݦ©ã“ÇÂhÒF Ä£w®•)µ pÏÚldõW‘MЖ[@¿cÎò!çÄq~€#ƒ¸£Hè»æß·•Þ>•VÉôÏýcTº¹›¯çA£„>TXö¤Î᳋ÞÔwÍP“—0*2ñz_òXû8Ä”GÑ4Boäž;?ÐÀ¬k²z+o…Õ^F—úÏóö³¿c<»|$aX`º^]EÀ•ª~÷©ÕsFWì\€ÄcÒA¤@mùlá ßRË8l®îc~Är¤³Äì„èDððåÁZâËz”Tþ¸+¯îœÀ{û¥_Z¹CÉnñB8‚“œË#NÌz¹»ßaü;éœêÄHH1Lm$~ý+W—ÊšðœLAxåšU½þ[ηÑǰuG߯ZyvÝ0Z÷ʘO~]Ë"´vohŒ¾.;djBæÀ¸©&i0)lÑæüìŠÍ¯!Yãgz@ÆÌǶ…¬ÄX ìúÁìoZcë ›X¹³ß†DÙ6ZPÌ, ’óÚ^m*vvøñÿ¶®XF endstream endobj 145 0 obj << /Length 741 /Filter /FlateDecode >> stream xÚmUMoâ0¼çWx•ÚÅvHB²ó!qØmUªÕ^!1ÝH ¤í¿__‚—m ñóøyÆÌÝ·—ÍDUíÎLÂGÎ^͹½t¥™¤ß·§àî.kËËÑ4ýc*S³ç'öÒµåÆôì>]gë¦î,yÝ”‡KeFÖ×$mÞëÆS°»3¿&åq÷GðÉîRúº™pßêþ`I_Î3[d·Eæý4ݹn›'&9ç¶7UÚaãL)l:ŠÛ×MÕ zØê!YU—ý0rßåÑžo>ν9®›},—lúj'Ï}÷á4>Óç®2]ݼ³û[ivjs92V+V™½íhýÿØ ›~éñÊyû8&ÝX®²­Ìù´-M·mÞM°ä|Å–E± LSý7—ЊÝ~¤&–Êçø U´ –2´XÆ(p‹m“¡¦ÂÜÂÂ∠ËXXœ(W°8X&˜LR4â=z¨Åu«kTÌGEåïm7hçáË8KÉc`Iu(à!a <#œG´Ž »>ÃÎn-tJ!]O2Çø`œúñãÌSŒóø#§¸­'œâ,<Ø“L€%q¡O8\Ï€™:Žó 3ht ‡,ª+à9­uçgŽCwËpÞDÿ‚|ŽOžRÇɉ#ɇÛW ºmè—’®1NÃwH=8!õ Á éŒ4ôDCp&q"p¢œüBCT/ôŒ9ñ¡!ɨ~Bü }ÒéîRqÒ‰óTÂçFIŸúܨ™ÏŠ|nTìs£Ÿ¥|neEA¼;~æó¤òÛ<©â6OšßæI‹ÏyÒòsžtèó¤g>O:òyұϓN|žôÜçI/|ž´òyÒÚçI§>O:óyҹϓ.|žRîó” Ÿ§Tú<¥³ë¹_¾û¥ãmÂKz}öÊK×ÙÑ=·î¡ÃW7æú"ŸÚV¹{ÊÇÿŒž‹à/@̪X endstream endobj 146 0 obj << /Length 741 /Filter /FlateDecode >> stream xÚmUMoâ0¼çWx•ÚÅvHB²ó!qØmUªÕ^!1ÝH ¤í¿__‚—m ñóøyÆÌÝ·—ÍDUíÎLÂGÎ^͹½t¥™¤ß·§àî.kËËÑ4ýc*S³ç'öÒµåÆôì>]gë¦î,yÝ”‡KeFÖ×$mÞëÆS°»3¿&åq÷GÈÉîRúº™pßêþ`I_Î3[d·Eæý4ݹn›'&9ç¶7UÚaãL)l:ŠÛ×MÕ zØê!YU—ý0rßåÑžo>ν9®›},—lúj'Ï}÷á4>Óç®2]ݼ³û[ivjs92V+V™½íhýÿØ ›~éñÊyû8&ÝX®²­Ìù´-M·mÞM°ä|Å–E± LSý7—ЊÝ~¤&–Êçø U´ –2´XÆ(p‹m“¡¦ÂÜÂÂ∠ËXXœ(W°8X&˜LR4â=z¨Åu«kTÌGEåïm7hçáË8KÉc`Iu(à!a <#œG´Ž »>ÃÎn-tJ!]O2Çø`œúñãÌSŒóø#§¸­'œâ,<Ø“L€%q¡O8\Ï€™:Žó 3ht ‡,ª+à9­uçgŽCwËpÞDÿ‚|ŽOžRÇɉ#ɇÛW ºmè—’®1NÃwH=8!õ Á éŒ4ôDCp&q"p¢œüBCT/ôŒ9ñ¡!ɨ~Bü }ÒéîRqÒ‰óTÂçFIŸúܨ™ÏŠ|nTìs£Ÿ¥|neEA¼;~æó¤òÛ<©â6OšßæI‹ÏyÒòsžtèó¤g>O:òyұϓN|žôÜçI/|ž´òyÒÚçI§>O:óyҹϓ.|žRîó” Ÿ§Tú<¥³ë¹_¾û¥ãmÂKz}öÊK×ÙÑ=·î¡ÃW7æú"ŸÚV¹{ÊÇÿŒž‹à/znªb endstream endobj 147 0 obj << /Length 741 /Filter /FlateDecode >> stream xÚmUMoâ0¼çWx•ÚÅvH U„dçCâ°ÛªT«½Bbº‘ ‰B¶ÿ~ýüÞ²í4~?ÏØƒ¹ûö²éªÝÛYøÈÙ«=·—¾´³ôû® îî²¶¼œl3ü°¶²Õ4{~b/}[níÀîÓM¶iêáÁ‘7My¼Tvb}M2ö½nˆû°û7ûkVžöÔl©CÝÌ8pßêáè8_M3Wc75æ—ü´ý¹n›'&9ç®7UÚžÀÃ9˜:Ø|Rv¨›ªŰ=H „dU]ãÈ—'w°xûqìiÓÚ IØüÕMž‡þÃ+|æÏ}eûºyg÷7ÊÜÌöÒuG *ÖkVÙƒkè¼ÿØ,›eðJyûè,“~,PUÙVöÜíJÛïšw$œ¯YRëÀ6Õs WìU9*_ÂW¨£uÈÐaC;ì 0,,]!,ްàpÄÂa¥}Áá Q0©RhÄôÐÐC¯®»8]“µœ•¿wý¨‡+XÆ¡±”<,± xˆØ^ ÎG¸#ö}ÆýZÐ)…ô=ÑDÃÆ)W0ÎhœÂ8ÿ‡?qŠÛð„÷ B8 ÜI*À¹ OxÜ,{2õïAf Uà5h8d¡°®/q­çxo<󼃼ïMDà_ ¯ÂóÑSê99r$`ôá÷•oôK‰×˜VˆÁwˆ=5pBì'Ä32 '#g!'N”£_Ðè zÆù AeX¾B~}êôw©9ê„óÔ‚r£%åF‡”½ Üèˆr£cÊV”­)7³¢A½{~FyÒùmžtq›'ÃoódÄç<ù9O&¤<™åÉD”'SžŒ¢<™%åɬ(OFSžŒ¡<™”òd2Ê“É)O¦ <¥œò” ÊS*)Oéâzgþ—ïéð6Á;z}õÊKß»Ñ?¶þ¡ƒ'®nìõ=îÚVùȧ¿ =Á_ëQ©† endstream endobj 148 0 obj << /Length 494 /Filter /FlateDecode >> stream xÚm“Mo£0†ïü ï!Rz ˜|U‰ÄAÊaÛª‰V½&ö$E 6ÿ~=HÕUAgÞ¿“ɯ÷½Ÿ«ú~üÌÙ´uo$ø›ßÇÆ›LD-û t÷  @Ùö…½›Zî¡cÓÍNìtÙ=YñNËk¯`T=­áRêo ÞæøôeCîŸúòÚ•Úç(>”ÝÕŠæ™ ²ŸAæŠþ€iËZ¿°ð™sn[­6u…c´^0XaÁhî\je?ì„î¼0bª”ÝprOYÙ÷Åû[ÛAµÓçÚKS|ØdÛ™›óøäoF)õ…MZ³©}ß4W@Œ{YÆœmG;ÿë±<œñ®9Ü`‘;‡äKÖ Úæ(Áõ¼”óŒ¥E‘y Õ¹¡ât¤ba¥bi<Îg®bÌÅw­ü:/]×åvYsäˆâ[ä˜â+䄘#ψ]íœôò‚â9ò’8D^osâyMìîÚGÈ‚X o‰ä‚îBŸÉà5Éà‰<øÇ»’ÁÿÂò k£(Do9Örá,Âq¼B?"tŽýEDqì)bbœW$ÄèYÌèM»>sb×gEìjqÞ(ŒæÃ×po¿$îÝ}IdoŒÝ·œn-p!J ÷ýmê«ÜÏ-þøOÃÓ[áýL‡ endstream endobj 149 0 obj << /Length 696 /Filter /FlateDecode >> stream xÚmTMoâ0½çWx•ÚÅ$ !Ù ‘8l[•jµWHL7IP‡þûõ¬V=Mžß̼ñ s÷ëu;ÑU··õÈÙ›=w—¾´“ì÷îÝÝå]yil;<[[Ùj<=?±×¾+·v`÷Ù&ß´õðàÈ›¶<^*;²~&ûQ·‚>ìþÝþ”MS >Ù_êãP·ò{=éÇsæ@öd”ôÇöçºkŸ˜xäœ;`ÝVY×`Œs4½JaÓQÜ¡n«þª‡í¡.’Uu9\ßèY6î>¼ý<¶Ù´‡.Z.ÙôÍž‡þ“4>DÓ—¾²}Ý~°û¯ÒÜÑör:-d0­V¬²WÑÍÿ¼k,›þ8ãóþy²LÒ»ðºÊ®²çÓ®´ý®ý°Ñ’ó[Å*²mõíLrŸ²?ŒÜÔqù¥ã• â5F8@ šˆ=@Šð)&°  È8Ô¹€ÂÅRx u€Dº\j2H—†ª¡ÐVÁ¹0CzL]ø Âb°ct‘I ©g$`htÑ‹0œÆ\F„áŒ0ä†sê‡á jd< —Iê6œ»õñzgóñºË»þê W ¤qÈ’£+—Ÿ#ö•ñÌÇkÄÞ .‰bªsré…¤šáæÄç†bïmŽXú¾„Kß7ǵHß7Géû„û¾nb§>&jÊØµäuœ¯¼ú•ñ1ÜV™÷•âÜãâµÇ‰Ou$ÕŸqWèS/%1{\øxB!€§ÔK(hH©—TЖ枃»J©Ïϯv×ÜëÁ=küÒ2ø¥UðKÏ‚_:~é$ø¥Óà—ÖÁ/¿Œ ~™Eð+7¿èË¢/ ÿlì¡ÛÒ(/}ïö -+ZXukoûìÔE?Z„ãæÅÛKýqíƒÄ endstream endobj 150 0 obj << /Length 695 /Filter /FlateDecode >> stream xÚmTMoâ0½çWx•ÚÅ$ !Ù ‘8l[•jµWHL7IP‡þûõ¬V=Mžß̼ñ s÷ëu;ÑU··õÈÙ›=w—¾´“ì÷îÝÝå]yil;<[[Ùj<=?±×¾+·v`÷Ù&ß´õðàÈ›¶<^*;²~&ûQ·‚>ìþÝþ”MSÇ“ý¥>u;áà¾×ÃÑq~:fc_0F)l®»ö‰‰GιÖm•u f8GÓ«6•ê¶ê¯bØÒ"!YU—Ãõžeã.ÉÛÏó`›M{è¢å’MßÜáyè?IáC4}é+Û×í»ÿ¢Ìl/§ÓÑBãÑjÅ*{pÝìϻƲéOÞ(ïŸ'Ë$½ ¯ªì*{>íJÛïÚ-9_±eQ¬"ÛVßÎ$÷)ûÃÈM—ÏñP:^9À ^`„ª‰Ø ¤Ÿbr š€Œ@ ‘{@(\,…RH¤Ë¡&€ti  mœ+3¤ÇÔ…Ï ,;F™$Б€‘zF†F½ÃiÌeDÎ(ó0œAº1a8§ÎyΠFÆÃp™ nù[¯w6¯»ü·ë¯Îpµ@‡ )9ºréñ9b_iaÏ|¼Fì-ÐÐà’(¦:×ù(—nQHªY^`nA|n(öÞæˆ¥ïK¸ô}s\‹ô}sÔ‘¾oA¸ïë&vqêcâ ¦Œ YK^ÇøÊ›!¡_Ãm•y_)Î=^ ^{œøTGRý÷w…¾1õR³Ç…'ÄxJ½„‚†”zImiî9¸«”êðøüj'pͽܳÁ/-ƒ_Z¿ô,ø¥ãà—N‚_: ~iüÒyðËÈà—Y¿2qó‹¾,ú’ðÏÆºíŒòÒ÷nЪ¢5Q·ö¶ÍNÝ Yô£58.]¼½Ñ‰ç‚è endstream endobj 151 0 obj << /Length 695 /Filter /FlateDecode >> stream xÚmTMoâ0½çWx•ÚÅ$ !Ù ‘8l[•jµWHL7IP‡þûõ¬V=Mžß̼ñ s÷ëu;ÑU··õÈÙ›=w—¾´“ì÷îÝÝå]yil;<[[Ùj<=?±×¾+·v`÷Ù&ß´õðàÈ›¶<^*;²~&ûQ·‚>ìþÝþ”MS§“ý¥>u;áà¾×ÃÑq~:fc_0F)l®»ö‰‰GιÖm•u f8GÓ«6•ê¶ê¯bØÒ"!YU—Ãõžeã.ÉÛÏó`›M{è¢å’MßÜáyè?IáC4}é+Û×í»ÿ¢Ìl/§ÓÑBãÑjÅ*{pÝìϻƲéOÞ(ïŸ'Ë$½ ¯ªì*{>íJÛïÚ-9_±eQ¬"ÛVßÎ$÷)ûÃÈM—ÏñP:^9À ^`„ª‰Ø ¤Ÿbr š€Œ@ ‘{@(\,…RH¤Ë¡&€ti  mœ+3¤ÇÔ…Ï ,;F™$Б€‘zF†F½ÃiÌeDÎ(ó0œAº1a8§ÎyΠFÆÃp™ nù[¯w6¯»ü·ë¯Îpµ@‡ )9ºréñ9b_iaÏ|¼Fì-ÐÐà’(¦:×ù(—nQHªY^`nA|n(öÞæˆ¥ïK¸ô}s\‹ô}sÔ‘¾oA¸ïë&vqêcâ ¦Œ YK^ÇøÊ›!¡_Ãm•y_)Î=^ ^{œøTGRý÷w…¾1õR³Ç…'ÄxJ½„‚†”zImiî9¸«”êðøüj'pͽܳÁ/-ƒ_Z¿ô,ø¥ãà—N‚_: ~iüÒyðËÈà—Y¿2qó‹¾,ú’ðÏÆºíŒòÒ÷nЪ¢5Q·ö¶ÍNÝ Yô£58.]¼½Ñ»á‚ò endstream endobj 152 0 obj << /Length 696 /Filter /FlateDecode >> stream xÚmTËnâ@¼û+f‘’a؆!ÍØXâ°I¢Õ^Á²–°Œ9äïwª3ÊÔ.WwWwA?üzßNtÕííD=söaÏÝ¥/í$û½;EyW^Û¯ÖV¶ßž_Ø{ß•[;°Çl“oÚzxräM[/•Y?“ŒýªÛ@AöøiÿNʦ©÷‚Oö—ú8Ô턃ýYGÇú™ÀÊîPFil®»ö…‰gιÖm•u &9GÓ«6õê¶ê¯’Ø#!YU—Ãõ‰¾ËÆ­ÉÛïó`›M{è¢å’M?ÜËóГʧhúÖW¶¯Û/öx§Í½Û^N§£…Æ£ÕŠUöàJº¼î˦?y#}~Ÿ,“ô,¼²²«ìù´+m¿k¿l´ä|Å–E±Šl[ݽ“ܧì#7u\>Ç—ÒñÊñš# PMÄH Eø“XÐdjˆÜ @áb)<:@"].5¤KCÕPh«Àà\˜!=¦.|a1Ø1ºÈ$ŽŒÔ304ºèENc.#ÂpF˜‡á Ò Ã9uÈÃp52†Ë$uÎm}\ïl>®»ü·ë¯Îpµ@‡ )9ºréñ9b_iaÏ|¼Fì-ÐÐà’(¦:×ù(—¶($Õ,/0· >7{osÄÒ÷%\ú¾9Ö"}ßu¤ï[îûº‰]œú˜8¨)cCÖ’×q¾òfHèWÆÇp[eÞWŠsˆ×'>Õ‘TÆý®Ð7¦^Jbö¸ð1ð„8BO©—PÐR/© -Í=»J©Ïϯv×ÜëÁžµ~iüÒ*ø¥gÁ/¿tüÒiðKëà—΃_F¿Ì"ø•‰›_ôË¢_þÙ¸D·«Q^úÞ:Wt&p êÖÞ.Ú©;!‹>t Çó‹§·"úª#…® endstream endobj 153 0 obj << /Length 739 /Filter /FlateDecode >> stream xÚmUMoâ0¼çWx•ÚÅvHU„dçCâ°mUªÕ^!1ÝH ý÷ëñ#xÙö?ŸgìÁÜýx]OTÝmÍ$|äìÍœºs_™Iöss îîò®:L;<S›zœ==±×¾«Öf`÷Ù*_µÍð`É«¶ÚŸk3²¾'ióÑ´ž‚}Øý»ù=©½à“í¹ÙM;áà¾7ÃÞr¾›f¶ÆnjÌ-ùeúSÓµOLg~¼À8÷ã ãâþÈ)okà çA„8 ö$`I\èÎ×3`çAfŽã<ÈZ]ƒÂ!‹„ê xNkÇyã¹ãÐð"œ7Á¿ _¥ã“§Ìq âH`òáö•‚nú¥¤kÌÂðRONH=CpB:# =Ñ%8“ˆ88QA~¡!*ÉzÆœøÐäT?!~Ž> étw©8éÄy*ás£¤Ï }nÔÌçFE>7*ö¹Q‰ÏR>7в¢ G]¼;~îó¤ŠÛ<©ò6OšßæI‹¯yÒòkžtèó¤g>O:òyұϓN|žôÜçI/|ž´òyÒÚçIg>O:÷yÒ…Ï“.}ž2îó” Ÿ§Lú> stream xÚmUMoâ0¼çWx•ÚÅvHU„dçCâ°mUªÕ^!1ÝH ý÷ëñ#xÙö?ŸgìÁÜýx]OTÝmÍ$|äìÍœºs_™Iöss îîò®:L;<S›zœ==±×¾«Öf`÷Ù*_µÍð`É«¶ÚŸk3²¾'ióÑ´ž‚}Øý»ù=©½“í¹ÙM;áà¾7ÃÞr¾›f¶ÆnjÌ-ùeúSÓµOLg~¼À8÷ã ãâþÈ)okà çA„8 ö$`I\èÎ×3`çAfŽã<ÈZ]ƒÂ!‹„ê xNkÇyã¹ãÐð"œ7Á¿ _¥ã“§Ìq âH`òáö•‚nú¥¤kÌÂðRONH=CpB:# =Ñ%8“ˆ88QA~¡!*ÉzÆœøÐäT?!~Ž> étw©8éÄy*ás£¤Ï }nÔÌçFE>7*ö¹Q‰ÏR>7в¢ G]¼;~îó¤ŠÛ<©ò6OšßæI‹¯yÒòkžtèó¤g>O:òyұϓN|žôÜçI/|ž´òyÒÚçIg>O:÷yÒ…Ï“.}ž2îó” Ÿ§Lú> stream xÚmUMoâ0¼çWx•ÚÅvH U„dçCâ°mUªÕ^!1ÝHDI8ô߯Ÿá-Û@ãçñóŒ=˜»¯Û™®Ú½…œ½Ù¡=÷¥¥?w]pw—µåùd›ñÙÚÊVÓìðÄ^û¶Üڑݧ›lÓÔãƒ#ošòx®ìÄúždìGÝöa÷ïö÷¬<õBÍöçú8ÖÍŒ÷½ŽóÝ4s5vSc~É/ÛuÛ<1ñÈ9w…¼©Òö†`~ÑÁ擲CÝTýE Ûƒ´@HVÕåxùïòäo?‡Ñž6Í¡ ’„ÍßÜä0öŸ^áC0é+Û×Í»¿Qæf¶ç®;ZPÁx°^³Ê\Cçýyw²lþÁ+åý³³Lú±@Ue[Ù¡Û•¶ß56H8_³¤(Ömªÿæ®Ø&ªrT¾„¯PGë ‘¡Ã2†wØ`24XXºBX8aÁá ‰…ÃJû‚ÃA¢`R¥Ðˆ è¡¡‡^]wqº&j9)*ÿìú‹v®`‡ÆRò°Ä:(à!bx8ápŒØ÷¹ììׂN)¤ï‰&â>0Ni¼‚qFãÆù?ü‰SÜÖ€'¼ÂYðàNR–È}Â{àfØ{©çx2­¯AÃ! …u x‰k=Ç{ã™çàäàExo"ÿ}žžRÏÉ‘#£¿¯xÛ _J¼Æ °B ¾Cì©bÏ8!ž‘=Ñ%p&r"àD9ú Q¾ gÌ‘T†uà+ägÐG¡N—š£N8O-(7ZRntH¹Ñ ÊŽ(7:¦ÜhE¹Ñšr£1+ôè‹wÏÏ(O:¿Í“.nódømžŒøš'#¿æÉ„”'³ <™ˆòdbÊ“Q”'³¤<™åÉhÊ“1”'“RžLFy29åÉ”§”SžRAyJ%å)]\ïÌÿòý/Þ&xG¯¯^yî{÷ úÇÖ?tðÄÕ½¾Ç]ÛÁ*ÿñùô·£—"ø »s¨s endstream endobj 156 0 obj << /Length 709 /Filter /FlateDecode >> stream xÚmTMo£0½ó+¼‡Jí!m$Šdó!å°mÕT«½&àt‘@„úï×32UÕè1~3~ö<æî×ëvfênïfá£oîÜ]†ÊͲ߻>¸»Ë»êrríøì\íêiõü$^‡®ÚºQÜg›|Ó6ãƒ'oÚêx©ÝÄú™dÝGÓ2ö÷ïîï¬: ÑliŽcÓÎ$Pß›ñè)?¬ _Cþ¸áÜtí“PRJ(Ú:ëNp€s0¿ŠóIÖ¡iëáªDìAW ´¨›j¼~á»:ù›€äíçyt§M{è‚4ó7¿x‡OÔ÷Ì_†Ú Mû!î¿ ó ÛK߈2X¯Eí¾ž?÷óîäÄü‡ÓÝ¿iªºÚû]å†]ûá‚TʵHËr¸¶þ¶SÆþ0QO•Kx…&Z©=Ö1¤Ç>‹‘¥ÀÒbåqb0àq&°˜d'¤H1«[Q/cÚ0‰&Õ¿Ýp•*äI(¬µŒkŠÃ†2$l/€#ÚAŽ cëΘ :µÒXÓ"^f™F~N‡Kˆ rJÄÖ_dªP§ !®@§¿à+M÷THü`Ô©3ä NƒE7k°fBqxI¹ÈAý2GÝsú6AE Ye/‘Oú3äÄÑ€I?î«ôkM­Ê'„WG×f€©fœ¨ ³@ý¨$ÍÀ%ñ¡~’Sø ñs¨“ì…‘¤îÊ(î»ÑÜwrßÍ‚ûn"¹ï&á¾Ã}7dXz Ýñsöƒ)ئd?XÉ~°Šý`5ûÁ†ì»`?؈ý`cöƒMØvÉ~°+öƒ5ìkÙ6c?Øœý` öƒ-Ù™d?dŠýiöC¶¸õÿNüa\À`»Í¡ê2 ~DáôÃÙS§iÝm@ö]Yøàd†8|½”ÁpJ‘ endstream endobj 157 0 obj << /Length 740 /Filter /FlateDecode >> stream xÚmUMoâ0¼çWx•ÚÅvH U„dçCâ°ÛªT«½Bbº‘ ‰B8ô߯ß{ .Û@ãçñóŒ=˜»/Û™®Ú½…œ½Ús{éK;Kîºàî.kËËÉ6Ã/k+[M³ç'öÒ·åÖì>Ýd›¦yÓ”ÇKe'Ö÷$cßëÆS`vÿfÿÌÊS¯fûK}êfÆúVGGùf–¹û\b¸à·íÏuÛ<1ñÈ9w…¼©ÒöÎÁ|Á擬CÝTý¨„íAW $«êrGø]žÜIÀâíÇy°§Mshƒ$aóW7yúÔ÷ÌŸûÊöuóÎî? sÛK×-ˆ`ãθtJ!±'™ˆcøÀ8õãŒ3?NaœâOœâ¶<Dg!Àƒ;IXô ôÀÍ0z)rЃÌ@« kÐpÈBQ]^ÒZä 7ž!‡î /½‰ü òU Ÿ<¥Èɉ#“ÜW ºmÐ/%]cXß!õÔÀ ©gœÎÈ€žhŒœIDœ8QN~ACT/ès⃕QøŠøôQ¤ïRsÒ ç©…Ï–>7:ô¹Ñ ŸùÜèØçF+Ÿ­}n4eEƒ=zG~æó¤óÛ<éâ6O†ßæÉˆ¯y2òkžLèód>O&òy2±Ï“Q>OféódV>OFû<ãódRŸ'“ù<™ÜçÉ>O)÷yJ…ÏS*}žÒÅõÎð—¿tx›à½>zå¥ïÝ{ˆO->tðÄÕ½¾Æ]ÛÁ*üà3>ýcÀè¹þ¤C§~ endstream endobj 158 0 obj << /Length 900 /Filter /FlateDecode >> stream xÚmUMoÛ:¼ëW°‡éÁ5?$R. ¤d9ôMðð®ŽÄä ˆeC¶ù÷³k›m‘CŒÕp¹;;†wŸ~>Î|¿Ž3óEŠ_ñ¸?O]œ5ß¶‡âî®Ýwç]Oßcìc]=~?§}÷Oâ¾yhÆáô9%?ŒÝ۹׬“B|Æœ‚>âþ)þ;ëvÇw%gÏçáí4Œ3‰ä§áô–’>\ ‚‚6ý§ã°¿ õEJ™€õØ7ûÆ8ó 1¿’{Æ~ºðÏ`W(-ú¡;]¾è·Û%=°ùñýxŠ»‡ñe_,—bþ+-OÓ;qü\ÌL}œ†ñUÜÿI--=ž‡·B«•èãKª˜æÿ¾ÝE1ÿpÆ[ÎÓû! Mߊyuû>Û.NÛñ5K)Wb¹Ù¬Š8ö­iÇ[ž_®¹uÊ•MúÑzQ­Š¥Ò)V†€Ú(TØ€àx¿àÞ¢ žjy‹°°!ÀÐÔ•µZÔÀ2àP="¦ZdÔ0\ÃG©R\¡·”).–2*ÎШa!„U¼Ä,†³ÔÛHð° `+jÐÃ.¸5Nα@èâ°èÐVK-àxŸ%ô˜Ü3š% A°YÓ€z¡ÎšÔ>kP#¬³¦õ™5m0W£oš¦Ã¾žj­®§Üý·.†ÐZ¡ŽT$X/©)n)æ#W—„o(æ“oÀRZÞ $K¢p4’ŽZ¶-bâ\­1¦Ü°Jä æP"Gñ‘XÔQ¬‚i/8ºkÉ^€ÂZqŒ:ZsŒ½š9”d š­Bù Ž)ßsLù-ï7½æx˜ÏJ›¡¾Ò`¯ažÉ½)f¥É$†µ’1™¸ dÑŠcªCZCù<£7Ã3JÊgózÌnøþHȰíáÌYÉšäTœ¯a…Šï¯Æ,_»œ-Ÿ—Oë87Ë}êÛKÔ´Ü—Ll¹oKñšò+Êg­JÌâ.¾GZyóº‹Vðc­48¸’ï¼äØWtù]Í:P~`áŒñ±–rZŽq.nÍ1]Ç ÇàSÿæ/©ßP•ýïuö¿7Ùÿ¾Ìþ÷Uö¿·ÙÿÞeÿû:û?Èìÿ ²ÿƒÎþ&û?”Ùÿ!dÿ‡&û¿1y–¦¼ÍH·œn5þ¹ã)º½ÝyšÒ“Bï½x#†1Þž´Ãþ€]ôGoáõñÅ×Mñ?®Xê endstream endobj 159 0 obj << /Length 898 /Filter /FlateDecode >> stream xÚmUMoÛ:¼ëW°‡éÁ5?$R. ¤d9ôMðð®ŽÄä ˆeC¶ù÷³k›m‘CŒÕp¹;;†wŸ~>Î|¿Ž3óEŠ_ñ¸?O]œ5ß¶‡âî®Ýwç]Oßcìc]=~?§}÷Oâ¾yhÆáô9%?ŒÝ۹׬“B|Æœ‚>âþ)þ;ëvÇ÷jö|ÞNÃ8“È}No)ç£e‘0ñ&hË?q:ûñ«P_¤” X}³ßa†c1¿ðó+³—aì§ ñ j…Ò¢ºÓå‹~»]›ß§¸{_öÅr)æ¿Òâñ4½ÃÏÅüÇÔÇi_ÅýÌÒÊãùpx‹`!d±Z‰>¾¤‚iöïÛ]ó¼¥<½¢Ðô­˜U·ïãñ°íâ´_c±”r%–›Íªˆcÿךv¼åùåš[§\Ù¤­ÕªX*be¨-€@E€-X€÷@à-ê©–·xkM PY«…@ ,Õ#bªE†A Ã5rEqIø†b>ù,¥å½A²$ G#é¨eÛ"&ÎÕcÊ «Dž`%r‰EÅ*˜ñ‚s »–ì(¬Ǩ£5ÇØ«™CIªÙ*”¿à˜ò=Ç”ßò^pÓkŽˆù¬¸ê+ öæ™Ü›bVšLbX+“‰«@­8¦:¤•1”Ï3Jp3<£¤|6¯Çì†ï„ Û^Μø—¬‰ANÅùV¨øþjÌRñµ«ÁÙòy9ð´Žcp³Üס¾½ÔAMË}ÉÄ–û¶¯)¿¢|ÖªÄ,îâ+pp¤•70¯»hÿ8ÖJƒƒ+ùNÀKŽ}E—ßÕ¬åöÎÈk)§å˜ð5Çt7ƒOý›g¼¤:|CUö¿×ÙÿÞdÿû2ûßWÙÿÞfÿ{—ýïëìÿ ³ÿƒÊþ:û?˜ìÿPfÿ‡ýšìÿÆäYšò6#ÝrºÕøçއèöltçiJ/ ½VôRàÆx{Ðûvѽ„×w_?6ÅÿÑéF endstream endobj 160 0 obj << /Length 900 /Filter /FlateDecode >> stream xÚmUMoÛ:¼ëW°‡éÁ5?$R. ¤d9ôMðð®ŽÄä ˆeC¶ù÷³k›m‘CŒÕp¹;;†wŸ~>Î|¿Ž3óEŠ_ñ¸?O]œ5ß¶‡âî®Ýwç]Oßcìc]=~?§}÷Oâ¾yhÆáô9%?ŒÝ۹׬“B|Æœ‚>âþ)þ;ëvÇw7{>o§aœIä> §·”óѲH˜ø´åŸ8‡ýøU¨/RʬǾÙï0ñ˜_xˆù•ÙË0öÓ…ŒxµBiÑÝéòE¿Ý.‰ÍïÇSÜ=Œ/ûb¹ó_iñxšÞ‰áçbþcêã4Œ¯âþfiåñ|8¼E°²X­D_RÁ4û÷í.ŠùGÞRžÞQhúV̪Û÷ñxØvqÚŽ¯±XJ¹ËÍfUıÿkM;ÞòürÍ­S®lÒÖ‹jU,•N±2Ô@  "À–,Àû  ð õTË[<€5€ €¦¨¬Õ –€ê1Õ"à†á›×cvÃ÷GÂ@†m¯gÎ üKÖÄ §â| +T|5f©øÚÕàlù¼xZÇ1¸YîëPß^ê ¦å¾dbË}[Š×”_Q>kUbwñ88Òʘ×]´‚k¥ÁÁ•|'à%Ǿ¢ËïjÖò{ g䈵”ÓrŒsqkŽé:n8Ÿú7ÏxIuø†ªì¯³ÿ½Éþ÷eö¿¯²ÿ½Íþ÷.ûß×ÙÿAfÿ•ýtö0Ùÿ¡Ìþ!û?4Ùÿɳ4åmFºåt«ñÏÑíÙèÎÓ”^z­è¥À1Œñö öì¢?z ¯ï.¾~lŠÿP}éL endstream endobj 161 0 obj << /Length 750 /Filter /FlateDecode >> stream xÚmUMoâ0½çWx•ÚÅ$*B²ó!qض*Õj¯˜n$H¢$úï×3Cð²íh<~3~Ï~î~¼ngºj÷v¦9{³C{îK;Kîºàî.kËóÉ6ã³µ•­¦Ýችöm¹µ#»O7Ù¦©ÇÞ4åñ\Ù õ=ÈØºñ8‡Ý¿Ûß³ò4Ö‚Ïöçú8ÖÍŒø½ôí>sIv›dXôËöCÝ6OL xû9Œö´im$lþæ6‡±ÿDŽÁü¥¯l_7ìþ–šÛÚž»îhãÁzÍ*{pþçÝɲù·¯˜÷ÏÎ2‰kA¼Ê¶²C·+m¿k>lp¾fIQ¬ÛTÿíÅT±?LÐØAù>J‡ë ‘¡‹e .1›ÊPbéªpqH I$\kL¸8HbØŒShÄ…r =ôêzŠã51Xò‰Qùg×_¸sµ‚2¥äÄ’òÀ€+Š Ä ŠsˆC:CQŒ}.'c-ð”BbOEðƒuê×+Xg~Â:ÿ?aŠÛàj îB€.U ±$,ð¨›ĨA¦ˆA 2®‚žAÃ%‹˜òâ%Õ"µñ 1ô9h¨M„ _®ñ¤)ELN 1éÀs¥ ×þRÒ3fg =傸aîCÑYj¥ VÑÝà^¬w&L˜Ó=·° ½Ð3â„nqFyÀDŽϠOLüñ5'žpÏZx?iéý¤•÷“^x?éÐûIGÞO:ö~ÒÚûI“‡4ðÑíˆÏ¼Ït~ë3]ÜúÌð[ŸñÕgF~õ™QÞgfá}fBï3yŸ™ØûÌ,½ÏÌÊûÌhï3c¼ÏLê}f2ï3“{Ÿ™Âû,åÞg©ð>K¥÷Yº¸¾Nœ0³`Â^Çayî{7)q ã„ÑW7ö:©»¶ƒ*üሟþS`õRé̯ endstream endobj 10 0 obj << /Type /ObjStm /N 100 /First 853 /Length 6139 /Filter /FlateDecode >> stream xÚí\YsGÖ}ׯ¨Ç™p Ê}‰pLò %aƒ ©öh³ºÁ0¿~î¹™U•UÕ%Ôö·ÄD¨²²s½[Þ{2;[ª•¬"%º’1TRUŠ>)Uí+%+«b¥\弫”­¼ TOí©Y¨b4[šúI«*­¨¯1•Ž•4ÊWšÆ²2Ò0•tZTÆVÒcXGo'+ã+”© ¬¯¬¨´tbË(¼me5 I³Y[iíDe½c¬¨©6NU6ÐP EAUÔÄ I„jú£Bgèíô‘FïP9ª2^V4´³hDM<ªÎtå…–•W•—Š>8*2ô™ª µ£)ˆl·¨È‹P*ÎWÁT>Ò|²‚ø ¨¢IÑ„mp`ÂRVP¢èO›*jz½)«¨q¤*MãFGbµTHòŽÔ¥/©$"C"”‚Ô%‰9ÊIÊV È_ƒRP¹ÚmIa(gPƒ”q“rZ4ƒÕ= +OÊ“’Fvº¤™Io$ISKÊè“‹Ä Í m’ú¥Ð'1.A™¨Å§ %ƒFJæ"C04z¢$+ây¢æ2êm@™ƒ’%YJª-¢œr0*²,ʹH4µJDÊ‘yJMtÃZEoÊ80M+=©€Æ‘ä­`×j)#ƒÙ"q)¥‰jªŒ7*1Gƒ+OöÔ—Ö¸­”d¤T©´ŒÐ:e4ˆ¤ÆÚØ-eœ²[ß¿U}¹œWõþìý|«~pq¾šŸ¯–•¦Õv°UÌ—¯ŽçKZz\°7?YÌv.>W¯@ë>ª7[ÔýŠú‘¸¹Ù?þÁ#?¢Ñªï¿¯êG´B j¥•’³$i—³´¸|Ó@’R–¸ˆœ¥ëý«‹ãÃùªzMÄ>|TÕGóÏ«êM3×Z.¤±¡îÄGAñ:2©T@g™= Íæ¼‡br]ÓU r›Åß]5Òš’!»‹=¶ 1{Rf$†Û²ëGìÚ?Ý’Ç‚õ›h·µê[r¨ÅˆÃð'pHHËñB,õ\0ÕÓç„þïlÊz´rõ7[¹Žc\ÎS°þ+Vtiú·‰ƒ¾™1˜ ‘ª.¥³^"¥Üyåš‘vº+4óÉÇãùUõ·÷«Ç.–«åñÕârUùmáÿÞ'á‡Ï«Ç‡«ÙŠèx6;£‚CUÄÎî ¼X}ÙªŸïïUr«>ܫĶPM÷ÃoW<Æ¡êÙrÎB¬¾Üùä§ïžÌO?ÍW‹ãY3jÛY¤H28O´]\I2³WËÕƒ³+2î­úé,PÖnÕ¿,NV–$T…ÈN&hf 5à „?A:„5uz¨ÌÆCÛæÍùÜçkOÛžú¶PŠ óÇdxêððg €ø3w$°‡r&â¶>¤~¹]Û/ç#YB36Ê"ð¾Lµm ¥9J‚Û;…Æ!5!¶© 9ïstjú#ßD8<ˆp­S“²çÑ9‰Ì;µC¾|·<—´ç¼_ÃwCc)“2ÂaLpD!ømÉÐ!C3Ë´iÇÞ½8äxî×i°á„©Ï#sGTD¨’:ê šŽM¾d·e» À;°îzfà]c:©L£­(X±º#å2‹Åv)T4|¤Å°x†|ª?‹ßMßÄšFhÃs¯+ŸëLß™¯-77»÷<}xÿk`WœwpB™ÂÆ µp0ôP¹@î ÕÖàl×ãì†Îÿ8»‡1ÿpv+œ¼,zN§Wà²9ð‡‘™ÄB_VdùöY㓊µÔ`×MÁ6»ãÿ‚í¿l¯CE=¸í3Ë ñF·M·•)á¶ó®·åÜ–r·­èà¶qr·{8ÙõðåW·#Çû¿|[aë_.?ÌÏ×î2V®æóµ{úA½W?¯÷ë£úgÚp¼­yÓñ®~_ ­Çi}ÆÛ¼ù¨?õŪÚíG ô¨pÍàºS®ñ„<‘¼ËÄŽ¨ýÎì6;¹ìæ–§˜Åá|yrÙ—ÞR ~tréþŒ“ËÞôÔ!uy"[œñ—¸Ë“âR:¥õ”罈Àw²?Ú²zu'Iâ)-c=ç79êîXˆ¤ÕÔx)¶B…ÝÑòm¥6Zeþf«,ˆõR+¿æ)ìk⋬ÞÚÌ֔÷¹Ä}+‘„ÑWAÞI$_ÿnlÊ­¬ÿ¤/εß@Ý–u7bÝ|+Öoª—×R½ ¬¹éK;¹-Û×5u^»6ôêwîZx®ëaL[BðxÛæT²ÞVÍ ² Ân»œ¦6™!õvì^%<¹¤ VÆm\\PÍÔV¶ Å„@3+¶qÑ#bà@m©‡ <aÅèÌ6®…8)¤Jó†Hcô¨J—C"*‚ H‰pChš"İÍw8Ð4R#â@ñQR£’ïqÚ‘0ó¶_“–ª±DŒÊ©ùo¶^» h¾MS#žR{¾}aD)ùm¡èÿ>ÆófÖg%º9¡H˜œÇÞluµ€¯ÙR Ã+¦È¿éáxš±‘v‰oJÏÿ3ô2»gäïvª7삊SíŠsþ”ÏUb†?œ_œ”Ç..@KŒNMLãú^¹mUUrÛô£=ÃâÝ»9ùJöš¦¨g”´ÞhQ·#YGa,5I #ÐZƉÖ\+áf¦Ò %üöÞš–„«Iin; \øC eQ—f°³‰ù½‹hó(MšKÊ–SÈ…ÇS²ÕX¶Ýôv4‰–·#—¢âöˆ5)o©~]þ&)E•ÐvÛ´ó†Î¯éšrî~M¯k¿Ì Ö=§×tÒlMZ3!§8L!/†OÆQyJS›,šÓ‚‰4w$ýš5Õ:%¶ù„Bª“§e@9… µd­žÇ·Œ¨¥‡þ}D*µQˆÜš±!r1‚ƒ¿—_:9ÃK>-|̰ÍsGöJóÚëø]“´ bMZ¨Æ 7i²>.GÖX»–AŽ·@&†¼²‚£</rêF"°¼¨=\¡#|DK‚Ì÷ye$P&DÌ’ñšÀƒ ½½ëŠÓ”çÙAî&iï+¡›w»Á étsmiû€è%Ãf޽DJ×6UÀéð¸j!°és ›¶ìƒRšZäÃëmÕ¤‘i6a­Î:¶¶VÛ·*™¬ØRÜ£yÍí½í=å Kë‚v¾¼Ü<£l.’ ç\ûIØn§†H£N¥Òä{ÒZoºk^qM/ö¦i¸qI¢…SŽÌ)î¶áþŒw98ø%OFÂPðað[Ý"ù¶hØâPžÚ;ü¾&ò¸øEbè¡+\p ’~=îaÈÈ«#g@¼øû9äÊã¬}¥p]ío–cNÁP'!­`y«B1^ñ±V­a Üù"¤r^™ä°BûXðëП…PÁ‘‡÷ž&&„hàòHXBr,2Ó€Gç_Üßeäa¹Úô@MJû’3i¤ã+Ñ©ÄÀèá_‘ê¬s#L¶Ò˜@9Ξt:>òìnCžÈfkáUYV©!´‰_~Y”pô î,¡õ1¹3fN x7j °_JSIp‘÷|f4Ż׈=dƳG&:âæF>æa ÌíBüB“ÀÄûà²^¡ÅÞ7ù0j +%¡ïäXœ&8˜(wEÞ’“)¹²¶qŒÉc伂È5›=z¤ÒFL±pmI:cË=ÛÖ©…á]§±œ¸ÿ²6ÉŠóÀŸ¼­K{u³e`!Р1xɸD kس¶“ÓŒ¢É­-¾s—>keÔÒ9ÛÈ`ˆ à2s¤ì¸6YD`÷CÐñÃáà8ÀŠùFA‡˜ozµ¬»¦u=Ú†ÕÆüнdÅ«=xñ°%ÂXtsþA’*å…oà@Ò`äÍ@8Kõ‰'ƒ#m^Ù6æ];ß–á]•jÃr4&K„¤š!ë楴“Ša j·QiÕñˆ Y ëÙPjºX:…Nr;W©–…v°á[¡„&"Qή!B#ØÜÒóþÓ²K2Q°¢ÝìMáw‰°“”jɰÊh¶BÅ q} ™Ú¤ÏŽ–{"€§zrD””FçòflÅû8™gL­S¾k•-½hQŒÐæ®§ð&i± Œ9Ð’ÏI$ïŸ^}Œ ‹˜vã°=ø>9"IÓ².edÓ¶6á#slj"<Ùœñ êÀ¶.Ìp»È2¤ð²)u¡Óæ)¥Qrd ‡"•ôÄït"Uè¥=­G¤©$i…ûµmS½± ÃÓµâº$¡”Ï)®¨¼Î›œ´ dÏŸþ­èïGù j‹}f Ø’ßÐi›·Æ»äÛŠycVŽŽÏ15 Ó7)#ê\Yš¦Ë•­Êeo‹}3×äÝÖ–N±@Œx ºtÄ’Øâ8™¿ÿHGžÉà2gr Ÿ¼øe«‚TÆ}óè)HçÎã·ûÕNo›ílÇè0Wˆ€ËÚ(ò…ö¸´§SÕÊÍUÛ:µÐ t:-O )j³%›­Î&zg&ÉX•Çì¢ïb”æñÀ HÐä4•x ¤[—âÓnÀ9† É_hn…°»š{òiþz.í-›ñRÏ<6kÚ±ÑJóe×~°g:†û¹î†1£U‘S‘£[[Ï=¸ÌJÇú0Ï}ÛV©N³ŸÔ£绺DnÊs 妩H=ox[mNS Ä©µ8ó)ªp|ĨA(> ä­·ÔÊs,–y`?‡/Us}•{ç|`¨.ohÓFL·ãEÆe 0¹ÌTã«M·&_–¤‘ Ÿ”yåÞÊ6¹o*çeŸeã/|î[ŽÉmpÜœóчìÕ&ŽR>§‚9nrïÙƒ'Ow÷`oç%ÿ΂/Ø™ò¼{´÷¼‡›æâ \É›ööEˆÞ-ÃPÞ2”¨œ¸ehŠ;…Æ”wíî×ê‡õãúI½Wï×õ¬>®/N/ÎÛëréšÜçâ‚v$›Ü;|úëÏ$žÕÏ<ãà@Çoóˆ×ð¼Ã\ÿP?bÎwë§õ3¾,øI »0x|qv–~«”.¾K· ߦ+‡¸I˜/'.êµW q9ñ²¹®øûÇ‹åÐW —óOÔ×Wùò"®1~¬?ÕÔŸë÷d,7ºÛùìÉΓç‡,c?!b›Í ÿzÃ[yw+?)â¾¹H½ +÷_={ôè±òÃäQ¸ªÎ,n"«Ž#J^DÁ í&81%#ºwçöíÕìx~:·ÚY¼ß}zÛ|ú×|•k›¬m|¾Äe ¦²ýÀ=ùS¿!âÊåG2»Õââüd±¼<}é VóÏ«ú%óó\¥=÷Ål7óã/~yüOóÞµÊ&#LøÊªÔ×™Ì=8 $j¯&Eýˆ—âõ?ób<¨i1þZÏN/?Ђœ¯fÍŠœŸR>y§÷3”¼¿šÏhÑ:ü +qvöödÆ‹ñÂ]-f§'‹wïšEyõá¢^žÎ–h¦•÷e¸öü&’~N L±cDãÄÌ·õ·.Njº‰Íywná)a”äö?,êýå"»ˆ±©_qÔ.|rÏ!#œ/®Ž?ž½;Χ1 “Åüj¾\,ë“ ê¸¤q›ˆO~V3ÿýãìá zèÃâ0ð[ ø7Ï-&hb@áñÛŸ9dßtyúqI˸áäí)·n>¤#Hq¶Hì\.ÈÅñ'ú–²QÄxðìåƒ=¶” §M‚q1„oc(vÒP“I0\í¼@Ò~W~•ÄÀìÿÖGŒoäàŸí¾zq´ÃŒOxC­Ó‰ÿ;-Þ˜ñ©ÌØrµk1\!i%4µÁêé‡-½lÜÒmj£@p´óêààÀNøºÆ9hñµÝЏ‘ÎôsHëS!½‘ÿ~úüåîã_ÁЄ*U¬ñßݾ…ûi† —”x[ÜÜoÀQ@¿°ò‘}ëBÀîÓû¿<<"™¾šDܱºÇ€ÿÏ{ÛIÅ[q»xo¦!ÌìêêâäÃgWxÈûuà;å’¨à îs~v¹ú²œSæób‰ëû„ùTD Ïžrg‹s’ÝÙÇÓÕâòôK}>{{Jhqþž1v Éƒê~ÒíF>óÇ݃Ÿ^í°h×/ B÷ðÛQi€±£ü‚µnz;CâlÑgk#øäÑÃ_÷˜-?"vôˆ>¨oa0ñZ¾nb&C#¸¼Zœ ¼ÉFîñðÉËý¦7Q¶õ'æ!e®ÁOé<ˆ x)!¤îÝIÆõçW­o€˜€f#Ïûó«£÷~†¬Ž&‘&‰ ¿s&ctÑ’Âïd IÉÞÏN•ÚgîŽò40ÀîúÓ $ÇG>-^ë~]šŽwFâ›pÒ×ýó„áùaÿ_$àŸ™þG‚ åe~%ÚËüø§©ÜöèâÅ9AÈPd7§hxº7¤HþkƒîQ¤ Šì˜"w õφ™AÊZÅ0&ÈoNÐðDkH‘ST$l!"9¦(lNÑððgHÑèÿÊI9©43¦(nNÑàe@ÿïH¥ HvJó#‚¬Øœ ÁÈ õ ÉBBjLÜœ ÑÉä‘a«ò‡<ªûOˆc‚Ôæ N†ä¸ëþ] ·rcznáûÓ!=#›Æ)Ê„ÆÖ˜¹…#êïé¯ùêr‘‘½…¯îïæ†ôŒL€«¥Çuò k¨¹…ŸîoņԌ¬Ùô¼´t…tĘž[¸éá6hHÑÈ û™Î­Y_·pÒƒ½Ãœ‘=÷Ø(Öûs¾…‹€þ=F\+Ñ­÷8^îîzˆ½‡ô¨ë£j±¼âwtÿYå?<¼- endstream endobj 162 0 obj << /Length 672 /Filter /FlateDecode >> stream xÚmTÁn£0½óÞC¥öÆ6’*ŠdrضjªÕ^pºH‰A@ýûõŒCf»êô¿™yã'æîÇën¦êö`gñ#govh/}egæç¾‹îîò¶ºœ­Ÿ­­m=Oìµo«Ù½Ùæ[׌ž¼uÕéRÛ‰õ=IÛÆú°ûwû{VÇQðÙáÒœÆÆÍ8ß›ñäIßž3d_ƒ “~Ù~hZ÷ÄÄ#çÜ W›ö c Ñü*…Í'qÇÆÕýU;€ºHHV7ÕxýÂwuö÷É»Ïa´ç­;¶ÑzÍæoþpûOÔøÍ_úÚöû`÷_¥ù£Ý¥ëNd0m6¬¶G_ÑÏÿ¼?[6ÿvÆçý³³Lâ·ºª¶¶C·¯l¿w6Zs¾aë²ÜDÖÕÿ%!ãpœ¨™§ò%¼b•l¢µËÜc€Ã¤ ¥¤ÀÈ ¤ÀPÀP«[ ßuªŸñ©_õgß_•ñxû4Ž$Oˇú<X^\NB8 ë\;c®‚šBbMx¹ ùy˜%ÆPÈ 3jok:E q:¹Œ/d4ˆ8ð€Q§4ÈA2="\¤ÂšYˆ+ÀË‹ÔÏsä(Äè5$ YŒ—Èú rŠÀ‘€ƒ~ì+A¿\HÈ•ÐWr߯{ÇNøxoËøŠ‡û• ¿”$¿TL~©ù¥òK¥ä—ÊÈ/¥È/¥ƒ†p˜1ðsòQä£*ÉGÍÉG-ÈG-ÉG“zA>ê„|Ô)ù¨3òQ/ÉG½"µ"µ&µ!uN>ê‚|Ô%ùh8ùhùh$ùhbòÅ,n~á†üá°nË£ºô½ß+¸´p]À¢hœ½íµ®í \ˆÓ†¯—2ú ¯M„Ç endstream endobj 166 0 obj << /Producer (pdfTeX-1.40.26) /Creator (TeX) /CreationDate (D:20240626083250-04'00') /ModDate (D:20240626083250-04'00') /Trapped /False /PTEX.Fullbanner (This is pdfTeX, Version 3.141592653-2.6-1.40.26 (TeX Live 2024) kpathsea version 6.4.0) >> endobj 163 0 obj << /Type /ObjStm /N 6 /First 42 /Length 256 /Filter /FlateDecode >> stream xÚ‘KK1…÷ùwï"ÏIS(]8âB-ˆÆŠˆ‹Ø í€4%Iþ{ï䪸P ®ÎÜÇùÎ £-¤éÀHPÚ µi Ñ¡âÄj¶X0îßøeÚWÆï¯µ•SS2~Jœ&À×Oþjµ>ëWÞKÁÚúE,›<jʘd0òûc.µß… Ú0~> ©Ðó8uW`.hÕ§‡ý¸ICÄkTk-—?º ÛXïÓã-Ã:Güš^Ðì×ãPà¹Aê&Š*eIæMt›½ü®‰`ˆÞ‘X¢[šÍNÒõ_ôíºS Ù}9ñ×¶»HœøÅÙ‡ÞÒ–â;w>çm‡² endstream endobj 167 0 obj << /Type /XRef /Index [0 168] /Size 168 /W [1 3 1] /Root 165 0 R /Info 166 0 R /ID [<5AEB7C730C87B47BA59BDF8BD446B9D5> <5AEB7C730C87B47BA59BDF8BD446B9D5>] /Length 477 /Filter /FlateDecode >> stream xÚ%ÒËoÌQÆñ÷9ç׆iK«£Æ C«EÕ­.­Þ´tÜoíPê’Ø°‰0B"º±Bì,$MØÙ¸l4]º”e%Ò…ÿÀF2"¨:ÏcóÉ÷=çü.s13›uf ƒY…±N“arŠ#ƒä(¢Æ7<2¦+<¼inJc’̇µŽë­ÇjRëþ¦#¤’, a—†´Q»~SµŒ,‚]¹¯±†¤ÈbRK–4ìîÕððÑ?:·”dÈ ²öàƒ6²¤öØk 1 which are not possible. } \references{ J.H. Friedman and B.E. Popescu (2005). \dQuote{Predictive Learning via Rule Ensembles.} Section 8.1 } \seealso{ \code{\link{gbm}}, \code{\link{gbm.object}} } \author{ Greg Ridgeway \email{gregridgeway@gmail.com} } \keyword{methods} gbm/man/gbmCrossVal.Rd0000644000176200001440000000633514547111627014322 0ustar liggesusers% Generated by roxygen2: do not edit by hand % Please edit documentation in R/gbmCrossVal.R \name{gbmCrossVal} \alias{gbmCrossVal} \alias{gbmCrossValModelBuild} \alias{gbmDoFold} \alias{gbmCrossValErr} \alias{gbmCrossValPredictions} \title{Cross-validate a gbm} \usage{ gbmCrossVal( cv.folds, nTrain, n.cores, class.stratify.cv, data, x, y, offset, distribution, w, var.monotone, n.trees, interaction.depth, n.minobsinnode, shrinkage, bag.fraction, var.names, response.name, group ) gbmCrossValErr(cv.models, cv.folds, cv.group, nTrain, n.trees) gbmCrossValPredictions( cv.models, cv.folds, cv.group, best.iter.cv, distribution, data, y ) gbmCrossValModelBuild( cv.folds, cv.group, n.cores, i.train, x, y, offset, distribution, w, var.monotone, n.trees, interaction.depth, n.minobsinnode, shrinkage, bag.fraction, var.names, response.name, group ) gbmDoFold( X, i.train, x, y, offset, distribution, w, var.monotone, n.trees, interaction.depth, n.minobsinnode, shrinkage, bag.fraction, cv.group, var.names, response.name, group, s ) } \arguments{ \item{cv.folds}{The number of cross-validation folds.} \item{nTrain}{The number of training samples.} \item{n.cores}{The number of cores to use.} \item{class.stratify.cv}{Whether or not stratified cross-validation samples are used.} \item{data}{The data.} \item{x}{The model matrix.} \item{y}{The response variable.} \item{offset}{The offset.} \item{distribution}{The type of loss function. See \code{\link{gbm}}.} \item{w}{Observation weights.} \item{var.monotone}{See \code{\link{gbm}}.} \item{n.trees}{The number of trees to fit.} \item{interaction.depth}{The degree of allowed interactions. See \code{\link{gbm}}.} \item{n.minobsinnode}{See \code{\link{gbm}}.} \item{shrinkage}{See \code{\link{gbm}}.} \item{bag.fraction}{See \code{\link{gbm}}.} \item{var.names}{See \code{\link{gbm}}.} \item{response.name}{See \code{\link{gbm}}.} \item{group}{Used when \code{distribution = "pairwise"}. See \code{\link{gbm}}.} \item{cv.models}{A list containing the models for each fold.} \item{cv.group}{A vector indicating the cross-validation fold for each member of the training set.} \item{best.iter.cv}{The iteration with lowest cross-validation error.} \item{i.train}{Items in the training set.} \item{X}{Index (cross-validation fold) on which to subset.} \item{s}{Random seed.} } \value{ A list containing the cross-validation error and predictions. } \description{ Functions for cross-validating gbm. These functions are used internally and are not intended for end-user direct usage. } \details{ These functions are not intended for end-user direct usage, but are used internally by \code{gbm}. } \references{ J.H. Friedman (2001). "Greedy Function Approximation: A Gradient Boosting Machine," Annals of Statistics 29(5):1189-1232. L. Breiman (2001). \url{https://www.stat.berkeley.edu/users/breiman/randomforest2001.pdf}. } \seealso{ \code{\link{gbm}} } \author{ Greg Ridgeway \email{gregridgeway@gmail.com} } \keyword{models} gbm/man/pretty.gbm.tree.Rd0000644000176200001440000000336314547111627015127 0ustar liggesusers% Generated by roxygen2: do not edit by hand % Please edit documentation in R/pretty.gbm.tree.R \name{pretty.gbm.tree} \alias{pretty.gbm.tree} \title{Print gbm tree components} \usage{ \method{pretty}{gbm.tree}(object, i.tree = 1) } \arguments{ \item{object}{a \code{\link{gbm.object}} initially fit using \code{\link{gbm}}} \item{i.tree}{the index of the tree component to extract from \code{object} and display} } \value{ \code{pretty.gbm.tree} returns a data frame. Each row corresponds to a node in the tree. Columns indicate \item{SplitVar}{index of which variable is used to split. -1 indicates a terminal node.} \item{SplitCodePred}{if the split variable is continuous then this component is the split point. If the split variable is categorical then this component contains the index of \code{object$c.split} that describes the categorical split. If the node is a terminal node then this is the prediction.} \item{LeftNode}{the index of the row corresponding to the left node.} \item{RightNode}{the index of the row corresponding to the right node.} \item{ErrorReduction}{the reduction in the loss function as a result of splitting this node.} \item{Weight}{the total weight of observations in the node. If weights are all equal to 1 then this is the number of observations in the node.} } \description{ \code{gbm} stores the collection of trees used to construct the model in a compact matrix structure. This function extracts the information from a single tree and displays it in a slightly more readable form. This function is mostly for debugging purposes and to satisfy some users' curiosity. } \seealso{ \code{\link{gbm}}, \code{\link{gbm.object}} } \author{ Greg Ridgeway \email{gregridgeway@gmail.com} } \keyword{print} gbm/man/gbm.roc.area.Rd0000644000176200001440000000356014547111627014333 0ustar liggesusers% Generated by roxygen2: do not edit by hand % Please edit documentation in R/ir.measures.R \name{gbm.roc.area} \alias{gbm.roc.area} \alias{gbm.conc} \alias{ir.measure.conc} \alias{ir.measure.auc} \alias{ir.measure.mrr} \alias{ir.measure.map} \alias{ir.measure.ndcg} \alias{perf.pairwise} \title{Compute Information Retrieval measures.} \usage{ gbm.roc.area(obs, pred) gbm.conc(x) ir.measure.conc(y.f, max.rank = 0) ir.measure.auc(y.f, max.rank = 0) ir.measure.mrr(y.f, max.rank) ir.measure.map(y.f, max.rank = 0) ir.measure.ndcg(y.f, max.rank) perf.pairwise(y, f, group, metric = "ndcg", w = NULL, max.rank = 0) } \arguments{ \item{obs}{Observed value.} \item{pred}{Predicted value.} \item{x}{Numeric vector.} \item{y, y.f, f, w, group, max.rank}{Used internally.} \item{metric}{What type of performance measure to compute.} } \value{ The requested performance measure. } \description{ Functions to compute Information Retrieval measures for pairwise loss for a single group. The function returns the respective metric, or a negative value if it is undefined for the given group. } \details{ For simplicity, we have no special handling for ties; instead, we break ties randomly. This is slightly inaccurate for individual groups, but should have only a small effect on the overall measure. \code{gbm.conc} computes the concordance index: Fraction of all pairs (i,j) with i1 then \code{gbm}, in addition to the usual fit, will perform a cross-validation, calculate an estimate of generalization error returned in \code{cv.error}.} \item{keep.data}{a logical variable indicating whether to keep the data and an index of the data stored with the object. Keeping the data and index makes subsequent calls to \code{\link{gbm.more}} faster at the cost of storing an extra copy of the dataset.} \item{verbose}{Logical indicating whether or not to print out progress and performance indicators (\code{TRUE}). If this option is left unspecified for \code{gbm.more}, then it uses \code{verbose} from \code{object}. Default is \code{FALSE}.} \item{class.stratify.cv}{Logical indicating whether or not the cross-validation should be stratified by class. Defaults to \code{TRUE} for \code{distribution = "multinomial"} and is only implemented for \code{"multinomial"} and \code{"bernoulli"}. The purpose of stratifying the cross-validation is to help avoiding situations in which training sets do not contain all classes.} \item{n.cores}{The number of CPU cores to use. The cross-validation loop will attempt to send different CV folds off to different cores. If \code{n.cores} is not specified by the user, it is guessed using the \code{detectCores} function in the \code{parallel} package. Note that the documentation for \code{detectCores} makes clear that it is not failsafe and could return a spurious number of available cores.} } \value{ A \code{\link{gbm.object}} object. } \description{ Fits generalized boosted regression models. For technical details, see the vignette: \code{utils::browseVignettes("gbm")}. } \details{ \code{gbm.fit} provides the link between R and the C++ gbm engine. \code{gbm} is a front-end to \code{gbm.fit} that uses the familiar R modeling formulas. However, \code{\link[stats]{model.frame}} is very slow if there are many predictor variables. For power-users with many variables use \code{gbm.fit}. For general practice \code{gbm} is preferable. This package implements the generalized boosted modeling framework. Boosting is the process of iteratively adding basis functions in a greedy fashion so that each additional basis function further reduces the selected loss function. This implementation closely follows Friedman's Gradient Boosting Machine (Friedman, 2001). In addition to many of the features documented in the Gradient Boosting Machine, \code{gbm} offers additional features including the out-of-bag estimator for the optimal number of iterations, the ability to store and manipulate the resulting \code{gbm} object, and a variety of other loss functions that had not previously had associated boosting algorithms, including the Cox partial likelihood for censored data, the poisson likelihood for count outcomes, and a gradient boosting implementation to minimize the AdaBoost exponential loss function. This gbm package is no longer under further development. Consider https://github.com/gbm-developers/gbm3 for the latest version. } \examples{ # # A least squares regression example # # Simulate data set.seed(101) # for reproducibility N <- 1000 X1 <- runif(N) X2 <- 2 * runif(N) X3 <- ordered(sample(letters[1:4], N, replace = TRUE), levels = letters[4:1]) X4 <- factor(sample(letters[1:6], N, replace = TRUE)) X5 <- factor(sample(letters[1:3], N, replace = TRUE)) X6 <- 3 * runif(N) mu <- c(-1, 0, 1, 2)[as.numeric(X3)] SNR <- 10 # signal-to-noise ratio Y <- X1 ^ 1.5 + 2 * (X2 ^ 0.5) + mu sigma <- sqrt(var(Y) / SNR) Y <- Y + rnorm(N, 0, sigma) X1[sample(1:N,size=500)] <- NA # introduce some missing values X4[sample(1:N,size=300)] <- NA # introduce some missing values data <- data.frame(Y, X1, X2, X3, X4, X5, X6) # Fit a GBM set.seed(102) # for reproducibility gbm1 <- gbm(Y ~ ., data = data, var.monotone = c(0, 0, 0, 0, 0, 0), distribution = "gaussian", n.trees = 100, shrinkage = 0.1, interaction.depth = 3, bag.fraction = 0.5, train.fraction = 0.5, n.minobsinnode = 10, cv.folds = 5, keep.data = TRUE, verbose = FALSE, n.cores = 1) # Check performance using the out-of-bag (OOB) error; the OOB error typically # underestimates the optimal number of iterations best.iter <- gbm.perf(gbm1, method = "OOB") print(best.iter) # Check performance using the 50\% heldout test set best.iter <- gbm.perf(gbm1, method = "test") print(best.iter) # Check performance using 5-fold cross-validation best.iter <- gbm.perf(gbm1, method = "cv") print(best.iter) # Plot relative influence of each variable par(mfrow = c(1, 2)) summary(gbm1, n.trees = 1) # using first tree summary(gbm1, n.trees = best.iter) # using estimated best number of trees # Compactly print the first and last trees for curiosity print(pretty.gbm.tree(gbm1, i.tree = 1)) print(pretty.gbm.tree(gbm1, i.tree = gbm1$n.trees)) # Simulate new data set.seed(103) # for reproducibility N <- 1000 X1 <- runif(N) X2 <- 2 * runif(N) X3 <- ordered(sample(letters[1:4], N, replace = TRUE)) X4 <- factor(sample(letters[1:6], N, replace = TRUE)) X5 <- factor(sample(letters[1:3], N, replace = TRUE)) X6 <- 3 * runif(N) mu <- c(-1, 0, 1, 2)[as.numeric(X3)] Y <- X1 ^ 1.5 + 2 * (X2 ^ 0.5) + mu + rnorm(N, 0, sigma) data2 <- data.frame(Y, X1, X2, X3, X4, X5, X6) # Predict on the new data using the "best" number of trees; by default, # predictions will be on the link scale Yhat <- predict(gbm1, newdata = data2, n.trees = best.iter, type = "link") # least squares error print(sum((data2$Y - Yhat)^2)) # Construct univariate partial dependence plots plot(gbm1, i.var = 1, n.trees = best.iter) plot(gbm1, i.var = 2, n.trees = best.iter) plot(gbm1, i.var = "X3", n.trees = best.iter) # can use index or name # Construct bivariate partial dependence plots plot(gbm1, i.var = 1:2, n.trees = best.iter) plot(gbm1, i.var = c("X2", "X3"), n.trees = best.iter) plot(gbm1, i.var = 3:4, n.trees = best.iter) # Construct trivariate partial dependence plots plot(gbm1, i.var = c(1, 2, 6), n.trees = best.iter, continuous.resolution = 20) plot(gbm1, i.var = 1:3, n.trees = best.iter) plot(gbm1, i.var = 2:4, n.trees = best.iter) plot(gbm1, i.var = 3:5, n.trees = best.iter) # Add more (i.e., 100) boosting iterations to the ensemble gbm2 <- gbm.more(gbm1, n.new.trees = 100, verbose = FALSE) } \references{ Y. Freund and R.E. Schapire (1997) \dQuote{A decision-theoretic generalization of on-line learning and an application to boosting,} \emph{Journal of Computer and System Sciences,} 55(1):119-139. G. Ridgeway (1999). \dQuote{The state of boosting,} \emph{Computing Science and Statistics} 31:172-181. J.H. Friedman, T. Hastie, R. Tibshirani (2000). \dQuote{Additive Logistic Regression: a Statistical View of Boosting,} \emph{Annals of Statistics} 28(2):337-374. J.H. Friedman (2001). \dQuote{Greedy Function Approximation: A Gradient Boosting Machine,} \emph{Annals of Statistics} 29(5):1189-1232. J.H. Friedman (2002). \dQuote{Stochastic Gradient Boosting,} \emph{Computational Statistics and Data Analysis} 38(4):367-378. B. Kriegler (2007). Cost-Sensitive Stochastic Gradient Boosting Within a Quantitative Regression Framework. Ph.D. Dissertation. University of California at Los Angeles, Los Angeles, CA, USA. Advisor(s) Richard A. Berk. \url{https://dl.acm.org/doi/book/10.5555/1354603}. C. Burges (2010). \dQuote{From RankNet to LambdaRank to LambdaMART: An Overview,} Microsoft Research Technical Report MSR-TR-2010-82. } \seealso{ \code{\link{gbm.object}}, \code{\link{gbm.perf}}, \code{\link{plot.gbm}}, \code{\link{predict.gbm}}, \code{\link{summary.gbm}}, and \code{\link{pretty.gbm.tree}}. } \author{ Greg Ridgeway \email{gregridgeway@gmail.com} Quantile regression code developed by Brian Kriegler \email{bk@stat.ucla.edu} t-distribution, and multinomial code developed by Harry Southworth and Daniel Edwards Pairwise code developed by Stefan Schroedl \email{schroedl@a9.com} } gbm/man/gbm.object.Rd0000644000176200001440000000474114547111627014111 0ustar liggesusers% Generated by roxygen2: do not edit by hand % Please edit documentation in R/gbm.object.R \name{gbm.object} \alias{gbm.object} \title{Generalized Boosted Regression Model Object} \value{ \item{initF}{The "intercept" term, the initial predicted value to which trees make adjustments.} \item{fit}{A vector containing the fitted values on the scale of regression function (e.g. log-odds scale for bernoulli, log scale for poisson).} \item{train.error}{A vector of length equal to the number of fitted trees containing the value of the loss function for each boosting iteration evaluated on the training data.} \item{valid.error}{A vector of length equal to the number of fitted trees containing the value of the loss function for each boosting iteration evaluated on the validation data.} \item{cv.error}{If \code{cv.folds} < 2 this component is \code{NULL}. Otherwise, this component is a vector of length equal to the number of fitted trees containing a cross-validated estimate of the loss function for each boosting iteration.} \item{oobag.improve}{A vector of length equal to the number of fitted trees containing an out-of-bag estimate of the marginal reduction in the expected value of the loss function. The out-of-bag estimate uses only the training data and is useful for estimating the optimal number of boosting iterations. See \code{\link{gbm.perf}}.} \item{trees}{A list containing the tree structures. The components are best viewed using \code{\link{pretty.gbm.tree}}.} \item{c.splits}{A list of all the categorical splits in the collection of trees. If the \code{trees[[i]]} component of a \code{gbm} object describes a categorical split then the splitting value will refer to a component of \code{c.splits}. That component of \code{c.splits} will be a vector of length equal to the number of levels in the categorical split variable. -1 indicates left, +1 indicates right, and 0 indicates that the level was not present in the training data.} \item{cv.fitted}{If cross-validation was performed, the cross-validation predicted values on the scale of the linear predictor. That is, the fitted values from the i-th CV-fold, for the model having been trained on the data in all other folds.} } \description{ These are objects representing fitted \code{gbm}s. } \section{Structure}{ The following components must be included in a legitimate \code{gbm} object. } \seealso{ \code{\link{gbm}} } \author{ Greg Ridgeway \email{gregridgeway@gmail.com} } \keyword{methods} gbm/man/basehaz.gbm.Rd0000644000176200001440000000357714547111627014266 0ustar liggesusers% Generated by roxygen2: do not edit by hand % Please edit documentation in R/basehaz.gbm.R \name{basehaz.gbm} \alias{basehaz.gbm} \title{Baseline hazard function} \usage{ basehaz.gbm(t, delta, f.x, t.eval = NULL, smooth = FALSE, cumulative = TRUE) } \arguments{ \item{t}{The survival times.} \item{delta}{The censoring indicator.} \item{f.x}{The predicted values of the regression model on the log hazard scale.} \item{t.eval}{Values at which the baseline hazard will be evaluated.} \item{smooth}{If \code{TRUE} \code{basehaz.gbm} will smooth the estimated baseline hazard using Friedman's super smoother \code{\link{supsmu}}.} \item{cumulative}{If \code{TRUE} the cumulative survival function will be computed.} } \value{ A vector of length equal to the length of t (or of length \code{t.eval} if \code{t.eval} is not \code{NULL}) containing the baseline hazard evaluated at t (or at \code{t.eval} if \code{t.eval} is not \code{NULL}). If \code{cumulative} is set to \code{TRUE} then the returned vector evaluates the cumulative hazard function at those values. } \description{ Computes the Breslow estimator of the baseline hazard function for a proportional hazard regression model. } \details{ The proportional hazard model assumes h(t|x)=lambda(t)*exp(f(x)). \code{\link{gbm}} can estimate the f(x) component via partial likelihood. After estimating f(x), \code{basehaz.gbm} can compute the a nonparametric estimate of lambda(t). } \references{ N. Breslow (1972). "Discussion of `Regression Models and Life-Tables' by D.R. Cox," Journal of the Royal Statistical Society, Series B, 34(2):216-217. N. Breslow (1974). "Covariance analysis of censored survival data," Biometrics 30:89-99. } \seealso{ \code{\link[survival]{survfit}}, \code{\link{gbm}} } \author{ Greg Ridgeway \email{gregridgeway@gmail.com} } \keyword{methods} \keyword{survival} gbm/man/quantile.rug.Rd0000644000176200001440000000147414547111627014515 0ustar liggesusers% Generated by roxygen2: do not edit by hand % Please edit documentation in R/calibrate.plot.R \name{quantile.rug} \alias{quantile.rug} \title{Quantile rug plot} \usage{ \method{quantile}{rug}(x, prob = 0:10/10, ...) } \arguments{ \item{x}{A numeric vector.} \item{prob}{The quantiles of x to mark on the x-axis.} \item{...}{Additional optional arguments to be passed onto \code{\link[graphics]{rug}}} } \value{ No return values. } \description{ Marks the quantiles on the axes of the current plot. } \examples{ x <- rnorm(100) y <- rnorm(100) plot(x, y) quantile.rug(x) } \seealso{ \code{\link[graphics:plot.default]{plot}}, \code{\link[stats]{quantile}}, \code{\link[base]{jitter}}, \code{\link[graphics]{rug}}. } \author{ Greg Ridgeway \email{gregridgeway@gmail.com}. } \keyword{aplot} gbm/man/gbm.more.Rd0000644000176200001440000001204414547111627013600 0ustar liggesusers% Generated by roxygen2: do not edit by hand % Please edit documentation in R/gbm.more.R \name{gbm.more} \alias{gbm.more} \title{Generalized Boosted Regression Modeling (GBM)} \usage{ gbm.more( object, n.new.trees = 100, data = NULL, weights = NULL, offset = NULL, verbose = NULL ) } \arguments{ \item{object}{A \code{\link{gbm.object}} object created from an initial call to \code{\link{gbm}}.} \item{n.new.trees}{Integer specifying the number of additional trees to add to \code{object}. Default is 100.} \item{data}{An optional data frame containing the variables in the model. By default the variables are taken from \code{environment(formula)}, typically the environment from which \code{gbm} is called. If \code{keep.data=TRUE} in the initial call to \code{gbm} then \code{gbm} stores a copy with the object. If \code{keep.data=FALSE} then subsequent calls to \code{\link{gbm.more}} must resupply the same dataset. It becomes the user's responsibility to resupply the same data at this point.} \item{weights}{An optional vector of weights to be used in the fitting process. Must be positive but do not need to be normalized. If \code{keep.data=FALSE} in the initial call to \code{gbm} then it is the user's responsibility to resupply the weights to \code{\link{gbm.more}}.} \item{offset}{A vector of offset values.} \item{verbose}{Logical indicating whether or not to print out progress and performance indicators (\code{TRUE}). If this option is left unspecified for \code{gbm.more}, then it uses \code{verbose} from \code{object}. Default is \code{FALSE}.} } \value{ A \code{\link{gbm.object}} object. } \description{ Adds additional trees to a \code{\link{gbm.object}} object. } \examples{ # # A least squares regression example # # Simulate data set.seed(101) # for reproducibility N <- 1000 X1 <- runif(N) X2 <- 2 * runif(N) X3 <- ordered(sample(letters[1:4], N, replace = TRUE), levels = letters[4:1]) X4 <- factor(sample(letters[1:6], N, replace = TRUE)) X5 <- factor(sample(letters[1:3], N, replace = TRUE)) X6 <- 3 * runif(N) mu <- c(-1, 0, 1, 2)[as.numeric(X3)] SNR <- 10 # signal-to-noise ratio Y <- X1 ^ 1.5 + 2 * (X2 ^ 0.5) + mu sigma <- sqrt(var(Y) / SNR) Y <- Y + rnorm(N, 0, sigma) X1[sample(1:N,size=500)] <- NA # introduce some missing values X4[sample(1:N,size=300)] <- NA # introduce some missing values data <- data.frame(Y, X1, X2, X3, X4, X5, X6) # Fit a GBM set.seed(102) # for reproducibility gbm1 <- gbm(Y ~ ., data = data, var.monotone = c(0, 0, 0, 0, 0, 0), distribution = "gaussian", n.trees = 100, shrinkage = 0.1, interaction.depth = 3, bag.fraction = 0.5, train.fraction = 0.5, n.minobsinnode = 10, cv.folds = 5, keep.data = TRUE, verbose = FALSE, n.cores = 1) # Check performance using the out-of-bag (OOB) error; the OOB error typically # underestimates the optimal number of iterations best.iter <- gbm.perf(gbm1, method = "OOB") print(best.iter) # Check performance using the 50\% heldout test set best.iter <- gbm.perf(gbm1, method = "test") print(best.iter) # Check performance using 5-fold cross-validation best.iter <- gbm.perf(gbm1, method = "cv") print(best.iter) # Plot relative influence of each variable par(mfrow = c(1, 2)) summary(gbm1, n.trees = 1) # using first tree summary(gbm1, n.trees = best.iter) # using estimated best number of trees # Compactly print the first and last trees for curiosity print(pretty.gbm.tree(gbm1, i.tree = 1)) print(pretty.gbm.tree(gbm1, i.tree = gbm1$n.trees)) # Simulate new data set.seed(103) # for reproducibility N <- 1000 X1 <- runif(N) X2 <- 2 * runif(N) X3 <- ordered(sample(letters[1:4], N, replace = TRUE)) X4 <- factor(sample(letters[1:6], N, replace = TRUE)) X5 <- factor(sample(letters[1:3], N, replace = TRUE)) X6 <- 3 * runif(N) mu <- c(-1, 0, 1, 2)[as.numeric(X3)] Y <- X1 ^ 1.5 + 2 * (X2 ^ 0.5) + mu + rnorm(N, 0, sigma) data2 <- data.frame(Y, X1, X2, X3, X4, X5, X6) # Predict on the new data using the "best" number of trees; by default, # predictions will be on the link scale Yhat <- predict(gbm1, newdata = data2, n.trees = best.iter, type = "link") # least squares error print(sum((data2$Y - Yhat)^2)) # Construct univariate partial dependence plots plot(gbm1, i.var = 1, n.trees = best.iter) plot(gbm1, i.var = 2, n.trees = best.iter) plot(gbm1, i.var = "X3", n.trees = best.iter) # can use index or name # Construct bivariate partial dependence plots plot(gbm1, i.var = 1:2, n.trees = best.iter) plot(gbm1, i.var = c("X2", "X3"), n.trees = best.iter) plot(gbm1, i.var = 3:4, n.trees = best.iter) # Construct trivariate partial dependence plots plot(gbm1, i.var = c(1, 2, 6), n.trees = best.iter, continuous.resolution = 20) plot(gbm1, i.var = 1:3, n.trees = best.iter) plot(gbm1, i.var = 2:4, n.trees = best.iter) plot(gbm1, i.var = 3:5, n.trees = best.iter) # Add more (i.e., 100) boosting iterations to the ensemble gbm2 <- gbm.more(gbm1, n.new.trees = 100, verbose = FALSE) } gbm/man/test.gbm.Rd0000644000176200001440000000204614547111627013616 0ustar liggesusers% Generated by roxygen2: do not edit by hand % Please edit documentation in R/test.gbm.R \name{test.gbm} \alias{test.gbm} \alias{validate.gbm} \alias{test.relative.influence} \title{Test the \code{gbm} package.} \usage{ test.gbm() } \value{ An object of class \code{RUnitTestData}. See the help for \code{RUnit} for details. } \description{ Run tests on \code{gbm} functions to perform logical checks and reproducibility. } \details{ The function uses functionality in the \code{RUnit} package. A fairly small validation suite is executed that checks to see that relative influence identifies sensible variables from simulated data, and that predictions from GBMs with Gaussian, Cox or binomial distributions are sensible, } \note{ The test suite is not comprehensive. } \examples{ # Uncomment the following lines to run - commented out to make CRAN happy #library(RUnit) #val <- validate.texmex() #printHTMLProtocol(val, "texmexReport.html") } \seealso{ \code{\link{gbm}} } \author{ Harry Southworth } \keyword{models} gbm/man/summary.gbm.Rd0000644000176200001440000000550014547111627014332 0ustar liggesusers% Generated by roxygen2: do not edit by hand % Please edit documentation in R/print.gbm.R \name{summary.gbm} \alias{summary.gbm} \title{Summary of a gbm object} \usage{ \method{summary}{gbm}( object, cBars = length(object$var.names), n.trees = object$n.trees, plotit = TRUE, order = TRUE, method = relative.influence, normalize = TRUE, ... ) } \arguments{ \item{object}{a \code{gbm} object created from an initial call to \code{\link{gbm}}.} \item{cBars}{the number of bars to plot. If \code{order=TRUE} the only the variables with the \code{cBars} largest relative influence will appear in the barplot. If \code{order=FALSE} then the first \code{cBars} variables will appear in the plot. In either case, the function will return the relative influence of all of the variables.} \item{n.trees}{the number of trees used to generate the plot. Only the first \code{n.trees} trees will be used.} \item{plotit}{an indicator as to whether the plot is generated.} \item{order}{an indicator as to whether the plotted and/or returned relative influences are sorted.} \item{method}{The function used to compute the relative influence. \code{\link{relative.influence}} is the default and is the same as that described in Friedman (2001). The other current (and experimental) choice is \code{\link{permutation.test.gbm}}. This method randomly permutes each predictor variable at a time and computes the associated reduction in predictive performance. This is similar to the variable importance measures Breiman uses for random forests, but \code{gbm} currently computes using the entire training dataset (not the out-of-bag observations).} \item{normalize}{if \code{FALSE} then \code{summary.gbm} returns the unnormalized influence.} \item{...}{other arguments passed to the plot function.} } \value{ Returns a data frame where the first component is the variable name and the second is the computed relative influence, normalized to sum to 100. } \description{ Computes the relative influence of each variable in the gbm object. } \details{ For \code{distribution="gaussian"} this returns exactly the reduction of squared error attributable to each variable. For other loss functions this returns the reduction attributable to each variable in sum of squared error in predicting the gradient on each iteration. It describes the relative influence of each variable in reducing the loss function. See the references below for exact details on the computation. } \references{ J.H. Friedman (2001). "Greedy Function Approximation: A Gradient Boosting Machine," Annals of Statistics 29(5):1189-1232. L. Breiman (2001).\url{https://www.stat.berkeley.edu/users/breiman/randomforest2001.pdf}. } \seealso{ \code{\link{gbm}} } \author{ Greg Ridgeway \email{gregridgeway@gmail.com} } \keyword{hplot} gbm/man/gbm-internals.Rd0000644000176200001440000000257614547111627014645 0ustar liggesusers% Generated by roxygen2: do not edit by hand % Please edit documentation in R/gbm-internals.R \name{guessDist} \alias{guessDist} \alias{getStratify} \alias{getCVgroup} \alias{checkMissing} \alias{checkID} \alias{checkWeights} \alias{checkOffset} \alias{getVarNames} \alias{gbmCluster} \title{gbm internal functions} \usage{ guessDist(y) getCVgroup(distribution, class.stratify.cv, y, i.train, cv.folds, group) getStratify(strat, d) checkMissing(x, y) checkWeights(w, n) checkID(id) checkOffset(o, y) getVarNames(x) gbmCluster(n) } \arguments{ \item{y}{The response variable.} \item{class.stratify.cv}{Whether or not to stratify, if provided by the user.} \item{i.train}{Computed internally by \code{gbm}.} \item{cv.folds}{The number of cross-validation folds.} \item{group}{The group, if using \code{distibution = "pairwise"}.} \item{strat}{Whether or not to stratify.} \item{d, distribution}{The distribution, either specified by the user or implied.} \item{x}{The design matrix.} \item{w}{The weights.} \item{n}{The number of cores to use in the cluster.} \item{id}{The interaction depth.} \item{o}{The offset.} } \description{ Helper functions for preprocessing data prior to building a \code{"gbm"} object. } \details{ These are functions used internally by \code{gbm} and not intended for direct use by the user. } gbm/man/relative.influence.Rd0000644000176200001440000000437214547111627015661 0ustar liggesusers% Generated by roxygen2: do not edit by hand % Please edit documentation in R/relative.influence.R \name{relative.influence} \alias{relative.influence} \alias{permutation.test.gbm} \alias{gbm.loss} \title{Methods for estimating relative influence} \usage{ relative.influence(object, n.trees, scale. = FALSE, sort. = FALSE) permutation.test.gbm(object, n.trees) gbm.loss(y, f, w, offset, dist, baseline, group = NULL, max.rank = NULL) } \arguments{ \item{object}{a \code{gbm} object created from an initial call to \code{\link{gbm}}.} \item{n.trees}{the number of trees to use for computations. If not provided, the the function will guess: if a test set was used in fitting, the number of trees resulting in lowest test set error will be used; otherwise, if cross-validation was performed, the number of trees resulting in lowest cross-validation error will be used; otherwise, all trees will be used.} \item{scale.}{whether or not the result should be scaled. Defaults to \code{FALSE}.} \item{sort.}{whether or not the results should be (reverse) sorted. Defaults to \code{FALSE}.} \item{y, f, w, offset, dist, baseline}{For \code{gbm.loss}: These components are the outcome, predicted value, observation weight, offset, distribution, and comparison loss function, respectively.} \item{group, max.rank}{Used internally when \code{distribution = \'pairwise\'}.} } \value{ By default, returns an unprocessed vector of estimated relative influences. If the \code{scale.} and \code{sort.} arguments are used, returns a processed version of the same. } \description{ Helper functions for computing the relative influence of each variable in the gbm object. } \details{ This is not intended for end-user use. These functions offer the different methods for computing the relative influence in \code{\link{summary.gbm}}. \code{gbm.loss} is a helper function for \code{permutation.test.gbm}. } \references{ J.H. Friedman (2001). "Greedy Function Approximation: A Gradient Boosting Machine," Annals of Statistics 29(5):1189-1232. L. Breiman (2001). \url{https://www.stat.berkeley.edu/users/breiman/randomforest2001.pdf}. } \seealso{ \code{\link{summary.gbm}} } \author{ Greg Ridgeway \email{gregridgeway@gmail.com} } \keyword{hplot} gbm/man/gbm-package.Rd0000644000176200001440000000372714562475545014251 0ustar liggesusers% Generated by roxygen2: do not edit by hand % Please edit documentation in R/gbm-package.R \docType{package} \name{gbm-package} \alias{gbm-package} \title{Generalized Boosted Regression Models (GBMs)} \description{ This package implements extensions to Freund and Schapire's AdaBoost algorithm and J. Friedman's gradient boosting machine. Includes regression methods for least squares, absolute loss, logistic, Poisson, Cox proportional hazards partial likelihood, multinomial, t-distribution, AdaBoost exponential loss, Learning to Rank, and Huberized hinge loss. This gbm package is no longer under further development. Consider https://github.com/gbm-developers/gbm3 for the latest version. } \details{ Further information is available in vignette: \code{browseVignettes(package = "gbm")} } \references{ Y. Freund and R.E. Schapire (1997) \dQuote{A decision-theoretic generalization of on-line learning and an application to boosting,} \emph{Journal of Computer and System Sciences,} 55(1):119-139. G. Ridgeway (1999). \dQuote{The state of boosting,} \emph{Computing Science and Statistics} 31:172-181. J.H. Friedman, T. Hastie, R. Tibshirani (2000). \dQuote{Additive Logistic Regression: a Statistical View of Boosting,} \emph{Annals of Statistics} 28(2):337-374. J.H. Friedman (2001). \dQuote{Greedy Function Approximation: A Gradient Boosting Machine,} \emph{Annals of Statistics} 29(5):1189-1232. J.H. Friedman (2002). \dQuote{Stochastic Gradient Boosting,} \emph{Computational Statistics and Data Analysis} 38(4):367-378. The \href{https://jerryfriedman.su.domains/R-MART.html}{MART} website. } \seealso{ Useful links: \itemize{ \item \url{https://github.com/gbm-developers/gbm} \item Report bugs at \url{https://github.com/gbm-developers/gbm/issues} } } \author{ Greg Ridgeway \email{gridge@upenn.edu} with contributions by Daniel Edwards, Brian Kriegler, Stefan Schroedl, Harry Southworth, and Brandon Greenwell } \keyword{package} gbm/man/print.gbm.Rd0000644000176200001440000000435314547111627013776 0ustar liggesusers% Generated by roxygen2: do not edit by hand % Please edit documentation in R/print.gbm.R \name{print.gbm} \alias{print.gbm} \alias{show.gbm} \title{Print model summary} \usage{ \method{print}{gbm}(x, ...) show.gbm(x, ...) } \arguments{ \item{x}{an object of class \code{gbm}.} \item{\dots}{arguments passed to \code{print.default}.} } \description{ Display basic information about a \code{gbm} object. } \details{ Prints some information about the model object. In particular, this method prints the call to \code{gbm()}, the type of loss function that was used, and the total number of iterations. If cross-validation was performed, the 'best' number of trees as estimated by cross-validation error is displayed. If a test set was used, the 'best' number of trees as estimated by the test set error is displayed. The number of available predictors, and the number of those having non-zero influence on predictions is given (which might be interesting in data mining applications). If multinomial, bernoulli or adaboost was used, the confusion matrix and prediction accuracy are printed (objects being allocated to the class with highest probability for multinomial and bernoulli). These classifications are performed on the entire training data using the model with the 'best' number of trees as described above, or the maximum number of trees if the 'best' cannot be computed. If the 'distribution' was specified as gaussian, laplace, quantile or t-distribution, a summary of the residuals is displayed. The residuals are for the training data with the model at the 'best' number of trees, as described above, or the maximum number of trees if the 'best' cannot be computed. } \examples{ data(iris) iris.mod <- gbm(Species ~ ., distribution="multinomial", data=iris, n.trees=2000, shrinkage=0.01, cv.folds=5, verbose=FALSE, n.cores=1) iris.mod #data(lung) #lung.mod <- gbm(Surv(time, status) ~ ., distribution="coxph", data=lung, # n.trees=2000, shrinkage=0.01, cv.folds=5,verbose =FALSE) #lung.mod } \seealso{ \code{\link{gbm}} } \author{ Harry Southworth, Daniel Edwards } \keyword{models} \keyword{nonlinear} \keyword{nonparametric} \keyword{survival} gbm/man/calibrate.plot.Rd0000644000176200001440000000550614547111627015002 0ustar liggesusers% Generated by roxygen2: do not edit by hand % Please edit documentation in R/calibrate.plot.R \name{calibrate.plot} \alias{calibrate.plot} \title{Calibration plot} \usage{ calibrate.plot( y, p, distribution = "bernoulli", replace = TRUE, line.par = list(col = "black"), shade.col = "lightyellow", shade.density = NULL, rug.par = list(side = 1), xlab = "Predicted value", ylab = "Observed average", xlim = NULL, ylim = NULL, knots = NULL, df = 6, ... ) } \arguments{ \item{y}{The outcome 0-1 variable.} \item{p}{The predictions estimating E(y|x).} \item{distribution}{The loss function used in creating \code{p}. \code{bernoulli} and \code{poisson} are currently the only special options. All others default to squared error assuming \code{gaussian}.} \item{replace}{Determines whether this plot will replace or overlay the current plot. \code{replace=FALSE} is useful for comparing the calibration of several methods.} \item{line.par}{Graphics parameters for the line.} \item{shade.col}{Color for shading the 2 SE region. \code{shade.col=NA} implies no 2 SE region.} \item{shade.density}{The \code{density} parameter for \code{\link{polygon}}.} \item{rug.par}{Graphics parameters passed to \code{\link{rug}}.} \item{xlab}{x-axis label corresponding to the predicted values.} \item{ylab}{y-axis label corresponding to the observed average.} \item{xlim, ylim}{x- and y-axis limits. If not specified te function will select limits.} \item{knots, df}{These parameters are passed directly to \code{\link[splines]{ns}} for constructing a natural spline smoother for the calibration curve.} \item{...}{Additional optional arguments to be passed onto \code{\link[graphics:plot.default]{plot}}} } \value{ No return values. } \description{ An experimental diagnostic tool that plots the fitted values versus the actual average values. Currently only available when \code{distribution = "bernoulli"}. } \details{ Uses natural splines to estimate E(y|p). Well-calibrated predictions imply that E(y|p) = p. The plot also includes a pointwise 95% confidence band. } \examples{ # Don't want R CMD check to think there is a dependency on rpart # so comment out the example #library(rpart) #data(kyphosis) #y <- as.numeric(kyphosis$Kyphosis)-1 #x <- kyphosis$Age #glm1 <- glm(y~poly(x,2),family=binomial) #p <- predict(glm1,type="response") #calibrate.plot(y, p, xlim=c(0,0.6), ylim=c(0,0.6)) } \references{ J.F. Yates (1982). "External correspondence: decomposition of the mean probability score," Organisational Behaviour and Human Performance 30:132-156. D.J. Spiegelhalter (1986). "Probabilistic Prediction in Patient Management and Clinical Trials," Statistics in Medicine 5:421-433. } \author{ Greg Ridgeway \email{gregridgeway@gmail.com} } \keyword{hplot} gbm/man/gbm.perf.Rd0000644000176200001440000000371514547111627013577 0ustar liggesusers% Generated by roxygen2: do not edit by hand % Please edit documentation in R/gbm.perf.R \name{gbm.perf} \alias{gbm.perf} \title{GBM performance} \usage{ gbm.perf(object, plot.it = TRUE, oobag.curve = FALSE, overlay = TRUE, method) } \arguments{ \item{object}{A \code{\link{gbm.object}} created from an initial call to \code{\link{gbm}}.} \item{plot.it}{An indicator of whether or not to plot the performance measures. Setting \code{plot.it = TRUE} creates two plots. The first plot plots \code{object$train.error} (in black) and \code{object$valid.error} (in red) versus the iteration number. The scale of the error measurement, shown on the left vertical axis, depends on the \code{distribution} argument used in the initial call to \code{\link{gbm}}.} \item{oobag.curve}{Indicates whether to plot the out-of-bag performance measures in a second plot.} \item{overlay}{If TRUE and oobag.curve=TRUE then a right y-axis is added to the training and test error plot and the estimated cumulative improvement in the loss function is plotted versus the iteration number.} \item{method}{Indicate the method used to estimate the optimal number of boosting iterations. \code{method = "OOB"} computes the out-of-bag estimate and \code{method = "test"} uses the test (or validation) dataset to compute an out-of-sample estimate. \code{method = "cv"} extracts the optimal number of iterations using cross-validation if \code{gbm} was called with \code{cv.folds} > 1.} } \value{ \code{gbm.perf} Returns the estimated optimal number of iterations. The method of computation depends on the \code{method} argument. } \description{ Estimates the optimal number of boosting iterations for a \code{gbm} object and optionally plots various performance measures } \seealso{ \code{\link{gbm}}, \code{\link{gbm.object}} } \author{ Greg Ridgeway \email{gregridgeway@gmail.com} } \keyword{nonlinear} \keyword{nonparametric} \keyword{survival} \keyword{tree} gbm/DESCRIPTION0000644000176200001440000000503414637453022012535 0ustar liggesusersPackage: gbm Version: 2.2.2 Title: Generalized Boosted Regression Models Authors@R: c( person("Greg", "Ridgeway", email = "gridge@upenn.edu", role = c("aut", "cre"), comment = c(ORCID = "0000-0001-6911-0804")), person("Daniel", "Edwards", email = "unknown@unknown.com", role = "ctb"), person("Brian", "Kriegler", email = "bkriegler@econone.com", role = "ctb"), person("Stefan", "Schroedl", email = "stefan@atomwise.com", role = "ctb"), person("Harry", "Southworth", email = "harry@dataclarityconsulting.co.uk", role = "ctb"), person("Brandon", "Greenwell", email = "greenwell.brandon@gmail.com", role = "ctb", comment = c(ORCID = "0000-0002-8120-0084")), person("Bradley", "Boehmke", email = "bradleyboehmke@gmail.com", role = "ctb", comment = c(ORCID = "0000-0002-3611-8516")), person("Jay", "Cunningham", email = "james@notbadafterall.com", role = "ctb"), person("GBM", "Developers", role = "aut", comment = "https://github.com/gbm-developers") ) Depends: R (>= 2.9.0) Imports: lattice, parallel, survival Suggests: covr, gridExtra, knitr, pdp, RUnit, splines, tinytest, vip, viridis Description: An implementation of extensions to Freund and Schapire's AdaBoost algorithm and Friedman's gradient boosting machine. Includes regression methods for least squares, absolute loss, t-distribution loss, quantile regression, logistic, multinomial logistic, Poisson, Cox proportional hazards partial likelihood, AdaBoost exponential loss, Huberized hinge loss, and Learning to Rank measures (LambdaMart). Originally developed by Greg Ridgeway. Newer version available at github.com/gbm-developers/gbm3. License: GPL (>= 2) | file LICENSE URL: https://github.com/gbm-developers/gbm BugReports: https://github.com/gbm-developers/gbm/issues Encoding: UTF-8 RoxygenNote: 7.3.1 VignetteBuilder: knitr NeedsCompilation: yes Packaged: 2024-06-26 12:33:00 UTC; greg_ Author: Greg Ridgeway [aut, cre] (), Daniel Edwards [ctb], Brian Kriegler [ctb], Stefan Schroedl [ctb], Harry Southworth [ctb], Brandon Greenwell [ctb] (), Bradley Boehmke [ctb] (), Jay Cunningham [ctb], GBM Developers [aut] (https://github.com/gbm-developers) Maintainer: Greg Ridgeway Repository: CRAN Date/Publication: 2024-06-28 06:20:02 UTC