tokenizers/0000755000176200001440000000000013257243614012456 5ustar liggesuserstokenizers/inst/0000755000176200001440000000000013257220650013426 5ustar liggesuserstokenizers/inst/CITATION0000644000176200001440000000170613257205112014563 0ustar liggesuserscitHeader(paste0( "To cite the tokenizers package in publications, please cite the ", "paper in the Journal of Open Source Software:" )) citEntry( entry = "Article", title = "Fast, Consistent Tokenization of Natural Language Text", author = personList(as.person("Lincoln A. Mullen"), as.person("Kenneth Benoit"), as.person("Os Keyes"), as.person("Dmitry Selivanov"), as.person("Jeffrey Arnold")), journal = "Journal of Open Source Software", year = "2018", volume = "3", issue = "23", pages = "655", url = "https://doi.org/10.21105/joss.00655", doi = "10.21105/joss.00655", textVersion = paste('Lincoln A. Mullen et al.,', '"Fast, Consistent Tokenization of Natural Language', 'Text," Journal of Open Source Software 3, no. 23', '(2018): 655, https://doi.org/10.21105/joss.00655.') ) tokenizers/inst/doc/0000755000176200001440000000000013257220650014173 5ustar liggesuserstokenizers/inst/doc/introduction-to-tokenizers.Rmd0000644000176200001440000001233413256545214022203 0ustar liggesusers--- title: "Introduction to the tokenizers Package" author: "Lincoln Mullen" output: rmarkdown::html_vignette vignette: > %\VignetteIndexEntry{Introduction to the tokenizers Package} %\VignetteEngine{knitr::rmarkdown} %\VignetteEncoding{UTF-8} --- ```{r setup, include = FALSE} knitr::opts_chunk$set( collapse = TRUE, comment = "#>" ) ``` ## Package overview In natural language processing, tokenization is the process of breaking human-readable text into machine readable components. The most obvious way to tokenize a text is to split the text into words. But there are many other ways to tokenize a text, the most useful of which are provided by this package. The tokenizers in this package have a consistent interface. They all take either a character vector of any length, or a list where each element is a character vector of length one. The idea is that each element comprises a text. Then each function returns a list with the same length as the input vector, where each element in the list contains the tokens generated by the function. If the input character vector or list is named, then the names are preserved, so that the names can serve as identifiers. Using the following sample text, the rest of this vignette demonstrates the different kinds of tokenizers in this package. ```{r} library(tokenizers) options(max.print = 25) james <- paste0( "The question thus becomes a verbal one\n", "again; and our knowledge of all these early stages of thought and feeling\n", "is in any case so conjectural and imperfect that farther discussion would\n", "not be worth while.\n", "\n", "Religion, therefore, as I now ask you arbitrarily to take it, shall mean\n", "for us _the feelings, acts, and experiences of individual men in their\n", "solitude, so far as they apprehend themselves to stand in relation to\n", "whatever they may consider the divine_. Since the relation may be either\n", "moral, physical, or ritual, it is evident that out of religion in the\n", "sense in which we take it, theologies, philosophies, and ecclesiastical\n", "organizations may secondarily grow.\n" ) ``` ## Character and character-shingle tokenizers The character tokenizer splits texts into individual characters. ```{r} tokenize_characters(james)[[1]] ``` You can also tokenize into character-based shingles. ```{r} tokenize_character_shingles(james, n = 3, n_min = 3, strip_non_alphanum = FALSE)[[1]][1:20] ``` ## Word and word-stem tokenizers The word tokenizer splits texts into words. ```{r} tokenize_words(james) ``` Word stemming is provided by the [SnowballC](https://cran.r-project.org/package=SnowballC) package. ```{r} tokenize_word_stems(james) ``` You can also provide a vector of stopwords which will be omitted. The [stopwords package](https://github.com/quanteda/stopwords), which contains stopwords for many languages from several sources, is recommended. This argument also works with the n-gram and skip n-gram tokenizers. ```{r} library(stopwords) tokenize_words(james, stopwords = stopwords::stopwords("en")) ``` An alternative word stemmer often used in NLP that preserves punctuation and separates common English contractions is the Penn Treebank tokenizer. ```{r} tokenize_ptb(james) ``` ## N-gram and skip n-gram tokenizers An n-gram is a contiguous sequence of words containing at least `n_min` words and at most `n` words. This function will generate all such combinations of n-grams, omitting stopwords if desired. ```{r} tokenize_ngrams(james, n = 5, n_min = 2, stopwords = stopwords::stopwords("en")) ``` A skip n-gram is like an n-gram in that it takes the `n` and `n_min` parameters. But rather than returning contiguous sequences of words, it will also return sequences of n-grams skipping words with gaps between `0` and the value of `k`. This function generates all such sequences, again omitting stopwords if desired. Note that the number of tokens returned can be very large. ```{r} tokenize_skip_ngrams(james, n = 5, n_min = 2, k = 2, stopwords = stopwords::stopwords("en")) ``` ## Tweet tokenizer Tokenizing tweets requires special attention, since usernames (`@whoever`) and hashtags (`#hashtag`) use special characters that might otherwise be stripped away. ```{r} tokenize_tweets("Welcome, @user, to the tokenizers package. #rstats #forever") ``` ## Sentence and paragraph tokenizers Sometimes it is desirable to split texts into sentences or paragraphs prior to tokenizing into other forms. ```{r, collapse=FALSE} tokenize_sentences(james) tokenize_paragraphs(james) ``` ## Text chunking When one has a very long document, sometimes it is desirable to split the document into smaller chunks, each with the same length. This function chunks a document and gives it each of the chunks an ID to show their order. These chunks can then be further tokenized. ```{r} chunks <- chunk_text(mobydick, chunk_size = 100, doc_id = "mobydick") length(chunks) chunks[5:6] tokenize_words(chunks[5:6]) ``` ## Counting words, characters, sentences The package also offers functions for counting words, characters, and sentences in a format which works nicely with the rest of the functions. ```{r} count_words(mobydick) count_characters(mobydick) count_sentences(mobydick) ``` tokenizers/inst/doc/tif-and-tokenizers.html0000644000176200001440000004027413257220650020605 0ustar liggesusers The Text Interchange Formats and the tokenizers Package

The Text Interchange Formats and the tokenizers Package

Lincoln Mullen

The Text Interchange Formats are a set of standards defined at an rOpenSci sponsored meeting in London in 2017. The formats allow R text analysis packages to target defined inputs and outputs for corpora, tokens, and document-term matrices. By adhering to these recommendations, R packages can buy into an interoperable ecosystem.

The TIF recommendations are still a draft, but the tokenizers package implements its recommendation to accept both of the corpora formats and to output one of its recommended tokens formats.

Consider these two recommended forms of a corpus. One (corpus_c) is a named character vector; the other (corpus_d) is a data frame. They both include a document ID and the full text for each item. The data frame format obviously allows for the use of other metadata fields besides the document ID, whereas the other format does not. Using the coercion functions in the tif package, one could switch back and forth between these formats. Tokenizers also supports a corpus formatted as a named list where each element is a character vector of length one (corpus_l), though this is not a part of the draft TIF standards.

# Named list
(corpus_l <- list(man_comes_around = "There's a man goin' 'round takin' names",
                  wont_back_down = "Well I won't back down, no I won't back down",
                  bird_on_a_wire = "Like a bird on a wire"))
#> $man_comes_around
#> [1] "There's a man goin' 'round takin' names"
#> 
#> $wont_back_down
#> [1] "Well I won't back down, no I won't back down"
#> 
#> $bird_on_a_wire
#> [1] "Like a bird on a wire"

# Named character vector
(corpus_c <- unlist(corpus_l))
#>                               man_comes_around 
#>      "There's a man goin' 'round takin' names" 
#>                                 wont_back_down 
#> "Well I won't back down, no I won't back down" 
#>                                 bird_on_a_wire 
#>                        "Like a bird on a wire"

# Data frame
(corpus_d <- data.frame(doc_id = names(corpus_c), text = unname(corpus_c),
                        stringsAsFactors = FALSE))
#>             doc_id                                         text
#> 1 man_comes_around      There's a man goin' 'round takin' names
#> 2   wont_back_down Well I won't back down, no I won't back down
#> 3   bird_on_a_wire                        Like a bird on a wire

All of the tokenizers in this package can accept any of those formats and will return an identical output for each.

library(tokenizers)

tokens_l <- tokenize_ngrams(corpus_l, n = 2)
tokens_c <- tokenize_ngrams(corpus_c, n = 2)
tokens_d <- tokenize_ngrams(corpus_c, n = 2)

# Are all these identical?
all(identical(tokens_l, tokens_c),
    identical(tokens_c, tokens_d),
    identical(tokens_l, tokens_d))
#> [1] TRUE

The output of all of the tokenizers is a named list, where each element of the list corresponds to a document in the corpus. The names of the list are the document IDs, and the elements are character vectors containing the tokens.

tokens_l
#> $man_comes_around
#> [1] "there's a"   "a man"       "man goin"    "goin round"  "round takin"
#> [6] "takin names"
#> 
#> $wont_back_down
#> [1] "well i"     "i won't"    "won't back" "back down"  "down no"   
#> [6] "no i"       "i won't"    "won't back" "back down" 
#> 
#> $bird_on_a_wire
#> [1] "like a"  "a bird"  "bird on" "on a"    "a wire"

This format can be coerced to a data frame of document IDs and tokens, one row per token, using the coercion functions in the tif package. That tokens data frame would look like this.

#>              doc_id       token
#> 1  man_comes_around   there's a
#> 2  man_comes_around       a man
#> 3  man_comes_around    man goin
#> 4  man_comes_around  goin round
#> 5  man_comes_around round takin
#> 6  man_comes_around takin names
#> 7    wont_back_down      well i
#> 8    wont_back_down     i won't
#> 9    wont_back_down  won't back
#> 10   wont_back_down   back down
tokenizers/inst/doc/introduction-to-tokenizers.html0000644000176200001440000011661713257220647022437 0ustar liggesusers Introduction to the tokenizers Package

Introduction to the tokenizers Package

Lincoln Mullen

Package overview

In natural language processing, tokenization is the process of breaking human-readable text into machine readable components. The most obvious way to tokenize a text is to split the text into words. But there are many other ways to tokenize a text, the most useful of which are provided by this package.

The tokenizers in this package have a consistent interface. They all take either a character vector of any length, or a list where each element is a character vector of length one. The idea is that each element comprises a text. Then each function returns a list with the same length as the input vector, where each element in the list contains the tokens generated by the function. If the input character vector or list is named, then the names are preserved, so that the names can serve as identifiers.

Using the following sample text, the rest of this vignette demonstrates the different kinds of tokenizers in this package.

library(tokenizers)
options(max.print = 25)

james <- paste0(
  "The question thus becomes a verbal one\n",
  "again; and our knowledge of all these early stages of thought and feeling\n",
  "is in any case so conjectural and imperfect that farther discussion would\n",
  "not be worth while.\n",
  "\n",
  "Religion, therefore, as I now ask you arbitrarily to take it, shall mean\n",
  "for us _the feelings, acts, and experiences of individual men in their\n",
  "solitude, so far as they apprehend themselves to stand in relation to\n",
  "whatever they may consider the divine_. Since the relation may be either\n",
  "moral, physical, or ritual, it is evident that out of religion in the\n",
  "sense in which we take it, theologies, philosophies, and ecclesiastical\n",
  "organizations may secondarily grow.\n"
)

Character and character-shingle tokenizers

The character tokenizer splits texts into individual characters.

tokenize_characters(james)[[1]] 
#>  [1] "t" "h" "e" "q" "u" "e" "s" "t" "i" "o" "n" "t" "h" "u" "s" "b" "e"
#> [18] "c" "o" "m" "e" "s" "a" "v" "e"
#>  [ reached getOption("max.print") -- omitted 517 entries ]

You can also tokenize into character-based shingles.

tokenize_character_shingles(james, n = 3, n_min = 3, 
                            strip_non_alphanum = FALSE)[[1]][1:20]
#>  [1] "the" "he " "e q" " qu" "que" "ues" "est" "sti" "tio" "ion" "on "
#> [12] "n t" " th" "thu" "hus" "us " "s b" " be" "bec" "eco"

Word and word-stem tokenizers

The word tokenizer splits texts into words.

tokenize_words(james)
#> [[1]]
#>  [1] "the"       "question"  "thus"      "becomes"   "a"        
#>  [6] "verbal"    "one"       "again"     "and"       "our"      
#> [11] "knowledge" "of"        "all"       "these"     "early"    
#> [16] "stages"    "of"        "thought"   "and"       "feeling"  
#> [21] "is"        "in"        "any"       "case"      "so"       
#>  [ reached getOption("max.print") -- omitted 87 entries ]

Word stemming is provided by the SnowballC package.

tokenize_word_stems(james)
#> [[1]]
#>  [1] "the"      "question" "thus"     "becom"    "a"        "verbal"  
#>  [7] "one"      "again"    "and"      "our"      "knowledg" "of"      
#> [13] "all"      "these"    "earli"    "stage"    "of"       "thought" 
#> [19] "and"      "feel"     "is"       "in"       "ani"      "case"    
#> [25] "so"      
#>  [ reached getOption("max.print") -- omitted 87 entries ]

You can also provide a vector of stopwords which will be omitted. The stopwords package, which contains stopwords for many languages from several sources, is recommended. This argument also works with the n-gram and skip n-gram tokenizers.

library(stopwords)
tokenize_words(james, stopwords = stopwords::stopwords("en"))
#> [[1]]
#>  [1] "question"    "thus"        "becomes"     "verbal"      "one"        
#>  [6] "knowledge"   "early"       "stages"      "thought"     "feeling"    
#> [11] "case"        "conjectural" "imperfect"   "farther"     "discussion" 
#> [16] "worth"       "religion"    "therefore"   "now"         "ask"        
#> [21] "arbitrarily" "take"        "shall"       "mean"        "us"         
#>  [ reached getOption("max.print") -- omitted 33 entries ]

An alternative word stemmer often used in NLP that preserves punctuation and separates common English contractions is the Penn Treebank tokenizer.

tokenize_ptb(james)
#> [[1]]
#>  [1] "The"       "question"  "thus"      "becomes"   "a"        
#>  [6] "verbal"    "one"       "again"     ";"         "and"      
#> [11] "our"       "knowledge" "of"        "all"       "these"    
#> [16] "early"     "stages"    "of"        "thought"   "and"      
#> [21] "feeling"   "is"        "in"        "any"       "case"     
#>  [ reached getOption("max.print") -- omitted 101 entries ]

N-gram and skip n-gram tokenizers

An n-gram is a contiguous sequence of words containing at least n_min words and at most n words. This function will generate all such combinations of n-grams, omitting stopwords if desired.

tokenize_ngrams(james, n = 5, n_min = 2,
                stopwords = stopwords::stopwords("en"))
#> [[1]]
#>  [1] "question thus"                         
#>  [2] "question thus becomes"                 
#>  [3] "question thus becomes verbal"          
#>  [4] "question thus becomes verbal one"      
#>  [5] "thus becomes"                          
#>  [6] "thus becomes verbal"                   
#>  [7] "thus becomes verbal one"               
#>  [8] "thus becomes verbal one knowledge"     
#>  [9] "becomes verbal"                        
#> [10] "becomes verbal one"                    
#> [11] "becomes verbal one knowledge"          
#> [12] "becomes verbal one knowledge early"    
#> [13] "verbal one"                            
#> [14] "verbal one knowledge"                  
#> [15] "verbal one knowledge early"            
#> [16] "verbal one knowledge early stages"     
#> [17] "one knowledge"                         
#> [18] "one knowledge early"                   
#> [19] "one knowledge early stages"            
#> [20] "one knowledge early stages thought"    
#> [21] "knowledge early"                       
#> [22] "knowledge early stages"                
#> [23] "knowledge early stages thought"        
#> [24] "knowledge early stages thought feeling"
#> [25] "early stages"                          
#>  [ reached getOption("max.print") -- omitted 197 entries ]

A skip n-gram is like an n-gram in that it takes the n and n_min parameters. But rather than returning contiguous sequences of words, it will also return sequences of n-grams skipping words with gaps between 0 and the value of k. This function generates all such sequences, again omitting stopwords if desired. Note that the number of tokens returned can be very large.

tokenize_skip_ngrams(james, n = 5, n_min = 2, k = 2,
                     stopwords = stopwords::stopwords("en"))
#> [[1]]
#>  [1] "question thus"                    
#>  [2] "question becomes"                 
#>  [3] "question verbal"                  
#>  [4] "question thus becomes"            
#>  [5] "question thus verbal"             
#>  [6] "question thus one"                
#>  [7] "question becomes verbal"          
#>  [8] "question becomes one"             
#>  [9] "question becomes knowledge"       
#> [10] "question verbal one"              
#> [11] "question verbal knowledge"        
#> [12] "question verbal early"            
#> [13] "question thus becomes verbal"     
#> [14] "question thus becomes one"        
#> [15] "question thus becomes knowledge"  
#> [16] "question thus verbal one"         
#> [17] "question thus verbal knowledge"   
#> [18] "question thus verbal early"       
#> [19] "question thus one knowledge"      
#> [20] "question thus one early"          
#> [21] "question thus one stages"         
#> [22] "question becomes verbal one"      
#> [23] "question becomes verbal knowledge"
#> [24] "question becomes verbal early"    
#> [25] "question becomes one knowledge"   
#>  [ reached getOption("max.print") -- omitted 6083 entries ]

Tweet tokenizer

Tokenizing tweets requires special attention, since usernames (@whoever) and hashtags (#hashtag) use special characters that might otherwise be stripped away.

tokenize_tweets("Welcome, @user, to the tokenizers package. #rstats #forever")
#> [[1]]
#> [1] "welcome"    "@user"      "to"         "the"        "tokenizers"
#> [6] "package"    "#rstats"    "#forever"

Sentence and paragraph tokenizers

Sometimes it is desirable to split texts into sentences or paragraphs prior to tokenizing into other forms.

tokenize_sentences(james) 
#> [[1]]
#> [1] "The question thus becomes a verbal one again; and our knowledge of all these early stages of thought and feeling is in any case so conjectural and imperfect that farther discussion would not be worth while."                                               
#> [2] "Religion, therefore, as I now ask you arbitrarily to take it, shall mean for us _the feelings, acts, and experiences of individual men in their solitude, so far as they apprehend themselves to stand in relation to whatever they may consider the divine_."
#> [3] "Since the relation may be either moral, physical, or ritual, it is evident that out of religion in the sense in which we take it, theologies, philosophies, and ecclesiastical organizations may secondarily grow."
tokenize_paragraphs(james)
#> [[1]]
#> [1] "The question thus becomes a verbal one again; and our knowledge of all these early stages of thought and feeling is in any case so conjectural and imperfect that farther discussion would not be worth while."                                                                                                                                                                                                                                                                   
#> [2] "Religion, therefore, as I now ask you arbitrarily to take it, shall mean for us _the feelings, acts, and experiences of individual men in their solitude, so far as they apprehend themselves to stand in relation to whatever they may consider the divine_. Since the relation may be either moral, physical, or ritual, it is evident that out of religion in the sense in which we take it, theologies, philosophies, and ecclesiastical organizations may secondarily grow. "

Text chunking

When one has a very long document, sometimes it is desirable to split the document into smaller chunks, each with the same length. This function chunks a document and gives it each of the chunks an ID to show their order. These chunks can then be further tokenized.

chunks <- chunk_text(mobydick, chunk_size = 100, doc_id = "mobydick")
length(chunks)
#> [1] 2195
chunks[5:6]
#> $`mobydick-0005`
#> [1] "of a poor devil of a sub sub appears to have gone through the long vaticans and street stalls of the earth picking up whatever random allusions to whales he could anyways find in any book whatsoever sacred or profane therefore you must not in every case at least take the higgledy piggledy whale statements however authentic in these extracts for veritable gospel cetology far from it as touching the ancient authors generally as well as the poets here appearing these extracts are solely valuable or entertaining as affording a glancing bird's eye view of what has been promiscuously said"
#> 
#> $`mobydick-0006`
#> [1] "thought fancied and sung of leviathan by many nations and generations including our own so fare thee well poor devil of a sub sub whose commentator i am thou belongest to that hopeless sallow tribe which no wine of this world will ever warm and for whom even pale sherry would be too rosy strong but with whom one sometimes loves to sit and feel poor devilish too and grow convivial upon tears and say to them bluntly with full eyes and empty glasses and in not altogether unpleasant sadness give it up sub subs for by how much the"
tokenize_words(chunks[5:6])
#> $`mobydick-0005`
#>  [1] "of"       "a"        "poor"     "devil"    "of"       "a"       
#>  [7] "sub"      "sub"      "appears"  "to"       "have"     "gone"    
#> [13] "through"  "the"      "long"     "vaticans" "and"      "street"  
#> [19] "stalls"   "of"       "the"      "earth"    "picking"  "up"      
#> [25] "whatever"
#>  [ reached getOption("max.print") -- omitted 75 entries ]
#> 
#> $`mobydick-0006`
#>  [1] "thought"     "fancied"     "and"         "sung"        "of"         
#>  [6] "leviathan"   "by"          "many"        "nations"     "and"        
#> [11] "generations" "including"   "our"         "own"         "so"         
#> [16] "fare"        "thee"        "well"        "poor"        "devil"      
#> [21] "of"          "a"           "sub"         "sub"         "whose"      
#>  [ reached getOption("max.print") -- omitted 75 entries ]

Counting words, characters, sentences

The package also offers functions for counting words, characters, and sentences in a format which works nicely with the rest of the functions.

count_words(mobydick)
#> mobydick 
#>   219415
count_characters(mobydick)
#> mobydick 
#>  1235185
count_sentences(mobydick)
#> mobydick 
#>    29076
tokenizers/inst/doc/tif-and-tokenizers.Rmd0000644000176200001440000000723213256545214020365 0ustar liggesusers--- title: "The Text Interchange Formats and the tokenizers Package" author: "Lincoln Mullen" output: rmarkdown::html_vignette vignette: > %\VignetteIndexEntry{The Text Interchange Formats and the tokenizers Package} %\VignetteEngine{knitr::rmarkdown} %\VignetteEncoding{UTF-8} --- ```{r setup, include = FALSE} knitr::opts_chunk$set( collapse = TRUE, comment = "#>" ) ``` The [Text Interchange Formats](https://github.com/ropensci/tif) are a set of standards defined at an [rOpenSci](https://ropensci.org/) sponsored [meeting in London](http://textworkshop17.ropensci.org/) in 2017. The formats allow R text analysis packages to target defined inputs and outputs for corpora, tokens, and document-term matrices. By adhering to these recommendations, R packages can buy into an interoperable ecosystem. The TIF recommendations are still a draft, but the tokenizers package implements its recommendation to accept both of the corpora formats and to output one of its recommended tokens formats. Consider these two recommended forms of a corpus. One (`corpus_c`) is a named character vector; the other (`corpus_d`) is a data frame. They both include a document ID and the full text for each item. The data frame format obviously allows for the use of other metadata fields besides the document ID, whereas the other format does not. Using the coercion functions in the tif package, one could switch back and forth between these formats. Tokenizers also supports a corpus formatted as a named list where each element is a character vector of length one (`corpus_l`), though this is not a part of the draft TIF standards. ```{r} # Named list (corpus_l <- list(man_comes_around = "There's a man goin' 'round takin' names", wont_back_down = "Well I won't back down, no I won't back down", bird_on_a_wire = "Like a bird on a wire")) # Named character vector (corpus_c <- unlist(corpus_l)) # Data frame (corpus_d <- data.frame(doc_id = names(corpus_c), text = unname(corpus_c), stringsAsFactors = FALSE)) ``` All of the tokenizers in this package can accept any of those formats and will return an identical output for each. ```{r} library(tokenizers) tokens_l <- tokenize_ngrams(corpus_l, n = 2) tokens_c <- tokenize_ngrams(corpus_c, n = 2) tokens_d <- tokenize_ngrams(corpus_c, n = 2) # Are all these identical? all(identical(tokens_l, tokens_c), identical(tokens_c, tokens_d), identical(tokens_l, tokens_d)) ``` The output of all of the tokenizers is a named list, where each element of the list corresponds to a document in the corpus. The names of the list are the document IDs, and the elements are character vectors containing the tokens. ```{r} tokens_l ``` This format can be coerced to a data frame of document IDs and tokens, one row per token, using the coercion functions in the tif package. That tokens data frame would look like this. ```{r, echo=FALSE} sample_tokens_df <- structure(list(doc_id = c("man_comes_around", "man_comes_around", "man_comes_around", "man_comes_around", "man_comes_around", "man_comes_around", "wont_back_down", "wont_back_down", "wont_back_down", "wont_back_down", "wont_back_down", "wont_back_down", "wont_back_down", "wont_back_down", "wont_back_down", "bird_on_a_wire", "bird_on_a_wire", "bird_on_a_wire", "bird_on_a_wire", "bird_on_a_wire"), token = c("there's a", "a man", "man goin", "goin round", "round takin", "takin names", "well i", "i won't", "won't back", "back down", "down no", "no i", "i won't", "won't back", "back down", "like a", "a bird", "bird on", "on a", "a wire")), .Names = c("doc_id", "token"), row.names = c(NA, -20L), class = "data.frame") head(sample_tokens_df, 10) ``` tokenizers/inst/doc/introduction-to-tokenizers.R0000644000176200001440000000536713257220647021673 0ustar liggesusers## ----setup, include = FALSE---------------------------------------------- knitr::opts_chunk$set( collapse = TRUE, comment = "#>" ) ## ------------------------------------------------------------------------ library(tokenizers) options(max.print = 25) james <- paste0( "The question thus becomes a verbal one\n", "again; and our knowledge of all these early stages of thought and feeling\n", "is in any case so conjectural and imperfect that farther discussion would\n", "not be worth while.\n", "\n", "Religion, therefore, as I now ask you arbitrarily to take it, shall mean\n", "for us _the feelings, acts, and experiences of individual men in their\n", "solitude, so far as they apprehend themselves to stand in relation to\n", "whatever they may consider the divine_. Since the relation may be either\n", "moral, physical, or ritual, it is evident that out of religion in the\n", "sense in which we take it, theologies, philosophies, and ecclesiastical\n", "organizations may secondarily grow.\n" ) ## ------------------------------------------------------------------------ tokenize_characters(james)[[1]] ## ------------------------------------------------------------------------ tokenize_character_shingles(james, n = 3, n_min = 3, strip_non_alphanum = FALSE)[[1]][1:20] ## ------------------------------------------------------------------------ tokenize_words(james) ## ------------------------------------------------------------------------ tokenize_word_stems(james) ## ------------------------------------------------------------------------ library(stopwords) tokenize_words(james, stopwords = stopwords::stopwords("en")) ## ------------------------------------------------------------------------ tokenize_ptb(james) ## ------------------------------------------------------------------------ tokenize_ngrams(james, n = 5, n_min = 2, stopwords = stopwords::stopwords("en")) ## ------------------------------------------------------------------------ tokenize_skip_ngrams(james, n = 5, n_min = 2, k = 2, stopwords = stopwords::stopwords("en")) ## ------------------------------------------------------------------------ tokenize_tweets("Welcome, @user, to the tokenizers package. #rstats #forever") ## ---- collapse=FALSE----------------------------------------------------- tokenize_sentences(james) tokenize_paragraphs(james) ## ------------------------------------------------------------------------ chunks <- chunk_text(mobydick, chunk_size = 100, doc_id = "mobydick") length(chunks) chunks[5:6] tokenize_words(chunks[5:6]) ## ------------------------------------------------------------------------ count_words(mobydick) count_characters(mobydick) count_sentences(mobydick) tokenizers/inst/doc/tif-and-tokenizers.R0000644000176200001440000000364413257220647020050 0ustar liggesusers## ----setup, include = FALSE---------------------------------------------- knitr::opts_chunk$set( collapse = TRUE, comment = "#>" ) ## ------------------------------------------------------------------------ # Named list (corpus_l <- list(man_comes_around = "There's a man goin' 'round takin' names", wont_back_down = "Well I won't back down, no I won't back down", bird_on_a_wire = "Like a bird on a wire")) # Named character vector (corpus_c <- unlist(corpus_l)) # Data frame (corpus_d <- data.frame(doc_id = names(corpus_c), text = unname(corpus_c), stringsAsFactors = FALSE)) ## ------------------------------------------------------------------------ library(tokenizers) tokens_l <- tokenize_ngrams(corpus_l, n = 2) tokens_c <- tokenize_ngrams(corpus_c, n = 2) tokens_d <- tokenize_ngrams(corpus_c, n = 2) # Are all these identical? all(identical(tokens_l, tokens_c), identical(tokens_c, tokens_d), identical(tokens_l, tokens_d)) ## ------------------------------------------------------------------------ tokens_l ## ---- echo=FALSE--------------------------------------------------------- sample_tokens_df <- structure(list(doc_id = c("man_comes_around", "man_comes_around", "man_comes_around", "man_comes_around", "man_comes_around", "man_comes_around", "wont_back_down", "wont_back_down", "wont_back_down", "wont_back_down", "wont_back_down", "wont_back_down", "wont_back_down", "wont_back_down", "wont_back_down", "bird_on_a_wire", "bird_on_a_wire", "bird_on_a_wire", "bird_on_a_wire", "bird_on_a_wire"), token = c("there's a", "a man", "man goin", "goin round", "round takin", "takin names", "well i", "i won't", "won't back", "back down", "down no", "no i", "i won't", "won't back", "back down", "like a", "a bird", "bird on", "on a", "a wire")), .Names = c("doc_id", "token"), row.names = c(NA, -20L), class = "data.frame") head(sample_tokens_df, 10) tokenizers/tests/0000755000176200001440000000000012775200571013617 5ustar liggesuserstokenizers/tests/testthat.R0000644000176200001440000000010012775200571015571 0ustar liggesuserslibrary(testthat) library(tokenizers) test_check("tokenizers") tokenizers/tests/testthat/0000755000176200001440000000000013257243614015460 5ustar liggesuserstokenizers/tests/testthat/test-encoding.R0000644000176200001440000000123313252224016020333 0ustar liggesuserscontext("Encodings") test_that("Encodings work on Windows", { input <- "César Moreira Nuñez" reference <- c("césar", "moreira", "nuñez") reference_enc <- c("UTF-8", "unknown", "UTF-8") output_n1 <- tokenize_ngrams(input, n = 1, simplify = TRUE) output_words <- tokenize_words(input, simplify = TRUE) output_skip <- tokenize_skip_ngrams(input, n = 1, k = 0, simplify = TRUE) expect_equal(output_n1, reference) expect_equal(output_words, reference) expect_equal(output_skip, reference) expect_equal(Encoding(output_n1), reference_enc) expect_equal(Encoding(output_words), reference_enc) expect_equal(Encoding(output_skip), reference_enc) })tokenizers/tests/testthat/test-utils.R0000644000176200001440000000066412775200571017725 0ustar liggesuserscontext("Utils") test_that("Inputs are verified correct", { expect_silent(check_input(letters)) expect_silent(check_input(list(a = "a", b = "b"))) expect_error(check_input(1:10)) expect_error(check_input(list(a = "a", b = letters))) expect_error(check_input(list(a = "a", b = 2))) }) test_that("Stopwords are removed", { expect_equal(remove_stopwords(letters[1:5], stopwords = c("d", "e")), letters[1:3]) })tokenizers/tests/testthat/test-shingles.R0000644000176200001440000000341013070504253020362 0ustar liggesuserscontext("Shingle tokenizers") test_that("Character shingle tokenizer works as expected", { out_l <- tokenize_character_shingles(docs_l, n = 3, n_min = 2) out_c <- tokenize_character_shingles(docs_c, n = 3, n_min = 2) out_1 <- tokenize_character_shingles(docs_c[1], n = 3, n_min = 2, simplify = TRUE) expect_is(out_l, "list") expect_is(out_l[[1]], "character") expect_is(out_c, "list") expect_is(out_c[[1]], "character") expect_is(out_1, "character") expect_identical(out_l, out_c) expect_identical(out_l[[1]], out_1) expect_identical(out_c[[1]], out_1) expect_named(out_l, names(docs_l)) expect_named(out_c, names(docs_c)) expect_error(tokenize_ngrams(bad_list)) }) test_that("Character shingle tokenizer produces correct output", { phrase <- c("Remember who commended thy yellow stockings", "And wished to see thee cross-gartered.") names(phrase) <- c("Malvolio 1", "Malvolio 2") out_d <- tokenize_character_shingles(phrase) out_asis <- tokenize_character_shingles(phrase, lowercase = FALSE, strip_non_alphanum = FALSE) expect_identical(out_d[[1]][1:12], c("rem", "eme", "mem", "emb", "mbe", "ber", "erw", "rwh", "who", "hoc", "oco", "com")) expect_identical(out_asis[[2]][1:15], c("And", "nd ", "d w", " wi", "wis", "ish", "she", "hed", "ed ", "d t", " to", "to ", "o s", " se", "see")) }) test_that("Character shingle tokenizer consistently produces NAs where appropriate", { test <- c("This is a text", NA, "So is this") names(test) <- letters[1:3] out <- tokenize_character_shingles(test) expect_true(is.na(out$b)) })tokenizers/tests/testthat/moby-ch2.txt0000644000176200001440000001743212775200571017647 0ustar liggesusersCHAPTER 2. The Carpet-Bag. I stuffed a shirt or two into my old carpet-bag, tucked it under my arm, and started for Cape Horn and the Pacific. Quitting the good city of old Manhatto, I duly arrived in New Bedford. It was a Saturday night in December. Much was I disappointed upon learning that the little packet for Nantucket had already sailed, and that no way of reaching that place would offer, till the following Monday. As most young candidates for the pains and penalties of whaling stop at this same New Bedford, thence to embark on their voyage, it may as well be related that I, for one, had no idea of so doing. For my mind was made up to sail in no other than a Nantucket craft, because there was a fine, boisterous something about everything connected with that famous old island, which amazingly pleased me. Besides though New Bedford has of late been gradually monopolising the business of whaling, and though in this matter poor old Nantucket is now much behind her, yet Nantucket was her great original--the Tyre of this Carthage;--the place where the first dead American whale was stranded. Where else but from Nantucket did those aboriginal whalemen, the Red-Men, first sally out in canoes to give chase to the Leviathan? And where but from Nantucket, too, did that first adventurous little sloop put forth, partly laden with imported cobblestones--so goes the story--to throw at the whales, in order to discover when they were nigh enough to risk a harpoon from the bowsprit? Now having a night, a day, and still another night following before me in New Bedford, ere I could embark for my destined port, it became a matter of concernment where I was to eat and sleep meanwhile. It was a very dubious-looking, nay, a very dark and dismal night, bitingly cold and cheerless. I knew no one in the place. With anxious grapnels I had sounded my pocket, and only brought up a few pieces of silver,--So, wherever you go, Ishmael, said I to myself, as I stood in the middle of a dreary street shouldering my bag, and comparing the gloom towards the north with the darkness towards the south--wherever in your wisdom you may conclude to lodge for the night, my dear Ishmael, be sure to inquire the price, and don't be too particular. With halting steps I paced the streets, and passed the sign of "The Crossed Harpoons"--but it looked too expensive and jolly there. Further on, from the bright red windows of the "Sword-Fish Inn," there came such fervent rays, that it seemed to have melted the packed snow and ice from before the house, for everywhere else the congealed frost lay ten inches thick in a hard, asphaltic pavement,--rather weary for me, when I struck my foot against the flinty projections, because from hard, remorseless service the soles of my boots were in a most miserable plight. Too expensive and jolly, again thought I, pausing one moment to watch the broad glare in the street, and hear the sounds of the tinkling glasses within. But go on, Ishmael, said I at last; don't you hear? get away from before the door; your patched boots are stopping the way. So on I went. I now by instinct followed the streets that took me waterward, for there, doubtless, were the cheapest, if not the cheeriest inns. Such dreary streets! blocks of blackness, not houses, on either hand, and here and there a candle, like a candle moving about in a tomb. At this hour of the night, of the last day of the week, that quarter of the town proved all but deserted. But presently I came to a smoky light proceeding from a low, wide building, the door of which stood invitingly open. It had a careless look, as if it were meant for the uses of the public; so, entering, the first thing I did was to stumble over an ash-box in the porch. Ha! thought I, ha, as the flying particles almost choked me, are these ashes from that destroyed city, Gomorrah? But "The Crossed Harpoons," and "The Sword-Fish?"--this, then must needs be the sign of "The Trap." However, I picked myself up and hearing a loud voice within, pushed on and opened a second, interior door. It seemed the great Black Parliament sitting in Tophet. A hundred black faces turned round in their rows to peer; and beyond, a black Angel of Doom was beating a book in a pulpit. It was a negro church; and the preacher's text was about the blackness of darkness, and the weeping and wailing and teeth-gnashing there. Ha, Ishmael, muttered I, backing out, Wretched entertainment at the sign of 'The Trap!' Moving on, I at last came to a dim sort of light not far from the docks, and heard a forlorn creaking in the air; and looking up, saw a swinging sign over the door with a white painting upon it, faintly representing a tall straight jet of misty spray, and these words underneath--"The Spouter Inn:--Peter Coffin." Coffin?--Spouter?--Rather ominous in that particular connexion, thought I. But it is a common name in Nantucket, they say, and I suppose this Peter here is an emigrant from there. As the light looked so dim, and the place, for the time, looked quiet enough, and the dilapidated little wooden house itself looked as if it might have been carted here from the ruins of some burnt district, and as the swinging sign had a poverty-stricken sort of creak to it, I thought that here was the very spot for cheap lodgings, and the best of pea coffee. It was a queer sort of place--a gable-ended old house, one side palsied as it were, and leaning over sadly. It stood on a sharp bleak corner, where that tempestuous wind Euroclydon kept up a worse howling than ever it did about poor Paul's tossed craft. Euroclydon, nevertheless, is a mighty pleasant zephyr to any one in-doors, with his feet on the hob quietly toasting for bed. "In judging of that tempestuous wind called Euroclydon," says an old writer--of whose works I possess the only copy extant--"it maketh a marvellous difference, whether thou lookest out at it from a glass window where the frost is all on the outside, or whether thou observest it from that sashless window, where the frost is on both sides, and of which the wight Death is the only glazier." True enough, thought I, as this passage occurred to my mind--old black-letter, thou reasonest well. Yes, these eyes are windows, and this body of mine is the house. What a pity they didn't stop up the chinks and the crannies though, and thrust in a little lint here and there. But it's too late to make any improvements now. The universe is finished; the copestone is on, and the chips were carted off a million years ago. Poor Lazarus there, chattering his teeth against the curbstone for his pillow, and shaking off his tatters with his shiverings, he might plug up both ears with rags, and put a corn-cob into his mouth, and yet that would not keep out the tempestuous Euroclydon. Euroclydon! says old Dives, in his red silken wrapper--(he had a redder one afterwards) pooh, pooh! What a fine frosty night; how Orion glitters; what northern lights! Let them talk of their oriental summer climes of everlasting conservatories; give me the privilege of making my own summer with my own coals. But what thinks Lazarus? Can he warm his blue hands by holding them up to the grand northern lights? Would not Lazarus rather be in Sumatra than here? Would he not far rather lay him down lengthwise along the line of the equator; yea, ye gods! go down to the fiery pit itself, in order to keep out this frost? Now, that Lazarus should lie stranded there on the curbstone before the door of Dives, this is more wonderful than that an iceberg should be moored to one of the Moluccas. Yet Dives himself, he too lives like a Czar in an ice palace made of frozen sighs, and being a president of a temperance society, he only drinks the tepid tears of orphans. But no more of this blubbering now, we are going a-whaling, and there is plenty of that yet to come. Let us scrape the ice from our frosted feet, and see what sort of a place this "Spouter" may be. tokenizers/tests/testthat/test-wordcount.R0000644000176200001440000000132213252224016020570 0ustar liggesuserscontext("Word counts") test_that("Word counts work on lists and character vectors", { out_l <- count_sentences(docs_l) out_c <- count_sentences(docs_c) expect_identical(out_l, out_c) out_l <- count_words(docs_l) out_c <- count_words(docs_c) expect_identical(out_l, out_c) out_l <- count_characters(docs_l) out_c <- count_characters(docs_c) expect_identical(out_l, out_c) expect_named(out_l, names(docs_l)) expect_named(out_c, names(docs_c)) }) test_that("Word counts give correct results", { input <- "This input has 10 words; doesn't it? Well---sure does." expect_equal(10, count_words(input)) expect_equal(2, count_sentences(input)) expect_equal(nchar(input), count_characters(input)) }) tokenizers/tests/testthat/test-basic.R0000644000176200001440000001535213252224016017635 0ustar liggesuserscontext("Basic tokenizers") test_that("Character tokenizer works as expected", { out_l <- tokenize_characters(docs_l) out_c <- tokenize_characters(docs_c) out_1 <- tokenize_characters(docs_c[1], simplify = TRUE) expect_is(out_l, "list") expect_is(out_l[[1]], "character") expect_is(out_c, "list") expect_is(out_c[[1]], "character") expect_is(out_1, "character") expect_identical(out_l, out_c) expect_identical(out_l[[1]], out_1) expect_identical(out_c[[1]], out_1) expect_named(out_l, names(docs_l)) expect_named(out_c, names(docs_c)) expect_error(tokenize_characters(bad_list)) }) test_that("Character tokenizer produces correct output", { # skip_on_os("windows") out_1 <- tokenize_characters(docs_c[1], simplify = TRUE) expected <- c("c", "h", "a", "p", "t") expect_identical(head(out_1, 5), expected) }) test_that("Word tokenizer works as expected", { out_l <- tokenize_words(docs_l) out_c <- tokenize_words(docs_c) out_1 <- tokenize_words(docs_c[1], simplify = TRUE) expect_is(out_l, "list") expect_is(out_l[[1]], "character") expect_is(out_c, "list") expect_is(out_c[[1]], "character") expect_is(out_1, "character") expect_identical(out_l, out_c) expect_identical(out_l[[1]], out_1) expect_identical(out_c[[1]], out_1) expect_named(out_l, names(docs_l)) expect_named(out_c, names(docs_c)) expect_error(tokenize_words(bad_list)) }) test_that("Word tokenizer produces correct output", { # skip_on_os("windows") out_1 <- tokenize_words(docs_c[1], simplify = TRUE) expected <- c("chapter", "1", "loomings", "call", "me") expect_identical(head(out_1, 5), expected) }) test_that("Word tokenizer removes stop words", { test <- "Now is the time for every good person" test_l <- list(test, test) stopwords <- c("is", "the", "for") expected <- c("now", "time", "every", "good", "person") expected_l <- list(expected, expected) expect_equal(tokenize_words(test, simplify = TRUE, stopwords = stopwords), expected) expect_equal(tokenize_words(test_l, stopwords = stopwords), expected_l) }) test_that("Word tokenizer can remove punctuation or numbers", { test_punct <- "This sentence ... has punctuation, doesn't it?" out_punct <- c("this", "sentence", ".", ".", ".", "has", "punctuation", ",", "doesn't", "it", "?") test_num <- "In 1968 the GDP was 1.2 trillion." out_num_f <- c("in", "1968", "the", "gdp", "was", "1.2", "trillion") out_num_t <- c("in", "the", "gdp", "was", "trillion") expect_equal(tokenize_words(test_punct, simplify = TRUE, strip_punct = FALSE), out_punct) expect_equal(tokenize_words(test_num, simplify = TRUE, strip_numeric = FALSE), out_num_f) expect_equal(tokenize_words(test_num, simplify = TRUE, strip_numeric = TRUE), out_num_t) }) test_that("Sentence tokenizer works as expected", { out_l <- tokenize_sentences(docs_l) out_c <- tokenize_sentences(docs_c) out_1 <- tokenize_sentences(docs_c[1], simplify = TRUE) out_1_lc <- tokenize_sentences(docs_c[1], lowercase = TRUE, simplify = TRUE) out_1_pc <- tokenize_sentences(docs_c[1], strip_punct = TRUE, simplify = TRUE) expect_is(out_l, "list") expect_is(out_l[[1]], "character") expect_is(out_c, "list") expect_is(out_c[[1]], "character") expect_is(out_1, "character") expect_identical(out_l, out_c) expect_identical(out_l[[1]], out_1) expect_identical(out_c[[1]], out_1) expect_named(out_l, names(docs_l)) expect_named(out_c, names(docs_c)) expect_error(tokenize_sentences(bad_list)) }) test_that("Sentence tokenizer produces correct output", { # skip_on_os("windows") out_1 <- tokenize_sentences(docs_c[1], simplify = TRUE) out_1_lc <- tokenize_sentences(docs_c[1], lowercase = TRUE, simplify = TRUE) out_1_pc <- tokenize_sentences(docs_c[1], strip_punct = TRUE, simplify = TRUE) expected <- c("CHAPTER 1.", "Loomings.", "Call me Ishmael.") expected_pc <- c("CHAPTER 1", "Loomings", "Call me Ishmael") expect_identical(head(out_1, 3), expected) expect_identical(head(out_1_lc, 3), tolower(expected)) expect_identical(head(out_1_pc, 3), expected_pc) }) test_that("Line tokenizer works as expected", { out_l <- tokenize_lines(docs_l) out_c <- tokenize_lines(docs_c) out_1 <- tokenize_lines(docs_c[1], simplify = TRUE) expect_is(out_l, "list") expect_is(out_l[[1]], "character") expect_is(out_c, "list") expect_is(out_c[[1]], "character") expect_is(out_1, "character") expect_identical(out_l, out_c) expect_identical(out_l[[1]], out_1) expect_identical(out_c[[1]], out_1) expect_named(out_l, names(docs_l)) expect_named(out_c, names(docs_c)) expect_error(tokenize_lines(bad_list)) }) test_that("Sentence tokenizer produces correct output", { # skip_on_os("windows") out_1 <- tokenize_lines(docs_c[1], simplify = TRUE) expected <- c("CHAPTER 1. Loomings.", "Call me Ishmael. Some years ago--never mind how long precisely--having") expect_identical(head(out_1, 2), expected) }) test_that("Paragraph tokenizer works as expected", { out_l <- tokenize_paragraphs(docs_l) out_c <- tokenize_paragraphs(docs_c) out_1 <- tokenize_paragraphs(docs_c[1], simplify = TRUE) expect_is(out_l, "list") expect_is(out_l[[1]], "character") expect_is(out_c, "list") expect_is(out_c[[1]], "character") expect_is(out_1, "character") expect_identical(out_l, out_c) expect_identical(out_l[[1]], out_1) expect_identical(out_c[[1]], out_1) expect_named(out_l, names(docs_l)) expect_named(out_c, names(docs_c)) expect_error(tokenize_paragraphs(bad_list)) }) test_that("Paragraph tokenizer produces correct output", { # skip_on_os("windows") out_1 <- tokenize_paragraphs(docs_c[1], simplify = TRUE) expected <- c("There now is your insular city of the Manhattoes") expect_true(grepl(expected, out_1[3])) }) test_that("Regex tokenizer works as expected", { out_l <- tokenize_regex(docs_l, pattern = "[[:punct:]\n]") out_c <- tokenize_regex(docs_c, pattern = "[[:punct:]\n]") out_1 <- tokenize_regex(docs_c[1], pattern = "[[:punct:]\n]", simplify = TRUE) expect_is(out_l, "list") expect_is(out_l[[1]], "character") expect_is(out_c, "list") expect_is(out_c[[1]], "character") expect_is(out_1, "character") expect_identical(out_l, out_c) expect_identical(out_l[[1]], out_1) expect_identical(out_c[[1]], out_1) expect_named(out_l, names(docs_l)) expect_named(out_c, names(docs_c)) expect_error(tokenize_paragraphs(bad_list)) }) test_that("Regex tokenizer produces correct output", { # skip_on_os("windows") out_1 <- tokenize_regex(docs_c[1], pattern = "[[:punct:]\n]", simplify = TRUE) expected <- c("CHAPTER 1", " Loomings", "Call me Ishmael", " Some years ago", "never mind how long precisely") expect_identical(head(out_1, 5), expected) })tokenizers/tests/testthat/test-tif.R0000644000176200001440000000440313256545214017343 0ustar liggesuserscontext("Text Interchange Format") test_that("Can detect a TIF compliant data.frame", { expect_true(is_corpus_df(docs_df)) bad_df <- docs_df bad_df$doc_id <- NULL expect_error(is_corpus_df(bad_df)) }) test_that("Can coerce a TIF compliant data.frame to a character vector", { output <- docs_df$text names(output) <- docs_df$doc_id expect_identical(corpus_df_as_corpus_vector(docs_df), output) }) test_that("Different methods produce identical output", { expect_identical(tokenize_words(docs_c), tokenize_words(docs_df)) expect_identical(tokenize_words(docs_l), tokenize_words(docs_df)) expect_identical(tokenize_characters(docs_c), tokenize_characters(docs_df)) expect_identical(tokenize_characters(docs_l), tokenize_characters(docs_df)) expect_identical(tokenize_sentences(docs_c), tokenize_sentences(docs_df)) expect_identical(tokenize_sentences(docs_l), tokenize_sentences(docs_df)) expect_identical(tokenize_lines(docs_c), tokenize_lines(docs_df)) expect_identical(tokenize_lines(docs_l), tokenize_lines(docs_df)) expect_identical(tokenize_paragraphs(docs_c), tokenize_paragraphs(docs_df)) expect_identical(tokenize_paragraphs(docs_l), tokenize_paragraphs(docs_df)) expect_identical(tokenize_regex(docs_c), tokenize_regex(docs_df)) expect_identical(tokenize_regex(docs_l), tokenize_regex(docs_df)) expect_identical(tokenize_tweets(docs_c), tokenize_tweets(docs_df)) expect_identical(tokenize_tweets(docs_l), tokenize_tweets(docs_df)) expect_identical(tokenize_ngrams(docs_c), tokenize_ngrams(docs_df)) expect_identical(tokenize_ngrams(docs_l), tokenize_ngrams(docs_df)) expect_identical(tokenize_skip_ngrams(docs_c), tokenize_skip_ngrams(docs_df)) expect_identical(tokenize_skip_ngrams(docs_l), tokenize_skip_ngrams(docs_df)) expect_identical(tokenize_ptb(docs_c), tokenize_ptb(docs_df)) expect_identical(tokenize_ptb(docs_l), tokenize_ptb(docs_df)) expect_identical(tokenize_character_shingles(docs_c), tokenize_character_shingles(docs_df)) expect_identical(tokenize_character_shingles(docs_l), tokenize_character_shingles(docs_df)) expect_identical(tokenize_word_stems(docs_c), tokenize_word_stems(docs_df)) expect_identical(tokenize_word_stems(docs_l), tokenize_word_stems(docs_df)) }) tokenizers/tests/testthat/test-ptb.R0000644000176200001440000000244113252224016017334 0ustar liggesuserscontext("PTB tokenizer") test_that("PTB tokenizer works as expected", { out_l <- tokenize_ptb(docs_l) out_c <- tokenize_ptb(docs_c) out_1 <- tokenize_ptb(docs_c[1], simplify = TRUE) expect_is(out_l, "list") expect_is(out_l[[1]], "character") expect_is(out_c, "list") expect_is(out_c[[1]], "character") expect_is(out_1, "character") expect_identical(out_l, out_c) expect_identical(out_l[[1]], out_1) expect_identical(out_c[[1]], out_1) expect_named(out_l, names(docs_l)) expect_named(out_c, names(docs_c)) expect_error(tokenize_ptb(bad_list)) }) test_that("Word tokenizer produces correct output", { sents <- c(paste0("Good muffins cost $3.88\nin New York. ", "Please buy me\\ntwo of them.\\nThanks."), "They'll save and invest more." , "hi, my name can't hello,") expected <- list(c("Good", "muffins", "cost", "$", "3.88", "in", "New", "York.", "Please", "buy", "me\\ntwo", "of", "them.\\nThanks", "."), c("They", "'ll", "save", "and", "invest", "more", "."), c("hi", ",", "my", "name", "ca", "n't", "hello", ",")) expect_identical(tokenize_ptb(sents), expected) expect_identical(tokenize_ptb("This can't work.", lowercase = TRUE, simplify = TRUE), c("this", "ca", "n't", "work", ".")) }) tokenizers/tests/testthat/test-tokenize_tweets.R0000644000176200001440000000530613252224016021775 0ustar liggesuserscontext("Tweet tokenizer") test_that("tweet tokenizer works correctly with case", { txt <- c(t1 = "Try this: tokenizers at @rOpenSci https://twitter.com/search?q=ropensci&src=typd", t2 = "#rstats awesome Package! @rOpenSci", t3 = "one two three Four #FIVE") out_tw1 <- tokenize_tweets(txt, lowercase = TRUE) expect_identical(out_tw1$t2, c("#rstats", "awesome", "package", "@rOpenSci")) expect_identical(out_tw1$t3, c("one", "two", "three", "four", "#FIVE")) out_tw2 <- tokenize_tweets(txt, lowercase = FALSE) expect_identical(out_tw2$t2, c("#rstats", "awesome", "Package", "@rOpenSci")) expect_identical(out_tw2$t3, c("one", "two", "three", "Four", "#FIVE")) }) test_that("tweet tokenizer works correctly with strip_punctuation", { txt <- c(t1 = "Try this: tokenizers at @rOpenSci https://twitter.com/search?q=ropensci&src=typd", t2 = "#rstats awesome Package! @rOpenSci", t3 = "one two three Four #FIVE") out_tw1 <- tokenize_tweets(txt, strip_punct = TRUE, lowercase = TRUE) expect_identical(out_tw1$t2, c("#rstats", "awesome", "package", "@rOpenSci")) expect_identical(out_tw1$t3, c("one", "two", "three", "four", "#FIVE")) out_tw2 <- tokenize_tweets(txt, strip_punct = FALSE, lowercase = TRUE) expect_identical( out_tw2$t1, c("try", "this", ":", "tokenizers", "at", "@rOpenSci", "https://twitter.com/search?q=ropensci&src=typd") ) }) test_that("tweet tokenizer works correctly with strip_url", { txt <- c(t1 = "Tokenizers at @rOpenSci https://twitter.com/search?q=ropensci&src=typd") out_tw1 <- tokenize_tweets(txt, strip_punct = TRUE, strip_url = FALSE) expect_identical( out_tw1$t1, c("tokenizers", "at", "@rOpenSci", "https://twitter.com/search?q=ropensci&src=typd") ) out_tw2 <- tokenize_tweets(txt, strip_punct = TRUE, strip_url = TRUE) expect_identical( out_tw2$t1, c("tokenizers", "at", "@rOpenSci") ) }) test_that("names are preserved with tweet tokenizer", { expect_equal( names(tokenize_tweets(c(t1 = "Larry, moe, and curly", t2 = "@ropensci #rstats"))), c("t1", "t2") ) expect_equal( names(tokenize_tweets(c("Larry, moe, and curly", "@ropensci #rstats"))), NULL ) }) test_that("punctuation as part of tweets can preserved", { txt <- c(t1 = "We love #rstats!", t2 = "@rOpenSci: See you at UseR!") expect_equal( tokenize_tweets(txt, strip_punct = FALSE, lowercase = FALSE), list(t1 = c("We", "love", "#rstats", "!"), t2 = c("@rOpenSci", ":", "See", "you", "at", "UseR", "!")) ) expect_equal( tokenize_tweets(txt, strip_punct = TRUE, lowercase = FALSE), list(t1 = c("We", "love", "#rstats"), t2 = c("@rOpenSci", "See", "you", "at", "UseR")) ) }) tokenizers/tests/testthat/test-chunking.R0000644000176200001440000000224113252224016020353 0ustar liggesuserscontext("Document chunking") test_that("Document chunking work on lists and character vectors", { chunk_size <- 10 out_l <- chunk_text(docs_l, chunk_size = chunk_size) out_c <- chunk_text(docs_c, chunk_size = chunk_size) expect_is(out_l, "list") expect_is(out_l[[1]], "character") expect_is(out_c, "list") expect_is(out_c[[1]], "character") expect_identical(out_l, out_c) expect_identical(out_l[[1]], out_c[[1]]) expect_identical(out_c[[1]], out_c[[1]]) expect_named(out_l, names(out_c)) expect_named(out_c, names(out_l)) expect_error(chunk_text(bad_list)) }) test_that("Document chunking splits documents apart correctly", { test_doc <- "This is a sentence with exactly eight words. Here's two. And now here are ten words in a great sentence. And five or six left over." out <- chunk_text(test_doc, chunk_size = 10, doc_id = "test") out_wc <- count_words(out) test_wc <- c(10L, 10L, 6L) names(test_wc) <- c("test-1", "test-2", "test-3") expect_named(out, names(test_wc)) expect_identical(out_wc, test_wc) out_short <- chunk_text("This is a short text") expect_equal(count_words(out_short[[1]]), 5) expect_named(out_short, NULL) }) tokenizers/tests/testthat/moby-ch1.txt0000644000176200001440000002772212775200571017651 0ustar liggesusersCHAPTER 1. Loomings. Call me Ishmael. Some years ago--never mind how long precisely--having little or no money in my purse, and nothing particular to interest me on shore, I thought I would sail about a little and see the watery part of the world. It is a way I have of driving off the spleen and regulating the circulation. Whenever I find myself growing grim about the mouth; whenever it is a damp, drizzly November in my soul; whenever I find myself involuntarily pausing before coffin warehouses, and bringing up the rear of every funeral I meet; and especially whenever my hypos get such an upper hand of me, that it requires a strong moral principle to prevent me from deliberately stepping into the street, and methodically knocking people's hats off--then, I account it high time to get to sea as soon as I can. This is my substitute for pistol and ball. With a philosophical flourish Cato throws himself upon his sword; I quietly take to the ship. There is nothing surprising in this. If they but knew it, almost all men in their degree, some time or other, cherish very nearly the same feelings towards the ocean with me. There now is your insular city of the Manhattoes, belted round by wharves as Indian isles by coral reefs--commerce surrounds it with her surf. Right and left, the streets take you waterward. Its extreme downtown is the battery, where that noble mole is washed by waves, and cooled by breezes, which a few hours previous were out of sight of land. Look at the crowds of water-gazers there. Circumambulate the city of a dreamy Sabbath afternoon. Go from Corlears Hook to Coenties Slip, and from thence, by Whitehall, northward. What do you see?--Posted like silent sentinels all around the town, stand thousands upon thousands of mortal men fixed in ocean reveries. Some leaning against the spiles; some seated upon the pier-heads; some looking over the bulwarks of ships from China; some high aloft in the rigging, as if striving to get a still better seaward peep. But these are all landsmen; of week days pent up in lath and plaster--tied to counters, nailed to benches, clinched to desks. How then is this? Are the green fields gone? What do they here? But look! here come more crowds, pacing straight for the water, and seemingly bound for a dive. Strange! Nothing will content them but the extremest limit of the land; loitering under the shady lee of yonder warehouses will not suffice. No. They must get just as nigh the water as they possibly can without falling in. And there they stand--miles of them--leagues. Inlanders all, they come from lanes and alleys, streets and avenues--north, east, south, and west. Yet here they all unite. Tell me, does the magnetic virtue of the needles of the compasses of all those ships attract them thither? Once more. Say you are in the country; in some high land of lakes. Take almost any path you please, and ten to one it carries you down in a dale, and leaves you there by a pool in the stream. There is magic in it. Let the most absent-minded of men be plunged in his deepest reveries--stand that man on his legs, set his feet a-going, and he will infallibly lead you to water, if water there be in all that region. Should you ever be athirst in the great American desert, try this experiment, if your caravan happen to be supplied with a metaphysical professor. Yes, as every one knows, meditation and water are wedded for ever. But here is an artist. He desires to paint you the dreamiest, shadiest, quietest, most enchanting bit of romantic landscape in all the valley of the Saco. What is the chief element he employs? There stand his trees, each with a hollow trunk, as if a hermit and a crucifix were within; and here sleeps his meadow, and there sleep his cattle; and up from yonder cottage goes a sleepy smoke. Deep into distant woodlands winds a mazy way, reaching to overlapping spurs of mountains bathed in their hill-side blue. But though the picture lies thus tranced, and though this pine-tree shakes down its sighs like leaves upon this shepherd's head, yet all were vain, unless the shepherd's eye were fixed upon the magic stream before him. Go visit the Prairies in June, when for scores on scores of miles you wade knee-deep among Tiger-lilies--what is the one charm wanting?--Water--there is not a drop of water there! Were Niagara but a cataract of sand, would you travel your thousand miles to see it? Why did the poor poet of Tennessee, upon suddenly receiving two handfuls of silver, deliberate whether to buy him a coat, which he sadly needed, or invest his money in a pedestrian trip to Rockaway Beach? Why is almost every robust healthy boy with a robust healthy soul in him, at some time or other crazy to go to sea? Why upon your first voyage as a passenger, did you yourself feel such a mystical vibration, when first told that you and your ship were now out of sight of land? Why did the old Persians hold the sea holy? Why did the Greeks give it a separate deity, and own brother of Jove? Surely all this is not without meaning. And still deeper the meaning of that story of Narcissus, who because he could not grasp the tormenting, mild image he saw in the fountain, plunged into it and was drowned. But that same image, we ourselves see in all rivers and oceans. It is the image of the ungraspable phantom of life; and this is the key to it all. Now, when I say that I am in the habit of going to sea whenever I begin to grow hazy about the eyes, and begin to be over conscious of my lungs, I do not mean to have it inferred that I ever go to sea as a passenger. For to go as a passenger you must needs have a purse, and a purse is but a rag unless you have something in it. Besides, passengers get sea-sick--grow quarrelsome--don't sleep of nights--do not enjoy themselves much, as a general thing;--no, I never go as a passenger; nor, though I am something of a salt, do I ever go to sea as a Commodore, or a Captain, or a Cook. I abandon the glory and distinction of such offices to those who like them. For my part, I abominate all honourable respectable toils, trials, and tribulations of every kind whatsoever. It is quite as much as I can do to take care of myself, without taking care of ships, barques, brigs, schooners, and what not. And as for going as cook,--though I confess there is considerable glory in that, a cook being a sort of officer on ship-board--yet, somehow, I never fancied broiling fowls;--though once broiled, judiciously buttered, and judgmatically salted and peppered, there is no one who will speak more respectfully, not to say reverentially, of a broiled fowl than I will. It is out of the idolatrous dotings of the old Egyptians upon broiled ibis and roasted river horse, that you see the mummies of those creatures in their huge bake-houses the pyramids. No, when I go to sea, I go as a simple sailor, right before the mast, plumb down into the forecastle, aloft there to the royal mast-head. True, they rather order me about some, and make me jump from spar to spar, like a grasshopper in a May meadow. And at first, this sort of thing is unpleasant enough. It touches one's sense of honour, particularly if you come of an old established family in the land, the Van Rensselaers, or Randolphs, or Hardicanutes. And more than all, if just previous to putting your hand into the tar-pot, you have been lording it as a country schoolmaster, making the tallest boys stand in awe of you. The transition is a keen one, I assure you, from a schoolmaster to a sailor, and requires a strong decoction of Seneca and the Stoics to enable you to grin and bear it. But even this wears off in time. What of it, if some old hunks of a sea-captain orders me to get a broom and sweep down the decks? What does that indignity amount to, weighed, I mean, in the scales of the New Testament? Do you think the archangel Gabriel thinks anything the less of me, because I promptly and respectfully obey that old hunks in that particular instance? Who ain't a slave? Tell me that. Well, then, however the old sea-captains may order me about--however they may thump and punch me about, I have the satisfaction of knowing that it is all right; that everybody else is one way or other served in much the same way--either in a physical or metaphysical point of view, that is; and so the universal thump is passed round, and all hands should rub each other's shoulder-blades, and be content. Again, I always go to sea as a sailor, because they make a point of paying me for my trouble, whereas they never pay passengers a single penny that I ever heard of. On the contrary, passengers themselves must pay. And there is all the difference in the world between paying and being paid. The act of paying is perhaps the most uncomfortable infliction that the two orchard thieves entailed upon us. But BEING PAID,--what will compare with it? The urbane activity with which a man receives money is really marvellous, considering that we so earnestly believe money to be the root of all earthly ills, and that on no account can a monied man enter heaven. Ah! how cheerfully we consign ourselves to perdition! Finally, I always go to sea as a sailor, because of the wholesome exercise and pure air of the fore-castle deck. For as in this world, head winds are far more prevalent than winds from astern (that is, if you never violate the Pythagorean maxim), so for the most part the Commodore on the quarter-deck gets his atmosphere at second hand from the sailors on the forecastle. He thinks he breathes it first; but not so. In much the same way do the commonalty lead their leaders in many other things, at the same time that the leaders little suspect it. But wherefore it was that after having repeatedly smelt the sea as a merchant sailor, I should now take it into my head to go on a whaling voyage; this the invisible police officer of the Fates, who has the constant surveillance of me, and secretly dogs me, and influences me in some unaccountable way--he can better answer than any one else. And, doubtless, my going on this whaling voyage, formed part of the grand programme of Providence that was drawn up a long time ago. It came in as a sort of brief interlude and solo between more extensive performances. I take it that this part of the bill must have run something like this: "GRAND CONTESTED ELECTION FOR THE PRESIDENCY OF THE UNITED STATES. "WHALING VOYAGE BY ONE ISHMAEL. "BLOODY BATTLE IN AFFGHANISTAN." Though I cannot tell why it was exactly that those stage managers, the Fates, put me down for this shabby part of a whaling voyage, when others were set down for magnificent parts in high tragedies, and short and easy parts in genteel comedies, and jolly parts in farces--though I cannot tell why this was exactly; yet, now that I recall all the circumstances, I think I can see a little into the springs and motives which being cunningly presented to me under various disguises, induced me to set about performing the part I did, besides cajoling me into the delusion that it was a choice resulting from my own unbiased freewill and discriminating judgment. Chief among these motives was the overwhelming idea of the great whale himself. Such a portentous and mysterious monster roused all my curiosity. Then the wild and distant seas where he rolled his island bulk; the undeliverable, nameless perils of the whale; these, with all the attending marvels of a thousand Patagonian sights and sounds, helped to sway me to my wish. With other men, perhaps, such things would not have been inducements; but as for me, I am tormented with an everlasting itch for things remote. I love to sail forbidden seas, and land on barbarous coasts. Not ignoring what is good, I am quick to perceive a horror, and could still be social with it--would they let me--since it is but well to be on friendly terms with all the inmates of the place one lodges in. By reason of these things, then, the whaling voyage was welcome; the great flood-gates of the wonder-world swung open, and in the wild conceits that swayed me to my purpose, two and two there floated into my inmost soul, endless processions of the whale, and, mid most of them all, one grand hooded phantom, like a snow hill in the air. tokenizers/tests/testthat/moby-ch3.txt0000644000176200001440000007776212775200571017664 0ustar liggesusersCHAPTER 3 The Spouter-Inn Entering that gable-ended Spouter-Inn, you found yourself in a wide, low, straggling entry with old-fashioned wainscots, reminding one of the bulwarks of some condemned old craft. On one side hung a very large oil painting so thoroughly besmoked, and every way defaced, that in the unequal crosslights by which you viewed it, it was only by diligent study and a series of systematic visits to it, and careful inquiry of the neighbors, that you could any way arrive at an understanding of its purpose. Such unaccountable masses of shades and shadows, that at first you almost thought some ambitious young artist, in the time of the New England hags, had endeavored to delineate chaos bewitched. But by dint of much and earnest contemplation, and oft repeated ponderings, and especially by throwing open the little window towards the back of the entry, you at last come to the conclusion that such an idea, however wild, might not be altogether unwarranted. But what most puzzled and confounded you was a long, limber, portentous, black mass of something hovering in the centre of the picture over three blue, dim, perpendicular lines floating in a nameless yeast. A boggy, soggy, squitchy picture truly, enough to drive a nervous man distracted. Yet was there a sort of indefinite, half-attained, unimaginable sublimity about it that fairly froze you to it, till you involuntarily took an oath with yourself to find out what that marvellous painting meant. Ever and anon a bright, but, alas, deceptive idea would dart you through.-- It's the Black Sea in a midnight gale.--It's the unnatural combat of the four primal elements.--It's a blasted heath.-- It's a Hyperborean winter scene.--It's the breaking-up of the icebound stream of Time. But at last all these fancies yielded to that one portentous something in the picture's midst. That once found out, and all the rest were plain. But stop; does it not bear a faint resemblance to a gigantic fish? even the great leviathan himself? In fact, the artist's design seemed this: a final theory of my own, partly based upon the aggregated opinions of many aged persons with whom I conversed upon the subject. The picture represents a Cape-Horner in a great hurricane; the half-foundered ship weltering there with its three dismantled masts alone visible; and an exasperated whale, purposing to spring clean over the craft, is in the enormous act of impaling himself upon the three mast-heads. The opposite wall of this entry was hung all over with a heathenish array of monstrous clubs and spears. Some were thickly set with glittering teeth resembling ivory saws; others were tufted with knots of human hair; and one was sickle-shaped, with a vast handle sweeping round like the segment made in the new-mown grass by a long-armed mower. You shuddered as you gazed, and wondered what monstrous cannibal and savage could ever have gone a death-harvesting with such a hacking, horrifying implement. Mixed with these were rusty old whaling lances and harpoons all broken and deformed. Some were storied weapons. With this once long lance, now wildly elbowed, fifty years ago did Nathan Swain kill fifteen whales between a sunrise and a sunset. And that harpoon--so like a corkscrew now--was flung in Javan seas, and run away with by a whale, years afterwards slain off the Cape of Blanco. The original iron entered nigh the tail, and, like a restless needle sojourning in the body of a man, travelled full forty feet, and at last was found imbedded in the hump. Crossing this dusky entry, and on through yon low-arched way-- cut through what in old times must have been a great central chimney with fireplaces all round--you enter the public room. A still duskier place is this, with such low ponderous beams above, and such old wrinkled planks beneath, that you would almost fancy you trod some old craft's cockpits, especially of such a howling night, when this corner-anchored old ark rocked so furiously. On one side stood a long, low, shelf-like table covered with cracked glass cases, filled with dusty rarities gathered from this wide world's remotest nooks. Projecting from the further angle of the room stands a dark-looking den--the bar--a rude attempt at a right whale's head. Be that how it may, there stands the vast arched bone of the whale's jaw, so wide, a coach might almost drive beneath it. Within are shabby shelves, ranged round with old decanters, bottles, flasks; and in those jaws of swift destruction, like another cursed Jonah (by which name indeed they called him), bustles a little withered old man, who, for their money, dearly sells the sailors deliriums and death. Abominable are the tumblers into which he pours his poison. Though true cylinders without--within, the villanous green goggling glasses deceitfully tapered downwards to a cheating bottom. Parallel meridians rudely pecked into the glass, surround these footpads' goblets. Fill to this mark, and your charge is but a penny; to this a penny more; and so on to the full glass-- the Cape Horn measure, which you may gulp down for a shilling. Upon entering the place I found a number of young seamen gathered about a table, examining by a dim light divers specimens of skrimshander. I sought the landlord, and telling him I desired to be accommodated with a room, received for answer that his house was full-- not a bed unoccupied. "But avast," he added, tapping his forehead, "you haint no objections to sharing a harpooneer's blanket, have ye? I s'pose you are goin' a-whalin', so you'd better get used to that sort of thing." I told him that I never liked to sleep two in a bed; that if I should ever do so, it would depend upon who the harpooneer might be, and that if he (the landlord) really had no other place for me, and the harpooneer was not decidedly objectionable, why rather than wander further about a strange town on so bitter a night, I would put up with the half of any decent man's blanket. "I thought so. All right; take a seat. Supper?--you want supper? Supper'll be ready directly." I sat down on an old wooden settle, carved all over like a bench on the Battery. At one end a ruminating tar was still further adorning it with his jack-knife, stooping over and diligently working away at the space between his legs. He was trying his hand at a ship under full sail, but he didn't make much headway, I thought. At last some four or five of us were summoned to our meal in an adjoining room. It was cold as Iceland-- no fire at all--the landlord said he couldn't afford it. Nothing but two dismal tallow candles, each in a winding sheet. We were fain to button up our monkey jackets, and hold to our lips cups of scalding tea with our half frozen fingers. But the fare was of the most substantial kind--not only meat and potatoes, but dumplings; good heavens! dumplings for supper! One young fellow in a green box coat, addressed himself to these dumplings in a most direful manner. "My boy," said the landlord, "you'll have the nightmare to a dead sartainty." "Landlord," I whispered, "that aint the harpooneer is it?" "Oh, no," said he, looking a sort of diabolically funny, "the harpooneer is a dark complexioned chap. He never eats dumplings, he don't-- he eats nothing but steaks, and he likes 'em rare." "The devil he does," says I. "Where is that harpooneer? Is he here?" "He'll be here afore long," was the answer. I could not help it, but I began to feel suspicious of this "dark complexioned" harpooneer. At any rate, I made up my mind that if it so turned out that we should sleep together, he must undress and get into bed before I did. Supper over, the company went back to the bar-room, when, knowing not what else to do with myself, I resolved to spend the rest of the evening as a looker on. Presently a rioting noise was heard without. Starting up, the landlord cried, "That's the Grampus's crew. I seed her reported in the offing this morning; a three years' voyage, and a full ship. Hurrah, boys; now we'll have the latest news from the Feegees." A tramping of sea boots was heard in the entry; the door was flung open, and in rolled a wild set of mariners enough. Enveloped in their shaggy watch coats, and with their heads muffled in woollen comforters, all bedarned and ragged, and their beards stiff with icicles, they seemed an eruption of bears from Labrador. They had just landed from their boat, and this was the first house they entered. No wonder, then, that they made a straight wake for the whale's mouth-- the bar--when the wrinkled little old Jonah, there officiating, soon poured them out brimmers all round. One complained of a bad cold in his head, upon which Jonah mixed him a pitch-like potion of gin and molasses, which he swore was a sovereign cure for all colds and catarrhs whatsoever, never mind of how long standing, or whether caught off the coast of Labrador, or on the weather side of an ice-island. The liquor soon mounted into their heads, as it generally does even with the arrantest topers newly landed from sea, and they began capering about most obstreperously. I observed, however, that one of them held somewhat aloof, and though he seemed desirous not to spoil the hilarity of his shipmates by his own sober face, yet upon the whole he refrained from making as much noise as the rest. This man interested me at once; and since the sea-gods had ordained that he should soon become my shipmate (though but a sleeping partner one, so far as this narrative is concerned), I will here venture upon a little description of him. He stood full six feet in height, with noble shoulders, and a chest like a coffer-dam. I have seldom seen such brawn in a man. His face was deeply brown and burnt, making his white teeth dazzling by the contrast; while in the deep shadows of his eyes floated some reminiscences that did not seem to give him much joy. His voice at once announced that he was a Southerner, and from his fine stature, I thought he must be one of those tall mountaineers from the Alleghanian Ridge in Virginia. When the revelry of his companions had mounted to its height, this man slipped away unobserved, and I saw no more of him till he became my comrade on the sea. In a few minutes, however, he was missed by his shipmates, and being, it seems, for some reason a huge favorite with them, they raised a cry of "Bulkington! Bulkington! where's Bulkington?" and darted out of the house in pursuit of him. It was now about nine o'clock, and the room seeming almost supernaturally quiet after these orgies, I began to congratulate myself upon a little plan that had occurred to me just previous to the entrance of the seamen. No man prefers to sleep two in a bed. In fact, you would a good deal rather not sleep with your own brother. I don't know how it is, but people like to be private when they are sleeping. And when it comes to sleeping with an unknown stranger, in a strange inn, in a strange town, and that stranger a harpooneer, then your objections indefinitely multiply. Nor was there any earthly reason why I as a sailor should sleep two in a bed, more than anybody else; for sailors no more sleep two in a bed at sea, than bachelor Kings do ashore. To be sure they all sleep together in one apartment, but you have your own hammock, and cover yourself with your own blanket, and sleep in your own skin. The more I pondered over this harpooneer, the more I abominated the thought of sleeping with him. It was fair to presume that being a harpooneer, his linen or woollen, as the case might be, would not be of the tidiest, certainly none of the finest. I began to twitch all over. Besides, it was getting late, and my decent harpooneer ought to be home and going bedwards. Suppose now, he should tumble in upon me at midnight-- how could I tell from what vile hole he had been coming? "Landlord! I've changed my mind about that harpooneer.-- I shan't sleep with him. I'll try the bench here." "Just as you please; I'm sorry I cant spare ye a tablecloth for a mattress, and it's a plaguy rough board here"--feeling of the knots and notches. "But wait a bit, Skrimshander; I've got a carpenter's plane there in the bar--wait, I say, and I'll make ye snug enough." So saying he procured the plane; and with his old silk handkerchief first dusting the bench, vigorously set to planing away at my bed, the while grinning like an ape. The shavings flew right and left; till at last the plane-iron came bump against an indestructible knot. The landlord was near spraining his wrist, and I told him for heaven's sake to quit--the bed was soft enough to suit me, and I did not know how all the planing in the world could make eider down of a pine plank. So gathering up the shavings with another grin, and throwing them into the great stove in the middle of the room, he went about his business, and left me in a brown study. I now took the measure of the bench, and found that it was a foot too short; but that could be mended with a chair. But it was a foot too narrow, and the other bench in the room was about four inches higher than the planed one-- so there was no yoking them. I then placed the first bench lengthwise along the only clear space against the wall, leaving a little interval between, for my back to settle down in. But I soon found that there came such a draught of cold air over me from under the sill of the window, that this plan would never do at all, especially as another current from the rickety door met the one from the window, and both together formed a series of small whirlwinds in the immediate vicinity of the spot where I had thought to spend the night. The devil fetch that harpooneer, thought I, but stop, couldn't I steal a march on him--bolt his door inside, and jump into his bed, not to be wakened by the most violent knockings? It seemed no bad idea but upon second thoughts I dismissed it. For who could tell but what the next morning, so soon as I popped out of the room, the harpooneer might be standing in the entry, all ready to knock me down! Still looking around me again, and seeing no possible chance of spending a sufferable night unless in some other person's bed, I began to think that after all I might be cherishing unwarrantable prejudices against this unknown harpooneer. Thinks I, I'll wait awhile; he must be dropping in before long. I'll have a good look at him then, and perhaps we may become jolly good bedfellows after all--there's no telling. But though the other boarders kept coming in by ones, twos, and threes, and going to bed, yet no sign of my harpooneer. "Landlord! said I, "what sort of a chap is he--does he always keep such late hours?" It was now hard upon twelve o'clock. The landlord chuckled again with his lean chuckle, and seemed to be mightily tickled at something beyond my comprehension. "No," he answered, "generally he's an early bird--airley to bed and airley to rise--yea, he's the bird what catches the worm. But to-night he went out a peddling, you see, and I don't see what on airth keeps him so late, unless, may be, he can't sell his head." "Can't sell his head?--What sort of a bamboozingly story is this you are telling me?" getting into a towering rage. "Do you pretend to say, landlord, that this harpooneer is actually engaged this blessed Saturday night, or rather Sunday morning, in peddling his head around this town?" "That's precisely it," said the landlord, "and I told him he couldn't sell it here, the market's overstocked." "With what?" shouted I. "With heads to be sure; ain't there too many heads in the world?" "I tell you what it is, landlord," said I quite calmly, "you'd better stop spinning that yarn to me--I'm not green." "May be not," taking out a stick and whittling a toothpick, "but I rayther guess you'll be done brown if that ere harpooneer hears you a slanderin' his head." "I'll break it for him," said I, now flying into a passion again at this unaccountable farrago of the landlord's. "It's broke a'ready," said he. "Broke," said I--"broke, do you mean?" "Sartain, and that's the very reason he can't sell it, I guess." "Landlord," said I, going up to him as cool as Mt. Hecla in a snowstorm--"landlord, stop whittling. You and I must understand one another, and that too without delay. I come to your house and want a bed; you tell me you can only give me half a one; that the other half belongs to a certain harpooneer. And about this harpooneer, whom I have not yet seen, you persist in telling me the most mystifying and exasperating stories tending to beget in me an uncomfortable feeling towards the man whom you design for my bedfellow--a sort of connexion, landlord, which is an intimate and confidential one in the highest degree. I now demand of you to speak out and tell me who and what this harpooneer is, and whether I shall be in all respects safe to spend the night with him. And in the first place, you will be so good as to unsay that story about selling his head, which if true I take to be good evidence that this harpooneer is stark mad, and I've no idea of sleeping with a madman; and you, sir, you I mean, landlord, you, sir, by trying to induce me to do so knowingly would thereby render yourself liable to a criminal prosecution." "Wall," said the landlord, fetching a long breath, "that's a purty long sarmon for a chap that rips a little now and then. But be easy, be easy, this here harpooneer I have been tellin' you of has just arrived from the south seas, where he bought up a lot of 'balmed New Zealand heads (great curios, you know), and he's sold all on 'em but one, and that one he's trying to sell to-night, cause to-morrow's Sunday, and it would not do to be sellin' human heads about the streets when folks is goin' to churches. He wanted to last Sunday, but I stopped him just as he was goin' out of the door with four heads strung on a string, for all the airth like a string of inions." This account cleared up the otherwise unaccountable mystery, and showed that the landlord, after all, had had no idea of fooling me-- but at the same time what could I think of a harpooneer who stayed out of a Saturday night clean into the holy Sabbath, engaged in such a cannibal business as selling the heads of dead idolators? "Depend upon it, landlord, that harpooneer is a dangerous man." "He pays reg'lar," was the rejoinder. "But come, it's getting dreadful late, you had better be turning flukes--it's a nice bed: Sal and me slept in that ere bed the night we were spliced. There's plenty of room for two to kick about in that bed; it's an almighty big bed that. Why, afore we give it up, Sal used to put our Sam and little Johnny in the foot of it. But I got a dreaming and sprawling about one night, and somehow, Sam got pitched on the floor, and came near breaking his arm. After that, Sal said it wouldn't do. Come along here, I'll give ye a glim in a jiffy;" and so saying he lighted a candle and held it towards me, offering to lead the way. But I stood irresolute; when looking at a clock in the corner, he exclaimed "I vum it's Sunday--you won't see that harpooneer to-night; he's come to anchor somewhere--come along then; do come; won't ye come?" I considered the matter a moment, and then up stairs we went, and I was ushered into a small room, cold as a clam, and furnished, sure enough, with a prodigious bed, almost big enough indeed for any four harpooneers to sleep abreast. "There," said the landlord, placing the candle on a crazy old sea chest that did double duty as a wash-stand and centre table; "there, make yourself comfortable now; and good night to ye." I turned round from eyeing the bed, but he had disappeared. Folding back the counterpane, I stooped over the bed. Though none of the most elegant, it yet stood the scrutiny tolerably well. I then glanced round the room; and besides the bedstead and centre table, could see no other furniture belonging to the place, but a rude shelf, the four walls, and a papered fireboard representing a man striking a whale. Of things not properly belonging to the room, there was a hammock lashed up, and thrown upon the floor in one corner; also a large seaman's bag, containing the harpooneer's wardrobe, no doubt in lieu of a land trunk. Likewise, there was a parcel of outlandish bone fish hooks on the shelf over the fire-place, and a tall harpoon standing at the head of the bed. But what is this on the chest? I took it up, and held it close to the light, and felt it, and smelt it, and tried every way possible to arrive at some satisfactory conclusion concerning it. I can compare it to nothing but a large door mat, ornamented at the edges with little tinkling tags something like the stained porcupine quills round an Indian moccasin. There was a hole or slit in the middle of this mat, as you see the same in South American ponchos. But could it be possible that any sober harpooneer would get into a door mat, and parade the streets of any Christian town in that sort of guise? I put it on, to try it, and it weighed me down like a hamper, being uncommonly shaggy and thick, and I thought a little damp, as though this mysterious harpooneer had been wearing it of a rainy day. I went up in it to a bit of glass stuck against the wall, and I never saw such a sight in my life. I tore myself out of it in such a hurry that I gave myself a kink in the neck. I sat down on the side of the bed, and commenced thinking about this head-peddling harpooneer, and his door mat. After thinking some time on the bed-side, I got up and took off my monkey jacket, and then stood in the middle of the room thinking. I then took off my coat, and thought a little more in my shirt sleeves. But beginning to feel very cold now, half undressed as I was, and remembering what the landlord said about the harpooneer's not coming home at all that night, it being so very late, I made no more ado, but jumped out of my pantaloons and boots, and then blowing out the light tumbled into bed, and commended myself to the care of heaven. Whether that mattress was stuffed with corncobs or broken crockery, there is no telling, but I rolled about a good deal, and could not sleep for a long time. At last I slid off into a light doze, and had pretty nearly made a good offing towards the land of Nod, when I heard a heavy footfall in the passage, and saw a glimmer of light come into the room from under the door. Lord save me, thinks I, that must be the harpooneer, the infernal head-peddler. But I lay perfectly still, and resolved not to say a word till spoken to. Holding a light in one hand, and that identical New Zealand head in the other, the stranger entered the room, and without looking towards the bed, placed his candle a good way off from me on the floor in one corner, and then began working away at the knotted cords of the large bag I before spoke of as being in the room. I was all eagerness to see his face, but he kept it averted for some time while employed in unlacing the bag's mouth. This accomplished, however, he turned round--when, good heavens; what a sight! Such a face! It was of a dark, purplish, yellow color, here and there stuck over with large blackish looking squares. Yes, it's just as I thought, he's a terrible bedfellow; he's been in a fight, got dreadfully cut, and here he is, just from the surgeon. But at that moment he chanced to turn his face so towards the light, that I plainly saw they could not be sticking-plasters at all, those black squares on his cheeks. They were stains of some sort or other. At first I knew not what to make of this; but soon an inkling of the truth occurred to me. I remembered a story of a white man--a whaleman too-- who, falling among the cannibals, had been tattooed by them. I concluded that this harpooneer, in the course of his distant voyages, must have met with a similar adventure. And what is it, thought I, after all! It's only his outside; a man can be honest in any sort of skin. But then, what to make of his unearthly complexion, that part of it, I mean, lying round about, and completely independent of the squares of tattooing. To be sure, it might be nothing but a good coat of tropical tanning; but I never heard of a hot sun's tanning a white man into a purplish yellow one. However, I had never been in the South Seas; and perhaps the sun there produced these extraordinary effects upon the skin. Now, while all these ideas were passing through me like lightning, this harpooneer never noticed me at all. But, after some difficulty having opened his bag, he commenced fumbling in it, and presently pulled out a sort of tomahawk, and a seal-skin wallet with the hair on. Placing these on the old chest in the middle of the room, he then took the New Zealand head--a ghastly thing enough-- and crammed it down into the bag. He now took off his hat-- a new beaver hat--when I came nigh singing out with fresh surprise. There was no hair on his head--none to speak of at least-- nothing but a small scalp-knot twisted up on his forehead. His bald purplish head now looked for all the world like a mildewed skull. Had not the stranger stood between me and the door, I would have bolted out of it quicker than ever I bolted a dinner. Even as it was, I thought something of slipping out of the window, but it was the second floor back. I am no coward, but what to make of this headpeddling purple rascal altogether passed my comprehension. Ignorance is the parent of fear, and being completely nonplussed and confounded about the stranger, I confess I was now as much afraid of him as if it was the devil himself who had thus broken into my room at the dead of night. In fact, I was so afraid of him that I was not game enough just then to address him, and demand a satisfactory answer concerning what seemed inexplicable in him. Meanwhile, he continued the business of undressing, and at last showed his chest and arms. As I live, these covered parts of him were checkered with the same squares as his face, his back, too, was all over the same dark squares; he seemed to have been in a Thirty Years' War, and just escaped from it with a sticking-plaster shirt. Still more, his very legs were marked, as if a parcel of dark green frogs were running up the trunks of young palms. It was now quite plain that he must be some abominable savage or other shipped aboard of a whaleman in the South Seas, and so landed in this Christian country. I quaked to think of it. A peddler of heads too--perhaps the heads of his own brothers. He might take a fancy to mine--heavens! look at that tomahawk! But there was no time for shuddering, for now the savage went about something that completely fascinated my attention, and convinced me that he must indeed be a heathen. Going to his heavy grego, or wrapall, or dreadnaught, which he had previously hung on a chair, he fumbled in the pockets, and produced at length a curious little deformed image with a hunch on its back, and exactly the color of a three days' old Congo baby. Remembering the embalmed head, at first I almost thought that this black manikin was a real baby preserved in some similar manner. But seeing that it was not at all limber, and that it glistened a good deal like polished ebony, I concluded that it must be nothing but a wooden idol, which indeed it proved to be. For now the savage goes up to the empty fire-place, and removing the papered fire-board, sets up this little hunch-backed image, like a tenpin, between the andirons. The chimney jambs and all the bricks inside were very sooty, so that I thought this fire-place made a very appropriate little shrine or chapel for his Congo idol. I now screwed my eyes hard towards the half hidden image, feeling but ill at ease meantime--to see what was next to follow. First he takes about a double handful of shavings out of his grego pocket, and places them carefully before the idol; then laying a bit of ship biscuit on top and applying the flame from the lamp, he kindled the shavings into a sacrificial blaze. Presently, after many hasty snatches into the fire, and still hastier withdrawals of his fingers (whereby he seemed to be scorching them badly), he at last succeeded in drawing out the biscuit; then blowing off the heat and ashes a little, he made a polite offer of it to the little negro. But the little devil did not seem to fancy such dry sort of fare at all; he never moved his lips. All these strange antics were accompanied by still stranger guttural noises from the devotee, who seemed to be praying in a sing-song or else singing some pagan psalmody or other, during which his face twitched about in the most unnatural manner. At last extinguishing the fire, he took the idol up very unceremoniously, and bagged it again in his grego pocket as carelessly as if he were a sportsman bagging a dead woodcock. All these queer proceedings increased my uncomfortableness, and seeing him now exhibiting strong symptoms of concluding his business operations, and jumping into bed with me, I thought it was high time, now or never, before the light was put out, to break the spell in which I had so long been bound. But the interval I spent in deliberating what to say, was a fatal one. Taking up his tomahawk from the table, he examined the head of it for an instant, and then holding it to the light, with his mouth at the handle, he puffed out great clouds of tobacco smoke. The next moment the light was extinguished, and this wild cannibal, tomahawk between his teeth, sprang into bed with me. I sang out, I could not help it now; and giving a sudden grunt of astonishment he began feeling me. Stammering out something, I knew not what, I rolled away from him against the wall, and then conjured him, whoever or whatever he might be, to keep quiet, and let me get up and light the lamp again. But his guttural responses satisfied me at once that he but ill comprehended my meaning. "Who-e debel you?"--he at last said--"you no speak-e, dam-me, I kill-e." And so saying the lighted tomahawk began flourishing about me in the dark. "Landlord, for God's sake, Peter Coffin!" shouted I. "Landlord! Watch! Coffin! Angels! save me!" "Speak-e! tell-ee me who-ee be, or dam-me, I kill-e!" again growled the cannibal, while his horrid flourishings of the tomahawk scattered the hot tobacco ashes about me till I thought my linen would get on fire. But thank heaven, at that moment the landlord came into the room light in hand, and leaping from the bed I ran up to him. "Don't be afraid now," said he, grinning again, "Queequeg here wouldn't harm a hair of your head." "Stop your grinning," shouted I, "and why didn't you tell me that that infernal harpooneer was a cannibal?" "I thought ye know'd it;--didn't I tell ye, he was a peddlin' heads around town?--but turn flukes again and go to sleep. Queequeg, look here--you sabbee me, I sabbee--you this man sleepe you--you sabbee?" "Me sabbee plenty"--grunted Queequeg, puffing away at his pipe and sitting up in bed. "You gettee in," he added, motioning to me with his tomahawk, and throwing the clothes to one side. He really did this in not only a civil but a really kind and charitable way. I stood looking at him a moment. For all his tattooings he was on the whole a clean, comely looking cannibal. What's all this fuss I have been making about, thought I to myself--the man's a human being just as I am: he has just as much reason to fear me, as I have to be afraid of him. Better sleep with a sober cannibal than a drunken Christian. "Landlord," said I, "tell him to stash his tomahawk there, or pipe, or whatever you call it; tell him to stop smoking, in short, and I will turn in with him. But I don't fancy having a man smoking in bed with me. It's dangerous. Besides, I ain't insured." This being told to Queequeg, he at once complied, and again politely motioned me to get into bed--rolling over to one side as much as to say-- I won't touch a leg of ye." "Good night, landlord," said I, "you may go." I turned in, and never slept better in my life. tokenizers/tests/testthat/test-ngrams.R0000644000176200001440000001050213252224016020033 0ustar liggesuserscontext("N-gram tokenizers") test_that("Shingled n-gram tokenizer works as expected", { stopwords <- c("chapter", "me") out_l <- tokenize_ngrams(docs_l, n = 3, n_min = 2, stopwords = stopwords) out_c <- tokenize_ngrams(docs_c, n = 3, n_min = 2, stopwords = stopwords) out_1 <- tokenize_ngrams(docs_c[1], n = 3, n_min = 2, stopwords = stopwords, simplify = TRUE) expect_is(out_l, "list") expect_is(out_l[[1]], "character") expect_is(out_c, "list") expect_is(out_c[[1]], "character") expect_is(out_1, "character") expect_identical(out_l, out_c) expect_identical(out_l[[1]], out_1) expect_identical(out_c[[1]], out_1) # test for https://github.com/lmullen/tokenizers/issues/14 expect_identical(tokenize_ngrams("one two three", n = 3, n_min = 2), tokenize_ngrams("one two three", n = 5, n_min = 2)) expect_named(out_l, names(docs_l)) expect_named(out_c, names(docs_c)) expect_error(tokenize_ngrams(bad_list)) }) test_that("Shingled n-gram tokenizer produces correct output", { # skip_on_os("windows") stopwords <- c("chapter", "me") out_1 <- tokenize_ngrams(docs_c[1], n = 3, n_min = 2, stopwords = stopwords, simplify = TRUE) expected <- c("1 loomings", "1 loomings call", "loomings call", "loomings call ishmael", "call ishmael", "call ishmael some") expect_identical(head(out_1, 6), expected) }) test_that("Shingled n-gram tokenizer consistently produces NAs where appropriate", { test <- c("This is a text", NA, "So is this") names(test) <- letters[1:3] out <- tokenize_ngrams(test) expect_true(is.na(out$b)) }) test_that("Skip n-gram tokenizer consistently produces NAs where appropriate", { test <- c("This is a text", NA, "So is this") names(test) <- letters[1:3] out <- tokenize_skip_ngrams(test) expect_true(is.na(out$b)) }) test_that("Skip n-gram tokenizer can use stopwords", { test <- c("This is a text", "So is this") names(test) <- letters[1:2] out <- tokenize_skip_ngrams(test, stopwords = "is", n = 2, n_min = 2) expect_equal(length(out$a), 3) expect_identical(out$a[1], "this a") }) test_that("Skips with values greater than k are refused", { expect_false(check_width(c(0, 4, 5), k = 2)) expect_true(check_width(c(0, 3, 5), k = 2)) expect_false(check_width(c(0, 1, 3), k = 0)) expect_true(check_width(c(0, 1, 2), k = 0)) expect_false(check_width(c(0, 10, 11, 12), k = 5)) expect_true(check_width(c(0, 6, 11, 16, 18), k = 5)) }) test_that("Combinations for skip grams are correct", { skip_pos <- get_valid_skips(2, 2) expect_is(skip_pos, "list") expect_length(skip_pos, 3) expect_identical(skip_pos, list(c(0, 1), c(0, 2), c(0, 3))) skip_pos2 <- get_valid_skips(3, 2) expect_identical(skip_pos2, list( c(0, 1, 2), c(0, 1, 3), c(0, 1, 4), c(0, 2, 3), c(0, 2, 4), c(0, 2, 5), c(0, 3, 4), c(0, 3, 5), c(0, 3, 6))) }) test_that("Skip n-gram tokenizer works as expected", { stopwords <- c("chapter", "me") out_l <- tokenize_skip_ngrams(docs_l, n = 3, k = 2) out_c <- tokenize_skip_ngrams(docs_c, n = 3, k = 2) out_1 <- tokenize_skip_ngrams(docs_c[1], n = 3, k = 2, simplify = TRUE) expect_is(out_l, "list") expect_is(out_l[[1]], "character") expect_is(out_c, "list") expect_is(out_c[[1]], "character") expect_is(out_1, "character") expect_identical(out_l, out_c) expect_identical(out_l[[1]], out_1) expect_identical(out_c[[1]], out_1) expect_named(out_l, names(docs_l)) expect_named(out_c, names(docs_c)) expect_error(tokenize_skip_ngrams(bad_list)) }) test_that("Skip n-gram tokenizer produces correct output", { out_n2_k2 <- tokenize_skip_ngrams(input, n = 2, n_min = 2, k = 2, simplify = TRUE) expect_equal(sort(skip2_bigrams), sort(out_n2_k2)) out_n3_k2 <- tokenize_skip_ngrams(input, n = 3, n_min = 3, k = 2, simplify = TRUE) expect_equal(sort(skip2_trigrams), sort(out_n3_k2)) }) test_that("Skip n-gram tokenizers respects stopwords", { out_1 <- tokenize_skip_ngrams("This is a sentence that is for the test.", n = 3, k = 2, stopwords = c("a", "the"), simplify = TRUE) expect_equal(length(grep("the", out_1)), 0) }) test_that("Skip n-gram tokenizer warns about large combinations", { expect_warning(get_valid_skips(n = 7, k = 2), "Input n and k will") }) tokenizers/tests/testthat/test-stem.R0000644000176200001440000000152312775200571017530 0ustar liggesuserscontext("Stem tokenizers") test_that("Word stem tokenizer works as expected", { out_l <- tokenize_word_stems(docs_l) out_c <- tokenize_word_stems(docs_c) out_1 <- tokenize_word_stems(docs_c[1], simplify = TRUE) expect_is(out_l, "list") expect_is(out_l[[1]], "character") expect_is(out_c, "list") expect_is(out_c[[1]], "character") expect_is(out_1, "character") expect_identical(out_l, out_c) expect_identical(out_l[[1]], out_1) expect_identical(out_c[[1]], out_1) expect_named(out_l, names(docs_l)) expect_named(out_c, names(docs_c)) expect_error(tokenize_word_stems(bad_list)) }) test_that("Stem tokenizer produces correct output", { # skip_on_os("windows") out_1 <- tokenize_word_stems(docs_c[1], simplify = TRUE) expected <- c("in", "my", "purs", "and", "noth") expect_identical(out_1[20:24], expected) }) tokenizers/tests/testthat/helper-data.R0000644000176200001440000000262613256545214017777 0ustar liggesuserspaths <- list.files(".", pattern = "\\.txt$", full.names = TRUE) docs_full <- lapply(paths, readLines, encoding = "UTF-8") docs_l <- lapply(docs_full, paste, collapse = "\n") # docs_l <- lapply(docs_full, enc2utf8) docs_c <- unlist(docs_l) names(docs_l) <- basename(paths) names(docs_c) <- basename(paths) docs_df <- data.frame(doc_id = names(docs_c), text = unname(docs_c), stringsAsFactors = FALSE) bad_list <- list(a = paste(letters, collapse = " "), b = letters) # Using this sample sentence only because it comes from the paper where # skip n-grams are defined. Not my favorite sentence. input <- "Insurgents killed in ongoing fighting." bigrams <- c("insurgents killed", "killed in", "in ongoing", "ongoing fighting") skip2_bigrams <- c("insurgents killed", "insurgents in", "insurgents ongoing", "killed in", "killed ongoing", "killed fighting", "in ongoing", "in fighting", "ongoing fighting") trigrams <- c("insurgents killed in", "killed in ongoing", "in ongoing fighting") skip2_trigrams <- c("insurgents killed in", "insurgents killed ongoing", "insurgents killed fighting", "insurgents in ongoing", "insurgents in fighting", "insurgents ongoing fighting", "killed in ongoing", "killed in fighting", "killed ongoing fighting", "in ongoing fighting") tokenizers/src/0000755000176200001440000000000013257220650013240 5ustar liggesuserstokenizers/src/skip_ngrams.cpp0000644000176200001440000000436413257220650016270 0ustar liggesusers#include using namespace Rcpp; CharacterVector skip_ngrams(CharacterVector words, ListOf& skips, std::set& stopwords) { std::deque < std::string > checked_words; std::string str_holding; // Eliminate stopwords for(unsigned int i = 0; i < words.size(); i++){ if(words[i] != NA_STRING){ str_holding = as(words[i]); if(stopwords.find(str_holding) == stopwords.end()){ checked_words.push_back(str_holding); } } } str_holding.clear(); std::deque < std::string > holding; unsigned int checked_size = checked_words.size(); for(unsigned int w = 0; w < checked_size; w++) { for(unsigned int i = 0; i < skips.size(); i++){ unsigned int in_size = skips[i].size(); if(skips[i][in_size-1] + w < checked_size){ for(unsigned int j = 0; j < skips[i].size(); j++){ str_holding += " " + checked_words[skips[i][j] + w]; } if(str_holding.size()){ str_holding.erase(0,1); } holding.push_back(str_holding); str_holding.clear(); } } } if(!holding.size()){ return CharacterVector(1,NA_STRING); } CharacterVector output(holding.size()); for(unsigned int i = 0; i < holding.size(); i++){ if(holding[i].size()){ output[i] = String(holding[i], CE_UTF8); } else { output[i] = NA_STRING; } } return output; } //[[Rcpp::export]] ListOf skip_ngrams_vectorised(ListOf words, ListOf skips, CharacterVector stopwords){ // Create output object and set up for further work unsigned int input_size = words.size(); List output(input_size); // Create stopwords set std::set < std::string > checked_stopwords; for(unsigned int i = 0; i < stopwords.size(); i++){ if(stopwords[i] != NA_STRING){ checked_stopwords.insert(as(stopwords[i])); } } for(unsigned int i = 0; i < input_size; i++){ if(i % 10000 == 0){ Rcpp::checkUserInterrupt(); } output[i] = skip_ngrams(words[i], skips, checked_stopwords); } return output; } tokenizers/src/shingle_ngrams.cpp0000644000176200001440000000711013257220650016743 0ustar liggesusers#include using namespace Rcpp; // calculates size of the ngram vector inline size_t get_ngram_seq_len(int input_len, int ngram_min, int ngram_max) { int out_ngram_len_adjust = 0; for (size_t i = ngram_min - 1; i < ngram_max; i++) out_ngram_len_adjust += i; if(input_len < ngram_min) return 0; else return input_len * (ngram_max - ngram_min + 1) - out_ngram_len_adjust; } CharacterVector generate_ngrams_internal(const CharacterVector terms_raw, const int ngram_min, const int ngram_max, std::set &stopwords, // pass buffer by reference to avoid memory allocation // on each iteration std::deque &terms_filtered_buffer, const std::string ngram_delim) { // clear buffer from previous iteration result terms_filtered_buffer.clear(); std::string term; // filter out stopwords for (size_t i = 0; i < terms_raw.size(); i++) { term = as(terms_raw[i]); if(stopwords.find(term) == stopwords.end()) terms_filtered_buffer.push_back(term); } int len = terms_filtered_buffer.size(); size_t ngram_out_len = get_ngram_seq_len(len, ngram_min, std::min(ngram_max, len)); CharacterVector result(ngram_out_len); std::string k_gram; size_t k, i = 0, j_max_observed; // iterates through input vector by window of size = n_max and build n-grams // for terms ["a", "b", "c", "d"] and n_min = 1, n_max = 3 // will build 1:3-grams in following order //"a" "a_b" "a_b_c" "b" "b_c" "b_c_d" "c" "c_d" "d" for(size_t j = 0; j < len; j++ ) { k = 1; j_max_observed = j; while (k <= ngram_max && j_max_observed < len) { if( k == 1) { k_gram = terms_filtered_buffer[j_max_observed]; } else { k_gram = k_gram + ngram_delim + terms_filtered_buffer[j_max_observed]; } if(k >= ngram_min) { result[i] = String(k_gram, CE_UTF8); i++; } j_max_observed = j + k; k = k + 1; } } if(!result.size()){ result.push_back(NA_STRING); } return result; } // [[Rcpp::export]] ListOf generate_ngrams_batch(const ListOf documents_list, const int ngram_min, const int ngram_max, CharacterVector stopwords = CharacterVector(), const String ngram_delim = " ") { std::deque terms_filtered_buffer; const std::string std_string_delim = ngram_delim.get_cstring(); size_t n_docs = documents_list.size(); List result(n_docs); CharacterVector terms; std::set stopwords_set; for(size_t i = 0; i < stopwords.size(); i++){ if(stopwords[i] != NA_STRING){ stopwords_set.insert(as(stopwords[i])); } } for (size_t i_document = 0; i_document < n_docs; i_document++) { if(i_document % 10000 == 0){ Rcpp::checkUserInterrupt(); } terms = documents_list[i_document]; result[i_document] = generate_ngrams_internal(documents_list[i_document], ngram_min, ngram_max, stopwords_set, terms_filtered_buffer, std_string_delim); } return result; } tokenizers/src/RcppExports.cpp0000644000176200001440000000447713257220650016251 0ustar liggesusers// Generated by using Rcpp::compileAttributes() -> do not edit by hand // Generator token: 10BE3573-1514-4C36-9D1C-5A225CD40393 #include using namespace Rcpp; // generate_ngrams_batch ListOf generate_ngrams_batch(const ListOf documents_list, const int ngram_min, const int ngram_max, CharacterVector stopwords, const String ngram_delim); RcppExport SEXP _tokenizers_generate_ngrams_batch(SEXP documents_listSEXP, SEXP ngram_minSEXP, SEXP ngram_maxSEXP, SEXP stopwordsSEXP, SEXP ngram_delimSEXP) { BEGIN_RCPP Rcpp::RObject rcpp_result_gen; Rcpp::RNGScope rcpp_rngScope_gen; Rcpp::traits::input_parameter< const ListOf >::type documents_list(documents_listSEXP); Rcpp::traits::input_parameter< const int >::type ngram_min(ngram_minSEXP); Rcpp::traits::input_parameter< const int >::type ngram_max(ngram_maxSEXP); Rcpp::traits::input_parameter< CharacterVector >::type stopwords(stopwordsSEXP); Rcpp::traits::input_parameter< const String >::type ngram_delim(ngram_delimSEXP); rcpp_result_gen = Rcpp::wrap(generate_ngrams_batch(documents_list, ngram_min, ngram_max, stopwords, ngram_delim)); return rcpp_result_gen; END_RCPP } // skip_ngrams_vectorised ListOf skip_ngrams_vectorised(ListOf words, ListOf skips, CharacterVector stopwords); RcppExport SEXP _tokenizers_skip_ngrams_vectorised(SEXP wordsSEXP, SEXP skipsSEXP, SEXP stopwordsSEXP) { BEGIN_RCPP Rcpp::RObject rcpp_result_gen; Rcpp::RNGScope rcpp_rngScope_gen; Rcpp::traits::input_parameter< ListOf >::type words(wordsSEXP); Rcpp::traits::input_parameter< ListOf >::type skips(skipsSEXP); Rcpp::traits::input_parameter< CharacterVector >::type stopwords(stopwordsSEXP); rcpp_result_gen = Rcpp::wrap(skip_ngrams_vectorised(words, skips, stopwords)); return rcpp_result_gen; END_RCPP } static const R_CallMethodDef CallEntries[] = { {"_tokenizers_generate_ngrams_batch", (DL_FUNC) &_tokenizers_generate_ngrams_batch, 5}, {"_tokenizers_skip_ngrams_vectorised", (DL_FUNC) &_tokenizers_skip_ngrams_vectorised, 3}, {NULL, NULL, 0} }; RcppExport void R_init_tokenizers(DllInfo *dll) { R_registerRoutines(dll, NULL, CallEntries, NULL, NULL); R_useDynamicSymbols(dll, FALSE); } tokenizers/NAMESPACE0000644000176200001440000000374713256545214013710 0ustar liggesusers# Generated by roxygen2: do not edit by hand S3method(tokenize_character_shingles,data.frame) S3method(tokenize_character_shingles,default) S3method(tokenize_characters,data.frame) S3method(tokenize_characters,default) S3method(tokenize_lines,data.frame) S3method(tokenize_lines,default) S3method(tokenize_ngrams,data.frame) S3method(tokenize_ngrams,default) S3method(tokenize_paragraphs,data.frame) S3method(tokenize_paragraphs,default) S3method(tokenize_ptb,data.frame) S3method(tokenize_ptb,default) S3method(tokenize_regex,data.frame) S3method(tokenize_regex,default) S3method(tokenize_sentences,data.frame) S3method(tokenize_sentences,default) S3method(tokenize_skip_ngrams,data.frame) S3method(tokenize_skip_ngrams,default) S3method(tokenize_tweets,data.frame) S3method(tokenize_tweets,default) S3method(tokenize_word_stems,data.frame) S3method(tokenize_word_stems,default) S3method(tokenize_words,data.frame) S3method(tokenize_words,default) export(chunk_text) export(count_characters) export(count_sentences) export(count_words) export(tokenize_character_shingles) export(tokenize_characters) export(tokenize_lines) export(tokenize_ngrams) export(tokenize_paragraphs) export(tokenize_ptb) export(tokenize_regex) export(tokenize_sentences) export(tokenize_skip_ngrams) export(tokenize_tweets) export(tokenize_word_stems) export(tokenize_words) importFrom(Rcpp,sourceCpp) importFrom(SnowballC,getStemLanguages) importFrom(SnowballC,wordStem) importFrom(stringi,stri_c) importFrom(stringi,stri_detect_regex) importFrom(stringi,stri_opts_regex) importFrom(stringi,stri_replace_all_charclass) importFrom(stringi,stri_replace_all_regex) importFrom(stringi,stri_split_boundaries) importFrom(stringi,stri_split_charclass) importFrom(stringi,stri_split_fixed) importFrom(stringi,stri_split_lines) importFrom(stringi,stri_split_regex) importFrom(stringi,stri_sub) importFrom(stringi,stri_subset_charclass) importFrom(stringi,stri_trans_tolower) importFrom(stringi,stri_trim_both) useDynLib(tokenizers, .registration = TRUE) tokenizers/NEWS.md0000644000176200001440000000445113257217764013567 0ustar liggesusers# tokenizers 0.2.1 - Add citation information to JOSS paper. # tokenizers 0.2.0 ## Features - Add the `tokenize_ptb()` function for Penn Treebank tokenizations (@jrnold) (#12). - Add a function `chunk_text()` to split long documents into pieces (#30). - New functions to count words, characters, and sentences without tokenization (#36). - New function `tokenize_tweets()` preserves usernames, hashtags, and URLS (@kbenoit) (#44). - The `stopwords()` function has been removed in favor of using the **stopwords** package (#46). - The package now complies with the basic recommendations of the **Text Interchange Format**. All tokenization functions are now methods. This enables them to take corpus inputs as either TIF-compliant named character vectors, named lists, or data frames. All outputs are still named lists of tokens, but these can be easily coerced to data frames of tokens using the `tif` package. (#49) - Add a new vignette "The Text Interchange Formats and the tokenizers Package" (#49). ## Bug fixes and performance improvements - `tokenize_skip_ngrams` has been improved to generate unigrams and bigrams, according to the skip definition (#24). - C++98 has replaced the C++11 code used for n-gram generation, widening the range of compilers `tokenizers` supports (@ironholds) (#26). - `tokenize_skip_ngrams` now supports stopwords (#31). - If tokenisers fail to generate tokens for a particular entry, they return `NA` consistently (#33). - Keyboard interrupt checks have been added to Rcpp-backed functions to enable users to terminate them before completion (#37). - `tokenize_words()` gains arguments to preserve or strip punctuation and numbers (#48). - `tokenize_skip_ngrams()` and `tokenize_ngrams()` to return properly marked UTF8 strings on Windows (@patperry) (#58). # tokenizers 0.1.4 - Add the `tokenize_character_shingles()` tokenizer. - Improvements to documentation. # tokenizers 0.1.3 - Add vignette. - Improvements to n-gram tokenizers. # tokenizers 0.1.2 - Add stopwords for several languages. - New stopword options to `tokenize_words()` and `tokenize_word_stems()`. # tokenizers 0.1.1 - Fix failing test in non-UTF-8 locales. # tokenizers 0.1.0 - Initial release with tokenizers for characters, words, word stems, sentences paragraphs, n-grams, skip n-grams, lines, and regular expressions. tokenizers/data/0000755000176200001440000000000013070504253013357 5ustar liggesuserstokenizers/data/mobydick.rda0000644000176200001440000136632213070504253015665 0ustar liggesusersBZh91AY&SY9J|~H|@}_[ݪWUTض_nq6"*PP@@@wopۅ;4_w(U+`#$:Ί瓇'ϰmP(  DPmۤWl}9]Sۛ|=`h)RP @gXRvt PJYg6"D$PT(+@U@(P(EE (4PP @$QJ@ @QE6@*ADzCX$I@ $@*$HWR@5T PCmIPQ!H@JQ(TQRR$%@ ( TQ RA$*%EH2d+Q$@EHP}1{TT!RP(+>وP"BP D  (*  P(U6(T IB!%TE($mPPJ((% ;1*QLwvRU({O4 c4@A@((" P *$:U@$$  Z{zKOoow}꣥qZ-]z{}=ݾi}}=w/]զ==2]ו@XgS/Ȟ6 x}mqf=>^wo{Q!W,L]zv%m.3Sf\J}k6_ojmۓ5wvw=++S=̺U7z.3Uم ܞ# 1mm2t4[7Tm@EPEIy<^uZyv=7C\֛i8( fs@ =eleAR.JA" @BURElbkɭR֥JG;8ƛ֕꛷6u{ѵǫ/f/3BGwB!4)H[{s={\Sftk[=ۆv{zҒ{sewg@RחWY$xwHG0wr[t:75uݹ{>\h(^|)9^_zr}m!RWm%]w[b{[np}0|;RU%)^qa]MbRx4kCU*!Li 8{;ۻut/WuW)ǩy`Ç*쭖7cֵ @"ǣ׼bTmiͻ'|\_lvڭ;7N#wUzlÜ*Rz,YIX@[[l wnjVM BivP RB:Ýfҁlw'wwnzx=v[G.m7g[CDG6 w_{׳ĶsT\m11r@7=`SonU -Zf}i]}c!gw\{N7܎-zP +1{"ñRJ (բ%J j==z)evyf֪ ޓ|ڙ文'2/z]sy\4)DVɵN55yqǎRL_1ݥTg+zsuHPPP7p%l/ϙ.[۞;Y3UKfe%3fZEemjՌ(QUL EJSekRe3a(gmw;6)T*)fUc (ԃmf+9xoMo>}_\@-fbկNA{ mW"4͚0r홞vTܼuPY}g;;ϽG٭DuPϭ9oTUu'_^=.u=V&nrS!+7pOk:ݞzfگnmx83 &VoqXޞI֨:SouZE{Gw{"@iMMzL 4!B!j6C41LJJ(&I)E?T6MFC 4=@H 4h&эiOB= B @ LBMeOЦS&@ M%4 F'IdιԜ3un!nv[P? i>_iTbUQS+RB1uAuA N0A-$S2P!ֿNYBtbQ'4q`"EZlEb"_"*#-꛸s2 uQ"+5,yi1WRLF,{"mgхlTT VcYkJ((-I]f&raJ BR2J{̘Л c9tUR.Y(YT`(dT ] ]NdKj Jmﭿ)VϽ{'Z;+MNc=-/%Ew!Ttҽ%lgLHt܈E+leCXjf4 @)aƸ܈{+ ѐPU\n6Ko`0[BZĒ%m{ݿt\;fgq M~yuƿtL!a%7'PM/nP(}ThLk^nxqp#}>x~nDgӁuJo\ϛ۩Y9Gb"œޕbR:< yڝ)a϶/^6zNQb,JD IBdit@W}w3uY=WB/Ccg|ˁmGKR?c.@ߴ -!mOn#zWK~O)TFaGɰ@ v$C?4˹Ϫn)tsQD?;{81&U],,6Q`'TxsW5ԲTUACPXᶻs;|sۮSPP]#-f7_oAX}bl^"~OzuF܏?h;~XM3A`cH_8ms'1De_lmgCr|}|&i}>1+l~:ӇM<|E`X)Hi0AT}JF'[]hvR*+ᰙ3cIdE,Q`l)42 e!qs{I+. #GL)&˕{cC I,eJ,A(nkL|Ӹ覕X%[*+kݴw_ؓ- o7k峌6h!m[ydM\} -?oǠ׵i,JnCdҖ/Ihd%,ryR1rUZ +!9Y!z=WGgX8x}.1h#_8KciQ`ڳ $SWkx;NJ&22@*T bm%絁}pes It-ip<7ٸ="`i@y{:oH>a&-ڡhIFz7®!# ; tTi7 inYqkA$.)'oAJiU|'ч{N?:,ӡ*˞_W78?~WkI|:֦ecGk)r-l׿|aYWDBz[A=5P?G򯮥9AlU3#z>Ey}!֒Ev^r(2|7D! 4ˮ@upti~Yٓ"plMia> 2}41֚E̪^D,C^`Rw(eƑF H U;þP#͑/ "Mb^8B|/Wn:Һ"#Ogs2㔪9沤Ip^S'$&>텐3~ |&2YSd7>ҡ2P1bQ>:0;+^w'k_Q%g}+~dmhh=_==]QY~óM&A3rR[[}ۼ(%r(>S(R/h:` SFo#&vRgUgkNKeQ_yN{*[ۇ;*l8L#[lo\QI/n+뮝 o_ilEfs aŕl2 սv1wd=W {v]^BGCi~K))\GV≮X=A5.F#;5#k99Aڍϲ$, >~VrmOoHb%g }QgUcO+,Tw"$j԰7K-I[OФt`I*{}F챓=2?B] ,bDs=;L>2$2r 3ȽhE]]#utf>{j{a~9~.9Jb8Ca}U*z~"ckϡtgeV\b矙X\y]* wg۷(An:q赎&S^Ӱ95XPtŒBQ7T MtQ',rPN5sZQ_[;bNmt]!B*SyIRUxe\?0+H%Ү,y|?ﭗC]˵:R#ٞe& H7o?eT&8gwWhԎډ)ޢ/rv7A+=%2]y}6 mU#Ge1U) AYC$cLfCAy v5 !p4e*.;hu$ནhuz^oɮBJ+*9D*+oə[ L lA+Xa5a #7$Z}f9^U4jsRldtV[= H%7ٹ3hZ^Ianˌ>9f|ړ5wӲ,~p'!h.C|[CM3iTêUx^j#] æ#ӏz FA+tt"<]njmohD?X=\~p֊C˧^5}}9̙p.5?uLrXx(WU`ʬ^^i1Iݛ[=WF^K2v[Hseb D${g.Zf .eaҤT_XL{}Esc:cǵƻ=/ڸ2g+=O{~U uyYYOh?!;0/A2Q>Lr*bꧩ/~5B^-8qT\J9>B ;0t8G9w O+4'ob/mFym@ p167.6Ev:=* 9)Xn?k)W.?]/|ǒ2˻Wk#N?wk{xU!V7Y 8?AOx{ ̠c=Y}21[wDxAXF r[gBX鬥.r0V"(gl<9H1u}bcW7L'4auԽiLڂ:g{.zAZН%m/XWbo{:O.|[d%.A27ls kq V/,u,6XguYm3 maBfZ;8e/x9 :T-k㸝N7d@e_tEY埮691'C1pF;Lh-g٠YC ܅lN}0lᮍ!njN4>ANtN9W4LpfpA4cO8ˮf1ͤնZma}/|=8[a& ΞZuȖnྫ zgE f4_WxV{$r`/LCc`猬=mM|.|MqJ62scBwݗS*];:J5cS79>8&bi:QZ9lw ;8xUb GNӆY ԍ2iPqDWjWD(m"2ͻ'&.;dhc\骋e;_ H5zhzjƺ/R/\:Pl$_WtD/+aG5۩a$=oMR~ox[-fB0&ӳo5؊B\#q#B{[ZtVbQT3O o2B^TvGe-)w.#% )Qwe2;hpsM|ƏcģDh#kGC_=iY q!YEW(^#UiD+y|~~~ɲMȸh Lְ!wXa"Ի[dC1=8&|]JJ?{ =7r/M&}kU Kh@>EʅBx~U>i#%0 PC u 3V/.Z-"LfnnS\! Tz A՛P2F<ms ATtȈb3-SL<0EBzEDu,> &xs!y +4I@_|/?sN=g|_m^opn1ȋ|ZyZM2M’cTM&%aakP8%Rw=8ֵֵ}ߪtKp Ui,?+FC׭^M\}^n}LvCkaz QB |(2!4?fmWs?+6߭ɇc-% `QX|>LCYM!ߨSF ~߼w_罓o hݡ Q(:9?Z0XF6OP Dx1.WY~UȞ#U(XǨ| I~9TѹMNJB.a2n )U͒so~~òl 뙟H}+`7{E)RYucC)''nBݞQ;*yu^yj\|z;w>32]@4=NN^@w(5* C`BۤaBߦk7_HOTT7bߪ6r?M.&, sa$$]MZOoy s_dzx:$CdlpCܭC΄sT4kMB5Utjn"ܷbVDW'LmuYBxn\'m_Q]m\!O&,#cf(^imCs5Ž뵿[uZZ|aãy>8h2HǪ r~?_7#M}~*GgCpnuGi#+c(o Z[tB0z;q.AP.*PzMh)9PٟMu$)͞ *o?O隇Oۚ')w_0 '(0[#JR(1:Pci*(7\T$]X@cPsHKx>?O'~c{v=~?-듸1z?~ PpRR:|W%zc]L ;\0#{]7̊JT!S?_;vQ)* |0 LH,ur{zY_G,AJpmkvn`ڿ ?O~g}O_?/e` H@Un|c`Q\bm5,Uy~1#xBi V }m08!xo !9 ǴlG+uS~?26?ߦ]tٟ?Cgogޟ:Cϋ{u 2E!4і箏_>^ZL$ʢST?/}}Sp|>[󩿛M|hFCdVO]N^{6͔DTP?^=]([ gvۅVgYV\wǔ- QEo_-8fdQ3v^"SQr)'ޙΌu: cŬ6 "BlpvG+=c?kѵXm7iV3~Xc =>= #p_r g2~3//t_(Mo\V!3vv5=nֻZVtɂO<+n}jD Î>yQ=."0 s 3Ћݛ_WLSF:pyr Jb!e [I+TC9Ϻs/uEnٵI ~Wooܨ_6$r*?~ESQeBE_Af6 QiF(Ic6͐z/u^?_?翿ْ}I*0 T:~^O7sKudKQ0V礟1IhfO~u]gg2|i"E^RVBJ5!ġ Jo5߅?i1،b?Z)?0ȟx?-4̺K(|BI}W|f+|'}O+QƎtAn'|YqX75.yt쟤_ݪ~L+oϝ/t@ cϿK{bA` 'ua|Nd(xvȾ -6 bH cMHh")<m-oЍp}/_y03}7XW,L E7 Ƿ˶V?l~rw|A-hk`چm~bp9.ݓӫ&˛9d!j*T@8Fw{?q"3 [l6T>n 6n?_"~2F\b;=:GpK~' RRW,:&sAo~_lc^2YtI,ZHi{)߸ n'I1ܭ;O^U W!A %/ߍBn wc vIv3B{IgŘ}9tP 欭vKG#dcl[vgz~_ONX)`j<񳟦=M{3P;o3eds-a=i9v/d?H6 A/#5C/Gܴ FͅOk =싾fO:>p?DŽJPx_JC=_g&RC{xq(O+~_O/`uL +4KW;y8.$uY/ӯȾ'ƍ\_;]µDLzM`dKfI_ 7֮њoQ=%g#ӀtPsUnX[%mh )JAG?$.>Gj|?>/n9W-G73:eYRg 3h?Y+`w\ad4Qꂵv_H*?ce-?Cކ7JQj R~w|m$ ~߿G^#"`"E4 ?8{2:`ѕ:tϯ&iEҙ7EvioZvi866&GZU_￯OG_^p <>;߯ק]O ˾C2tOL$ZRC=2>)S_qD1mE-9o7_ʰվP&Lη߱t0-k@AK୚FHwB'`7{abДT_H;#d>dzmFW$p쁟`7H_ҏFQi.'Tn qP!C9 ?^@5o4l%ZரnS]8AG?Sx=?֞$G_p 1 2qKӅ[U "O?١mh!n:\ *o,Ӣq8X+Ԧ'ЛN:οߓge Y>,XKy^9g Ԉ~!Nk'P5%B+ ̠CN-i l̒^4.JsU(~$??:ZO2h',IfTNK;_sѰ:R%z{8az"mgWz_gָT"==cw rCNvCvZ)<^}_K|7ew'3.>k\R ŁaoI֑d:i)p<#7f lÞ3sj\ /(<""VJeۍ:"oET9tjLKv׸f"O IxXCm$(5pN6_AW(mYEvDN17Zi'^GA)xi /oBK4BcƛAyA7<:vg#.(NնW+p].A*4wKEzn_CyA](Jm܈Br 4:JL2a1 !/'$+㰁(:˽2П/U %.3dxq|ĖU:kBh87p ]Z\.Cݭz݅Ǩ"tBp%/t2NBLjX!k*?(Z@& 7js/OAf={7 wN,ޡWBjj"XϹE/(i3GY]IMa8͜U:H"n^ȗ~1BeGMrGWԝPhf, [/8MW *&*C|>s9nu'K+q8ute}o[Jj60)g|6Y}?!zXAV4&T-DbCD7}5EqU# NPwayD)3jFD]ŻZouCx-"c] 5z4!)v;M>3sqL Y4[ i/Ws['lҪ(]B2F)}ߓ\,3Fؚ!?7sgJ7:\֮#:p%U% SZv;,)],r{);R5jQ:幨jHX"YpF3V6Ka +Q28kw+WZ=v ޸xo> pCMF0J+SΙ҇rëaupL|-Y| EX$VE~Bc8\n 0NMr!5tMyaIxrQyCz4Ks)BĴ ,0͔%2/^ '" Y% 0MbGLnJMdDX7JNEU{n7Tv֔(Tv𜄥XB7p߫nsdP<]aUO8wJ@N#hp zc5ev^`SO!avWG^bFrg8Si E7Ū\0"/~ձ=3!nFR00o"Wd8M0 fLԭ95?u=3S.OG՞h` 鮒y]N5)yuY:,ca|H]ИTR{~"z'W%iD]Ȏ9i`P?ٙ8rQG%OšV=# Z,7K@Qfs gnU *$|R.h7CVfUdb_@j*kFѢ 9D-ǧ ;[2:>=Yl@N7kֽEK Y%b ӅLW@ oP=kNum}٧iyʨ N}ib(B lG?pOa,U+F*>иH["ʑW>&?"¾[q2ͼuO3,(bаXU;8JX٬T_)ΠHTږbI-E\#]~U~1Y]g3A|({Lk*ëq)tТPZZz]S | vlD2v]Z/ .>[D,$58tX{Fށ?"V:3Si55 Bz~VEZ#,Wʍ_C wQ|To=HCE͡FؾHIg܌u[6ee!lߘ"U2%=B^땓ծ@cf^3wKKBɆȍ᯷hn7pZLbȚuP 5 =OÎOHܔ#}^&iPtI,[;09Bq"x'Hkӑ_uShVK#^+-, uuG<,#J>C*@Κuwo؛1;9NfƳ__fL(ߥ*|5ewM-$]nYMj&Gַ ukl>p;.+1H-QiIR2@Wa>ۦhz\L(M7.\jX+T/I79my,CVWHp\#i:Ҫ 11Jta!*fWuUn/mC7Hi2Я;hĺPĤ@hKd g$$'ẗ3t > c"9- .#0uF v9D4Dk,k6rԭ /] ]!3^QiY*- 4gmF]v:_{cE(xd)q«2@Q?*_e9 `%ܼ{Bp"#MRk#ZUHCNu4J5vAEz5Wp\^xh]abte2.FXi*:*u |`gGΡ]=̱~ch`F rw03Ivw]!Zww'}a43w.2QG'}]zؐw41tIV.%R] :2fPb0ʇW4zBn.:w)V 7:j ˵@ڟ]̖3ZG?z{_gxXB`#Ȼ%^y1DlIڡ3o$ED`Jomݫo~=Һ?wkkӺom{ub,p*lP$`l|Q0Rm,m+O {K=QkkT0d2b Mm_izo|"zA=$H>*JK1'䎊>?(i8%|]/h{;"FѼI' \Yų~ξg Gm4~dp"m)~] :;W+e@|{ƴ*nJZs)i1xo˵Ou`k/,pI ]V)Jvo:9|^_Rkk_q] 1o3)# FT¦UvCIhԚwzYkVjFVd2q{%z1i֑h|geLA$D3Q[3\gVEe B|O7Q.Fs7HspŚR-rdFJw[puM#~&mK&?&N)VߕJQ*G4Ɨa wX\Ѷ,ct<#d7a>J3&PU~G~Ln?ţ(]UJn#1뢨7aUS㍾MRӍzQK l{0  Hh "H&^GiE%m4v:ӓR;O8h< Bh# z/i_M\} uaγWC*!U,douu> kWn 6_¼=<oh%7*]^\I$ se=)>E;DDuӨ<&6k`^ \qP5(.[tC.I|঒W  >~K4l-}׺e62JXb&wپiTA% ´U E=yӗT,@<xgN =[pPQG!0FҜ(A)Yn2wտo7ʂrꙥ 39'D;&l;%Xr;]<^5n[oheTo{1wǧ Y$AixqEJix zkJUG,_eX iul3ϭGNʛ1Q%.~dIIE @TD pncL΢<:>s&#}G,#YΕZ"*:#gh@׫N;xlN'^!wEP_KI28};9{}YO\+&֋)*k49-iP<,2F #rNW]8Sm)o(8 ~i}ľ2 m pQ>mHQ0{^6̽E5hXsF&eIS:Sݤqk4NFaA Yh һUPw [ Pt7k6Q?}?]SN`F%aQQSI# --uF&'CƖ!N`-)O7L8hQ[$-{Xa5" 6L9ZࠢV& 4>66}:kReQ_cF~W:%ctA>%@jSK d TUxDx1Ó"X%T@|2 3b}#9)3@gj:gnӗ9/C ^?Լ7C} (zԈ%Q̵T_'y")MCTjQeLe (EuX"|r@ُɫdevW Q,8&j( Ca! _Dp]57`gg= KWzra;b&6x q뇲)qB iBvRO{9!^d'䴣T#N[wǜ܍Kq3Zev`*# 1gypwVvOjr揙LU.ڜ%tʰwՍ#_T>hZ_^Y< %&WC0۟EW~UwX: ~5d.ݔ"QƢ3D 3^u^[>,Iq^ȢGYeayA6/e vU1>yζ27d {KE6W(. #,'3]|/4C#P罙^tþ:ԭ6žv]ܤ8PĸTybߙ6ݸgS2|yx]-+('(7|QdLC2 d=!kRil fy0&<(5 bACJ?MX:>ptG_{c֓=zP eBp`<|U$E 6I~@-0DtBfz7?8@]_H˭-k8.06]UR$Q#azhA\5w/~_%6Q=ӥ}2{!#HR & h`"A)xzEs?5o8]r?@u x¯h2}'ׅvn绪OoJr-~yNWi0&Ϗ5(]/9ߦU,ui tFβGk|k `qWkC^f!{sFT %شYgeu9B>v!xUUT#M;iu`V-&RO.UzuҍC*xF~H-k?P`f:i.Z^=&ǯݧ٧|߱՞Kח v۹~p3wæq@o5XڃuζUX SÜh1ȝ"k&|]R^GofVS{> ߋ"mNZJck_~HY;z9nbJ:C:/,(=Q褥SMH[#U˺^I!N<h3Qƈ*I8)QWv6}\e "f @Y4 u[`{K!#tOI:)0 #XK/ڵwef+Z9뜼  !e./i^a.CKuuhp9л:3 aV5x6뎁 i`5dZ@ǦzP̷htpj&{ Eܜ`+4i'($ax&ZtD"51њeիH,\@XqV~kwGtxI# qAW7(C#`okV # E˙ŭ ޘޛ25 Tt݋Mz34D `a)=usݺjq^ȍR4:4FL)DJ18HSa3}ڀM8)YŕߡȬ #@MXP{]ܫ-o 9"b"[M'?fc,$qҧkM/ZsM|PXV??6մ{W9h<\J5{vQ/wLIl='7gnv.l=B  Z-$ 0!B͑wI-Jyf%m/̖Z|ge&аEPŅiל=L|Y\ˈlBYjaZ/+@0g|3E3{['M~U+0Sib~W@c"l_ivbE|^(Ჟӯ~{& *K怆V@F@{MH4F}Iq7MuY6f,f'?gdY+MSs]yȣUXa7Z`Ӈz9b11~ @Oڊ(\a:UӜF=Z |0e6M]"5.d^;{ 8XKP;.ijXX[PD$;쁦 a_L#`QJ8} ޔ\bFӃ(dzLV/Sd]aL1[m^T幐\'GE~GAפg7VmV<(ʁJ*lkqJhՌ`vO]aX)"J`utIX*ǥnPSl B`4oxTP&e2]VǑܩwǠOKcܵӬ=]ೞP>\0ټeM x{sVF:pGa$uWy)z#P ܼⰐaz^ 0 ; .Msx |8&d@uy9[4Cݲqx-ѦT?ruFB& # CP %$Gʺ6;4@& &;8\ ߋpwLml(^4)JR,O4J+ǿCl[#nϖ|22MRq7D >JH(G\c o$𰴉:5j$ bAdrn2wi/ꔾL(,E>ȾX"bI2 aA:NdjŸ+֢ !7KR q-cKPCQnٝm:;%߾m=٘~#ӪM.|U"">*ĵ l I ݯ$XݦA =btz>ィPijSyƜhWl8_+O ,pD_0Q}zDRJh売(l-[! u}?iVC>e M'!4P-m|̚.j.d69)dßw]>s.fqPdcP) eZ;dP!&]:3 `:g!5ܰ@霳viD&hl˼My~ȸxu#]J! sHs6 >WTb#ƚץi @DڤpE&TM8Y.tDh>~)Ι૰njn,콤i2@lqf=|B*6I*zoY@ޓ۪fcon3 Z'ZIQmTtG$Gf 9+'t5l*zX .KWn~²ID_:(K *\G4Yi'z ]EHIQa ȝ7.~uJ9mC: QǹhL"7,A&KJ P̬U9ɁJ#}8?M%&=.$qS ,쟫n(!x!xPGvy$Ep.jW Q-Mlxx̂Go=a>:<&K6Љyֲj`4mGi3VKBB1kXsx!pK-{Q_T7Kiv|⟮x~z@XuXL(d'IV#aiLc9Q=6CDA/Bk)#OmސNz7):w!\c+pVvXݱ|MqF%#}U@6o ޢ쩤э!!q-NK}( Iy*]-}Oef搎 _2~0,}] GD`\I$Fvn˦e:R6^Tdv]%xZr׀ygH6qfv(C4XD>xX.bVPYTrGYRkbX2`ݧGÁ0<YWⒹan/} ~\;K81WٟTqH8H.=? x^0/GW4?%;$U]$sD8aٜ~.c:lD5(ZlF:&~R뽱vn.wT !4"R{LLD|v| *P;rl{˸|c9D]ev֔_оς{ѻ@ɥ(u.#fXmr}$\>y.Y0bfn>kIm:{+$NZIT?̎|  | FqӲA>yOKJLi^hvjkGC|:67' 3#a=4t}k䜬$L\04ӻʟJ+ꇓ}\.`ԵL7ݕՔ싪_L@^ˎ{68i"@s΅Has 'zªȶI_tƯQ3T=s="eVՄUOWjߧw`~lҷD#隕R2*z)tM AmErMTSqUJ2(֙zVQTӆ&[.5u|^{$@v>h+%p?~%$&U"|J.uŖd5lULHJ{mP0#ˉeK̯œmg~(~i4BLq/#Un i"zsѭ`9u95c 3&*UOG>#dnMUp:GwJ4O3tWƾu5~.oŧ0|E`Hv ^ZEm`4o,œr&r'x]C &P(:[9ffE J{E^|?N%nӨ*07 @@<=\u@dH-K 졓;TK {bMwJa+S} 4B;`7!Քq<)$<ˉGwDEG÷d[FƟl܉+ӎJ_KV^3X.r!yĎ^"E1}i;|kI!Wly.Q=tPreyIT>؃"ƍbt}S mvuvf8go "Rj[Y?Dk6ٶP` ,"@widIˌ=],TCvU%0VN.LsV4Eu Ʊь&H-u&(+CpT $KlQw/ZTO_]s :zk~Q$M_F}ե[ 6>}BZ'I&R1j^kVCt K$ZuZl87-:kl[X17"rńt&]]\a-}ަi'!2ԗ-[YK)`Ԑ,t}"P 6).4$2r<&PKr,OudA$Ӵ7QT}S!Ⱶ{9f凢;v)\j1p˃e5ׯMz&]o#]-9t{1n/\m( @\hl˄Z‡֦)r2%̂B?u'9Eٳ?e߸-[F͸:g#ib,N8,MÇ"JRsYL/YԚOj󿲏dNBcʃd,CT;!?@:Ev+F5~ /iyF7kݰ{ ԠmZܝp#(K,2I(Abשw8_e0?1 hPlQAbC 5PǬ~z]FzkI dwKDfO3m_ UXxλoe|eZ"\и}"ORyq9wjgKYY*yI2u@tG;Cÿ̡VW]cNTѢM~ηlz{hJ>`o@J ѝ=ira^Na. `h;Q^&ڧ535qlqN ^x;H)e-:`et/ /UNddDB_[,}wN5ႌȓEg&A:ܯ/BiỹfĎSYcM3I @ ,m2hhpcIq,f_po}RqT Fv%]U 6\t z"טd|?ɭH8 6,mY,8_"i20^_ dg1,XL>a~?g Vud`ȸg#l![60M}pH!r̄q>olg5ȶ+v5cdhtUJˑC^O~7PD.}#v~-ղ3)1!hmvyR¾%Az>q%RpQwg'f?WtײHrk5u-~|Ntһu Z%Ģ{/rB3s($GJM5oM}nzV,2%e祂Moj&8 /lǸhe.џI;hG k_i6]zrr*ϻPU:1n0r0V(ed|R3uC(ˊD"\>_s"IR@>/6U |+,{\oAD@Q "R~6cg*j1QgӸ%Imӌ4)Uq#B[$a>_̍KiT ѶG}փbK+Kabb;~8ډ5͝~#"#_#a{}ըF8TukqPsފ(VO/>]%3 ``YabckZb|9ɽֳhsCƏaix.ué0,Aܣ ]FEۀ|co,Q#]/WwgnJ%-(3T0D㍬Y8̠P̊ 0 $"^캜҂$Ъ Q]$_N!ܙGe34\2B7ZcQDGPȆ>o~iOkpx}[XGnvqyDxX+R$H?|uOo] Џό = k\gvŖS=&}+1c+4JrE~tCH*Q!)<ǥд˭E6"At}Bttn9+[{0VBPkZw&oA[?kFOc4}y:f MI@ "ҍk"ViU#lo<5!]sJ=X?iC}1N*3`.BN1} u0khiW@w[pnF1CK{gex{|JtpdtYz$ ; eEC9ۢ1.6/_~I7]HMݥ[pY(֧8ʥ 1tQ-]v|V#H3)FM<&;8y#cȸcH ;tx8/C!Y' n)!Ʉ4ś{fڬ] 2; .ViƩdaq*v ٪eуCnAݥ/휟?QGC٤YI,Pe 0qfC7'#BB\r/}R0:F3M($!3ȏNRujߡ*q}fLAʐCXDt:TV\, |8*-j!ٝ_qM፧u ޭh+ 3,Rᮮ.F"ljJzb-L2Q!HnHۿ8U[iJn%A9M{S;q=4 V!{ "wUM*|i[f"svn$&Sf&`ʺɧ{\2q]^Gj8uֈYZjzaYP=啓*7g=ʆ05rDci 'qA1WR{iuwev\V3 u@BVQQ 6'^KJFQkxF fo5 Iucp$!R>+|q=q%C1W/N-g6 ^z XfUVi#s~3 ntlv[g~Ms砑" 1MN|Yf(~ƮZ DFUfk" hOG"k?d̄ NH~%JY{ qB,*&.RC1CS^Ϣn b YAK(K.Qb,ޭI S|X7+\g'~x$zK5Y5:*OJ= qHtz0X 4 s khmVTj+ӗՙ%I9q[ -jkS{L.9Vx] ĨlL.]?pهLHڐĺa:T] qǶi@rvy)ۤF,6[^*>vƎ#cUb~dc ; :PM,+R/ʎEg9J$vroLދDCzU] Msxk)½khӔ ~?M*&bƃ &h\8m%!*+.)z1 ѬKvW[YF>q(-:ʚ}.#Y^}6KKb:1Ȼ_i fnb!ĶmS> k&Z}+u)f.AU8`\;d3[NcozENtO`4ugw_#61+}ዋLW=IJCkLB{1 ѩ \:p/ճ!+6Gq.OLYaыޚU}a!5sm 3O s)Ny\e/2[5uc $d-3{^*T* f6l2Y8RMW:zr0K8L{R$:r=!ČZКf69Rwķc_T9.R,u,nҶ"05eV +@1.!)u/߼ۤR _ĢusEp3N(ձxC~e\β5.TR_K>z̟k*)tL܃vߝq}sOiy_GQ—_{yټɻ˙GHs4PpӞ:9vo1Ws2%z"x(,66U3WMzowAoOT¡~k_I4?|CD>c{:쉈]<&r"]-&*ýPD1H!ˠ?W=w2Ԋ T(wHi_Q8X)@|$L6De_^xN>ebʟLG{5ֽ6}ɜɏ@ܬϨ,ty_DR6B5GI -0x>7kVI%tRF7ݒI%z4}#q*5sk6 P5=/ _ oo}|tcZl할IoaY>QQT`Z4!_;o?%c oJ\##A^vn膗WhX-{rX i(e7[ M2OsD:RKPPC?Ű{:@Ñp;@Lݞ"vpn19%3|{Msp2Q4כ͎CY,Q~#򘬢ř4o}.^5qv|GTo Oo"F6jX]9Z2f y]T^*.>N>E!cĠ|scs$z+0; y~o{g_ͺ0|/opomze߯lt:/sk=B1>k#RC&=N>tfFn!Ѕ3(K%UX4K bddJ~%%le߲ Q w 3E (Pܲ/OQ\@AL28B[!YüO|v ݳSOAeY1مFrΜg+1 3! > aBG]3wp-Mڢk-e{T^ \W#1Ȱ aYєq]Жk .#dY[B5xLz?^ޗf" 6n!ڭߦ.x}3F YrcPbӷ@PQEPky0r. H['8͐T@FiYb+0f62<-{ 5xud>"`6vϧ>ۚzƊ^Ǽ(8DlHE*IWaFa i mFa¦Oǁ\׸n(idgD$ d+!}Q {X(d2OZ F.1̳c%+af|"$m8@B(]H l8 RgKrzӊ ˮ{Ś^99:7d#=rHG&rDy֌ 2گ& YUx3kABRz Pž'IJa ThI ᕳ/Be2}/ri4>uC6>p܎z(g=8[ã (UUQЈly>xB6Pư^rpu.*Ur(s 8MZ @MF7 ֘}PZȫqTSHWE{mD!rZ~<`{"a &4[U|~i{']f*);{tKVF Г@Wɠ(+d ò3|ge]>%2%]0θQUMO[_.׎XI'ȗqBϰw#PT\u;hNV_ p(!ƐLnBdn($M3JDi,)&.DԨa&PW8_zp֚I#h:;:F |Ho|c/vUki#3Y',0%u2߯c$cˠ HvJ+ArPnI0 _S *Z Lʫq6"!#}!~Y?uk`cU~0ҢL>e ?3}gҋF׈_{fFd SkOo{l[\`䭁iVlK|i NomM%Z*F4o||$2=gEԀsQ~=#ߩCb||]SmGS1 &צk ?Dzt}Gx9؁ߏ9HؿjL" :NȲ{ 6oJ$lsno~bbƊ'rl% 4oU"|?߼vXFpڤҥF6%1p,Hώf4 aEE5ov֗d̎[g9Ck&w|?pb Q4BCfBRJ'H5ed=YV *&p *xQ*U!x;׋gJgT=f{6zx3&yxQՆ`δmojK?Im_;u}|/05|̤k/@tYh0ycoxaOnfmC_klNi/vsN7-fP)9)f3mU R{yܱ)MqΓ EM@r3Ig:$uԇ "J+Cpesd0#w"sĄu lUO<TwZ!wxU)s]f{D jWr[zL ߊuJK*%fVPurE' \6ѠJo=+G5 2;v:WLpUt3yC? J`YbHu&_XiKRh~_$6iBU*;* |=Gbl w#^_tԄmӺW^ϬDkVJy*%ڕ!fԕXʀITwo4r3M)7$2o3n_ӯ{M =^D\ Uښ;nف](JɈRXV5uX]xVT'#Pf@qZDU?d<]u;W5?VLwdT9U?'QDDD:1H^7z\2Bd\F%P"T;xIJ5UQoR/*!(DcVwYإi&Wн5ng" u=0dɞV8 )(T(EAz(]l)\=e#M+lK3>f-8dzg>⨞$y Uz3آ ( z]_Yir!|FlAjP'hfF":Zd 3^0>q@0%.|6mQ9Ic=D2q(؈Ga)ZIV[Mt;'cx?w.;PںF 7xiQaԍ,/UEKZNY&dN!ݭK w|=aVL\_9A(![ 75jCi=qajn2ĕ/D BY 6EF귐rX"B Y/'ghL(=hYpZ ::>H[SAzI$ D[s8xJYy̠[E0FVAX UaYK6/;W:A!'-ӠPB%6㽪d.~-qw?k} 5al2i:gg0w`@EΩL/O#ބ]2@c(=+d[Xp?heQvaǾD-A: 4ÌWp&i26-̫5l9a- # hOЙX0Zquv8J=D,܇KI\s`L9agFAUQ}+Jtzۺ#,@vhy6GFf/~cм=B)5%T$2gBӅJD!!'tw3Z?|U=o1}g3q*CK]17˲D&ʑ {FRҩ29mb.m q D6g 85W;̨e|x5Wy[>n8ItXn)|I(|B% V*0f;abYBA|%SЛϻC1p8rMm//vײPhپ&굡ƒYkb<8_3dx܄ cz`϶ D^I{Yi~jĪ$#PP{d%p~%Ǥriiл ()-}T.I"؛u%T?^rLA'X9ol lR䖽Ľc67m 61;KrK3®L"r a@P{@Q)KQxaun_kvԄ4Pia2Աo`xq:v`r3;,Ǒ4=i?Gi2 Scv E{M,;mB hQ]}waaӆ68>y%jbbD>+n4 M=HExǙp *']#^6gTPۇT6qGԙ| u졸CLך]M|p s.c,n7{1cRDLͨ$~:Oz[Ɏ/>~b"-y;=F4UX_1l*‡Lr"1r$T:ܖ۽, &q4;?j_-[͡y5|e*}AZ"tX,t0|g>Fc;玝9 L, %1̿|$ƫ{=א^xw)l- Zgn׬g+uy=EHdyF*ebg9W<^&N4x%:NOM嫬$mbg^w]o>W"J>|.:o:Z{V˺Iwh"%xo'ߥwqnrj%}йID P(: yA 1+%97/= c9{i9n2K=үۮBυT&'a/y"JY/h2!)b'ZEs&`+Wb+@ٶﮥvǘnX!JI_$BxnUyFN$\>;u3G)ygť%i"=w= NqA#ue!XGZh.wtpRP߷#'ޡfZz@;۾: ?1,"j:)15If 2Y"U Q ĢuD:FCrvT"[2QM]ng@瓖Do=ʏUPv]-"oڥ_X^$Tj V$RՓ핃<7dk'7ީy9̒^&jJ]J  Ӂ{zKעl\o Ieu|>;Z ;F,>UeҞۙhĵ:|:̝Kl%mH]!$9r6E_' [s)cٻED o~ *cH,dő~goJT>+IX 9v)ZX<7D(Y^08Z @ j1}-> 63S$9/Z^ϊ6 Wќt-u'֚;E7V .  sP )]f g:7,{mfeʹT!SlcT:+(vmW[b0Q70_אu90Җ4Xl;HȑQu,HieBh[p}upDl ;ʘq:m^brUIƝB8֙p㑋mƸ0)kGNr)݁Ͳg5W"ݐfzlq F: 2H^}I=yNq(Ȭmj!IW4`F$.vw}3f"E<>HZ)C:1 rTU$e}[70MGy` eIs W ?7iK: l~8mҐ1O;L;`vG { $e n`W*&85$aQ +[LzQF:D@6MB*MۊZEZun˜&a;Wad?+s!{DKAY  t ID:WMq YvXPٸѰmY'"j㱭`DM2,?S߀pAYӥ%C-0!D].595*aWRBgaJ{ZY|]p؞+3k`Ha)=:7)B*?qB2ӂڹw[N7{ţ .X lX+az!wy~фB*&q"eD+H;ĕ Qa鼖jzTRB@e,*!cE5rk5!*TR}#vFBR]'3tDi֧n&.7Df1)*BrXygwפ'b*~>(;ndmq uT+iA(S)fE4/A~o NX#7uCsU(k~+ O^({鮐Pp|<MA8x%O{?eyѡx31PCh3L݇S8K`CB}i$:)eIYF uzC)RUi]g yˌ"%+fC>Z>qd8v~)YġZ`!3NwzJqiwWrG/bT QH0OxI(M'!OX*Z0|lӔz6U#]YU.p5>ThڋdWlb)DA"l Tޣ'qX$_iIfjpH6םM #+3q-U1KkR;Ubgo[ |\@]D 2+}]LE'bnoHk3xs՘?c5 31-dZjE[Fi#͜nRox+.9-Q&)uZf! aW0WX{3Btf6Y(drhf+(\XB/H8k%N0 Wo )(3He0So \hfpڢ?ƣ*ԿbCэu60=fq'Kg#LOyJ}h\^cD (f45靎_;a S+\|B!D޳eKA_?B&F.vf ޥ@m4.%n*rӦ ZTWˇ:YsYa3IM:ztJ%4 >gu`u P6$>,RKꄖ j~ sFlU\+5\6 .kPW\V J_ծ|cFf+iu>MvguX!r]Z٫y3wNJ/#K={ R}CbCpnOFO *DkƑ*iK 8c434p 1sXP&")4єO40.q(q=@#gѶF_&L} @\?֬8^2[SQh`U=ӆ LTppB{zl#PtefH@vDEVG[!DGx3@h$AI7KB|7IO۾,8'5pֿN>2c%8PB Ct^b0'GR֏-}c ע ob0IjS2Gno3;;P5{Y OE쨟!/:d1gJx]%,VT馞_:CSnjf۶hΑ*JS@E%8 lkN&/ks0]6E]ܐ+ '0T4U %O-wx+;Pp[фlu;r3dod ))/cqw,/n ħO5-SDZ^1H2 W#)* ª53g;5d#UE#[/TܧPvl mW?;W>-)bj~:j7ieqtmsԽŔK#t|v<9~+1@B"ׇm2ad!"ZnV9%}GdN}d iS/<+(EU&#FжI_pK퀎iu>dhWg\KNA>#7ӄ)?o9,JJ^;KhXpn\ 6C\%|Ѫ2sr}:ttL2*L/9ێ> VtU0Ͱ;oyd~ch}uԭDr+)Ȓ2qSޏzͳTTN' Dmp")׾enn*/oa3טN뎴6khd̽mH5}]85H]~:΃QNz[7' w 19x09[fnWv|y9;u3 v7 v JfbDC9I ܮ:mwOc;#LOhv "S̽̓zQ3Ѻ } `mpi3]qxQ%{zb $l" yί_rltk@۫aRH=-@&-^1u<ej4ګ[.ow"o*L̋!LY:(gwLaخ DQSZQ8P}+*G"e$~oB-{ #:g Lr!P% H% s=C%|08Ba]:dj' T5菎nP5 0{Ila]3|ÌjXJbTN $$f)BX\ު~Tܳeb\)]Cާf62F.GN S#&#d iX7fsi#ڗov^*7ᙂqi<-oYw8zeZq=;Ҫ{gF^pFF,Rudblp&^jޟ~bJC=hH̅=Y~:?Xfȡmf \4mːB7yL9s94:«˴/b#ɚq:vya[|h[ߋ M QkG}m9s ):7!Ak7i \P:!D4,$?ss v{xB~9‚I׷J$x ̢ȃߥ (67U @"of!tODhx@Vӭ4/Iz*vpIN5C'Sƺa8` 쁭1 E!+?Wuo(wku؃_]bAe4x";,,ɧMS2ه͆۶v{} ~ckE=T:],$a/I28L}çAgp6CE-^ B ADvDjYb^q-voHQzݻ{=nLiV S|J4s_=.}t#3M!U6lcgӶ޴6בZHkK"N-cGpeZ|̃Y܆.U#@mk)!Eb8ذL|(wD|)Ll>:ad 7A هRzfdRvnnUمcv8^4vP~h4V զ<_8hМ,`ŝA&۹Ir|y<{ԊiHQčXpyс,7@X|5Թ4{bŵx944$[bjWf_"iKBfu8>X@?4RF}&C^$$~ozHɗNgSjbCQ<"xVRHj $ְK1ǯ}k "1 "0z:O؃h\$ي %V^0).R61 %olcJtj$z"Wylȼ="XylI!D&]Ur4zS<+8}2}!=bw nWS7 E^(Ώ*nMt8BdN!=i%]9Hb355>b]$f$2Qd1`!ZZ`jD8@0(FZE`|ƭ4qWE ϷRx>7;.\<ڲ0Et &co@Rl{32MSd |0y}jq*zU޲Z;ݻ(hYHKnwnj]"5‹S V(lf_Z^/ehҳ9Ӹ-ҁyeC%Pț1sF.P셵YΤŸհ1=F?Jvvwdx'i my)'_z$8 >9Á7߷bA^Y@epHǫZ̶$''.@Mx֪פ)P}o:B~w/MMgDsnrd]ZK/(tJUn0x5p;UWFT$/5FLfM[6Kd =.]-g6  zQ40fI$ Z>/ViFӀIK.V#PoA~ml?O:aM׭"ʻaՂ˵jQĦ- RG|o_]"6Δ}D^EX]%kT2_TNhV%ac12E!B./_hh=}DY<eݨ=E`SiP1'9Ni53vFgr>.Mdf k"ɮz}m lxoOvdDihic 0kB]A@ѤڹBΤ*m6J-D[D/J>{C+ȃ[ozHӱA}`hxAAl$KH%nR? k9>BK 1&AZuZmO 1_SIt?%1!lY5(TY8i|bݝgb@xX3.x?nKiɵA>0g҄ DҎJ="-vB-R&5µ,C03sz)Pؓ5H< l3$:00WJ6SO~Lh~.}r ѿ1=DG}P?>A ?Xt^;Dw>HHzåZO]g.rmh0$`AHuEfgUT @T(w&˒6X DʺʆPX?|}rkqkd=9zjī"!xf׋o,iɃv^6>s%Y-]g]lqCYq)|AT\ fb BDz÷>.Bw궓#HN(Ge!BHap]/;i˕Bj!H(5hJ/3E̛pVof%̽D]PGKp]% B.|d[#3ӱ& -~c%zWt56/bڳ[BAXߜvxkL|xAIN5o1,9;ڱjѨzl`ol:͉E ˙A-@Ҭ󅽺!K.TQµ&:Nu?gD"c0ar 8d~OuXv8ɜ?Ӗ] _ow>1:fM ļ2^Q _ѭ)Rs&G.2_jRb(Eу|Vҡi!aqx6MxxTț8V0&:ں9½Qo:cO6IE%sKx'PsbJҎ\]pnSW]9ǎEXpi%{ͮ* [}æs,ۆt6{0뮥[7nG]1.F1@IJ uVéJ[yQ#B~)Ĺn eZe/ӄ^C*4E#Ý>خ: LXq!8a2 g19`$JMw0.}>~ rlt:%'hu?9z:g!Ӝ+9' {+ y}hC\8uஓҕʽ=YL˨P]c zV3Q{v ;+t۹Z/[z2x5mP$}Ώ}6tgQ$&^IxWIK_^W/48_eL}AfW}m+DY_ۿsG3pQwLx [pN\g\ƨHHVx0#foQy=t"3^6G:0~ër屟En1Sq:A{ɜjϙ?z{tՏ,*Mf~+dqb@>!))p3!w }`I 9sfC[˲Hsaa~(q=, u<Y=ap!Yz@2 UlB*-]ą"56:M+./kKHW"eÔ1xd|VzBGyMF':d (l}n.?|kl=2qPT*Hچsj~{:FOG9V_x?K;RnғG|jg댌u˼Ӝ/+0b֤Z 8w#c8őP>yElK;D-ivjE*X"D4'|sıc(4+]PMΑKq6}~ Xk1ӡnJly6LT(by\ϊبB&>Q:ymKC%6e0yzA;enxA) ģ*z{.vk2%|zfXץVA3eu_Byr4ݴ_2@c9x)n]!}"SrȱcNˁaKr"&O;)K3&3eo6U omo_kZx^z AO; |5\;d R2> ̭_H aH&p؜ӧÌI7zMi Z"hzs< q @ҳGW$Iv :nTM>J Ҫ"V\m~&sl~ZĢiptqD>r.~ЄdYo$$nlځ"!\P)hjmF1s禼[jUGuT Z:Lԝ‘2B#YӮ"# \xnS h^'voSTEJ26C(=߁ֽo; /Rt),LL¿_T W>|y?c=hWZDaGGW9Bϙ#HI̽Da*vQoV X'^z ݢNrbY ~~aث(qUt(Wo>23T3:t8gRVub b@D#pP;FM_\VItgD.W->/1qxopLBPjTN^"Q/v]I<\qYx`xLtGM}1T^hb^E sx>MaP3 $ Ʈ\iaxA^wkϻ l S_] R̨{8=az .X';'ҼT]_ֹ3b  }vzӬtI:3 0BBYqTv0: 2Bd@%/:St0Cѕ+ t? )Q:#R>^L(uf_m4P:mg|g>Suاt )[*`E7ίEӵ<׌˫'r1?`)fC)Ae,j#sxz9q\oOLJ]GxS}"Γ+D'B)Qr[.v"U!\CTYJ= Erȣ>7WaR`0-bQI Fa uw3z|\:GNz_Fɝ[&JN?>3veThqD߻~=X{!8EE?}D3eh㴑rc=U_qUf!hTГu,yzk/~Mk͏9p[O845QG%tcg=:㪕zFvWd7Y6ė}]2./ bdi+r q־uUxcf+HM-6ըe QB)b4ׁ/j΢X6XV/BaE]Mqζ |*oR1ZFm(Fb^kD"'dQM5 t(܈5y!AɊ;Ӫ#7T&%ZnZ2Yz62X %="\AzE*C㧀lHf6u%9!†U=9=:RQ4'<\["!,vuf@͐ӛ$sS?EY11@\BUq4 4@@SÏcv"{ΣRr]n=B3rU~ө0PgRTIP2 Z) SfGΘۨ_vg$_0pfȯ_dމX߲ eT.lS7;;r4:f8YsעAx֫!0%6cԐ:MC_\)#5XJNj;t ";U2ar]Rӄ DY?ⷶVȗ`pE['xkX54F$MT:.ȣBĖY*&  ⲫ46Be\KƗzN|h§.+1ߥXc bبAs&!`8JC]>4N +[# (XY5Ĵ{"hNeoOC#Y yH9\9H6n#q׈qL٬|Y! P\ڰF\KY)> "1zD2浚6&.@?2xuΧ (` 5u y[h[Z8&{ >/Ueil'g&jirБhZH3N j62J#R)-;xj' Ż81R74Qzqﴺ OWZ0]k5WL;1ݷ½^Oj)J2}y=EZ5El0]$)K፭r)3gij;uS)=G1A`CQf8flЮ;d8vO,٤={^s=< j"l/>Rz2T.+*OdL #kjyݎC/%+8oUIadӳRu{v>Vhc6mWæ]$  Gk+=.ZB<_'?!t^Rk+|Sg1R/t Hc o/*O;|zM/jŘv4+>џ}7[@ b,C$۾m]UIR 0%UOp*4QK[W{(c "_5^ +X%"EU"r4/T3QPJ30uW,I #M+,–$_ Xde )0aFM8mۇx(3q`bZuTv7i;-`h"i 8.zC3{{W :g~ӘBo̅ )q]2Y*{2l8ȍ^eeL= K*Vk\s'nXXov|1+$ $d hA#R^R`ɳki-j&Q5+Foވ!_DZz46y|H 58) =|+ڻʮ{aoDTJv*,KG9VZ^:㩷۲0 离_eMxEg\>޺k}vE8[޴&*6 'N0rx]<ښ p$&j<MkL|tw_{V:~j\*U ^ :Xq|[j%\Xaps6>R8quK:]wCTF/8Dk~zi5mM^Ǡ~!s' Ecl@?t5wl +d<0U#INK3ᬿV]\˺Q<::+"g4GHIst*:ڌH 7 p×寣{ F2#D$20>bd-ڭWwdzb6 "wefQr5־uy ⧽NI酟JIhm$@QY9tJ~svtf}7XP. 7ѴMo,rfB\'IXOIfty:(pDI nGw^ҹvf]gP|>ci/]m"zҞ9a'Oc;F1RܗPi`͜\ٰHREXJ#Exd3m ,,3?Hټ^/ nbv`Q#DEUH@Iuen9߫S=+iNz[:$Jf7X$>u&_sW`~ߦ5=O'tkORؔ~]RB8d2p~Ѷ89yYOqxAc?W"14zMX\"U5WOPzϧJ" "*J('vxPi]'֖t? aT_Q0>x_bqmI1\Ķ>VI_j G':ug/>, #I y0eqۤ=Mw3xTD?LɤQ:MDk9[ a &M<%Q;1'i#csNRR8PuLcQڊ4C^NuvF KΦ`A(B=|{w CP+x=l،el3 Q:ZN,I˿ek2߾ȯ>7& 9EZ0&pcCDR X#?'I OF^_9oչՃ&ߜg01F)H}k]kF\Lx,:a R0TonNO|/WJhni 5)Cz ~L nkW qQW3ȻVzAr*A]̳jP1fc?Ams*enjs8"Rob1$jI~ p%Y v5VI@]^\A‰]exu@Cn2Nymʎ>񷘧|^zsZ5=zP\r1+ !E/&V4 븆mvN(LY5 셪oxTFB.S=ŭzoݲ'Ǘ֧WuUоo`g(920̯v<S 3Ơpi`0.!DqFpWCڠ](f,q`ESBט#f xBF]I2^{WKxń$aЂAF9wgĵ>=<6όD6KvPv^ҿJT<:Ԯww IYt6*AUMqg &^1{ֱvid@c5u'XBNh4390i{fTsNQP:(謰cugֆ2;&B'5yI*D /j!Ur]0I6pB ֶCTݴA֐d-M|g }&ZcPJwq ,ijquM.t枢MnU*vRJbD\pN9auֳ3N쉑z5~GDp5sKABgD@uC2-}7)6[zU >vgĎˋ޿E~yu1TUUF,WEE$m cJz1ZMMTKWE!b<Th=֗.Ss~+YZ Y7\Rf Unfi{3jZ8: 8,m(Wf̆|V:L  eQГ x{ !/HKhmbZ\aOkXftȆr4ĝdp,ԇb7±zдm&Y'wrBuXC_ptjJKY%T\347;'6ȓjt '+*!)Ϊ !HDQ38(bj3gu/^(l#fb<(<}KN$YH =婗k7Yx:gǎaꫳSQTA;;2BdU f{5(^j"zqKxtKT]lh6&zCBD:/j{:갩J̳o3r$` ((gXV@Z{Ec.{ \<Ƅt' tF^0;BKZ"Hqq|E͈fF_~*!^M3֊=qٳ~Es]A]bw<}q/6JU(U%rdΩsͳA)!V!kUeAuA^c5Qһѯ sKlB5pc5sJЌ4&Y֮c8V(F_Xxw{tIdjxQ ۹l._&p'0!' u@n8k)ǻV3Xو} NyN$#ÑxziE7[֋oA$乥gN4_x16;@Ӈ##%/SpZc\isM 8oP\lF]a_r;O|ٟ7]utrqiv{]>u:lrDJ7ew$9ʚK a.Ǩxct썡|n[Ø\Zni.vIZ1JVA i @^%|c;Wk8:vZTZ!ig&#HjZ6w ߦƂ1L0oRPhN#ftI1nVjF/s=–CLg5).z'g')sw|3(8-|7))#ηc Ԥtt {v˸$`K-dKU/T!pC7.n.wT2fD凶w77Wƃ:^,̸8xkbr#:G1e:q&1LݮӔCVE+H+B4ԉ1nHj|~="ё81\7g΅󠑜N0*r!Q &=s#a6V)`vYؐn:aF0 8Ih2CyJl~ߏT10XP:cyv1֩n{v3zꆡB P/و mR%kRLf'lKm]>+ZJNYM `hT1JLM-xK65&2 HnI,8М柄-cpB*;(`R8].Ҁ%hw81XIb l52$fd&Q ])E! Rg:{B`>i5C)~hLY7lB{;T JBUTI?XHvl·9sl}ۡ%D>حJ'md| +VREb]Y#je碳%Inbn K*-S CXDqC3ܦlw!>LhY?; =<<7Sp_jU]!v.9HA !dA;ĩe=e6M_pœ7BmtƔcWo Iz\dk f;bycJ䡟uͰ1l\vcQk*-5r͠lyǯ.q>ZXҠvzXA=堉0YEɕ|BoCj Kt(&[Qva d5ZnB3^: {t[TqFT,ŵ*V!$ {TPvOMNٖ GC6PRZdLvʧO<{)d?2.i6uŻZR;eOWvkI^5C̓Gz l>z݈^'þ`h4Gɷ.@K){_29 mEwxh(iRLӮ >[].ވ$7]n@!h5qu! v2O:^ 2L"k*ֲ9:x̺vqMH$,#iսv1`խloMa>4,!{$; G؁҇ݶ JgQ&'fWi Yu=6m <T*c9P893<ƯLD^5 JхX&NwzhC6խ{ _%exR*hTgm'0j}o#| wCCe1;Cxygr--n^i#ׄɬCQ8b#ʞne1+ļqlv\?2ʐ\'.]<4:v/Ēg#ȕw#'׉44 ed]Q̍b3M]R8c e Ɠ:Y4/J3dtYV!OS(j 9oMYЕZ8$H*.ԡH(xshX1([!6qYo|Nv"N?H tE˕{q$|՟#p6u D(j;0Dw0hLb8kZ@}τ0Үex"EF mXKo|T:( )4'F;3s ZPhOk\۬jD1LlD];CBo*\H#.[6 4m:6Am)TJ %$h3Z4XڨӫA0JH\b@$d XV5lI;7D#AZ EϢcoe"߷7JH6B,MTw Ԋo Hd6) { xQ^O^p+ }A`hD3,鲋7Nۺ pX Q=qSje,D/yF:o|3*R?,߉ޓ|t2;\* J-K1D,F.ʷxPƆ0l(MbbwiQjY/l9(/70 hwD_&[d)U$e:ʑr40X%pi^n'0bƯZ,zS.5 Km>l:k͎&+}9eozw7l|>%%ŤqMk~]hk1l I>1L,":#эR6,\5=ٯZ( [ }L]\W]Q44hh @|:l 4ߝqt(!$KnbY@3Bgpix0ٮZmVpMIXS8t-$*$XοC / yY!] Vց}+Ig9]넳8u E/ׂ1_Jmu,PP.#lKgf/m?]fz\֖%=WKbcbߩQs4p'Db z>"aeҨLuŚ&FKPDJ0Gb冢!/Tf!(@:viSo {M2ؙ\a-[oW8}gnqtu,nUdRQ!ݗLsvύo{ΤQ4Dfe `@V5|}U Kg+ywT6Qǩ񝂅Z xޙG+3mZ+=wK'{-fs+ÿ1|-nmvf,YVHF!È}\1t9‹ E[}bY_U=i k<픅tRN4~ڂD. o Mֱnfqe$@HJQ)V#M=E;$0Ëݖ'WVԍcT0DU!C"5QsЩ`S#-) ꡹X2vb 8Y eXt3|ބ]׬${`d fx^\-CXP*07(7 FS 3(Pb2mQR"d /eX.L0 $1Aif|%  ^|uKwߤ@r%zaѡ#&%i,,C@(Y5A lI{ը|j}$͸DM) Юj+*croU1%!!5nN%d=hzp[~[|doipMg=)2 IegMӻ{:4C#1"r_10‹WӅgsAը,d6;7'gM1&<6̢3ncp2J!O\zϬ00G"=Џsޖϖ{=ԷB-#ƜGI@Ur:Xr*4jsۉd&/T-l9֞X=VUveVxz*-!7n٥ ` h֖$<;+76uY, ޠDE[_>FմPezMO.aQ7g VyMDxm)f <-@5*.d4Je"ʉҺ0_mk3Կ:eJN5z#EW7^ҖT;z񱢏7D4cM|05Zh($P묅wWgd;ëƟ 8SھL`Wsңm)cV3l-([H~w|OKot19jj C!+VX Gea܌W*⿭_9ce'. vIfTvßsXi4/R3WԂ>kͣ$I^e <5)\R2WQ:f =-zY8⽎4r J|r #)ih*dreN 5ibyN(oFޒ?bC+5փ k*a\?j£==m7cO]_B@Gqϻᧅ/<Bm@_׼F*;e%} NH[6/ //Ȍ(l 'JअAWKǤCEDX=ydk`B;*CǯS:k^S.4ळyvkaD8J/H.BKtEqI4sf.Eg/;={ #kWmIh,Yy|Dr<,*xB5[uB=BQ9z溔<1tD|XW#RM,e*'%61|C 5NE)UQG bO)T1 @+B"1lgLayoIYX'\;M,H̕6ʗPR+IDGlQLfZ\;%'Ue eS*j1M:_ "+g"Ek4wA!j}Ƚ3`GDFV&'U2ߗJ$él{.T MEfJ zDP/{W7w ? Ņ&α+>/62yMo҈R.?I&]22u]6Oً"o LҸ6j44 J)ƨ^L'w޺G3 (br ]3ye\RW ?v <6Ȳ!']/hvu;LLB|RRPF_~a[6%q?MWU:W1q׫w]5Fx!}h/E' `AЀ* U\kzrJE*Ou\$M}Fqei,5hk4ή%;=9\0e;}m z$@U§Lm,u\YćXT,%x_-I!D^.8GvK>Q !1<ﭕoiJz8ku~niWg`Ac z_؍]cqrQQ;Os ){wO^ {)`℥m,τ<3H״9Q (^9 !&(,3,a#'c #~/j}5p4sy/B.FL-1$֊UBQС}# ll} #\xe #TX( $Gp}^$Ist^jQ}E,MuR^"i&EaF'JGE?i\L]Tsi2]b=5paxI\)"!oOݺ}m<ҽwGҲ@Ъ73u(316Jf/KX$["Hy O'"6a٣AW*`0npRȽ\Asu %%s88'KAMIWn > O !{h.kzDETDƥ"Ml (#_>Xg}hW3/_k\k^v4M졲V(SpJ+"^wRHl:F)G*Mwh:o6$x0KGE!Hȍ*RC~!msJSڦZAcV0{AՇ`C!I&'rN{TSy`@Vژ.}^2[ ^C ᔥfUBUS&:K?P>dj&N;!5|)0Dd?cKeUVv빭=HVR=4|ZwW\ 5GGu /@<~j; zbVשP$=q3Pޱ^c7Bw-TA4( vl,g =n5!"ȱtƈH<֗M $GJ;h~~vGLo^hIa{)aW-DHDB^H.C@^w[Pzm/gݠRNk 1D<_{:DSp $r4TQlt([*#ʹ]uNKp!|x\okmp6 %;HIt:k2,G`$=Rs3Av=0,:H5~˙%GOr|x'^nZdC"#=3ӅQ(F$Ey ^R\cS(2M"7":[ .|6k0DMޕ @;CB08m$=J4~`1j@!'ˋmt]@*Hk85#:?Z^vA{et5Y~We,{#~'OP,tO9s6&>"w WX{?ԱOzkb@vzD*Q0Unbs攍Bd&@OlDx쏋|vܢ^ gj3տT,=:mx'w7QE ׾B+֘ 0V^Nֳ`k"ŋ;Zc%JB4`h.Mtk+q3 60KH0pCMMzR_@sQm&ɅA>*NXh.u3B}JP>Fa/xGòAkxOX o-;ӵUXܒ^!M+0b,gcͧ.ՓOyآr$vw_n uolD*HVe $t]^jRR'jP'Dy1aFdl~Oמ e l#맳g|cvt!:IU3ڦ<#xl8rED4)W;ZJ{[\kGf,J+YBƒsa(hwɇQƃ1K3v75 q`)F?˻I1.5"kM.u|8Dlwv[:+^ʇm|$>Jyq̔Yߺ PB ɥ>g=g31_!q5Ju=]u^#P:ca eQP [i# 9Ġ{_0~!*Qⴽ}-gHI6\CJ|wwDVa-H# `gQ"1{nielqY`tľ.^Y36+|0%&6t$hcPtY) \*oIdM/];Mz9@x]Pd!]l"X3#~Mo+֧]:‡,U;->nGv|28OֲQXхlu1^bf{<_97F:0ξfO]L심XAr9-]N7FXL{n8DU4%K2*%t4|G[yBTpWS1U0M;1]pn!]sZ޺l݇ F)dxJTf1ȓ j!Qr蠮0Tl-j7 M[g]ʅ~8sHs(&}<21t6Jn[_2*(M /0t\Yev^qP"H}oj 5!;]{$lj)$$cB1m8 jߒC9I Jl.WΖWNPU*{e֭\Z4BHhv#ZD\b. rP.V?#_<8A^"6ѫ }%͕wK]bׯHJEU1Gu c꧟: ;?+RwD+cf 9K=Q`&{#Bp|Yl⫗4T,q1bkMuD:]Uq1pWϳ«/_H:RJ р\=hY*Ay 3 B'7[ q/VgTZ1X<3?<@'RziR7oEXeFCDS@0GOveH&dac,;;0yfr=bZEU] @L#`LKcFgq% ;a"tDKtC>mJ΁ Si) =&AVj<7N;j/؂>OyjdQbz,Ipu1ttX"UHQVNBsTQ>8ћލzwpZ!lcAghh'Y۽yD> hn$`;ejb|ΐ[AhZqY}de2Zuv!vzb67]u{k=7R`]U2pA-=隊EAB]JA/pYx= ^$*B8K!$%KeN^m}=4m?l$m6BDUA8aZZݗZzؠ*]#e|b"MXvɣ4Vn9 &N8Es2D\"+k۾2`YG RL7&amD 5=qq ( OU˻:߈17w75%x,p򽜔Gat8F|R] AFQSL/K_X;3wn/;V7ôAcHaX[($+c{zs( JŝXhC?@<Oon6ME" P1JE/+! BI{AWyx(jȧ,#(x*/"^^AtOr䄐^9Ir<% LR!D\`L1j‚ȼ/( $!TÇ8BIQD x^ rh(8^D+ŠIљx(P,i*!m$R@ =-khV(715tݏ_퍫ozY ZT   Ͽ~7˶38w(T)vE\2Czq>?[b5VRSc_MIi巼EÜjb{[xiZՌ*,*>Ym؇~O43Ղ 9v<\?ϭߪ=];m(m-p(yp"2th_J/@*m#Rͬ5s.1-lq\t9tC؞fxBPW8ABM* Tr4YW6UAk]!}-uǵQ;c|b_i|j'tVx,(>S_S t 9:l1fDr67G3 o=?!(V5eBDla tgoW5+ I31 s.zweW&^'8_Oo졌Ķd;v+V濂TWs{+=PӺJ36b ofB͉. 7g]k%:@%ARIaGG\}e9'_?'?·#Z0K3wݪ +24FN_RߍXZl6m9\#zgT?`J>nZʟ'q.|Rw!@ Gn_*2Mn4/= `)(*%YM35IԵij&ħwS}ʌ/yS]NEkl,Xm+N5ttcvXk&k} ż<rBM+{(P.g˴gS3N+OUU?&:ҏD7?AGX])a !N:2lS9F 9,GGP s`D|"p r>S T0wִj0-ޚt{ƺy˦q?'HmQ.Izog)׮fyfL,uw_ν}Ln_A/ŠȔϮ͘*e=CmA3G]n ^ 󔹀35`腋RtRקwi-?V4 Q)0oLͩ6[x3q#ZQijx`ŮKmtGɹΑJEFm/9;+$5@k^덽g ${3$oHagy(SyjEs*. uOD'= ⛥rqyWʳPyᴲh%U7̽uHEN0)T,c7۫P׉\Bx >`|MΛb7žG%۪o,:\,R)^d҆fbj1 ]B9܌HjhJSӮ ;voQW}+J6+Bi+up)1*&UH5E`.1UQF%مMM w&wLw9J8V=kYut7}s/[NbW-Fcn6[TMQvHۈg 8[ZF{A21zo:ߡs_E$OP^SBW *{+F܇Vôf"&[t='-:a[MHAƓV'Fi"ϻ|Z)+-8r1Y'W42kZU:1*MDb>A4:%䣢晢!S_WXӲ02QV$d-;xW]ÛN50qpv(VSDQ{8:w֤ƺL"<z.8_X/ bҹ5_f_/ Sj"*Og*1Uۘ‰V_7L]_ǽOjlV⋔VAxA22#7Nޘ#Za7IӾ]uf"͕YzT&OK,ZX^a8? ?b8ҦvNQUupA8v-ڵŹ(>֕hGJDH&X%ܦ^KR1|BWц.N(50cltv$;UUN=25h,1_} +A^9ɂ\'x'j":Z#>ݭfcq[֣C{ O/8*«| lRP2ưa è2a*}t`-J7*q?+7+L=82.J6TtҝS~:WB"J7 Rz>o>Lu=*@]B;`\/~eatG\ce5]/o/Gq|H" U' KSi]Ң O%+^;.w1YMXHo+LGk4\su'3wC(oޢ9nٌ');;;˕F6gF #kݢVik,zD;Y]sW^/ qqF Ԧ*ںTWn>6t]!u2)o^L ^[ĢwKq>]j[`ln:cΗ n@|"NTQΖr"/qq0J~s_xaF8;zǔ2Hb99egggO>aj2-4I[u:@LR3Ӯq% ?WsB&ࣱQt+FSǣHܼZ쭌Gyf_%凝rH]`Ӓȴ`7Xzk/=Ls#N*C_ì:JCXHCvwK§8|*Jp2|iNaiCh.Z^Y_DWT|8yKDAJ Ω_{RK"|إ1v*EN$nBo{gGI2[풇vr%e[ aTl>,z2%Cv"1Nk`kB'<)1}e] p5tc򧱯OQBr0Hwr/xF:kgZ~a龜r x|e;-H!}*H*#@&`{4nd1 H8U ϼDdO\|2)+p `|嶒E%uj:*Ӿ|=/9cYӏ2PN:#xbgGQUOf JK[HGN2,sK'̍"9_MYox:Fywƹؤ}tx~w8t[k8e𢡄:)%.tʿ)-$iM.ʏ]IbRsLYtT~o3HprFR{1m^TљEL&{$I&xO*lVvBQ!w+oL, fi$sg37ҴkxںK1.lI9ŏr1wr+I %`@WMivtcֱҧ߼'f3[V\g'̄[b=Cey?[JΜy;LteF&HaʆeV"3!goh\בRTdҷeԽxrSrҥ*w B_KDTC'%VſZtPy3 uMuzFץn .*)BK(GL cɧ0;ЏN4_ 5Q АL ^W5ylF3I$/ Bh@MUB/c6ivGs.F4(;҈ه8f]dx,] jv㗱?|M_,쯸-vjCgfϬ t_R_7dIӮi o g0A- d.&;K+fa|]sЭ wDRJQL_y̖z V(xUythhDNf%R?b ; 9WC8\\Lne‚J鵒t5fSU6'͢ˆI%wU^靇H 8č7?f=HK*1MHpCm[bk^btb::DgiY "GH֢4 olEr\|Og6Ɠb6Nl_2)lĥ4>3{ !Z!iHӌJ 9û,,:tluu_Z[^kcΪFN UBSef.yڸ. g#YsW.`o u|aa޵-Z&mw׫u)~D O27RA TS4Bk54qΖy\ Pe00rtV/d=o!RJT:3>[ϓq b_i`= .lklbdl|3ݣsμ̋\b|xLx&&zGrl3iú2St[,t{0:23b)AJz^ke"5W:Ln#(J_l#Z wWArj L5x Lns{K͎#3_=poYe޽nٝM;]L! $M5!>*n1p^0 ˆvA|¤/-חӡXh4m7 }lU%~z~3>/GE{XHÎfSd93IR)(7ܶKufscpӰƹnf{6uu D}- U<4b]arv)9%jpgkQ[p4\_Gs"Rc>u:}DaQaouH@o&?Ǩ?F9|n+7 +%?R3z7JWIX!C&qHnXt"vy*צNmOF`qL;1ˤ)rSJjN>M.xjqM_K,;.;^chÚe ĔN ]TON *} E#8vc4:D4mA諣.iV2gIZ,+niQ+UU`=2 w)$J͒؇ (&s`YMf۫ǕƻJz{17ng2]OrO=>צk(*p޷Y 鞝zke)ѫ \+:/'}^Gx`QJ =TgJ(r g&ea) }wVl& 4uG&um7CAsg#(;AC3fcsa|W@0m/=B:ySWܜ>wI,XVdDP E ~ M0McVhퟺ1AYIVxuPHGy̅uno&OYr- M^g{缺d @Є)_stF`>`Q^'mq$d 6[~:Co٣Au^@y*O oU;x6 !9I׍w_a }OzTx.ϻvpqݯ#$/gAtOPjw!mwe+[MXs;Y>~0֪y`?#2{nwZְ͍w6\/:+ ^>Z__uwIS4m/)_w6zܧϿx/|DžlDgGcbm"]ȃ-Qާz{񎫶a^}Z|y(>Loneb־WeEc%{,=Ő>Isz,L֛JEE{6QSt:Rf&Y;Zغ% +xc j,|V=Vߊ zwSaЯ=c&BaՁm0!jk ocl`Z*\A#,+yf.G4-ᵺ^ ѷV$>S/2hP}  (B`i@NPc^@c ^2viӱ] \ =mJB'H&a(볎26M#oG!kW&ӭ|qƪmP Uz[9D XZIh=~7~i.ZփjcmzPO I6và Nz:-}&Y],؀, dh:t3Kh:T.u$7DxuQE ] 9QXخį%~=$3>Z[`An CXh _l%(A5m'|P@&O+]O"Y#WuN%[t Wq$V7ƁuƳ~-(&uҩ芷=~g'Þw,SǧVTNg\o_a?Y{u.ߕ҆O>Թ~Gh?[dfW_L. RΞ8SG;}9߲/_koؾVP~?ωoeLcWUٛoyk&7?z=}}'B}|K?~Λ濺R8׿+I!~^7oSkΡ2"R3^~j 5 {j;#ȩ}R|h[egXingh$BRG0~L2lн)IWlV:^:މuznޥ9HQa .>va*:~PA ChohA!ZV~?{vtUI+觱 dm1FW#0ih n%'y K3fqx||?󆄚I®=[W՚e]#]}4\3ֻ+ k/sK~X|C%2!>o K0sVf8u7)ŪX{:wIG_׮~h_FaO.Dj>E޿ Ͷ -uey}‡[6+xR? r2wĠBe*u0ҬI1"2KfnEQ~0Uv-D?>cN[]iMaQN-Kncf5 5qUmS;_>[K_W:gkq믱7 S_x/g z_JN/E39?6 xW41KRL7SjȺIMr0T7 dǁu4WFpBL1]'^GcW+`@Wåu?^nEO/1Q?J~͍"ϥƧSڞ.=Unj.CBnCwv l ֵfX^WĽnmp95 ,{.@NmX?|\\qi hUZT+UӺ"78BL<2er|,zG)'޻o,i' E@4vG%K Nk!pq z!ursoWةJ  k~\|njdw. eT.fduoNuj@P_JU">R .:G@.)d8G*3vkiHu##ĎWJS! }swDw{I#@ǿɾp9@vȔod\{i!%J)hhQ %):{ {vt+#$bEے;܈j"J;PxC{4tou>T!#޷' ΂0g?"ʏΛ \ ÑhA'U  S0B{'MuCNբ}A=b$FlTa'I_Zo>5XB7qiĵ OȚuW`X,di1Zr !rDkd  -ɩq=*D":ɗy2eb "jj*Y $*_[S[ 4` I"GZqY(Dh s9R7C o-oMgIt? 3-5 5p' z ٺM$@P"VoIF??o w{ 5}2_G0HcUi}>dͮ( swEu_{UvX۲z _yWԯ"nw*)+غ2Y;UUA%Zotc=+%RP$/n"xBc , Cve nSGdEWWo1#7?3#|gDG!ت1Oy?r+BziH)^*ɑ[|<$ba)9?③tY"^itv{+3 Ӱ{m˜T[s eodFG?y٩".gLTO qEWzNa mxX-"RLmphj(֪u:^\ƥηJ¹)Z>&I(.t9{p*Βo'9dc#Z|p"j..Q\b!"Iu{RLLǗcR,LθMjd[3L"$W(OM'2;V.]\"IFUG7,_Qp뎒bΩ^jPv&Yk3#y<&gkg!8[>NiVmf=j$-sJV&T8@ G([?IUy{i^b1z@#:{L0j;*˛[0L;ᒰrfsVҧUULJXH졋%0?2j{{RmsS>_7D5P0K%_P& 7W#9L2^Z]ǩF%W,qvPtbk4^hI:jk$C6~%T;~)_לv˴$}8eU@=c8Q/ ?,=aȏa hƺU 57gHr"CD>}CgnH~ v_*Ô!,=,#wsxKx֐czPL,F 2!{SKJ'e|!QpaЂqJD IC Ao7,-dm]/]d=8/$86e#Ź\\Bi LΪ&1 EesZUPXx浝k+UhJ9pu㸅…WxT,i&&ysS޺RGJgܮ9Oωb5g Uelɤz_h[A1 =rr^!Μgx<[1[ֵ m}؏z?^t4#,vgm[Hk:;)߯=?T+v<_1E@p^ =e? *v?a:A.s-f:1b>*& B""1HU2 2Pih$jd$'gkd^!BSQ"Ii*j-<SUS^ țO*C9P6% ^0QAEP^PJ=G"DTHZP䤱MX2ʯVBL$=akB+Rw/.})$Vo#U,0۞M#MXw n+I9LAV BrW%ih>xhj]J444Aʹ.A*z춳vJvb )Qa$5 D9 xEAQD̊5 Iʪ0**M% QPEf9%PQQTPFQ T!W(8W+/<p܏T'+Q)4Y91Lf9UDU(0 >t񇹃K&gr t_-kkZC@Aqbb(@•ADvc^'R86}*Cf.2JS$\NR r \J4@CY`(R TN&  E4M9qh f9pfA ~d{.3t|d?Iҧ9Cuvql++}puH=1#XZX=JOZ0"`5 7DlJkǼG#nEǹ?\/a~ϵY;@%Nw|!ktC3TB,+)*@iJ 2ʂ*'§(:BO5;lrWI[2LW~w;?{% ?B,& QYcBBT+8̠jJJJJ2Q-@d$@SJY#HR5Q4.ITCAWOz^-G1;g>mP 0ٺ+ӬDY%Ov9T8P^7%rPjTPrթ jV4J4ʂRQXg?%g1\m˭f]:fBBq841(H'6|qC #hep^@b">+>:gRV5aGae95[ ۣ8(&[ jR\qE` @RE!YEk%KEPY B)+DUxL;!R ȩlLpUH1'R8DIIĨpLBnAɒWTJg. $1+q>Y8"5f<Ԕ2Ur(apC$LR],5eq`,%Aȥ"T BEg>xI0 ԫM :V(^ % ԆB0R&AW$J( z3 q$ ʂu (UEhSff@^<6R,11 &*H !"AVm,l=~2Ii6Nc?h|%l-)3-`zI`RA<ʏu̳q&#E\x/,$unV!~4"6$.֬ N>\19_`|+شT}Q}ٞ\KEP l~v=oa2bUU[0"Zk*VT"#VVzXlpP+!CT4T$Ʀ2XC"b2* Z1dq* ``3"aBY?f:^3D]aƤV d% D!TB&KZ%Ӂ |O&0P@'0Y=usH#@,`EKhףҙ2 ! բ70O]{!X|!Yljӱ6:{ ά) Y,/\:=susrk9MwM-`?%I6U5IN! wL6ȰֈXrbyf ONȎf"=8:UC&{H>GNO-& fm?n/?nK~IH$g&޿4grc 0@QUSdʴ@ɧ"c3 &JSQ*((ecjBjFD ]QŜ@rE"B.NT O0 Q6$M\4ɆՄd`GW=ˡ=Kw IBH  )VL>^O7('~ɆM8(˛j6 Ξ`,_fRU@y3]!%ʈ)i(AE(xebUUAD/ I$yDRNEQ* J8V$ԃڔDQT%DvDUzacJ}Ξ}0u:7T7[=^bI$YL~ş?%|0VcH麫 6$[gC5>'?-~t>Ul  YH'ʓ=uƂl$e@E>sϹl=kl`{9Y>S]ԒH$O{Ok>BJ6(]_6ϰzh-!~OaQAgf9K2NNl5WX,vqR/"*/((b3k(?rs`U̟]R3;뷍h{CT):ξX&Jd%-)B%PR1&"R1/1"Aci>HL`TSDYl*QLBE&Ka͠)򇮥4A ^driiZ2i J LӀ)ً'{BHUxU P>P"=|"H#F!dΚמ9ᮗ>Aϯ'YdOl!Fq{<t+1aאKr1!4[gt8\zBP 3ZHB e Aer0˛R/}Oms>>M+Q(hsX'o"؊1+ǫ˟OmkMA,/QS @ i"ns jn~}hDhue36@LOGQ pռ &ٿX@KEg {J@6]-~.̽l2u!ο5i3`َp۷/gM|T+/C` ( 9`f yJvj%R\7}"Z qe5a P[-E#ܧ-;j$lde9E[v Ӓ[񊔵oLNȖ H],&m?N+& zqk;Tg ύ0$L{؅ (Fݝ 2C_!n_ëc@3zj_hq׊1˳=7_ʔ0̫*>B~-r ògzTӋBάՊ qQA@vdS7瓓^w|aYY$AKO} Y111Dl;[.B%IY_l @PPP^ F$(TXTVLN!RL2*v.iݥO]S+#S)fUg/}LTT¸?Ѧ"/FTP*8ٯ򚢪f-`6TFO-rr%b8OO}][3v>;m>i̩CqR,EP=[vAgkU:ϵC'"E?'v QB[82U(@\0_RH/y8A!hc!JΎ6drRP%Ij3* %He9)CR j227P<.!1)uP&@j)2B(2G$)\ȥ()rJ_N0PQZ!R!T;@K[E \&0rQTj<㓩iLirZPSN웅lzaC "KK._^&/;V$1\&j۲&Б8o=]iC ?QJiSZ (Z( U2 ''8 A,^[3IBFN_p8Wy30))< R E4T4{oUPT4utZa[ #AHı- q@*9 _?L,PIcp/$b!f Xyi8;1\Ir p;E;eb̶RL ~r ZCHo0p#>{bMOgqa6b~VvҐV!o0HOy`qG>nf 2_Riv\IF!?UqE#{Z*0?hq*ifqQBT+C>Oh$Ċc 0s@<+*TC# MIRFq'ɺ?׋xO,b.Oɛ=)~*~ȤU(]aL\,85-̳ Wrj pYcUM@ܚmTt@7u)qp˙s OY&ʩM4QqdME-ҪE'qܔf-( 2Q-j@)釤dJ\=1(wTDꌗQ4^Voj '6M'-`d,Rpw5+xr~J _c|f .لs&懮YMJgd!&h)s)f)~iM:ݒC@C^ߓ%jz~s63j$s|}y{χOda5!c~ڝyaO>uB;kir.huAdE g,$;q-dz!XʢQ r{^Bu@AJ^uҴ Ѩ x9;JIA{B^ssDzx>6poN@RJGS&U;VMEQ1} ~RD@МSKH A,eHXM-ZՇDcONNXFD()jJJ@B ( gCg?Kt8djmljcpDqh+yMn'F"sdN8Fe'h2Oda< joJ߭#!B ,GL~a:@ ƞ&V9k+dEaA}$Q[0|I')ֺϏkӎ|}D*CJGPPZlj$"qZ+"/뗒/y7L߇7]?ՀSvr1mMYH:kc+P]5 s'o??p!ԁS%HD')[IRL_vMaXJmOA )祊nN\cş˻&DxEJC6Mݸ[Q4̿I{`Nb5 Y/. b]S ~7rgAJ4asΖ/x'4cZzW qe ;4͇>0LϤg;m+MN ,>RkDU3_7ُѨH!`#y#TÛVIp7>8.4r O2XaI#H!C)hDg=}$UP1Ys0^:f][sǐyC{`H=ġNM)٫[o&0 ~E̪:'(MĄyGG$ P4!g=1fBZbQ/?xV*)3%c~[ۯ2j'De71PښEДߧ{HKjֈoeBG젯Y߶]:~@;Cu>crw JѮYaE[Wbf0)f^ C*3aߡyYrXPX B:SIaz<4Dͅ1[H~X('~o;WflbfeQcCPY7fKr:] mw@Y.st|ܭDS9q{^{{0z0A)%D "Ưai[KCAjH$^dG0)ݗi_[i)rer4K* p>.'WԠ8@@{ԗry vc!4& HZ?b艕LYPbn.ځd(YVuNsv䦍NPeK'(?DX@͟-ݦ5}j2Nnj8y4Tn{՞]XrVkdl!b<-y9'fx2o- 6"yawzC Bs`G>C¯p-j9F AeHVqkeALzϡD$?}3*T:2M@= UDh yrܩă*un2F (x8n% X'{+SuwzJ%QH:2C}D&:'z)Wʊo0::q=a0cĨ7mTR[PGP{#9~gP} k NI\#=_6)},>a(㍾z~8 _~bd[,vrMuIyVV-@ ds|$Lԛ߾zGJ=K;/yS|ď7YT(NѨPl苤X DP3 hʓąE!%W$PAQq_(`ʺLgXc4+lxû`q :nߞyೈКelR 5I"s󺡽~̀wt9AC$Sy~ng>+&C-1TL%ԡCAu.C= ޟ=c;%8Ns!*r?-eYi3[6 ʴqMqU}W9eͫ7f_w񇭙7=@KCsOlϭ|xSƳG(MNL=<}}a|4O)<朋PDhGK+4fXV# '*DU~NQV|Q%q;m[X$%KR5jbtvsβKKs BkvHn4f? C[#^L[ O^xzI@RR}=^o?mӴ; w7;C)g?PkTB!ý]92 `̀+43dW<,:'5zj u՛4LM{N7ѐ?u'gm*3@=㏏ _D}?&H{l97OA%uz`: <"&牭!+3;{"}xîװMcQl=k?t5=xmͬL"U UDR?I!-b!DUfmSHrF}??~dd!ZRA/ɹ$UAj]T$L O*DŖlwIv4M3qefv~^~C+F oR2>h U9K|%sZoCȹ_PyZR{ 0ěq5ޘ]Y0M_}gr8G@vhM]u.ˢA9kCthcS(‰̃C Ύ"]2a# agU'I7l*̛99U2]P*2q&फ़_N{YJ<{z^S7ΜCLa$:xa_~d8֥VRgm3v);J}64ndS@R⋶$QcMqNSh{cAyq@3[= 1ѧϨHՍ Ii"4!DV٠{EL3f#ʇ=Ρh9}o˜*t\h"(%Hz$i[1q16945G "!7աAzj"b-M đPF!KFI%/84_yY "D|2 mh-M&۬YYoq e6`W6R]wJ8wWkj`ni6m2wmwsb ,E#Xfܵ:0cȇEOXRkbH`xB"Z8Q9w)^@ |@VBdr:(rfxu [xn\yq Ĭ!HRqa%`V)WwZǮ 8OUBJv;ƦS !TP&F!e)*~}UeO((/l奣P|sd̥)0j1ZA RŁ 0Dy~4 Żb2E? Ok%&:o^G>]ߍ{!l@RąJfaܚjE4|Cf$bvs,Og綷yw懣Ƕ?xw"2^u!<WxsҥvЇrxb~B0 b[d[ RE%4S\6,&,c X~ˈ Mًuu,:'#qڿc?Ey~4f xD)hi=~?.=ߣዶivѭmu WQIP&4u .OަjxţzxT!aFb^$Tl,CBqE' Ο{ Lis,nc}#$?(LbP\kF1d! m /. skJXAVxP~1g@DGE]އFO:ݏMcM?KGqG^l?)am '*0T $@4{\V_c]QV\LSOw3ۉ荦G:d:Goyaj~=<;s7އBI|L Nٞ_>gsɸ"OL*I=9y+GUDA–("BT4[(&imF`&6 YŐda*(%2J @Uj[m30yOJu"C}\,R2(^9,-3!:;'Wt>8l;<xΩ8)lj{7edxK0d*<ԈFC:[g}˦yuo:3 lb|#/44E;L^OH@R=9!~,))C d76'" $ Y?dM 3ah6$Q+Q/k {/Csw>t[qP̮`Y~,j㝦hfAלVK^z?vE ߏ0Ӟ!eKwf(+Y>!Y[b 1X <^a3-'O~INj}yw< 1j]j5Gwf;È*PcIX*Bz\IhճفG\큱J;~N;Wƶ|0U0seƻO}}ɽG>~4|t kv!oN~e)16 ||F3JD!@o?/5 zwidFkٙv0ҶpgymXP4wh _{׋sl}v+SzTb? hҍjQdFMSԛ`Q3SMQRKAR &"!kQ= cQTnUryhrjAaɥ雄5w%r(B(QT<:$<Pn09k}EW)x'tY%4eXF.( "n2T*,Z!P>ٌZOU={U+E]1Ybw*MȨ8dJ1r#II((J+P`ųvMXEp?H44CivD=tЂ=X `mEXTTkQU"*LC ZYW%՟\!yg QNJqp*ڍ0¨EjKJpj &čI\Iv2d =ʂ==ȢG^V{ED]kbP#rՉk8jQE^t""+"AP))0Ԣ ȩQaVJUˮF]‹W "}!YXR(1;L8@)*p9%~V7{5Ld<_ߎ=w.fm|4 کA Gq'!ܟI81청 6p3j6:rh{Fՙ}{3ɐ~گF4gUEɳ ?}=?ee en? Dfc0 i-*? ~P'/'|?Ff]1#81jqdm+֡Y9:xS?M{I;Їk i{ m5eo+v&g&rv9K6I5ۙ?}%[¸r>:w]ѐ N5!E3;5mCiISG!Q^n _=;OC뤜5^PR@S]ocYz+T4:x]\#O;WӜG˃'ZDt#AsCE@ai CRVk^m,j|T0N#pjYh6{ǣmU ǥ+oxAz}UW(hUkEFD7m""U@>_8 1vj #r3(]'/9gɋ7ވLlHSl pu'= ,Mh&VIwnvEj{O/yiY1cNҟY#0q3XɈV[h6av6I:yCt6la'|t)(12qdݴ6){jy [%wPcB!DLQY8%UaLH!GH#J㧿Px&h(hɈ*) 'Z,f ԰F bTP `R &+ hYUQA (|!! ]<(,b˴5NT,Ͷsg7&0@1X`d8ʘIe"֤U*UU1#&pҸr̤^R8œ<*<#SD6qJ1X pidX h&dԦKC+'`Y#- [#ĎК0Sb111!CH𹝴b遙_L2%b(C  1%gݗ\RT6ئDf.LU1b" R,Jε+8$XI)mc1, IW,8G,Cw|r?&Ѵv{;*r =QIDO;Ӫd**3J "(8.Y5EQSME'xV"iQ-1ADUDcEQLԢ(,'(r-U,I@k(RTI5QQSUAd)t(3q%" R(mu2ҫvS5grU+[.٩d :fZC˪{wo Xֈ%RZPnCo-R3YJ.Ԣ *$?~T%Mvd}Xγm,!f@a&MC:$D$|?֐:6]D0f~w^nt4Ɛ1V*-hT:UZnۭޱR*TdK2nP9,ZPq h V'{`TYm3/ R~g^GR]Y͠pvܸ`vnXLD.5ؕ*ꉴBG:1/derk~9kFEv`jV)6m(Wڐ Y]ZR# P5XkBqy#C=&9PYilC(K/w  ק~SJ(2zW"Fw4Ϲ"\,UǻaI:aC8I&c |{f>}m2rt2͞X|oIvޤT_:~G?\4HBh]Wr u=Vg?3g{)K\huO%4jQ~LZdl˹yl]i v:<{~Ț;Sc "}KpCɯ-/JK20GUr iiPe9" G|ަ7M|T46荜힋e:w0һÂ2uK! &FUJ]K)*NgVVNpYJ&cU*G/1~y5iɎaoڈʜ>z{L$81re(0i"VX=uU1㹛zD':[ ӃSc33(l)D]"=GC'9[8O"F)C*'yq=K_,<ƽ*,!M4RPI' TP^zn@Dެ>HWP~ % Zo}Ƅ/x nݧ'I ND;'޸dS?#TKfn`ɈE?&W~xl,YMQd7I&]!o2n2Dn:O촭EcC!xk߿(L5,O ȥaMbY~ iy z Q[+9|lt 3AJB6eFdn渴1'a! uP d.F(AɤdRHxB EC5w7!DhZJid65!@<9x")W%,ma?L!f%3igT&jҲJ9BD4@%!"pҗXR);sbͻ·n_{E7l*MLeNR. » JJ"ZhZVR)XiiH%i(Bu#'-RRIJSsi(22A(iR(&} qgY h| & (:@dU5{BT $Q2h:/xwn3;g"U4VV XbŒc'nDUDgrbz%J` tau(((?:R>3 19&MTGbhw5i) M)IKJka`%_~5g=Pֿ~.~A7Fэ2ː6.+~ҙ*IF/:NqI!ЁB+!zvG_sV(kQ2BxNoU)J:5<@qPSCK@y SO3$>֥CrdĢf`e}`ܥ!HD$ T,$h)*)Օ!@44RR:h8EܸBRV2^HiT+ZNh(h>PI0 BPP!H)J_ \[bѬbS^#RVri Z.QC(Q+IMvP܆R Ph5.9ygmf;3?{X>C=JF D(@O.OxTjiJ2CQ㼁Q@&D$?(o2NöE!Y *Y&' kF4.ALKGwPP;Yk S {JCI'"&C,ExR(,"tO""=*'R`AģS pd:Ɇ(CP\5%!˃H ɹ" (H!(iXGk"u-tf%'xю[ЌA=$j(B^ UEUZԋAN0iH raU-?V kR%R48Mdwxo|G&%q-VRr )kрk%?x CnOz/\gXp|hXxEDt²! U X**/lGs!  B~Sf% +2es?])6yBI^RIKzݢ//< 'j9O$WY)2)bHynE ڙ3/(o1sJͦC[U37r2xeBzⅩ]X*׃DO80dzGkԣ嘇WA,MB]&J PY ]!,6g15 0JiR*Zm&KJR ]롪 )ԝ;C ! Yp5C$nYCk˷g+5R#^EW0@p d[)^|S P_c#3}uQW` `\kϟcmYLgZ3j^ыl{;/>91"B,Bt1rQ E {<'aZϽxFsBφv-or3JA1.mʅ7^}9DH8J= m>$_ () "= 3WC"(򜠽8aس[ֽ,l-l.aC%L-8S%Bj[cR^:z%L% hi()(ih!239`cCLQ)vd( ߽њ2(POv-~B)*ȕFqiV0 HVT?XXD' SZI:d ki*0+a^AAlUNνNt{05JA-+saL`#.rĦKZWI9p IC;޸n "S*d3d8CMqbr }0OD'{*EY_юFCnK@"(y\uʪfiJhSԨ,C1pUyyGd铉Qtϻ}{1R#\K޶996w 809bZ&JCF*$kǩkcѠyC](P*rN0l0ܦî==N0K:qS*E "q5\Iqk ɱcОps붕 ,^d8㍆ӿlt;aXzr"82H ȡ:JTJ4C璁OVaP+Rt҂BWwyd(01Hz=ǚ{(_Ê_] +ATm;rK֪yn5ȢP]eK(V1Ml}~y=OwۤD9ID̟]yȡgX*eTJq8R+X3iTAHE:0-/F5E!7MI V*F!$мJ'{#[u %-o{Q^GWf{Ͻ-\/q5MģKˬ0M Ym{{I{n{S93LEaF6 ٠c 4)J%깭*XОa)^1&یw$maNpE,KCEPķeTWzj"*ȯ(J/(6vZ-4yCSZ!EPD C$z~C{V'mEG-MzG]6_7t٬$V:!m/L XZd!n04ɨ^gww i17n6I$c`SHpJOf'<"R{Y6#I>$[Q{fQEO/xS ʁXEX*0;C\)(J7QAńpCi79 $b{*m*_G{}qkL &2VsV708Ns2 hN(C QUejRhtJvœh2ְża{Lg4,iEKgT:(jn-ņ2%dm37.o)W!_˛8Hm='UjWVSw.ڝV -nq55+ip5D_S.yqbqS9LV9L.eyV׻mz\d+f3 ̓D`ʏo{iovEaF" ۞E&!`<8 8܆G0mkV/=__CdCR*n5NUs(-˴w>ꒂEV'ȯc7~WLVEmIyNN=-Ac*T0RAsC뭻w^I^/GQ((pUXX?O+Q!ȡX^5ѕwpݢaf5 b$\]ΚÌ Yb"zUbD{^o8j7q̕P3Q ]5yqaߟe__`5ҪZ;qFp'>DžTUļ/0߂WHndaLĊKF)m2{˳Z#PBjҲȒݏx^OE4 cķl3 xJsKw;m}ml|3̮f1TN/ e&bX&2d+g<(+gQNjINRIJi,oc33 fQW]]ˮ,c9TVq(%Jjq+='y32&>sT Ž&>^vGKKx܊ Q\Z!'ͱ+?Q7>VMG*F^qq07/yM əe ªTJﰨ9<[!`t1D%rca̕USE*,T]Fm e&u4aMKl*b̭l'8q g r ?܂:h`PZSXk 1W?dU+2#!6*}ߛ9ܳTJʇYUX(LEQ@5yaX(g^eP_Q~X_1;hxo.s fSZ'Yr-Sۀ]˂q6Χ1rY_-9fjbi {G&{3zZbgc/"Ovr)"+D CԁN$(ׯyg^2Nnmq̲*kvg;DfڪÉnU&Ax$U)Ȣf'Rf)hȪ,X$z)"kܥJQbDQ6#npʋ^[3FQKe!v7>WN!!Na{y-QD*1v\x oo#F47a7;mrFTuOl{x=x΅Vh*E -M't01ۣ5yko+zig+"o,K=s^GrmavAnѴR lшr+,"˩DPYm!QZl}΋+y˹. -aF#!AFP2Cf3Uja1 g̡JliA0V ٌ;p3fͦ+ǘi[*uۏ-(& AI1脈0UzZh˲Ji]"VEaw9:y/#3ݡb~Ud#l4kYRfG^ywY3)[ AjKFQ) T C&3oVYOpjAo VY۶!$66]xhý@FϛNPmMquS<c-hXfͫvMr_{J1rxܖuSéYrq7؋ !KwL4q2NSU2qNgy ]m*FĬ՘ۨCRh0@È#Aۓ*F ٕ`Zy̷F D Fb1c6jha J&1VLZ(@Ӗm* ]4N^fXf4% Bý޿>goQgLʞ|En]KYdZA^d373& F  bctXX\vQw;Nr.2G(aDj[l5Vټ9rm>nb'g<bΜ{u<3ڰ[ZWh[z` lnrͥNvR+D"@FY$n%y"՚Q07a ER/"Bxe6le0&n@O3r8Btfu .zLLLQ"ء:!ʇY XQ:&$`x;mS)J)JJX CtcSK3釴 JÔ, q4@׎vyV(MaKsC)U"TPPʢ0Jz5b ,QC[ d3Cܯb6&b< !V uɔ*bu?eiV48m&0A)m5q J;dx"hsܦsJ'L<+A #4 $#r$ &.Q+0Q<=sJ"'LwTIs\ <7BI\J/H (D^A+Y*S\F!"+ 1屁R~lKP5'#TY*Uiii') %Ă%@d۷]{Wviv]G27}q8q,@q&d5&]ɢ4 9`Z##|NJi2,V ٌ'w8Fh'cDb!BXXB4@Ĭ;˯|0Ti\T:Vd򩈦eﱣWZv2GĮ:Lg^{ǚ%?jt:tQ!+$Kf3)~|+(Nraּqp+6P&JRB>'R԰4a;ʜ3 v9mⓣ]svٳ`1J2T<%,t bUÆ{|?"/#t`@=3C/ʾx7HAH4J3MZ#tb4Lh$&y$ӄ$P3KZ@6iJ)*_9OKvEK۝9ؚT0ՇmDI {eAT,GDoYݢt4%V  .9imzs{>OJ شbDTeIEI[|0JDNTDwaFF(b0HɈJ2i2r *&2 Ƞ p1Fz?3{]ZT߰C!ۻ @:IG|O'Q)8fC^BC2u.<&@knc0/Wo5,#FƩ%!,l^h3lX-7wy kE#v'%.a֣ˆ!Nt-HP;Z5~Č>0KH u CDҞdaZ>RrR(H-bk\@NH''z:nرݒ, ʕ R&{ǀt=2Ȍl(y -Bz| ݠCLa*xu -ekZ{EuRE hh\ (JJ01nx̣vs{KH)25Wh2B &{|bȍ8" "~ YE) ;K=kywۑqOQ c%9BvG0g]o9*Ļ4dG({>"|)mYwƈȊRiTiq KL9|uسq*bwrFdZ92Z44ㅑI" iۧfT4#8jSVH&E!)⻳BS"FY f B6B%VOv --A攃fZ*Z3s{qq^1E V-O'e 'q&0qmYM*)Vi:㏂g yCF'~rhT|LG͙Q^\,nfPLW*PX:Zyl#UOzE UO1Ppn&0Nhirw?ô3{*~hiP:'=9Jg4Γ$ێi8َ4a"}vk//jBJga\i\!lq7A뚾b4'웥Q/ 7Ш= [`@>6t =jSo8\,Y EVCb޹_q肓{{*vq'vqgg[{BcE.!Mnkx7: Cd iPGgƇktUb{5?f><4#}mjuuj}9#0Ԋ4l0sfB+FDDT ~wΰ㘽0hhKA׼.t6 h,-8GˎRRe>EgO/Y11}7&'k/pͤ&#fiX,F^U QD\oCLkƩ:_( B&ThN($u@8Dj9"17(EkDmOhK "ppzA KJzUS&1k<80V&g X6x\@EDחp@)R vXJN%9` kI"nHNt .^j[X SR{KקhezK,"Ѱ-$qVV@rWMRFVGV;^%P֟Ǘ|CFԞ1 ejrqBG|)奮Ҁb'ڭ=? rׄqVn-E`sToH`(sms^uŊ2w}~@Mx(8@@ĺ,|=yQfq(ug;k⊾: e}2Ch;CBkOsZ MNOF!^r?{ʙU@fC`+'t1:r3Hps&!5vOrᖺmrY [I}Pk7J;x0y9ۓc%B>`9;=X(nmۺHYxO^qת ^$;= $4WKg攽 Dj3I>aփ<s{C'fqDR!ķnϴ11s_ޕK^I“k/YeÉar{s$v#|`[ؓlt/$G@)4sE̾_ \k6vF~ ]mm8VʧRCʪ7_k`ǔzuijvem_LQ̡*]*Q<3쩱z v̦6@ _^u 豭%oYIB3ù}ǭm/_w]wʁK2NmKLj#"[R$('?jZ@B l_ sa4c g>'Ճd;Jjë>ni!{ td=`!>~<3mrR27V!өwЈ[Y=i0>UTZtdm::ޣ6?l?_fF*Goߦ@ԻZNڰ*,[. ܙ!:xBz,֖ŞН9Lxl>kn67m~p>>YtxSg40Y6fFL7wI:Gnn|v2qsaY T,3Џ,Pد\A>&Քwܖ vM*@/r‹b{t} t:bbh?Q᪪dz#nA3f+T(aCVI#F" g{ i썠?w[ @*;?=zuUdj]ގ<GVi cN:s,lKф+5CRH(WCNS.Rmxf=` 1!DRLpֹт;UQi YKd $mPI>LV4"lA6i X)r;@_aj^^RwX䳇v%ql}(ld@ U_}k qh!l;~ JK_bj&kCB$htaYoiO}i9wf_\ ?~xV o$B*ȇn% fxtj5h}=u hO4;8~%U5r-NmT1(Pi-Ͽf}WK Iɞ/A+AAd4Ty|5B JaCsxk [x@]s)BK|%5wsix(S!rp`}Ym"ud~t>fCګHu:B+aykS;?5~8{q"&JBhk] <{&5]-~o7&ޢKG@R1OaTHU\:~6k w龈H5A1 mI!DWf ?\N[J#eu9*^@m;7)15k=Y=uNxLCKr"$ c@SC&1:!1r~+4*=DJ[9lyM,wqVn/gG>}XS[<-lی:2uqNuV,?m`eD?3Yh‰9(,4[ٜ=ߏ%r_y Lۣ5h2)b)/M 54 TUC3B2wֽkp| j+4>fJTX9^l4֤6fr ,E7BIh Sڨ-[&@D)jFyl cc)i9[ ~Ppa vwLygfEn0H(N H`/͝7U/}}~9 ngH Q-{tyٟrI7MyIJA:{YylA4`= /N(|bt|mǝpڜICRCE6ZSwۿo9Orof:qNt4ؑ۳Ư1 2_ {#᠟B=^>>D!5]MuQ@,ƨ+gyFw{R$j5 $ g߯7[4 s - 1u$C;@*m†//\Y ,S9 0M"/CnfS1QuEB iWf/eˊWvF[̊$J0̇*<X/>%YGI%#,Ý0)i8_I9$t"bNP>a0FU]L@US7XrE荎\{"ѩN.QR% u(~)zwD!dEʸyuwY}]q  }PS'~g25-qT݁B{'ddKBNsCvBv@ae2~p" Ă)mM>*kDnn҉~w ־5ՃTR'^]W M3,쏳I|:QtClki0Tp`}*WP8<:81*d kpT$"[tUm>m,O]8c:DB4}BW55٪C @$@ro1|Fplهvh\ 2Vه6Wd#BW9 ? w9^ܪŚ6dD=S CyNvs*u=5(8?43_lq1dYNEF2!_?/aZBBSD/ֆ'.l8MXR-j9#&z9q-B;H ^׳-_aUE.BGo #HM+!ӷ{Q5D1Y39X6cj VPI1K k 8ETXE 2z*M:n1jO¥w/MGk~MiVdtj VZBH=#.2C+M17ĉZ Yhrٵ,|=% F:[ qŪ=oJ'ob-D8%A6YK 2^ч}P<ᓘ%'-Z ]6 "g6LΓ'RP'wks紴NfW,e[WHEN#kˆdh !JDk|30)]D>I`NF<$}KK:ot㠅/<,b~PZ|Rnt':&;t0޽[ B5[(Gkt"u n5k9u% :[ rQ0L>>PFSf,Ί8:6.b;w,CNXN,8D(PA<vmmxʰ9̃\qH*3& xl<( }R񖟗J]5v0)%~s33PwV u8jbFBĞ XD5Zz_Oz!#JK%L)'VY쏻e{RQ^;@S0Nٙu/AF0h|@Bq*Zg>L7it2ZT2Z]w|erU8&jtљo iEi2i, >9xzK=*SX0(c9~&0R괔q֕2˸̇k Ppő]O{Oqb=5Yչ ]76I0KՃnڎ5ZKVwQI@Qf^brER^J="H.P Xp'ׁwD~ k>Ҡ0(3'(yv~K[ WC^q̠"(⦿=RY3r?JfA)&8X;tz̋L".}iLMTg O%HY" O)v37':c|g$JZNN~Z.,>  -C "Ȟxm+8Nyavƀ=q{yn" iw??G <|I5d;CP_V\LԞ͂&g血2emՕ.^w}Ѥ@L_/S`i-%`ٝOJ "@$5]9o'^dѪy:hq{v(ZcTud|t\MD4ց2<.t)[&))>"nMwÅvxH5"IBcA IOEdΌ͸a7@ [|, ,^i\LdYB]d;!IƬx}H8Eppp gand[ aq̯ vku2gU)jV's'}{ï*|AAφ$§;>k+ Hd8A h c$]}}48 oiep!faq2Yg\R}}Y/<ŧg@|x(07puISh-O- F*? fyOU`jBCHh2?чq}~vn9נyH{ ٲg"@ 7ۦb&Rq7y8qEB=L-:wQ ޓŲϯW0&J;|fh>Sw^#IմҰYD՚ G3ˉq̖8\hE6RIcT9'7 44%D*pV`=n:/xPP'*}mx߃h C hC\H=|Gy`;(l T jz!{4t{vB.:LޝXyM6YZ')i7fSӵolT8h#( ^Z Vm~R^K#A2$rԁ8!e{4e'{Ow=$g2VvoksgtOj:0M&Ԍdz`#{{j{:A֐d 'Z5F6\_ ^`ƄܳAu~Î@P2BlT0SѝUN|uf@~UG1~Ru\i,-mh ܣSczMzaY, =E-Ql1*GTCF,}_kGe[8ؕ]%lq\HzB>bc'9"֝imUf%x;GgqAAuCǕ*U&IKE/ *Ǹܡp{=LцR Zh`|fv2C]֒UP)[ K)Aw\]a}sF宁ϼ66j\b ZXhGSPJ>͕^ҰVr>9T =fa/U)_*KtRήpș;`|;4^FZ%j"XI4@kr ~ygNgbAL=,l=fN׾QI/㏸K3),ѾfәϤ<ˋGO𹠻Ѭ?DE6TTzŨ-*b*[qwܯ4-Dfjo֡N[(!.h.p0K߶>(!̨ge,ۃ1{{q,K3%+ƱE\T4CnFKĬ0}3@dEMgӭ<! \f u'žCN\^TIPA+`6_Y iK!7Gx'mq뮀X 2xB :Ϭ DvO:!a1Zb-Zl.l و?lqg1=xrb>e\{x@fmYx;d/6tٜ3:]t8J|Ayj^^ =N_KRdw#MB)ϭ)ڜb"Xq8͙XxiˮLzd>)Z2z_IUL7g=(zud6H(/rPtY}e @~M M7 LA?mZB|U+8T#dCCH}Dztg:$hT?yѧe?6 ^_K:zԾ4{>8~kzP`?>^ČJ5bhvސ *.]} l] :ٳf<0,VT(ӥ/bsvp$&,@{7zc{8 5βz9?b<ҤB_j̀RɴB*}]^˙$"ڜޏs]=8: Ibj7!Ghްk%CQAN`箑Хw| Y]1\.a p{^~o֒+{C/Oi`'pΓLRLCV<:աO3ׇ-%F@qL:  j>@ě!Y I{U?J-b=g6w.oN@g_hO1T8z/O; ~oHI`5{V0w.wϴt{so%sӣtPOi9/7+JiA?fY^=R`:gD y=a)`I-)t^gEhTiGꢙLn+S>*" TgFrsJ I7l93BGI/{)Q`^;{*Gb!f<0 m ˻K8̹W3bp 1ʉF7vd\"GJin -doT0& qyNǒ/Qj|MMfk4O$yX9^WWHZ"B3wyޮz^dwg˷gFxRqLa OWļR9\?[.uSoY MPat>%A>rU0QK*7F|%/)_ш8ޡPiKрݹUw]a53n܊qU˲+{'E("e[. w0ioX|a,8ǽ]k[ qvهR7uPuiHu":N'LW/J)_ZyղS]&j\:qhw=dGLR- Wg}fI.Aq :pDpm4/cmk3 tifQwZi~{Z:ipVG. O 5qI_=ݙhoZb5HBQ}cuQFnbyCOyC{֬[7olU}wo"XS\Q]u1&3)(Wi\a+%8YRe;]FАx(#קӹbnx$5VYj18:].fqsXxߘhxq+E#l40kAht;4. @@iϻP^Tz`L:`Fn>% :3ڴ% v'(Є cc]lu~xX .P|V#S(mH)zb' Nj̣WDb W]]frU=_qe;M]yd0gu/*7[qL#wgcߑ^2?o~%t$H@ȎV U58]`sI-ncKn4 \_֕iWQ<\T~S]na̦=:|u=qr(京Ow8Dy O sU?[ouI,\i35e{*owLOoO'noϕ|w`竾a֚YRpp_Re'Ԧ\TL|wG'w3qFN13@A˥\X/ 0bn/MUv+>SKtv-aCtEzos&^|B׋.-!4釿~+3 jlӆAwNOcO^.}ʦKY+F32BqMSj}3ǹfn(,H/OV.V7u.uqF#a3(zg*Nxy~h/qID! i3C5cٳzdGi񉳞a|`|fj>Yy {{Gn>=v*R$"ϵ1Bl?Є-&5ayBR_͋Ԙ!c9d44niCqSyQf2q Tzw ;7OQ 7ɟr'H*V72۹O2|逡#ѥ??ҭiIw7 rYt:MJ<+ٵ(('*XPf/jNqw/a\@&2m^rȰdq<{3Q)9堖P@* BIk;Zӕj]% =+/~uή⡄? -u*b&H,^G耾q_Jb( Y CICp YJXo.Hlwo,|+w+ LhvÙ40m-R-X@X{!1Q"sāPI>Gg7Hp0Nd́Y @DtXDH DV)}hXf4+177 v3 N =(K{xn\fY$*zg|=:ubs D1w<Z4gzv7gI1w.g=N9[rStfAH"mF{3lƨrSYR2`f`QqT!ZZJQ6ɉ~~@*PP+9D7Ca T_X )JZ jo˜ȣ Jb)44xBj5No߿?/>8yL>tru/t+>v@a( [Ox IEx dTJ~Q|`1}4mI x]9 Rd44^d҅.g>4ZѦpc&r5 hDBgbs0$ S *yy")xVdTT-˗HBA|ߖ=[Q%Qi@(FLevb9L8;[=s f LԂyG 'yӬYV9hɫ >H }Vc~.*W\~ψu Wb5?u8I3~tu5oˣK#Isn|& ;+_&=\om9p;alDPdĝeľ+15QӚ|wAYӧ8sZ9Z|4f{9Մ)) 1Ma?eW&]cYAsCyY?*9wޣ|s`vb2ZzH߭X,XEgFS-by=z|!,C_g7 HG&-f/wAqn*=759uo~}.;tۄ>GGߞoAt|so5oZ0aI}DkӤtN{AZƷVGy8yxUzǿt7(v3GQr[:N3''fC}}\k Xoq}iםnQDk3zʅtɑu7Z\U}2^ڴNSkʤĹ# e:^ "6-]3蠺=kpƏ;F8+X8D`ܱPJ/~}QƽJc D/?F|E7뎹}yHC13|_v_PK½Kq*,g qdG%Q U.cʳ?sk5~]oLKU=˻~X61%RrsrpQg8>c+֞ǺSM9AԞq*",F,0l=jC[2aomN_[׼wTru\8KߥmάIF*']l:lHix EuBwƱN}+Ne띜C꾗]*#}ʳN'9qEQc1RX^xlC*11̸!vf*t q|ֲIzƗ_5F6b.A.x3!h؞] X)# GW^g~kX57ڐ+C5ep@G}Bi|^}`e#:Q+oc@M U:3O|wYR^EBe3T p;Kd-]Γ?utߞzo^u׼uO"D{x}OyW}^G<7}D}b7,H&:aGKY;&v#Eŕ9XDnn {^46M1G}gםcf^W{D7T8Tҏ9*#ƍa ?pǪF&c@DnW8 xq :($MS㥋?x~7OLJUNߛ% c3HΥ IѮn;|:=t7<_-v<^y|\VemE5wv8>S=&Ti>bΓ^yT\ݽaD7.&$oXxz+TV3lN]`ylڍ2'靵O\lYT:&ĕӶ6O6>4w,kQЌ:5ǘI#4 Zn)RC]~3 ;ħy0pʝQ*cIR/wZ23Aʸ"/%4x_V&:"9Uî8f1Y9ѯiPW<]M !.+)qWw\:Џ .3+%oJ&iV2~}ym<:TltLD(Ynb5[t"_'4$gjQ9f+Gdh#{)jPab(XnWUd,댈ci@lc,to!Mky7f%#D3GTd4Ծ]Ć3& {¾q7ӣC x[ȚߤL,g9ri/Jq9Sїe; ERg/2eSF%Bu3m!DB,=2aNsa(S8k0ôD'cna9R * .ih-B)<ƧMn.ItuJD3>EDVe/@Az~ȫ˰'蘓ONnIo+^0c(IdNץ ޖiq<ҋ2.OAKR"raUa /d <8!ӳ='~_ M OL==J}!GXJ\²О/;yV rUh҄3.)N&x{y}o "UOnSñAWZ)ٞ2uu{o~ G-zxP=(HYRpUKėHC Z1_ ww*F#@X`ia.;똹"up;>;tH[?/?C7>Kw az9)t=`Go}ł9m&Cm2M'2ju-1w, +3?7@,2QH¶a_Keh&շ+mr::eGi_4 gcLHsO1 s͛?%S|nYwyh;2Ock S<;h &jk1{ows7g1?xE؞YITwpvJŋ3ݘ1`|ɪbWKr2MIa\fPw[6R2|X_˜"+V-@BNk؁WdHUVAh4Yh`|vэ6:9g8WPԻwgD3rC/ Cd*9?H0CЅ KD5}T >' 1M;r  2@~Q9 @D D% 0CA(<0 dnέ2ov?t.H!j&&"!{bE Q$AATfaN bTED4 SRRPD5LDAUACLp\h4C~q8S|{ݓ90wO|>ɀva;v"B!w9ͱ6/ N{\{Q W4|9@U~"t,G*_ɼ.3z35Kh{XQjd T@72()!CrHU: h{vdP^`;Or }фқ9(ڀ}?=VC1p|4PY?/||pwwc?_MmL&yo9tR'&.cWy[Z֜۬w#H^ݖm8D[ R q6m֌-ŋG~o﷙anŹ@HLvt{A.Hͺ VOvחjOl'U4*EQ)fIBkudH~7\b1⩀ב)#( 4]9IdRg)r+ Y.PhMr.wrw')V_hL44RM\Ƙ()1 9ۇ53b|CNr=yU=}A bX*2I\Oӏ,2*f'w'd!{}2]$ѓ~0֬E.Y1j\hwsE7]NpNE&Җn#Z++t9 yBuR!o'=u~qߣOOXf$?%;om.ԟw'?C= ljHXu=gg|)senKZ8S1p ץ(B s_ OġYFV+8>G]`k p>J|ac*YQRB*WmZՊYkbE!#?r-V#5}5/(]?Ԉ+ZQfw#ü쎒czμB;r;^,#(nP PHrUؐ+$jSW|gQxFWYɔ߮w㛞Cs|) R#pqudgO/xr -uu&xWs"(( JƯ*dI ++Nq`H7̢'8kUEPRQU-ENCW6jtpXdCJyd%U81lv=*ZEOdhQ>x"ˈQj;X?o35*&qؑd|Oyz=`08>AHw{h2 9 ۽ *,6 @97WH߃tCF n=!={b#9}ӿWa?>hRsuf*Q,FJ֐q MxBF0zHCP(8 G#I 3JU, iȢǖvaMT§"t&k'.)z]DઝmG6E˹ѾDҫQ yZ[*{L xrÝ AWιkyylY_]V);it[cv֨;,mE`"Tƛ5.VC~CPAlb٫(2/ Ξ!{x蛎>|)fQ{|3{$ErHAC4Bh0N훚w6Ƌ Q<4átib*Yu)/LT49WgLb'-\O1PX :ehp@ ~Q[OB ~e?op 4J (s6֭\BW`hXA4­/zB6εURux8=F-\_Fm}yA ߛnqjB`2bLe[I++0j 3}Ef iENO] "j32JNni US2᫯9cxos'j23vU+;\}4<9̬Mq񇈨Ӛjk9w5+#TP|su*|\;4q*'fNn5Dja噔*ҥG3~1+ܚ&҂*^f;KӼSoxYJyϪS-12͔wGb[czR%D3Ѹ6HeII!ZkQf7lYud oլ=,v"؉z~`}^ h}q!I8POQD~@g*> YJ)2Gw6:k땀K'Ā|I%omƃ- jLtmQy\&  J.O ?ȓ9"MW>ņBQ(xMw"+qCv??+*T>J?+|K tZ[1)E QQ+ǹ$;qO>i1f#\@$nkG~l,O?'t[wޥR{y6O]i ɦ 'dNY1P!&k^w}+/=nsrQ2ϣdPCkf1a\tbFJ*1tpүphtN6:)mgu'I2:Su?tVSؐD\AךeYm?/^86/T:f|`lq`5Cf Ay3 Pti(Z2K7'@ <==e]?=9/.{7$(DF JiJP͉$kKh3BNyDMN4 $rȫ7}1_As:?=e5/spJw?_>d'`2~^DASdba=?,<&*)$|q¿; T20/憄J S6,x?#g>qC^jc)Z42bvxP~_ms+VO4,L^Ixxz|@{5yJb"X.P$iu7_4n3cDM !uRbz3,_m(g{cͣtY4 F;YԶԞS :) KyK-tXiRH<&^$5}ەC Xm#V,;LJ+Dk^^z^D> lA(m[~\RPcD-N` {qWY8vZBr^Q(Y׵Zݩ/#bDYJqv{OOͮ 5_Ų,*M=(㥄#Bb`xB&7LNKFts*"rNƍ^Ң]SK4FlR$*o0zg ̅qί>?:'!_/Y:݄AAџԦߏs !QT%0JKGR# qSL¿%E:)Ӝa®"A(= SdXOh t[EzG6ePw+$g'CLĦ+k/I>ɍ0C q[m2tW'VݏX}s@TP` `ƒ+eur=_ӈ%Cuωm0D%'6% e&f` K:i,m@/[. Ҕ4 ieKM:"l}s.Z/\U]~DE4RlI#~X6)P&WC ߨ'(Ɍo.Et,d.ѲRT**CmƸgz45f3A#BEYH R666sw@@ &GӯdImΥCf$TL#H'l\A QgN#zbӣ,AV$ z?Ad8 9uI99كa&P*ݽr ')zqpv`¦F@QB/.k8#h|VotF+™-De O=({ Qz ѲyjF@F~.7id2L$rRaVBQi,72]&!2 ]|z6 @:{П`aH(\*x6q,;D!jFOt⾾/޹<asܯ5K:吙U[ M{ޝF]\U5^m(f1N!Ttd]=4H2 KrD|&MxI>@:YjXf|i?Ʈ\4js$"!C!w @_F{ÞnO]>9 8O8hM u"YRjRgD:5TKÌD =~zMH '2n zYs(:IV8x%OKeaRo(u{7g.kإW W B }}i(S 1b>uc]<6*<|Y۠'秦OI۱ ?SZXFm=*sia9/}EQtesmק~]nuO7P&Կ6uS*:u15>\.&VZ#MMT)z̿#" = ĊC/nZXt-i<]P|lԬ͒fꧾy}Yޜy>+)aaN9rw}JӨYBukc 2SFe iPCts]5QʪQl.Vus/G!=FZHZ?| _qožn]T lƷM3Gm3Z/1^j[A^G:$N&^A`_xY\%VU=%ɁvV>E{kwM.[YGlM\60af!aZ{dH.e*9[p>ؽg/5gg4+ert0d5֎P^C'ꔧMw~r|yHK]Su:E敃eO r=D$JjCƙ/v%ˣ\]+']7\mPpqp .wn! i0wMG%] P,?yVn|3amЀɦ)-e˛֤vB[ 70N+QP1kw@ICba#ۣEmB(I)\6E1{-MsLaxĘ8E 7oo7;E:=X)r!/* `贂x3 K9N.'G<|a4mtMխx P]tf~yW7>?xV$ym~ {}O^ehJ!9Kq1˴`3J)G_^0.:1Ul ӹ  dk59<"Dz>MdG %I$snQ?yk`;+χ')O(Wga*"?r"=J_J^=6gO tZ7Vj]^r===NM_L"s`k yCD;*DUE$DL55'~+?:%=Н:m8$Ű2`y>mp_U ߉Y0Xo%"fJ]08kfLQ M$}K!v33I@&_ Vc`?F=zb|viGhMJS{x?&v39CYpu{T-O".wa_^ޫ:iԅFՑ<&Ǻcz?B[:Nrԕ& DEc`ia&Mi$͍ WJrdŌ)U_vo]upլEAU+E-r4ΥT85i]qJΰ7繘fY+H(sa0rfz6ڶ)<7 93Ǎ8P읖A>'Q}GtPAQ JWdM XT-_3;}Shj" \hɥh9Hۏ ߽2yK㝎L{%0̥їntկsvW[㜺#9L'e'KIB lc9D>~+ѻϽ/fCwDL i*"E33gq4m/J KZ|ƴxU7hIS"$WP\˃k[rq/2;[4ui^$8|D11&1WK4)Cc$"dz0=EX,E잞5%)hB(hpޖ/۪<_5?HPCʒa?xHm'D81H5dvNz8Dc{5ۜD9" زBxJ.B&@RB{K#WpjBk%*N TC22k4B9& 화.䤩1RKa`ƈ6dgu`Py^{"2  nv(MuiXJr$7jL 6c8qP kqej;*-bAbc;+ rrn1@5 ajVXa$ML2gǵx&;I֍9e4EU1/(@@Rf*s!U& ⃔1gH-Iu)Bu@L@R9.RQ-k.S. C n0*w B"nJL%LE'! Vq*Nk` q``'r"iUԦ)̭ @^%5l$hB"rFhGr) JHd#FI$ܭ'.@.or P% GP e\⁓4"s&M[0LXc$y؈(dMAa!.^5Y Lf%@gR(N!D JCHaDA掞~ɻ F:bDŽQ~m{KzPRw:a*6-"ML" i{{1$wIH&:=G7 PlQ\-U/~MgS}f>K.trD*5+ " RTmKnqdTeܹHq%w :dC`Y)^Z}8H\2i(\As(ZiJJRSz_y"}kDQXlaLGSϡB-!MFE4 4R1t1+לdI$VLJs$`eMdB"Hk]쐢U 0s2HLF@̜̚y=uSU^8+!4u$h/QIa!0F!<<{GzBk9! ̊AՑ&`IҖa0dt95RQcF`<N4Si,"h(&@/Lj<g4qZ'2Q̜zku5Al/(j)p- syxLW5mgԥ1 J P<Ĭ*L 50,> =dAen1o㴜WN^ -4MP&W)Oob Ƽ홾eme,NL"ł ,UJV0C _ԅ-4R4KM%+JR(w4T%J49 C%"DBJ@!RrJ((rL,rBDi8Pjr|%L>d9(P  r\"0 ~7Թd?Vuy40:{̲mx)⠦8>vc"mza!5-r^8%h(Z }$j(D']d&U8hP8uS&3;9#fjnKڥ5ajPe⩢J䓜$ m$)hFJ()V)i( B D)@i(DiFH !ZJ1M#9QYK2S7"m3[|=(9ŋl *\TkNEDU4FBQB[)hx2?U6(yV) )LQE(c"**MN5& ݙ>REQ!lg0h$܊FT2>^0 ңܾrnwƻkmPD^:9Jכ+᪢hQ6$Z$dBp'FAa;7{bKe]^c=BGO凂)p\('s .OvmHg< B  R>x: JP9AIJ'- q+%SP0 J1! % u"j7  ӬiSr5u# qq(nPq!H@R@ABT"df B"&@% @RCrLd! -(HH. !JB B@AsjT]@d (P Q0RJV*J*h @(@ jL3SP/).JY*JȜH4B&B)^#  +r܎!*AH RDbbaX.`,F(1E5I!E Ld + I"UI_LTR "G 4L(71ZKM-s A[0UbpqX& c8BVc8`Ԛ5J"T8[(R m VC%M@Y81YRTCnR=$јkRFj &I* f,\AAJGVҜS@?@>ø`<&+ {<K`"[{:[\-8'EDp'Lwf儚 YۑOM JM`t;i"6M9/:Ib18uOM B|9Y6O{o#l 2ʌ"Awwv ʒ@A QiY 0ZeE%#ۺ0ε^R.KR1QDGm.LL@/S×|>{ǟ%>jdgjw\Z_ e֛a!0A;jbbFgFEIĔXa|&jk/[B)JJJRyj!51'mpKKoΌNsȇ2w؋r 3((y)RPLPyH10H/V<)Ƨic'a;ZQexMo9-OsNeDYLY=CȯCZ7ġ8"QpPW6K91٪csdfI !I3=¥ ) Za"(X!fhJFB"&Z A JA)B2*2 %!Z"88{DV+'[BPXIi(hiFP)h )iJ("") *Bhjf %()J H(aRX(H Ji h (ZZVhZZ $ X)& )"Jh**!E7L{(~2mUwdߔ~z\[vT*-MjB#^=Q3rl(6!HCdUQJ=WpבzOHF4 #T}&))Cƈ(ȦǼ)'O0xq12&1ܴ3)u;Xb ETT$}S2PE oOPs`C0kW+=8&wBI5Afp\QċMpo{픕b|Hyrq瀰4C4.Ѡ$ OBJ@#n}!=63}"ɨWRe8{siu eW-.(m8;aJ W3))Ng.0/3kē9Ӊ2WʂW7@a_\\9(sw1"whgi6Ϋ3(IN WP~%UJЉBD!T @PuqCRh2Nu(Ktiufq@Μ^YӘ(^&w9{iی 1ajLeCmxx`Ğ!<*'R9+P Ld;2EZ{8¤̰ aݤ u1 rkQrzN~g2TȋdE:;vbZ)j 8NӮΝښ)iC]Hv8vݴKdج!zoL Yn9pY!KOȞ[@f GDy|YAvQE#թ-axPsH1FPjrnRQAxRdg-^Y9rلV;(5+H(kn&QyQxGETPԠATU(JO\IEn=ZWWyU6%0D^t)ʊ\.켨J "$X^{\LWd; ‡yn]hvB"bӓ^'Ϋ撅#ޣ>oE'P.a }{Gb}v時& Ճ7])C9cfxy¼սUIܨ*bVInQ{`1+uGL鶎1LmdôBU }a 'U  ru佥MFJF+xs"=j~^p9/fWӾ vT)(:C'% F) hhnDS)~Pa~:Ci-F# f1B08J;["*0P(<(jBJD֊2r$i >TڽنTӯy)JT Nzʏl0 E$ D4SECH-HP#ATP%!4#9j)L5E>C3Nc)p30)( T'022)%Ed [H>l1QyT)1<{^YW@RVAeQN#+Z%<p/gB|'(mP"{‡fINSDQX2CS nAB* ;e`RV, H%)(7IBm*@I dj 1 ȤKҮH $ExE83(um%6Ćbz}JdTg4 J% @ĴBP@ QHTJ.95%RSaP'W!.!~F%F R b"%h `djhiɡ & rBi q#%@$JW&%(ZFb*ۨ%-)H4&KABd KY);2DV &" FbR HrPJ))( iJ JAbB Z P(Bj"(bf%ԡf AAR.FME"CC"Ia+&bbLO[-;ii2Lp!aD#rUN%S$p d(J@ թFCR:ANɧ#*rɄ=FARmlE(0HB8j8:xP$ Lh GLq K]@#@74d@jMBE#EMBԆVAIE!d5 5*A (QDαH o0 B+1H(2]CLZJ+@@cY Y #d䁒MwkX )&\w`[$*(X`qK E`T`rua- c؆b@d(c dse͹wB@q(d q@D X)@͈Rd nL|RW` qH*iXbK n1H PT M4֗!#x RC@ QjJJp$̀r C3V(@֨HjUt咅'R&B$R(!8|NBd9*nC)"ZD1(s ]EJ䡐 R!-dJa"Ѐq"E{@U@'JZE S"%HJ(c 1 d&I R d8^6"ɩYĘm `uj bTXHdPa"<3OO ܆53Đ "@S /L9ŒRn5P$L (Z)DpXZir(i\ !@-,nr`ܚ52I5 %a1U@Ud5%h2$S% IWJ#\@P ( $!(\ !F7)EUR]IHQHPR- 6ʆrrȥ\{`dk1cкu  xDP55#=ip4PB JP-"9 @4H. N !TC!i2)j^&hOCTrC8XSʅa&; ΊP>:2)wwwd.CÎ<˧(qx07Jq a1B֨*#5X,T+bl*Uy3QTG*"&LUxDs6q Du{E!D #f@d@-Pz(%rtE IT%L@RSJS@4дHTdz/'jTei(t1p*E(1EdE\ 03r$8@JkNj5#:CZ15-EP g/*HH^RaybZBrS"ְMK؍ot)JQM)2!0- WžI=!;NTYOP"A B@d!HLl(2uMÕ nV<GZNu*ʰ!,&bbZZi0"J*r"R%I&$0r'M0nISuؐZ'a{Pd:{N*ISP=qTIuݞ8܋'t,£AwI r#5153"%rO"g\IcgL24P 2qk hQDzD"S!\2k*R\" aEkk=ADGDTd'x^AHzNJ< t)=Вt)W(T.UD'GD*pI< IU KV`2U$ԭI!ZDԻɺxE*!+0ʗSQE.$*(JJ%NUtPD(Ih(eTPU(MB"UVIԹQQ c-QIE,Vb=7xP * OPm#:dD^DAQ3Lr*(J)$E(΂eA-Fa4ԄQUIMQDHHj ʱT"+N '}l"-㈳X P 4(Z:!:/ C%뷙Ze-ָB?|z9vpf5dr DiJ)P\rB)Г*~RwWٗ <==[xzH~$i+@SY ٭o(h6~ۺңZ{F>*w&0_, Hd-ZҔL|3i..NS12 O(#{BFHR Ti}]h[02 Xu|pF!:R>3@q}얪PRlZ+ Tl^PVǿUaBb ̗D~۲vOHhjvHT- QjRJʢ*\i Z1HE =ӯv%DYRĂO)E5RZ,%RQBE@L@P44% T5KH5R5D+B1E E4R%DҫH#=%Ehؠ~XWr ;E5 JEz)9D^O|dPy삄 1XyN`TWD51)=2s'AC JG0\Bzt2GDJ  '!8R'=S*Si&!NЭ+Tj]*u4QFݩ5 b(+ȉy`LBҦ句)!%&A)ANN"Rx)NB24=5fBPfI萚N0qC!(Hh*3saKrM*db'ʐ!yyzVqrRk~>krRMaQ ғd4Ż`1Gm-%׫ ;k5K Vx4C3ӊxu cKi\wjZoBwΣE-_mkۉ5w?ۿU_ځŀkl/"g0%FUd*lH3&*TIHk0N8JJbz(Ί!j<"]dSvN3yyiӈQ1wێ1?1&d31bWX)󅈜NsK9|vՀ4z(D  8OhB0s38 > ul۟EvKZP;8Ua7U Nv@ڒ$ZFnZ[4#d 5;Ui3}V45J$Π+mk:ܭ屔A"Γ@]VaDi54EWQ8„8yr4QPI&A +?۸ ֗ k`R+kk3g4{GDžX 5U^c:\(!X a^#0f~0s*.@Hl(93 W^I^`aQp2Mg?y"~wUAJF)RLG Ḡ>1L&b G!FGPVA!J4d%9 !'UA҄Q*w!KH!Ewp7afyS9n<-p4DUm2vi@2ςiAvIPʩJDŢl̘(]ϕli&*2~P=]W^VXk#kQ˒E}+l'tˈBZԫ8G\3'@Wƀ-&|idrWP1/7DWs>IНea2E6O(Lf i$& Hi8rVyx{s@sq eԅ/A=JHdnMa*P&qJ@$臓` 蒐5I?}/""dEe )*JXDHdɐ%dr )) r@ja~-$*IFa|a+rᙜ"EY&TQE'"͏p= ؾCvdHNJRSIZBOE :h;F@t'#ATS@YoIHd,'Y];1 `Z`}` H$NFf?{|K:\qR`, !]@zH:Ԕ=#C|aF̩ PXVm*Tۆ.PlS9n$u ,tgL#w Y-]˙= 9'(lTa5t|L{dP82/!i 7PJ}ˬE]_G$L5)qJܥ%ceR 2cJ+-%`yg秛< ҿ!$bȘߕךP@,_G ?淮N4ajn;3g^fh4iQXGԡ ~A CHX@" B1 i&k/nrwyu38ͳQm? zEkn%NU7MS5MRjX^dUJc\ 1A~3((OUFXKdQg29GWs@{FB dl pCSbZOA;(tq4P (EBP," !fy~eSϦlexn95Kѫ$!"QyKL{8sA3UJB2YPvK?CquL"CW+0to'w%,(ןYr'n, fZI·i׷ Cw1zffVпArsYrl=<9驖[֥p}`%2L/]bi*LEd8i5Q1B뾜n˝_.x3Ο`T}yf^oxj"+gikWLfwy/f6ʪĉŰI֯KI?MuOV}Y2&J_Nby{AyY]D+l$J6&׋5\Lw=t{ZDjɔުFo»$Q6hwWOF3qɵ͓Kb{K-vl.qI׿\ƩnBSku;k`aEPonkӨF]xw ";{2q r5 tiۼ";^HvK'^lo{~h;X4ޔM "2G@Y@uzi9~KK&݈p8Y!2uj"i ;8}sF4E-=ߋ/{y'0&uar-h6u1NX#RˉOS21@ߚҶyq H9ϯQsWu^0fHJou@gSUnݧ# 3b,p֓9ݸE;xWB+S]D7}"In:]Lg_]\\_ ~,^X|P3g3 66ep/^Il45Mrb뿆Px=%K/ܞ*i7^Ol;uFeA|@F0zs΂~K`Ǹw%II/XkedvsA׉V^`^^x:+^ae;^ΧB"IݭE1JF a-SL1ӎrɢed%.T Ǝ2|>\=c;h]b׼\V◛ɼx%\mf$ {|f{ݢY lb&# ]2q*" ;r5hZZ0(2( .bYX)o`yU].)RxG^2FkgV׼8zToqԙ 2ƹ&a]̰R, q&yVǝ՟Ҟw<pCd˺+߫G:6B-r7Ʃi$gSӮuE=ጜb|Lӫ.'hisG-rSrP_3cH`*Nw=#z|X_|EE !\+]AJ.TI뻥4=VI&6s=0KKy:4FLQ 1[ל kZ57^-fxΐp Zi6C ;zƮ%νo[c#/:*z~auUA 8[rӽcom.qdrcçN/Rl>1I)^@̲eVGCd|". l )t C Y mjƪG'iSzBy[RxG2TGҹ^mb9{phL3\=Dq6/s Z`՝W]o0xqIDUD]XRӺDj1]Nx֣%aK*E1PvR)'k'z߭ktZ f7ɽRXB|f,pԏ1UXIyAͪ Y4z}cӽG#Ro^f7/> d‰Q# Ģ4磽7t7 -wyh ֪]zdyY-\UƦw HW>8zϽiBbG/X֯^ +@3q|6\uR֬A*4A:՟o }O8`G33iv<4 +%!jhxm@J*1I(mc/1^r';7(˥ۦFN). SA* >]7wUlߐC_fߪO| DG=S;k=f, 뎘j0~.K eN.{Yi_"֨ kc!25ǶM4i&5f{HP]٢jA4gLe`kG'ZwO 8Jpq23jFL:66aca%B煝)^/11oja~]]f AY۳3wDnYOkr|mD`)n;JafJF8h(9'?ѹyTۅ%בԗiގ9tWx<]%zE4kդr"Ykang:-@nRu15pN lY<,e;ٙ< }n_dz p)gnA3 7@t0ȗm -X:(D3 3IFM"nĹs#2V]>B$#|3 rCM! i+!BHtAR2`HqP 3Yx6yS5Nf`|cf|[vUZKDy}_9@P 篏pb,Z3K[E-CaUabAKS|{(ջ|H)9@42LV[B4.'vHd>6 O^g,z2zM>6=C-?.)~:aYt>& &񉔞nK)"& =SwTY!IB-%=,{[@pr@JJX(>p_#%޻A2O!D֪&[{H {]$4L7%AA@SHPԴ78=NJ^ Mꖔ(^ KT~V>Zļ2VN$XB"#r)i bJF*gwFL 1b<Lg`>xQ!?mB_;ɷI HJ"(0(p<SwQdI\ -_L:ž52bE+P &b5 2*ªOjaꞲՀU!jcb0YS*NZ5 ̳YĞ5\j M.CA)ˋQh4BNq BPDrd*¤*1:^Z&8(UH,QD[b7.ioL"*C*vW51=-&RYPY cPYP1Y *(e[CPYFPPԚm0Ǔ0-z2ߛwUU$_zN͔Pf c͜f plMCzYj?fc85PJ!PJ"@VBQ >jvCMH?] +4]"Y#H}7za ܿCC4DѬ*%5 ȩܺ5BI s;qP!ӿ;붎e|5\ *Y|û҇g],2SBP˫ T4-q۽ae?&J7a  Z)`hRU}e KJG 1(ҟsb.Z Nj!F3b˙%b1Fg-p&BvCQG ͊>ё%b5_P/$J"};{gt*VX:h2ao}'UNС] R=;0Q4dSfsuTZeaI:Sk?Oo=ɖ}@nG$I~6złGTĊ1*&FB 5DYƜqbwGZ!L$CРXCDR9)b E-%P54NFBPREL$M1HDM)}s%0EBGTEHB>xi? X9J*lŕQ3RҰ5j$0"`S%Iy30Sr4QP@!ERv٭PD1)f Pd C SE9S0Fpde<<|CRhJ~CwW&322[\M+*cxfF`%$Yڻf26We"lSHY=0m/ѩsϱďp^7 ?&Ūu)iqs 1?tptֺ$^ԭXC+Vwv ɠsj;& 8u~\-s2{k;jZd;^j!ܑ,رXOZ"?@7lfE?z@pmYDSe %Or=N5 # ]EEhCDes2Ž_B UH٭uQ}natOf%B~3O 9BwW5U7}OݝSsaPE|!ܱjK]{EP5@*`{~褨RH=h4Qd3p[z3bS;&;ԻfGP:t>N eAɓڲWIv0ӟ\ ~蜲O !Փ`i>/]4y- Ǵ/~MP>Ն֘drrzYRzu{G2cV9}C&CG JS<Btuw{0يW@6=dsX ;ycYS. ܦA/)N;xGݗ;v͠i271'%|7%P R?d+Hy;fjG;~/yxh>ӸMM!ԇTkԨVMuV?,2˳eju{;]ݤyM׭62$nDPe')'}'V CM'd ͸1 1}0GēB}yݛwLaMIL=t3stPbzDy6g>b 9svYב2gCdQA_-T?=:dt!,) hRZ7Zt\9*uC+ֵX;+!cB2 $戟Ddyى;7vߍ^:q{ >IBЄz2"J(iiViFH4!IiiR$*! D)B2)hR%%露}<نh"*$H@vORA D/ 뗏Nd"ODIoew4[ Bv3 np *(%VUqѲ[@ IڶXq菎t(7}3x<3ܾ4{}Q9r9*q"%,F3ʮ UGm1AZCuCgL^3gY(*S^=uBz}v6~In"O0ؗ8isG<N?hZC4xpȩa*t2ܱ~Tl¬bA5U p~!=螝>~?lI}biUjDB(/k6t/!fq p]193}[/m 77,ѻ9MV%ğy (9)2~1HR-iq()ӳ9dƶ،'9ƞ% oH$GVdĆ-kLL 7y7k"C+N@2=-z5&l9s3J76-EQ+2hS+Hnl׌37|5HVINc'szԮ JyfMxX,?2VO3o \]Wmnp"0QW o&B{w2 T$h5 5,[>l^|,%6dZZ3f_6MN'wcx#\LE̴JȦg:_9q9=)&a%<8Uʲιfq TU88.UhHH$qh&LlruTD)' уj*0,+3hj$)2F2q͠$^yW܎IGUA䳃qp'O"5æ.GHhh w<vs0:"*"}6m!C|>`Ԡl.ڜJ&QDKtv™酤tq܎6{S=a n%h˖W[z3%yrQӬbGeE5t倲)Uh5%dJXe@dP'Aݰ Iu~YMyVt.{LM7U ]AJ]8:E.'L4P>(H4 A]`ߛ9Wi=Cph+O0Swʼ9w-WnvkAq\GX5kӯ;g0,RʄI1,v#Q` Ԋ QXH3A0m@C4"d bvKwf;ØAᚃX2>9)lif4`"*VnY HkNHq֋*Q$rp iJQJ( )"reE-FJUS1K$ĄYADI4g^RB:ոl$C&˓M`,HWˍF Ihnw]hՒ4fM;5BtghKR๋320cئ}g7"̼an9_LN0(""!E$myיݹW S}pvHzE((M 婌YZ6ƺ=4r\Ԇax8 uT j-Pfo )92EԬj]hS̮d(49hajuPs' "ʽʙfKuT/m*Ǖ9s8kDL`#Ș]%Bz˰a5ӫ[b{y;J'g[nD'*v|{φd]ޗ;$w*]77]B#/zD|MѺ]c?/%]MK1+Pm,C3 z%kBwg78^fD^TϗBy!ska=@ZDRX&`^UE}/CdWPU KIz"JQKM8$ݒÐ;5PnqaN:qSwp27vH7v.(ajp;jY,TQ';}t*C0iNNZ$TPaw.R DUƤ(TC1ưzr89 hdJ&7s^_^N(犗S!'*GA{WN1DT,:[N0:k5 f!AjJ`Y.⣯"ܧ)5PmZf+5t/kbm)ݨQ}wpK^TQX1YqX|<'\kuY-x=z/:',.i'iGSyy~+zS3i_^=C0SS;۩ZSn.naGRre~'3־ϣGBzmaާ#{xm.<{,5.{QG|/lV6|{/u ̱8d}Gzը2}n2V.9 Y"K})[]jDڿSom/س!hdh?l/vvqN,h6aiaV&TrJQ j#f ȴ1(D#Xܱk LLz ^50zSIu3Vpy,Z̔p}VQ @ XٷZ~uXj{3C$?$Ts>i^~%Ys|O˝h)L 7dHxvf;? ~7c痑NgB |{tڤQaL .ff6̶3[ e/>ɡ#y'髍C0~{ࣛad;kDC.qqyMN9C ?K*RTz5\nQ rI?7믯XoK ڕHbH}PaϵpƗV"33}z@xŜM믟Sܹ6vsСAaL {]ԷǐP1XVLvlf=$Qa7KB~$ 3(2%IR3Ē'~㽨f-q7]m<WH 8}z>vO3]Oh)%hʐP".e"Ɋ2ҖL!?#aW 16HUfk:8A"'0j<]&nVG;$>DxǼ޻ꅾ4c:>Bzz[ f`dVnԮ'3^[BH 0E3QQGl)a5 pQ"4"cyP<kF'OaoŹj㦗!AA6!Jƣ+y˵ F˵ۋq}=YZ#g90Wj =0o523w-4`l8.ukx댽ULK>1|{;'=Wq&ADԄdx0ˡx[/~}(|*AΤZO|_~jLP:9ph4?g!ΔJݱϗC Q#R"`r@7uA/Ndioˈxw1Gyqnqx&kӱvR9)$@HD5|H% oFk D'd!\Vh:M4 bUw&oM0rSNxQvf&c 3W&.o(8[9Sz,Ou90Z,sc")MB%b;XfMѵ&#S|Y<ᚭIj,rPBPM(3R}|~$4݇j0A ( G<9s055fpӦy CJUd!r\&񓆗zq+vΧSvatXh4.`uxQ=Wy:G88_ߗbTb{`sF*U#Rh) vhHphIOЦE,TOhlYhM  8h@M| sj59dMc'H,l @|(|Ls {xʗDɘ=Ҹ(βnsf̪(Ag]Lg6<;o 7O= ,?kE z!a^lk6N @%)ICl$&* r(-KYf}иm- bp=>hh7U, *ZŠőQTQX+0X:y /O[Ote!c=n@ܰ4ҿN@0'a|z3-kZ(mKR;7nMF")xHp縌^}O|oz$?Oz:@`AClhj*~Ia3^c]gk5, xlTQpz >*q }"<9DMٞ:),#bJ|Y&5= h8/{wWG=}uŁ֒ evhLP|k4"KXitMٯDp2t5P]}* ?_X5(dCN<ż \Ef A(G[$NZ%-5O%2w:ӳe!E=B sV5,>=?i<4 /ZcKrGdh_"Mb bF2wb{{qg7omy[a!';a_'[y7xNVΗpDd6J&D(>{(aT~؜H,;Ot * d#LR8!Q03!JRj @TC%*$hJ% BѢ|@a7C?l4r0B䁸Bk4/ hhfX(pU`N؛Dש'ҵ)n%/Gɔb"&tC¡\D4ԓS M`䫞 0x F-6?:$0 UOIh@dUU%{$2#/Ku NY_h".'95 As9'Eja4W\ `ѭa4)`@v;7 OdH)JȪu$JHд-"93Dq-TD=R? ,"0m[V5ƍOP7lx5pCsȨ' I7J=u;GtuԤv=N~y;jO4Y8 >!մ_eSӔ+xUE** y7~^iPmҠEY4~jkOkS&? LF R? sTE4QZޮYD=f\v|;\sHTdHi"1iP1 s6""~> j$=o?fWͲbhB=@A<;L iU7 48u4ڑձ$$Q%hT~Rܮ) :pQJ4׃!m7O"9"%%z0 ."V)5 ̇"ީn5J+.1|YyԢ=Ƴ$ai3 ==I!"G|/[vL&ǽ}#)^2_mxO{Cۯ;͵P>xzvL)FI0H.]g{+Yѹ3hȵ1v;tS\tr`(_cd:xTyHouG9vicy]j0Byc )1R X1HEٻP'y@#%Φ(D6ݰbkq7i)OJ&{j)$N/@xCH%ߤ 2! phjC fg=J?cyo?>>QbZb5&V{MEتK^OR(x~r=aTNt 3<2 s1j6&PڡNCHz$xCf@@Ixyq1EE+pqTP_g|$Su_,Qq3fm2\MLwn%cQcRu}q5r^jɂJu @P7ҰX+P+*̶͕^mCH2zQ6!$|'ņ/omM$݇%sgISݤIIY)qfK !),mzG%ڰҢShw[o^aoEEX3%mbo)0F7(ÀH 3S\;kCUٴ7tnг"l(S mq!GlVqC<+Zi:B#3?h vLT4=t1QMyt ɞ6fh|nneGatsKT,$(W]Lj;drSm!C!|8IC뺞0qW5dz#m@(J$aoj"+Bn}fO~aϝx@oSɏ;'on %Ǔ6}ɫx Gyp6Z BAbL53FTbW@.#egY=*(oJ-^Lh1]JÒE՝&1Jg_XWw2pzr k e!hM0}xZCd-!g|iu$vom#kC8od{3>[[n3G.%/2u0`R}D)Υ:F_s3qM؝voo`;lYV2_W懠W>u0w`|B:%[sDG\qxGe JED\8SSʠpUCKTI$ቡXY!8[fDž D>˧Jzl-;T֢|uf듫9|=X9E)HK4F>.I lޥ "N'bB>U^4ll;5+_dE9)Qs B_q;u#!d\~Dx,x({4AOe 3t8n-G*C0yˬFQK5TQ+` ^|Wq6} Z.@TݤftF+O[L;˭ދM/B կ7hJ*J=/ UDE0ӲPi(Q )AfjwALY@Ly-.)s3;+\ s|-**E Ќn[9~o*^\ޛEH=`"-kH)Q"ɜESQ&HDr;pȇorxU1W7o,FmVۮc k<1D7o1Wi0BQUܯc(^G$`Ev 5Ho*AEE9t,xЮCvTs BRP ;K+Hemc3'K b^jT'ӌH楠?C/Ѷ䜼cֱerg^'^®`:'DSٮCWI& =Fa}[M<ڭq=<%O ~Mq:SυQ::O9⢐Vq˜xqgr0dyێ737qb"3˔*;esLlS3/81TeSh Lr\k]&eDQZXA9ƾTϼ9$M332ogYGl*wqւԼR[n*Hn8] ɲܚr?y|15 k*vK3іA4U=O |n灢EhObW ki\{z47(" ʻ5^rO|}~_e`"rLLKaJRv(b_Ӛ/1=ׇbtAv5 *BRB/j|cwpĘ2,d~z04<\N>IEcJ0JÐER44Hg>vTO|8uK?@@=3W3x}1[8Wcʹm. iT&!Hצm'6۔-ȕw2)2 !5ydR1ve ̠`9C Tx=B tdxEUQycy-ld3§G=s{*^<&=eBfz=ڋ짮**2h = 4B,s*n|s$m>{ZJg(ьLy'( ECNdbE[@`ה#,.ˆ.i\Ta ub h*ҧ]'Go~+εO,|O]w0ݡC8li>2$&tݠ;b$D2ڕݓ!UV(m\Ue 4-CMb/ ‚ZqA}=oFC/}`헻 !ĕ\~T;qb-RQ - -gZ֤=,cUaR1 &&\2H{ E3NP|{)B"s"@R,z詖hPmŒoEvمDV 4QkVe]8 !cC∅b%k! vDIdӋDnFYr@2Nܓ[6.wMHC!<C&ÄL&c1Ld *}@`@%Eg #,n ReEO˿7>57|<;W/; ! )c4qtw}12"KOAh?~h&"aŞ [0?_M J:ti+UQj`R&il Z#L&:>)gr<6:AО`SCHT(ED֎d2E Reۄ JԔECyy=m$XkLz܉̖ dsf9)}IˉF('KfCVf 87q#@<6REկy9ͻ..xgqjYYa`2a&bՖhɐ$&8PՈd-Sf=΃F|L}W!M~DoJ!#G-+ct~kY.:hv&9r,AvXZkξaa&K1J3(|]7NP:L *ĺLO¼秅=3ǐ{sq j24 "+ j9ٰ#rꝐiHSkfᎡlژu^wafCKݙ ʈݯ{؍&<}r D#4At9YiR623i:2׷Y==̊w'xxTńrjm >?H!>nn Xrng{N" 2Ju 3[7f9E%q!][9iS5'* 2AF֦AfHfXLVȥ3GN|I{eʉk n)cfwl\Ǡ*ҡȖI-i_ aV<*+chv?>ϥAxvk8~}(Cvy{'/X-q]#oyR?>ڒŴ_ws畆66k >pjP*: '//HN=&r2%I^J g1P*Ef$GQRbTiN?Űt.PU_P.!dhgU`T# 6QSS+c^2Y^]w)Y!@HI,,@݄դxϊ_/S^V2xIWnQ[KmpɄ_ZwO$7i+o54.rO-˔;Y7F_; N FH; fȨl%n۳7[ LpaSI%=SGؼ% (m@@m-[شG a a bt6Jnh8*)ԬTH((ά>|%Gpbcm\lsD۫e[s Ã=oBo*֟}^O8y;cGZ$f[V>WFVB3]by)PY8!NJrȧ4zқaC0˸: Q`Fbe*_5qwsyIڤƠ3[hzDpuMHyNgC༺&Buܩd3&d543~<&%lnZYD:Y__67y-gbϣ nQo/û; n51]ۥ[yu3vJlYزc04q̻uCwXF2 T 8D.(zM^Bܥ)w~8TLIlܺABvӸHE+Te+0c3Ie┄(l ~ŘS珞;T 0dEzO(WQC:h;\Z+J%eB)G`٬d`AmhL ZלNKfaWqnL~]{aFGOfIMEHʤB_{F`CB{J)&VgX> s/ZD(" ,:&D;g\Wg1S^Mǻ R +0È^^n䃩8͌'y`ה1JF(7Hd&4w&Xa',bV\I.̷克SǨfC+Ň^(E-E yrcj") [)vӚf1f` gs7Z!Xh2wN^ҬEKRQ**.5fE,XgM`Tm)ɶPv2m6^YۺڷV-*qҕylss^pk1n D/2~߹>~+]g<4neU8YBa66JX< 'HwEb[ [r#c|!ۮ8 *i`==n)sCş㡱vG {b<$(+Q8f*;0?ãZ'iqa{*cdmnk{GtvO^h{[DBmD ~̨jYUN0y~m=O]+^|r8Öale"~gu-.P6)"{h$*IWzqR=V7W6G=fBVLJD()|(LNRݙvPY4qZx9qr3DM܏/*6T2&|$vW#7 ޟ\CBOWmн<d( !G[l#\Taɂgv1™OzD Ya%饲0(a̶ ZQߵ'">g1 ,WX;R.RB$*[%ԈDAγ%Ẽ/GwqwC8RtVquJvhɽݝ\O m\X+Wgh^]K<{xiꗩ^YfHfd2C% d.:P &fBq&њU6͆2"9%6,I!D<|fiIWBD Ҡ-7#ȯ/W- )鸓$;Lj!%,i'#s3"nGd4Hp h)DP(( EvD"#2&zM˨Œ˩AQ Z,%rds!4EIPoUzW\G~GX5=CAEt.xP̊=K`|"2)R/xr¡,å6~8+#$jJ\m!+ ZR (CPzQI(}Pf7y` y hN_Ol#rb ՉReml2 /܀Q$@ݰ,0o\)T2FJ'%Rp *()*i"JRȪ $ ( (s2$2p )(yzN吞DV)rMN@(DiL2Ui D#'ȠLZHJR*R(%hhJ\RL 2*DYLWR30B%)(EP25.Q) Ц HZՒRDM4A-%#"dRЋ4B RPrP5R &BR.H.Tis25dD D ( INHeJ%!41PPP)T&HdDPD C 4P-JPTQ,B4HQ@DI@IKCKICTCK@HM!ICDC@&RҴR2")i"iW#GRB.J@U i)R"uЉjBJJZ)i"( ZF"h")jԦH4AKMRR HRU%LRd#DLj)2&!)J+$25$A J(5 d HM(2)%SMk5060R̰eSK çk&f2yRn/@ݻer[n7ҥ=E;{y<%wt.92RT6y+ U{v.Qjj&!A{87P:ª(1 JI PffXDh6bP \uI"<8S@P22Ժru)CM&L i*b& m tZ:N&ғ[J(3IM>dL?0V˰Ҧ{WlQ$WNUg#l~EvL;}dی[u,?9YH(,171 f!ѢqvfP @d@IywdCgAA / GD{ZS#aFJ h$P,' V&R HDj7Zֲn ,HeY\D,]."C CX7a>48;Jqg6vͺgg̸p8,$tC\E i5wms< WJf! #YÏ"9O'wvwl+2u+wLyx} yDٞM+Tx23Y%v O4,{7R0=T 1[#,st[3(kRzuSs(ܡ J/E#L!0C UsTM‹{NpQ.AAk{EKaԥ>=DКqQ:N: 鬥{!3r }jaiVFi;B@0Қ}rD$ 6BJ5dRd51YB!Ҡ OsMܺE 1(Pd+(d>r2r+Rj,,h)hLU(d)T^ d)KA2MHw0j Q`(u4*ٕJC!bZC‡Dh'Pߔrg>v{b"HTDskX&; ' pӨyyS`*ܰ1 5Y&Z Bh 藍X)^Ȫ`)NuqCʼ" bp/zP;qqJB^Sj~UFYLB),AKNj4vNRN)rI4zCbf`; m\q;Drcͦ: fLݵ5_MIviaQ"I0 $/qi ĺϪ/*'׷}}O\>m4xc GetrJ>zKQQƲlm5Q}/˷߻q~L (TSZk<RdB=~ yw݋mjEY+fX1 f4LuКh ]I ҕ.$`*1\8S1LR.wDH0$3"Mtp2hIjn6AKx\neQ !d0ʄaC@ēUdvۓD5Ԩ(/r)ȯ)$ ^HJ= Ċr:gEߴdu<qr.B~2}G/#ʊ=K91yI@6( H(j$=O+=lK9P_7FǨ]=ΜkϺNgk= I=Je=dJN4*a^Quϐ^PCȢ ^,GzWyGA5ƴrK$4/(?P='*aUW2} InEv!rmeC=AE<(UN'p8+T{:WyNRDB¼* WD (ͪ%rMX,?h;%įmZKp# ުMsG[њ%<{{ܾ 3 FصnvF*uhSMv:QZy?yT}2<ȴΰ"p$r5[ -8 3R OY9ErM%c1Y)(iF`%գ52Y?}}IϟDWE7mqPVɶ#:'S}}2CНŽPv/M+0\Kè >AbXT}>*$g6mdMCʝ g-V F9uG Ss!%Hڳ;茘*V f]+AQF&_Zj%yq&,&9a/: o +29́]]0୴XWf" Z(DAFVs9 1$a ԇ},Gj 37c6x9By;5.L7é}uS@D!PR@\g$Xg{Gb^;[j'篮񇻸Soɿo^R~b@9"QAOrk3̼'hSSL+ϖS_L6%XȾ_YGJr\LftxY9q}-d$SKgc#f5RT`^-Ւulm?/AxPSYYx9']J4۽=Sˌ8vfÝ=6!GO*ӳ=މ7 _k>2NskQ QvBBȭKg. LLf3D쁦P{e[C]w7'd(g틍(x-w5I*  C9{N‹u+P5  E+,7h5/[W-ETҒRsA"%ҙ5jh_H T98w"L @uP98Jw(nL)FTVsn cN11<0߂ɣ)%yu~goQdm 5NU} t¡Prn3 Ñ`pAnj)a}A;بGRhFtz3olͲV-;Qb (e12Vf Ptɿx5b}oNNh)lcV0De~6p}h]?JuGc;rFw,)Nl%cR dl&n®R>rN::[(+-:b7uEG&ҶnBGXIqRiO2c)-@W|E<&__9ylѠ2wr!?XJV:P6Q|A5C'ⲭYHFABJeF]uFcA@e@v$" DRv5ny//W p@Ĺ C|o׉khOS)KI(EsMiik3FQىzBd#PEkEEsH]翥ˆme"ԠwdȨ k͔D*SZ O8+"Ԡ3 ##٪$"ekQ$m7u?w}_ /(ai`2[{ ^} ̽6\UgڇϯRpsɟ$~WyփO$MS@Rp&2L2L_ 2k9=v=ln_@_;y8y-;RbqDlRbgC9B "wL9H*Hu QUB%%% EP4B4P J KBRHP"R(R4 @|2h(h)R!R ))( (2ZV((h(%())F JZbJP)y4"D(P$ JD!J#d (&@p@Ji!\$  5 dQCBҭ"5ABRPKHBU@-R *ViVZT)hZhB*ij( BHh*(JJ Z)i((M % B҅ %PRBЋ"jMB   )C`lihm|Fr\hs1V _GԻJZG2Jii2P엡Q-,Db)( T!S\Ё'0U>KEcv:Otu6Kݲ*gsoXz$X!D1'ef>SLBpfꋸ0EudA=+'?h>KdR&JWo۬^G: Rk|sT4j~w _ov%?(Pk-յJWSW+I' ά-yg %EBTHfn}S(UA ANT^ZĤ=]JE dS "Jú >;/c ԕ H)RE/l1}aS8S2nN132b͘Vq8-Ǐy.,8ɨI5M2o*3_SupY՜y6MW<U'mcq j1L=5=?gO,X5s2U-!P}E!Cri\aV:ݠg@>~ҥEP AP 8S#;ND!L|rK#!gs9>~i?XOHalFxuJ`ZRG{!*ZSCD`v?fc}oPi_~:~0ľv?D?t#SIM1TXh?hc$ "'R(j1I;ɚ9c]JXbb\d*i`2[mPO MYf Y0a MR@4S$d9Is!a5bٛq^8XUWZL2 S66_DdCvʅ-bf  atd}`?;'iUPIP ɮMk5Bb61bMLC5,1B*՞щ\I+ ?8dHtαRZ15:e XDD~wΚg1b1UBt)SMhMdg؄Mx%.7ĥ!J5oϐخ};a4]aaRLjї~^DHH2QTQ_Dw/y@\р~'L~YNح_?z-*6^0A)7g3 DV.0gzLsc nsDMUaHVTɽ͙5z0Z g|_ sPk}ީ? zMCTzkg?Wj5Yeȧ yx)ӹVozX0fĢZ|J"R'U~QX E͢("-⧻ÚTWmAhiZt!DGjZ<{h0ĕuT44wk/*}ğI5@U&C>xF&!?d>bY>VaL]mYz܌Y5RF$|.RD":GLF#7aƝk0Ȣ" ܢ$TxTQ;ԉI "QF"EE3mVZPQPQ3VXʢY* "0kMdB A}1X,ˇי[bM:vQ;pYf9vP(8@EdZ7ˆ4Q'[ 1nW6f2 6a^fG$h \A&"۱ǎ Ʋ .0J7]Dd/e$~^zemD8qcb8@wi{(|+b  Bܶ3b#`NX΀[ܣjj p% [Fa("k);JwL[6~yE|*ml0; oEʓ6^=^s=l>n&0YʹI^39cE5B d400JpH^R ?8>J@BPk"C3pHɿ ֛Z0Cg1M-??Hr&*`a4< rzr'xO?aǍCSr?9S1&hݠhô"D$V;(Uw2lHdfZ*4Ũ!:`ho:F!zT!`d?p; lWΗ C{OADx=ׄ<\qz2jj~[)q׬ZOx'vOGhWh\=d zVڠe OeұVi Pw$TD) N2u 3!DBAUUPR$L-h45+fPQN brAAF s +ƊFA~~„oCqWZPvݲ}! 2fĂ>?'Ԑ  AT(BL \%2(2ZʕMC¹J{`}?߳3ݶEl}i$;zkI.s!q ILIuhzfÝJpT i47AIPZd?t=Oц C@B}>PL (Q+谅)LI f]bBD[ڡv١@y HCBBl`R!Ɵo%}ﻞMrcQ.\a FljU@vʨ FPɐݓv̯nىgO[rl HD`i EXj&2b;f<ơͰḳ`P}zݐеOxEu zs(Onu=eN_35tb/:MJrm;>9bOiW:\|x;.mS~^rc'(E"os8=6WlqSiRR8&~s%?nBtJ{ї/'Cr\\%0"C<@3 &EגK~zu'ʳmϷ6a A74' qs"b fU6k㉬+!V| tON'(Xk'rI5UmRB $RĤIǒ ڤŠ mHF0&;OP~y8='o,>`"j1'),yJ Ÿw9i4{i}6Qm.Zn(vY<:JZ nC!ZaY{كm^K^wlNx8x^4lfaVBm1+l*R2dӆ8r+!=DȧĊq*aY-LHXQ& X ܐ !qΚn2-r~Zu)OhX2$|95826u{TQ~C8DVBv{ 7iRo:S6TPTNw0~5@C0d -/=Yi{e9K6ވqě'C9}no>l2}1؂r3 K9)=;[ ŁV0"EJ~g5E^5couL&D0G,:Qt'ٝ t/YAKz;U{z0==Hч]leG@'اCOԁ$3;.?*qo\E6L\+ {VF ޜ\.m{\s,4j)u , " -1(񠫩Y Jܤ`(!E=lyNfhI\t$rhf {t  I=CXs?o>eJ.4|ߏr~֖ >Sѕ d w;@РP#nSEQ^YS-(4 0PnG'X3 _?LT9^Z5A0k0kׯ;} {2Iwս@5]ڧu j ; {rbMGF c$*“s\^sf"ΫD=\ ž=7GrnB#G gh9ig;}>~-I#"JC!lgt(¹F)IHBK)]%*֭i(bR ` (3Ʒ({qÌKϸ ͤeՕ5J724:( &9g&1rBzMpq0_>뭮}3Oa1Yg0#;Y!H,i+xrfp5xI@ #ڹv 8&#v IF"E!80D۩w6tb|:I^Tu.q'/<23cshin3vgzPE@R?q߿xI 3%02C{f0`T_J Fʨ(+ԨEi}V@5^.ad&!lkά(&r_0wލxۤ5(wAUBҵlVf6l<@& ͎ Nd/]r;5t$Ƞ(-bm };!B}}/&P w *"̊p:'x˹Ԏv'P&$6N$?zV IKJ*9*`PHeM$T3*Q@`˒D#@P. ē!+ *(1994 !E9CNHPJAT% *D4!C2%BD|e)%@P}G>q TĴh*("$>L*>1 H ;fV2jIȈ-4DFT(a6bCQ0DDI@T4AHCU5&%d&9e)Q! {w&)]BVkP+YAjLKN^ps"YyJnCcԔ1RЁХ,"ʵ:WO")22D( $<(J*Jq=] ҉3 ,KʃpGh)T$J2Erwe{7 hmFyJE v2G-/Tq~iSiQR[2#dGXdN2X oVcuF"R>a*SCzQ=zױTTTEEG:۹hbEf gВZ9H#i㾳hG94us6IƵ"'%VVA@? -(YQCPP\@X(@30bRhw (l%HLJbZ =9 f` ) ğhtaGx `r6h^?lh+mM0lo2fgYm]9wALB Pp3oP qFkjC|.'D9hlaJSo2g?ɼԲT()2ȲtQɢ ZE@ EI1-S>8DhVeHauL`x=|Us_G/yN1n-z8&0鷜BNN7.jʿ7')PRHHO/݄m' Dauv%,݈P!s3n2sq.ޠe3*k/P}mŢ-nizo0{˼E\s))#̶D ÒQA) vym~2nc8uܶdJC ZDT;3#dk3 d*C7Vi5xʬ5X3zb/ ]&.!bEۯHۅB:d2BR%BFFMѤ2;7h\mCsu\!j[EE\)4Cw59,i> :"Øgh(iCrw&q8<ϴFG*x{^ސ ڇ+QJ))BhjZuh1u.Qq3RD)(u=KD*~i戁4dR6B{k~pLi|WC [ĩGl531+:>H(V9ÀUH#4jj-``lˢ:18ߥfF"C|O5xOQ Y9=ylcIu5+3[L$wbc˦*zAgQԩT49jJF0nV@^ho4z? tIZ*0)ɠ\$!i G'u|}iMvY1 tz؎ϰw)^]ӖԘh hEs†@Ţ)={gm׬8U jahj(ŭ|g03[їD]iڑ%aTHJԨ`* (d'GeUfcghyq" R< s jiL!FP6-UVFBVՖe{LcԂH(ZcrH\阂5v,半h! BAː#[  "IDŽ j(C"1}d=Zk@BU,N,8p)a9۬5% ALh+VAII$:-#],cE! xxć=Y)AE)JU R4{0:z/̼ QQLTxL g Y\tx(!R5CMUIBPȉ 3 *HiZń!TKLTqtɪE3 -dg<R, ?>Yu71pC{{ "6"鹻Ĩc/K39''#Ib-L]KY19lDV07ytꞦMXX4, /Ybf$iͤaAL`ם\(d8p^5dbNJDSM˖aOŒ5nǶu d}

oXpE HJE01GPP`B4JRkf"j(hݖ([hIөᷥӜ,? % Uʾl~.j`q!X)6<͉эWa줄7Ƙ`AO`K mC&{[Wh<޺ p(AO<^ֻ{ޜg`ε;׉uw&"qv={ݵXOc2yq0{ڲJ^CG#//dI=3*D|<3+v}%Ցu |M*# %YX**.yGmea$/0q^^{ն$U8SYcIDLe4SHR5RIDőSD6Tկ=<8O2)fUT/p6/ ^y-)r{3àŹT^^ڽXϪEB-,\yQ_aWDr {,Q^'^Q]+]/>EǴud\k^ 8z$GxWm]msmxBuϢ}x{ތʎonaq GC묨7)>C UEy(ɬމf}OO-NeC5K{wtz^{qq> )OӶ7lmE{/o{3\ЅwcvMLBͱI7`؍j‹O׽Ivcn3Y ˷\*(",mN=ǓPkeAOQ罶#W#)t\y!DEy&[b E"j [;$UkޗMay)DdQ*DxDho7AMR/t/Ok{ϭzY4w(H^ь/x؅ťzJZ1ޯW=M̫b%<{΍iuozvn|Xɞ)*Ѵ/yBzQyYKnw yޱ]z|Ӻdžx.[;3ϯEx=$G*YϽ脵g{yFL;`Y6cz 2`|{#yȃW+]{Qі ypV]Օcݕ ʆmP2p "Qr HE%'VVfUED̴LEX LC-s%ȯs4ʪK*/"(ȭO/"#M#9ELLKTNf P*C&yFKfQ1j"2^E(0=Q%f02,Qi& aM44éQaQOKluL0DE3.~\𠊆WEo=O!* myCԕ$YNPR( cP3S'(ף]v{fAKAL{hNڐˡNǼ4vxnCVs1*Qjc5?/

xg!DA9Ljh]͸Z3{ (z* l+LOvO"B`EjCX,\ܿ c]2mTdE^<ׄĝ=>7vIHٓaA w/uY=۽Oz%sz7 ͩeyf27ɓ>ty\yu+a='pSaCY2!Vak>bp‹Kf*-ggޑe{V1v,/mW'3 taYSK6Ϣ{3UOMQR(YB8 :~@t1 lݱlHb7V2;LDAk%q?:05>C0 %\ipCD;/W'ˎB}YyUO>3"XT}Bfێ}g}yÄDy}Ffc1=e`kSt v=>oa}b`MwyH{uސ7,mA~+Hd1 b.]Ϲyv?&T>)#Jz)翟7S~KhyDz^g5VY%!N:u6aA{X elqa6MEu9Қ͞o]5tl(L{O']m3ʋy'ϓc(((붇U䒈jI1B̈AySc/" ZSrnlrQXigΛyWs 1 AEWhY7Z'ϜN==-=즤yUAxlg#qyNSxl{$D{c7|lGO"J>ك¸${.oheTE=sq)*BR<$ wafZV$I{vb.a{XlCcBb*N$ e[ L7Z֊*ƛanA\T;&3;ןxn6H򼼚SVr23Ҟg"aQRQ=9@Puܠ`%qZ*ܤȱᑼ:9NU<텡zEnh;5iy\Q$="eq7=Q>7l .d-HHM&@$+g+r%q帔:72b4z؆nAv0y:&Fag˹hn I.RxxD8=PSgܲʵ"*JtRп^%P(l< 9?1hJ"PuR|F4MfzIyrNP Bv=s ma@{hV=`"}&dG ym72JѺg^?̥VZPTV X LsdĩƖ+y줸4Xc٢U|wۛK`P.y|N}ً9em "I\lk%Rs{P6(:qi!Ū8s vƍXuz"ҝ EEƺ)e M O# fFZd!HtKYj 45b)U$4t<3bx2ϟJAl>;ŋ. \Ԅ 9 vNIgV%[O{`h̲zq2jfA>P\<'P4{X8f4yxHYdW>C.Mdϴd'+;O )JI1"AY7''qw()R'Jx!2m}:sykxg{ o7+;x Ξ{O%*9ui><@ .5A[wz$?z&;2D2m'93r/҂*Yv A~rh-AE%Yf zB$;Lu) 5'aP{>1QR9EK2_,ݱOq1-o=˯}cS;o Y52ezE˯Miy24<N8X+JVeMXܤKKAE&-(q?L0~E e]=XcG`в dojg\nL"{q:(cW+o캨jԳ2p *.Cjj#^bQO =wb!˒ԑ-i=Jn!hh^x<*h-#jnI1 gp8@_8xydP.YF JPSUCUDAѤa"VkvĦI@y¶PgDe\p\?zqI CڍӠ6gV1T3*ӌ29LӍvmkw@醟D5!н25Z'ٺQC62#ikHhd[:5,^9 n\a=<:a׵`M}4G!`a;J$;('/ì2w E,NPews myH'?C' UWtܲ.y}qh"!tj&9D!jIVPDE.Ko?B H;΋97>i&~J`T1}}rnN>k`lReTZF|oyꢠTIDt,E|ZϢB*w$>\0R`O^*U4yrNŇo,"mlmު9鈮no9 g2_Iw7ۻݬSnm\k( O w<:o0ˀAl)k hI}{jZ: Jti?hy8Wx>ו$99I+'7:̇M"=Hw0R(2!۟Ǹz;Hw4Zev~c}I5bC9Kl)8@HH4R(U RB @ @*)E*Px|{/Y:(Z^V%i "j4D?U.2Jes>Qbeq%L 7zdOq°[IYQ96M4ycl:e  x-&%?H!*G=Z:ǝ)ϥ 1R QC n> oDmC)\Ou6@ ÎH,I6蠧*9RmcC{N)e\"\"<x_H $}CR+ՠhP-F Z͘t g0?ͷJSYEĶaZ{G- es齜&(iwm `Td8mee )e\nmH331Iƴۋmu\@2S6fDp1Ƭr NfGacrd8ZdmE;]7vCZ~s="!0 >.T!46?P!hTx<[7ͽfCv<d %/oC MAQbK/vBx5 N&LjE5 +ϕ^4]n䛔NRygK'h9:J; *uLy<(#c Z24j{ZA"!%N>励pN:#pL#GD5M,"ޔ;h!*A,pZ0"kl *}8-k ϭx˴ÃL*/H$fQm?ߪkLdG/gu{w\l-)ݙZj1\$vBJQV'Dl%de4G0hvB=i`qwUYâOgѮݤA)F6$r՝+n͡#tF`@VsT" #&wff0F$ $I.jSlǯU*W HfY/̭֛ }UI/}Iko2 t’cK=*J0EU4zdv[o*5[i}o}7u+ټ-GF3$יV*4,ȐA B-2y4dNr֌tI/StwHRqvr (-x<;5*:7b!s:N1\a8ș}9Zfyvl(h{Y0猔@% K)jxc(,o/xكsN4 X(Jb`ֱ,~Q%C=!{d4l!4lt{,DJKVu+Dp聂,WMd#rK1Il]1L8z؞!kV IW-PlxzvH([nd[Wg-*Vj>0<"CRmTHl6{J?]#_#|1+[^fjg7pӭh!&b?'^Cr٢ %1ql}٧Y)3En3H֒Xj}Ihcf*Ha#>6Gó}UѦ@:26zӮSIrr4{,4ϫXY %t藼܊)pADl;Zչ_OƐϞ%Ye0c.G@AΥb(Rfsߘ.b!l^oB{ {}{u  ] 1j7(tM\c~$0.qGF@ ɍ$R'%NKgpPC,z6wtPD},vҫ߯^Be[,FkU:' )1|xyU'@dEZ" 'XGaHzlAj,>n6CsI ##ֆ#xMi EGzYG[T8V3܇!5-2{l`;θ ?9}[K 4P|s5Λ SAޭZ5=6 ָvS3m$׊ vV}#\vIFҰyA.QyOFQJ9LO]S..`$W?rBeb[T73{~ aU $NpgyX@%g\]hc#^P,@la_E_  p;Kq!2\ me #'YXI<<0e[lx0tARkN?\FΦ:$&=`hc[<  D&U25,sv!{8흆TǓE>x]ϫ5P`srt뇸֩/եǤDqHPg}PQO!kH" AIBجqrlu|lyT= 6|Jwl-.γ< Dw?*{ㅤ -EGj@ȃģ}a!=bmixss4_gO3&b5ݽ vS@QԠZf -Z"Q 6qrSbD/ێ.[ ߮{^qL$Fc:A &$Yg]YbҡFOX|Q?VCHP~f\~!El ĭyr DLBoX/jF#D",twϥ&>7r\ݖ-"!YBP+q&_*ZcRFVwHPR>0YXLԐi~ގhgH"H=Z[k9߮>: K[E+ 1-,'6_(^n_=#n[ktغxX"=^Uԙ5H"mBL $OL݂[ciA'J ՛"LKap\;\ _>_^oK$ )85#N&51=Zpt\,mQwsd1T/Y@a@ ԙVAT7]8VbCFUIj^tiR}鯧.}Yܷg6I2͢kSOmzp98(w\ɲ,,-OV8UtJ1xQtLAvA# Dei'P{(_]P:|D`z\y{ܧ-"-MP{Nt}cֳF.Oe0PlM޷Y0[Yu. R4g$+.P>dj^@^wlc~}`Zm$ZHt WXMdHQ=a~>V|áh{5LTfukbb~0]C1ru*WāJF$Exc:KfA1~4z@al[W-Ц {$FR;NDۯ|V 'm!~BҰxw*_f2,̠ZWkrq~Jb%?V \\|JZ`M9 ,Lk]g_wÂ6WbJf1Q&y/2b~4}"^QiTp=WSssg@5L:!n)Qcd-Cg[pwa}qi>c}$jdZ_~EhJD #5=nT\߇׍'G&(n!d_`gD}q.^?tX=|4IYud VvRx9:ͤ'FWp<4aPj0]$hzZ7%uŗP,ޞX<-j 0 ho忣 @ ~ izǯHp0fLZ8Cm*-gԘ3H$m! >d ]m2$ܾ5 i#}`BEJ]}&lTId++6KxDaaYbAdPeHWB.P5/JUUKg&"'3hC}߿p e(1|9,otÜ:_pȗ,%gc!-'0|]W:UJ MYlP۰o."R!p݂in*0qZ 튄G b͗f&@&׷]vC,#&pqGGI#/SWүd$D%HD/ /m1Nl1{*lIIUxaٻQC(gn^I- :q?T')D~OOpZ`$~SA׌ +csIRM2z!U #e OF"ݔ_f nޗP P$_P"dO7"s b pl Ir#&7JsD`w<`4DUƲڊSSU})%yG_"}zo(q\ "7xGJ&'7Ll|rCid r'f4'@dC69HU7;u4oeN0v~9A_LJhT};{qko:jVHOEkBU~{ ̦A>-#7Tߘ/uҬw2)r5dBw}Vơ1۹blnmMy/w2'©OT6_b>q'z]QQk߾Gi ;If%zMGF!/v '7{zOڵK#M).T;Ot'•3rTF-nx͛|)֕v.r2H1CDt*wuJԄuX2ttl!$8aG;#A)#wuZwf9(:\6 lNղWk}bb 5ȨHvL+ԅcPs _o| PZ{^J:JhTz(*KHʘIT hpWlLwv`!RsAfmG2!joS߄>$YٍMߗt/ QIڻBO7gHQ9PCk ~ɠCԪ aۢsNrN%X=(hAe N4ewzPlnoDoop \,{©R9~66,b<܆iw Vp_:׺ۥ#:9.[|]~ ﷛ iūpo9OouzuuҌ FӉ)nVD=_˩v/XƳtLÈ%ߝ<{t]_x* q8AzvIr_M/qS"ާI>VWiVy*y uM #&RpY XS1lۨ,D2(򾖤^u1OG9X}C]#*1#8^" ;^4O ԃ;Ft5}yHi3$qzY ab3XگZ=^ =zM:G=/n/o ū Ȃq z.;9"Kw6Mfx:լ[5/ck;jBBSp>v+mm-iMɸ֤\s:xPZHha:aha3q>4tSy#ҌaޮR-k{r]WF9=ah #J:cc@ıjG l^6cu%!r3uKi6[r]`VmŒ ,̊BTDØ+NnQʤV(ﳮ:Pw@P)B=H%̿Ib(iݐr~'PxHB\`PP "q; 3tUUaEy{`(  ) 0|8TsUOcF;چY]Ybjy~$ %rURi S$\Qܙ2B 5)K9!/=I'n(8&ĔG.S=$šr2^'CIrjb[5{_Ne)ͦiܩJ;FF6j1V:AKA C43yoQC9\w`@$j&D\|uOօW 75no]s7 ," 5D뉶kQIQi1+SQ d$6旅 NdYU+_VuE(XT-,6p(&7O"weJdI"dq`˂x@렞@yJ<02Ƣv2dQAp_SJIQ"F X&JTEխFV``Ř`)+ߑh5K%--)p04us~A{ Ir (V\?yyn( ąv6V7@[=D3ז53s~ɱu >lvN5R ,W)*E!R5S@<֩ꄓ"ET%FP"ˑ#Px\CJd8GJdk]4/{L8Ũ_|ȸѪ%$ ۬ FKu (SAɡJZE'!Cȉ8EW]vA+X}zK` Jo =$0Њ¼0UQIT.@Cdv;}[+40ǨkNg,>7w*.vÜ㦳cD$ԻbzϜe|8pCb>wkMÐ!2YKBTU-6OMl|'d!R:zuR*n" ${mjJݻRA`3$2.8p;%2<܀<8dM2[,oDj P0TdܐJ@9f3KE5ڦg雛. htX ;=!̾{=hms:"ZΝf':*@vw!xޠEU[Ԅ=E)Zzv'G0[E!Fz gꇳ!lvɬ`o*dMx ɣ{-KTִޕFoOyMqVl=Gĥqy}3dN HX xÃ@jdlˬO_,+ tͥ?Qqg@ԝBMZ "CAe.UrkXVM #iBT?,X,1PhSPa!X(((8#*AHRJD415@+Jc#@5@D@[aG-s,ጮ%'N;KcZnașFcE  yu2dj9 k () й H%P5CCHR4G/SQG! \&iP}#wQAAKFlrʊi)h&0C(ihdT!Uvz3j&7;kDGNiK+>*11D3ߗ m:S@ =ayN C͘ Ƃ" +y$;nm풭7<&5cR+RGk/n3.f!YQ԰S]eQ7 l!&4_؛W4z")*\GCm$~~}RI4`b!BdL2V$:\9Q(wiNB qԬś٘^)Ru9]P5R1A[1hٌϙr 2땶Sqw;qjo\iJpyyn+{WL=>t_kt<%zs؝⡆.yshE.c %*VOL@Ր#H/ Db6IPvKjzew.0s;a=:umF}ǔ_z|xww6٧#zIocF2|~Ӯ3e0*.L4`xBxwn 0D.;3&Xۻt2e)2dTF\5,p}=aW9Ss0H4yƊK)CJvnD6HR /}eLޡ3<ۭ4QyҺ j_DĎKbs8/073 p`I6w{?#ϟ=UV~w vRoitݷ33 > PMۻ?Kio}veu:w~ƽ\L˖cVS5.eSu]ELc(o>g2/g!'=E"P\C"J8A H)ӒiLJ*%x\huR<^u4 +!(ES xPD c3_HNU{DxT j&Bz붯PjG;/  {jkV]KJRw+>[hO<2xȒ }"TH}_,X%.rEAxP\+)=7:ϥ́ۃtLMLڋ\۟8s%TUcD+΢:][NÁ'u/&WIMOM$Dy*0[Pb9JE͓K:Oߓ'Q@_^LORĐH) JbN!E!@PPqwY b_,ͯn;-̗qO$L{{̹UWsi$Y!M> +t3ݟY&6͎aDw2--pj&樋OŠbE U `JQ <]{)>aC!. $Ӊ /tJziR`TY%9IpY5|p[5'Us5alrd_m:PP9!T4!T>sT%dQEF)ׄݵU񜝔I1ԥ,H|@qfrV^hȐޑuX."sJm:[w>mxb-cdRE+"lg8XQܼؕu7{#>N RgǒZ(  H2 *iiC%22%JJ@ %P4PD+"Ζܡf5U c QXkPn{9!dEN!M8&AH Ac̦ER%)(h)iB7%9"E9-zbtAƋj#KI[9Y…ydsBabNLċ u9D&HRlAEh|BD PRs9 eA4KG h(fh))NBrwR:)I "44} ;kRNt1ӯƘ!+>\ ʒ!+ɛJllo66$Ge1fپ1D໱yH^*,HӖdhuî7MvH",a٣dxDV0| w2WL. 8{șPgbM77{ޠ65~ҠA #[#(> Rbߩ6\Lz)*:lnEQ`;5\Y11i8]%ZE-7d5]WviScO A؁dꭂ@Ur[^F0.R&ߕBfJ9I*h{R`Ĉu}ީpcl Ie!Fc=0gގ#b_} fDmΗqk5&2M|M=e))j!=Puu3~2x'9_O!aTCd}AttHZ Ѭ=+Eu;Wk !jq I:s86'-Y5"xw2*x઩Os* 6b*8W׍ێ,8=#yE3%W #'KC+;<%ZYDb@ȅva1S7qZA]VV]00M1/x4NRrDVN]r'8{[HҼMQEOݸzawχͲaD.*2F04 %x_-tl̶S@]־t悽hnS 0v!#H0C{Vbww CVUZDICR(OezPyl 2)ŽGd!]1AgfG-Du /3wHbDA r.q/y),l@$1@4Y[Mrwua8" fՀ|"IJhYUt$Bt.hӯg[%B4n|̽C83.OmYp>Zw؛ܫ::[ uB9\`-Y݂l(\pA %Nہq ߬l+[MB'3`ᤎ4Hf;,[s^H  Ft<{١)]r=Z|֦mbݯnxmR#rt Eϗy,..CGp7(KKO0x`۞=EMMX ~`Q "ل)H^ƢeQ!&Hb3ЄnzӋzQ 9;ч0F˯io\y-𭥜xoOJ[P"4-4Ns H%}52n׭*8 b_pQB0A)+wezYy6I'轿~Es|}y$Q15AiE7!_LltzH$qm $YHtmf]ҽ7u`׮0~{;]u!ч %]?jea~-S\~sՅN-1&0X6(-l~Ep)H;r&?7svf1*׽,u̼@B^4׉WzwG\,xa4K zk"hcW}a SA*[XvoF9sn=>|WĮ_c6,֖yShYֈ6"T% J`DN.ϬD9~]پ2!3B/;0G 5p|6gOַΚmq½63^ã4TEri[{v{9U(ᦒ\^UU3+v^si&,p\[h܂oq2%ޮUF<3ENS1AQFu,~-[}^V(Flߟ4ЎDH"$BiW-TIph Y=r0UVAj#xOܷn{֊a\!tC( 9e;Zɚ^5֚ԮA}T\DG~|~q{o,CsMS+F'ЭB4WɇE}&Iݭ[!>DtA9PЩ> 9g8~XZD9#6hA84ZHAhX?30YadiN־9z,ݸE a  DAͻ3xf6CMG|ե֡sS:4dMWȆN|c]cXƱLYH6"t"㣾H73##v +z3D@x٢a= Q9m!!&yuwn9|k3v dnMGgKٺg(eC#\[NZE\Z"XOlD|’/fܚɓZ[#iq4D|CfƖ(wcuvcrdC,AVܳr]ތC0K,Ka+O1< Y-iQٲ]]iPCy5Z𱪈/#~i$69&]]VWY GO\4eX-1tTPj| {JdX-igb$vMY2Z Uoe6<0K6QucC1 $" g}mf3(h;5-_%ЛSƫ@Fi*6İI5ގfa"_AH" "4a^2wqKiO'r^5lCq(-{wDfWG=3} yk'A,,wؼ4S%Ai:|G4#!54B!x<+_MJ7u|V"7Ϗ~dYuFk6$dn*f}Gqt@BobD/! %<^R6]42(m=EnϼY_;IXU[ϬKt~'T@$kՁsfR$lC 1ɨ搃0pҭB"|4  T&ȋ: @tt*Zc"-)ИDē>,w*V 5iQ^ژEdi'F#,Q<@.zҵMq9i.ٞ:M6Jjb]MY[l[dԙ;܊-nQHblq„! *M1l"ݔCʶEuCF q֝􍧹f@.) d""Pq#B aYm$$±rȚkwָ"3J_[UcMq;?…*IQ Gnz<&GS0eoZL4~9!Kq# 4FufmGQ?K^yhگӚ"HïDT"DL21>YDWqa?odfฆ ,b$a*&XP{xsrNߦy'lȘU_ck &oQ#Dx4н^Ɠi";Y%%N4S5dZm 4X  )I44Ms׵e+0N!̳-+$V,,8dnDZV\a $BM&+UD)Lrq`wq‚ZؾkOC?\LJB8Eie.kt͖/.w/1n"RlR֏0 N3@ud ۈEGYb3NǼ8k$mkO|Z/gBťoxnիc[:kv9597|cpukŬƮ5OxƐF+m@ #Zca3dEBpp7L*֗ z$gk g7n>pTvy9vmkRgK޻aZ/ "I^,v߁ >]铳^`u.dDe­CLE;~n?K|45qi,#vjvna Q>ޯ:AU-niSnhG{, ~Wj#T6W{) n.$2c†8F´^HHiK%*#Di2d< ٲeERZiWɑ)DͶl&.,dC(/5iialbl0Vpp{xji[+޺S]Ykы,G$ݟC!LX,b$ B{B(?L5/AUuJAB\kzN$( _G8,P W/LD9,vU*hcYXXZ^\p-9oБAǍpu+2XhEq0*FAf.P]k5Q+ܳZݡr)cfgiDo{wzmo1gV{J5zeOG::̘Tn\)}sIoo~gz~\B ZF *:0 `=\نJS1Nޏl^NLݖ<|}Ǐ%٢a\YC&LMA Q.28JG?#nvL=D|+كm]P5=]r<=$*$ _w'WK*5M{"N)blf`Ur^ub6 =]wc0B W)(2[>u[Aor)bnvs:нcI[lv&H4(J()`ͬY4𳀶AwOUr=7b=,8t\.o ޸Qm2;!l.R k YUiB\tQEYzo79au}5UA_554~ƳZİc*@AvsGc}utlgAsX`׵\o SUe,0Ɗv9ٵ1zxsjK(caOa֕[c<a^GZ]of!{HkZV5JsF =;OcyİO%cEܬKmo1Z/V8C67~[(XNxE+j{F%7;\zwߤAd"'AnN>B4ЍtwLMv{!j2(Qe:(D{0˙QaY A=8 CѦfMdϼ{ga/9AJ$:E@c{6Y0c*(i VcQMl:YwKfQ~}yć-I7s_GLY.JI[qIrDk68N1ai)Z5+K"Ւ#4KxFBJҞ,;hU8ǞkC'3rjFm,i$n4̲1 `Z$۲?k " Ⱦһh Jְz5~)osa4ZlҢ}r7rQ7M -ӲT><2! j֡z.O,q"hHbO0rz0{bxuG/RHt Hi XZjC)1d!tȳ0$V5 բ03[hm D=DZ.HMQΊ+--I_D6`o ,D $$D iˮ/Tޕ 7oMTsIT6'TYPrXQ+c+ `,J(3ku5 |[=L2ȵ@zV4>?@I ChN9WSz9\-K g'Ӭӕ!lY9lo9yzz80 jJ 1$ K_]K|e7oru/XL`x<.k\d](c$U!@ȄF o5wkfVF+HrMJa@`U ebX=P10v P1m k @I1` )DJ w@⫑u 7)kHES|}|wlT ᘰݓR&<7k=U WxAbى(J'ܑy+px-cau'0TZaxaid01]lsGדA Hۖ0;d4Q-`'o,7d aK? BB(L \ # Xb("*br#3RL$GsG(P@K "5M4 BM*U4Cd!eHD JR (ZhBhS!rBR" BR)p4UV$(" * 2i(h BiJ!KxNHЅTRhLBJ@V@ґ5@.biX\rhA@̀((/!(( ]Ā] $J9-*RB4@Т"4(0H$ !8DԔ-dBUD!BR+@Ѕ4 (RdBQKB*j^AbP) V@!Z !)3)L$(J Q)R)P(" L(iRZPJR)ZDEP)iZR A( F( x8J(RJCPRXJ+KBbhZ)ZC#! @( ()X((VFJ (,* -beHDR)JhS%" DZV(FA)Dhr (hh8)<[u%%*x L K HR !IE)HR+Jd !HMд%5CTPBzIPD a@ JL)2 J !TZxrJhh)5% "R(Pd"d҉T+BvVX&JDJPTH4RҔ% B99)I̙+@HU jF)rB5%* hT VC$2JhZ)ih(R @ Թ" )JAHҔ*P JR4@҅ D%- R4M";LBBC$ hB'rrV% hA2)B]ʹH*HRSB%JVBԭ R -䮠 %*"d% PIHR JMRДSIJ1 T*x2 ) RR PR=f?N.b-hrB1 !B(UR4 B*ST*R4% @ HP){E7 *P1 @HPДKTM B 2Ji_5#BPӹLj( Y yY!E-4䁸LVAMd:ràtkYXd˔Ax0V_&ӻFJgyߞʉwdST  MN_: >XՇپi0w uq*WowG (JJB$B'%ԆTCdaHT$ET*PC" #F)ݱAvkr%(&BP%"ҙd5d+JRd&@UJR&@Y#Q PQH#Cd!M)JقٍP 񋐡B42) ) (rpNHJ0R60X3m}w9Uw'.%]Szkjk,4~hZDރb~6 hgVZ4>>>ů- Yiţ&b3 ZjӖEZA(PPF ցxd>niԬ UsUAG6g[Bs㯳@+D?EFbזq\iݮ#oN I2W;͕;aߚ  }aX{Ne|tkE)xdj b"_ʕCfQX X乘2PQA&ԫ%mNa.R!ZR䎥a?t!@121v'*T ĝ9E9s"(g2*HH\JE"*%JK^ZL5t'O78 $G3`>zE\t/)u,RBcf^)eP 4V|xlo5Tر̄c!>\,L^dd(n*P^sx4 NHND D ﬨN9hɌX;Nʌ"KA 7{cf{xWcuHR[}6]4} bsίXdz^j0zo WPH-|KRCA Sv>I ߉ S/0E #fZ#Bl4yĞ|HXJgp QHUKRWÄ>p') S2FIjL3q} &T2+] }ieic}MxKsis,oh\HXXw41IHg 1 cާ9_8ο' ge/RˁhLX:ww:h<]ZmjmEB:c ^ ;H ūGZlAb$Ұ5Wit|ikg#-aͮŶ9ެ0NLfXmnjrA|СC &B#yvz8Gv`(:Bt@X%H,XAM y_o%4.1$):t24'F遗X$IխhetG@ob>ZxMkfXbwv]\sQ[j(2v.i,ik[~ \3>rW.}^zqw:31:%P'O\sZ sxθ6^nj//PuX_"]=f:T,B(&]d89>y$|;r2fYcIٴ \2H ;+xd7eQ}B(<tI]*vv)*m%yi^sdhRz `\ -ty(]b|&aBIoE.G%zy{#!PfaCHu 4$ 2BE \F 5+ΣNk!<^gʎ r㐁Ʉ+%w29HO3*h` L]l\v*f~$!>n$]R](J H`{VCS8no,GI@ȭQvPKmhel]d$Aa9[Ltǟiq>"_y=oar}G?^_~xsOܣ;/wva[*ꔗaqo-(UO=SΞRai k~xU}XAEXZ$XEA+D^Zb ee%FvGvAia1}QNC2 Rria ˹E(V"i,Hj^D|oJL>b'b`h:qP"]FfY( C@l>^xoNcd55'ͤVZw1;|k4:"Ȇj4'vR|b⦥g}i yq['@u'N0}\BB0r. 5zO(]UX3@wnh||ùNz릿CP+'X/Xp &Eh}b@') P<\SP"dYŸ B65q3{\_c|3 {i_zdɲ{A{I EH6UլִՏ),ԣ]h6 DӪ]#5fDޏ`xc#d 4'^ ݀{!>a @J:ËlU@v00tb~ZM,9u (jJ]N(G%FI&kZY@QN+9S$ x F֠?p{X}*"*` \͠^E2wysTzն\ۿٹ{pBK/*w*qrhS0SRHR,# <5kLeb|~9^C"x iJW L*)O& bÉ7% ?/Ƨ {q0{f%.t}rŮ*/F7[8&VV厦 ,M/#P5;!V18aAE$ yͫ_U\$ f=5 lC$(%F78uCkt}U74Byľ6ύ =olMP~vZ;;ý!hPoY ) ("((DJi(jH "R%&hiJR=أ,} ֿk*gܠ'pga|.X9P>LQR*|uQ`(d!Iv/q"fTCIQCw9@T~QJ $Bu MREM%!M% Y &`xAEyxGV}2O՛֌Ew7J1EEŤQĞ^F B.<>q+yzrbAPDs(=?_B3=sy w>kQQ4>$XWѻSd Ϗ[nYX~&4ʐptLCcCɐvh LeDxE@?8^yy5DrO =DͶ̫p 6#K;"|ӄ2Z=0%?Ÿ9~~qvARHyw1O2FVp_<_~:tyղ;\#~kH+SL_ɯOG塹Ugn'zcw4Nϸ|gh<ݑNӗ~.O(\)|Lx;E =tdEuq!O 0HV l%*_Ѳ3[]\sAiSfup }ONQtCrPAg6@ϣՓ'?AH/zZZ8YANOҬ/PHhq)N@0Oc.i: i@*eV|K_]ۛ!t8nHnu7$12<.5[H]q*96 q aXc SMq܏i(,`PSOtlxh1ICII9ؔȡ"7?DNBA|@ʤqp9IrPPeP}pRJҬ }>(`aVc"혇Z( A\k +?$"?Ghu3+xPo/O0҃Hy ΍CRE "2P2Rs?04N|qa<~nu( Z( %h\)hX (Llj9aR.Q"P>9 <QC(/2OAԊjL%S jbclY~wfĄIc⏭.o}ZO{ަ҉>r͢_A>sN=uRq2(,Ĭ8mĞ22B"*vc~.+$Eghꉎ=7cU@Hb 3-Qүy}k4ܸ\Enс۾\3lvM_l&';0ZhrC *J(=(zhZ旹|́*?ecCS|v:]1E2Cqi)\Fyu䛡i)|dd)$ qw)jbgEX8)''IDB{`kۼ^@4d=`S*( لSu?j~~Ð΂}P>4~>p#kʹ,`,T\pI J&`4U#Xj`nݨJ"")R+ʆ8HujTk^OXnpr,S&"l=`!:~Q6Шy /c7j؅tb*Ca@]>^\) F*C/_"^;@IʛY}ghpc![氝": -C̢{4m9{I8+f) *")J\L?Bӭ!~ x'cm^{]=-<|0O P~5Bz:("~1 JvO;"{>}q9`PSGǩ譞q_}PhSzPbm TyyIpݜʓCDS L24"`szDDe{ˑOte"Yċ JQ8UEc&"{3,2'zp]5k (>W@3eXEX@ޥ¨2dz u 4UTY {z9z20P)ibhJW$Z)NEŔFM  RIT#TKIB1+BRRP4!\dP4Hd!$51+ (T+ jL$1(L<(𼈲#Є)*(*&Ĕ8BJ7F-k-ĶimU4\yy 감$BdNE&`$iP鞹wmC6 uR+B% 2%3𿖿~o۠z}?&2tϾp6jD.59ٲSd):yYjlhKOvfC"_Kֈ `] /¸qoCߜec"!Gs|5|Wm~hxK=/I-6'+<;ga'V;_?/Wu4P}zLo:>p:`"XHD_4$ jA0D*D oU riUò 1!]IXܰ fj2s cZ>ާڪEMq 9CDoqCH|O700|} ‡Ɔ"KW=EDbՠ΋Чsϩm|][K|u*EE;[Njy/聇{ycYBU :ӆEaΩ*Hk)"¡у3TzuQM;F~#{8xiq\gQ 4QS4PL4MT_l 2k}6V0Ӄwۜd_JA57'=i0khB~\N8iB>B3 ,-i[*ׇ}E#Y)EV*%giҦ.Ɔrnqh/`kgݙ¢y&d/!Cǎ\vczaE 0\ue͢-PCfXt ]$ŪE˫&}"9w*C4_7c'߫|mpq䷽$NRf&vrQPŤuU0Ek.&<5f! WM~:^tE:ӦvtE- H(@I%,b[t S/`uAi7YA$6$AeEW nȄAPEXja`.* 8@Ɛ~ eҡZ@v$@/8x oS]g<85=D(Ǥ1+y畒= k]eDU~Zf9 O:uElHl*D(5>L/mZ>> 2DQ6:a(4@h,`ZI^LvjJƀvLEǩ4#_"ȏ|Ӭ0 Us֖jaQl*:dln&5%we':&-]hV6 +Ѫ$G5IRGۙl@gW@Y\8$#\%ӻEDtC ,浐2A#[P׫w.9P?yHxeeI$C-{#]l)7ڝB Q#Wx{GQTuCPdthmU穐߿ߙ⨍lv0PY(z/T톰Fv[զE=e  (b%Bh`N) Dh^\еL@K-y{[s,gD lXe:ۺ8lF$H߄ C"uۡtU~X7`gH u@旃û}W &K 6βsqw:zkSD:>0ohxe Zi OHr0>ٮEڝCBu}xvћ"{waX8ǘ~\ (=R vE QP+}>3=u`yż~S-5xGZN>ࢄI (ǒH` 'll# w tuѠp!I gR&!./̑of~ȴ1@5j-*492ĭ0渠@YR$oɶNm)))KYBK!m=i #UbMJ&A_HM2\:{ץvVM׃Ww$V6hbӌ{*ɉ6>h7$9xgK{a>2j@J(NNB$#R6"#2un7~dC ZzrfQG4]l ;.!,%{KMpkBEYz>iqhO:o"+jg7ܔs5¼;s3%}@u'Sg7"̟ ޕ{5_A֥ aKCc1Coس5a'i(|~!:IV/7I;h C:CKiTv!`Ld4eXwi˶?ޝdKڦb@5B}<>u 9*G0}H}$л ȭ|<"He`xHF&܇<'P]6!O6NA̹qX( ZiR*f|t9@/j<01K fK' ik*\lӫgso&y?֣S/@*@#D;ЦFuT~?s/ߓn} {;䏚%:kBS JYP4Bs9(5&RPBH&Fɥs0i @dj(iH Dr13R@QHHP4TQACK}4^}xY(R+RBd䔡  /,3‚ē?iOoNG:Q9F(j*d(מ8" =P?b '㡶E3'ƆeQۆ9ZAV(2 "ᓢM[mmN5&ϻ1a֬V6's5# Ci- b$8FCA]7? 2̓A 3}rO.Ð(0 ]:P@kT9X}@ k0;\"'aZ 44 $TIP( v  m MPP%IQR$ )At-P)  ($P@JPE  +fJ./^^}P$ PP(UPHOn{ů`c;7}ͷwyw}/V>}|<}owi* iw]zAԀrlʠo|=@B!&&i zSzPI1D"Bjx4Jy)%$BAi4i!@)TGh@@B&FSFѠI "M%=Цi^$:Ӑz NhUPX4o= Q-XK/?xS' (s" QEA&2ʩ'1Qq*U"j&22$(T=ˮofk)%N𕊉^)ʪm*!z 𢀧/RpAhΫ"_jXtΣ,.6{zQ1*:@ JReNWH @ǵK,CڸNE>af/VAvll.sp\Avu8^\󫶙8]TuN7~^@Gt?I ~L$|ą!}\S&Bˡ*/I\3#3aGgHA^o?(jylU"V'S5G΢Hvm(@̛/-CB7K0}  +j:1x_?D2r]olwNĽ o|Jim]:cZCQމ͑!~=wp|uHL}Nϔ4b'x>qgZ3OA@Z܎; Mx#ˣ"D GIIYFODg;wpog':{*<>t(z(^4;T# uQۥ >x|-,&LFN~ۣ-C 6=|EKsHDҋkk☍8]05-H-ͧcU6FY'FWo+ s@}f#qy'%"߱}>] "Ê;?r> nT>Q`2Gh. R@V|xC4>[o]m+9Ee^}٢؞K>Lw~z#X=DKӶhzkǘN&5QCL;^]ul˒Ǭ.nU]:}{=avo9;xV /,=oj  é78rPm%kfӝG$F5󛟾#8B"F1/HD3dOiD77U[9eb1l| _NWe-dp]tR $!c@~[ 'Ѳ _o9a:*ԯV+Wqm _xw{XнM,8kzH8kBHI(;? {pznAqHr 3'τ|:N[XzS@ c|I]LFW)PiQ'xҐLK=.lI.UچWZ;{~ 2ZP2y~ۭ}7cY.8qYu7oɻI'i{G+S:*4Đƾrt7"q5˾k (QP6ʺ|J^OA+h9פJvS9eՍU{7msC8]_P4tnT838s:]7eB;7Ƿ2Θ%.JIi%]r7C-:ג]#u_ueΧS#޿q 96zۡR?{^wq_7Gy _?}+X^SqbH6 ]Bb>5~}_׎~'?r Of<\?@/?S$/ QTgŐ s.i=}o>}uI^> aw,g~]rIXaX(a)*8mkHD},ÐC?C;5;ol{)cuĒߓ>q+.xϬ-ژ;6>`΀E?VYm"Z߱шO9Z eQV{Ske#3EB_pZOM{!T(MzAQ{߳^0F":ŰOmh'%QkDZ5Ӡ>K ڌ/'Fxw$迤*c[JxCi:Βih!1_*] z5 RWeuL֊qwF%wӯp:H@q=+h,S]E&&alp+R\7M2}N1ֻZknteOsGt)͸<$2>I<+2SF3_/ ?_so}A$'8ŏ ak|`HQe2rMD*$8z ٻߊ?~\~/z# 'IiWkATK7LP?_9=&ӰɮFJ]Ds.€vPu`'T}~6~>}0_c21B$#e] #^ `آ!`@~>o Eq-;O~V3& -ēo)Cw_h>l?=~!uR~>N6Iqua5ro|? 2l>pENؕctvB+?uud:W%~] aVQ:lO9 @ vrcUxW}`lBi(?[S|8(rA=}z2?̈&q*?XoGb olOa=-IY~ПOڇo58{oC^tnMĊO+=6kHf$9klN!*P7_F_L|zDs2$LUiuX^ei,z5dI6 v%/H)GTRvϷ߉u bEӬ""Q\ōLl?/$'j?{ο:Er9?H?}b? udh6JV[cx䱤Sy@j P;?&lF,5InE0LAY2oܱϑ W=_a5bkS ͐j?m㯟`OנOf}[gϔ>VTEgD_#n=/QZO1Fc O( mv-sot}vjx6{OfZP!KWʢ}tt̋b[~kE-K wwNvw*bXVn4ȇF9{5t"^8/BF$\aW1w6i~.S ݫ-M=V)*. vd(Q]t/²αһCBℲ$֟;m^iZޛ$ܭ|_ @wP3@HN(FBvb¼lKSQJm+B# R`Q8`$L,:8Z:Lc+{zi+1/ARc]Hfuz3C֕Lnќe Ԧx̊(Rlq0@FInmA.4@S& ;+xQxL3IjTkHdttmv.(,1G}4Ywsez!&:9; 7uބf,뢭XE5n.e geI5]&B hDo&t(9is.JAʨ.Wtf^NbBk 隋pSAsW u-ǟ3fgAШSxA\fn8i'f *Z?[e~_+{JḑEt$wKƝIvcO,kϝ/3%ò뒇V^hwfROSTG|K0gB8Yko1[Eqf!OuA?gXQu<<ؚT^}fCbmk# C0,B934{Ck~IPJۨ ω ȇk]>=WO X+b3 Gdm{^pG]di| b~~¬A/ "d8Tʅ4GQ+2)EaR)*`B3F i߲ 4\'1Z jR4p%+(n[a-5;L;(NCba*%kL~P?Xh%*cY&K_ҁgKĺ`T҈9bY.k̋0;$R^HѶ.!DjP`K1RlgtR#v؀~坭tг\ݟ*<ÇwL,E};zRbX؟SZ}2y'o,6{4~ǽv|~Qw脕|j΍Htm40X E"e: k?F`GY< P 9{mBA">଀T"ȉcxD:RD|Mlfm? p.)zݛoمg (i=PϽүN_Ne>}/W|jg?WM[eˋW"5Ky~i9uAђJ+^b!<[dsr*yR |8ӫ0cPgcC6pnjyCR "%(H$ۧQ.>Rza>f_Bϲ_Qޘ=t E'~=QygL4N,Bya&(åD!HZЇw!,XT Hcs?E:./2z8ʏm{ϠؔY͊w(;1s,浨{]W_a>yn3Yafa_m{G77% ^r4aL?N m.e@3 'JcIM tXiCHLJ(X`Bdʉ^%T+LtW%ouk0ݣi m^e=*+ 6/G@9b0R!ֽ蒞-=t~# w^[ǏW8R"FωeN%hf"h/ 1Fk1MJ`S wJqV!|aCħyߔMav=k:iExKq={nAe)rBFH>A4ݲjϴ-{qx<\|ƈoDw v`o 1[z%oI IGᾚ> 4~[檾hy<0 7}i7uVwGc 7@GLif,z1~C7/ШtoN [PiKF$ ZDiqb(Au<8\.Q WI7ؔ`6*A]Vde3HY谴QF٭;Uds0e} ca 7LbɇŚLeyus[=txdIV#未 0rSEh`sip|2tHNu 77yt ۹P9dA0: L)ĥN;zFb`x!%E3-w}FH (SFzکF_Bt8p:ޜP. J5_A$ 篴8E!l>NeJJz ix*7׋x>eAxCRtjx䐫>C~0û+.6*NH)̢cW}²*}^s鿌GUN6bUSE 6>mqi8Fi~X^҇Ѱm {o'c~W=A+ K؆W߉l J!G/!.\%zTI~gcvYCEh,W g9'wKw=c ur]87l+i28oP ubwȧ1SE f3xVEޗ禿ڿ} NΆ/˲cH3ѳYG C`;C>@#׉W4oHM";+.mY'Q:\xq#7bߧQ58o0"?_.ДBjr6{ezQy21H_"Z܆Ww>݆Z#Sí FTABsɉ"; kPC`l.S~JoB.,H6r]z!L]RA5`9.0܏TXְ4{>p{>~>3RJ.h8}1-JD\؟eVw!p^eE${fGIZUAREqȔF"󆑜]nmwL~gnG>UMʵ#/ꕯVM8meWJo>pkPx.qG2mfZ$(#:)f9j94A~噓tˆ w||z^~o@!n=֙_sA&gWvx'mDa*/+rؓD!=m:&93Ձ> p5 l/l-컴`6-ƏMs1a\x5l_Q;Y\%RA%r !#.Uv( -11e[)ߋY0AUX#-tc!6Iпz~'Nf<[Jz;l`Gfhs?/R076k+ fC@GB#{;Z e׆\G,(;;$R 8N7~ '~J !ef0TNS$a@j>:x5@Q2?~ zg~D2 +oEQ//g7[Wɻl܏׷ c}"+^qޟ*63#W>e߻yA3l}#tڝy~:J^WzIn+mR20 RGE6A9nzz"kb4GcJ7Z?iҮ[YP"{L;xX!'6 b@J'[횀 P`m^2w|iB-eш8S֮)s~2$ ^Z?~] ;}0 晁ru:G~|&'@a |\p/_cu~с:(d5S}fg3%1֛ƆePߊz|l']ezdefRvXI 6AC2 O)]~۳'\RqP52}>̤ͦs~s6`z h}طEsg=" n3+r|  8gWR<7u{*$x׫gZ(GDXWMX3X-/#Czk]XpSCͬ6C2.\^<|3 OZg oKwFpX}tdOTTHeeC\=||Hu7<$0tpqL4,J 1D ɷMkfJ!sgz*jR8 K $ron͏; g|77,zsM2$ /QVI`!NC iX β &~z:(ҰΓY-FяP:ݒ^MiXzO?:54t\|e0^zeT$)ǩE_X8|¼jR(dJ]1#^ƊP; .n(:ldL:E!?zӂ$r;aRcg>zǂ P?ߎ 1>f\hBOB'F9n7U|$]JQZ:qs2m_RΒκ)Y_nΙo }>aU$G D|<i:Y=y߆u%uL[._% '!B7uyާDS*@TN2_%2zdThA(\H1ey*9waAFd wX8Ϟ]Ml a[DF=#wMM^]%W0e֚uWhV?-c]|/9[o 3dN5>B_\ވoshO6ҿ|Ȣ5}!|1}JU>bzn 4MݓjW@ ԟof/8;j&w7  6ۼ yRr辶iKE߹}%+rFOÐ4G$ocae x&&wu)l ZUqf$;2,Q:kOrB؃e482ck"\SvOHek4𴮱XV\́VwYv$l*SbԠ a؝$䲠jL_g tG*gaf6]UĦ7M"OjFǍ#qf?$6 FKl}S o2KR78eEϟhO|g/_{#9[^P5m.}t3ƶzƅ;\P|LϞw~J9ty:*l+*deJ=~~D~"A:F󫠗٬wL e}̩=W9bTQn{T/=$$au=xl>%U-vyBbTOS=2Id&jWOC5.ZBU _ CʙsHSe+Q(L>\aPXǨe=i4zLVv}n7Ii} !%VW"~~f$Hn3.QO(u9 Za0j'~A߾kڃĽE!d;t{DPZu >|/!8(^EV]gG@郗|*gU4~tƾ̄ag=S7H ^KBXi҉nk({5δftOkJsʂII-C2r}!OD7pun!pq R(b'}Ic_Q7uU}0T(B$2c]&CXI+/3glbiItyd3(8YRO,?mjxf+0yPld[SlK\ABWN"]Б/s٣ ]/8_{U|y$68&E-url*9+Y'*47/8 skMJMQgfqvL?kIR#+1ܸ ]tK*ӏljq) `A'2 ud cl,}PO^Q ds=gR%hp}WIOgTas˥ K"ޓ8IZQ7$gQG"w^ږG0IG=C\fW}eF*^@9A8μlz-P&\cVuϧF?9McA5 2^ n_>N vmSZLIg}1Iۢ3wP dq19QG$Yd ?wͿ ~ z/iz+G]#"1C©+gd#i5&8XÌkxLe9B7vB\R\C'|0zW^[mQ\ѫd-gYʑ M m2 )тw!'u09 y> y]]N=b- 8Ug,zB/& ^ڤKɄIz㋇Ne E hҬ|D56ji}R&1X敱Mhʶ}s'](Hsi9znFѵ2Q#SszC% _ n#9fqB/[+#jtjWY=_?=^/ wpFfkK/g!"TQ ]rļo N'r 򕉻̷f 1IWT+mN|n8㹹 HvIHq/)N^07T>7rTC&+Qb/+Mx mo~%>2X@wGo>* VitJC7~:o/XX:w@BWL.kpG_(H~2iZK]w5bJi![ڼ/C GXYk8Q9Ars/QS^!`6RI7{*0WYwE"Q~Ӯlzdf:ǦH|HUn{|=,z)ꔱ}D)2PaA MԴDk7gK=GC!>p+1.|sa{B{G%.ܻՏf:4@ʾ[*kW7},d_-KƓWb%D2᩹}ijl;.pR|U !"Sw=1I]do#Zp|dqs}ö%( E>$,}T"q*Jk.d/}5Xy8dhtk^op_L2AtT9wYʥ0:P7-3#0֬u|wg\|fAt t&XMVTqΖWy!kH-3 :dz=jDkl׃Ň2_r2BϴS:}ƙFbJM[3| i2V"߭&o7kn~= t5tmB?B$F͂N49k<8mJ:n$*ӵY^Fr1 v8v/dX:FJ&e&M$9R9ͭe*\*.',ǘup>;eat{ e]wӾkw]O:J!=o|OkGp@>\LV ="IS)57TFu_ /uY۲RȭF{LrxI3('LMf/}\Cvm3~bcC;gAk9ZJ֎yAY_@`]ByYe_wL8հQ~xU˶̹竎JWa6qʣSNa3U_74%[i菱kڗ:XX8:(!]Xqu(N{L陫&S2i[O[A'?0<=JqЎ`8t>}҉2#!B6RZk_2Ae9Ƒ+hi(:@@Ҽ_ vؾxD0e07/FՆ"I}߁c9tA>xV7vyLHr|n6>xT[[9ZB.4~ڀž~/iDe]1tA\̈́yCfz!| ' *,MZY Pv%ҡpFgʧW@0Z},}657C{4N|2AB&|(vr!=hl.U0"|og:w\, -]'l`v>o  !Xo6o0]&gNXEBYsrc'5;}GK]ߥ*%s1<#@¸k= &ccVyEgOhִot4wBL:tjݡ&~e>YZ#UVdqӄ{o(63=wACUާAPQDk!$B 'ˉOg=kVic8ޜ?j0 0\*.zg@4)k Y-rNwoTe#뷮 UKh1 5뾧/ 7Hv>ki8e%oLKe5z%:xTW &|i>X#1LSw>wKC2%q:djT}8gkIѶqF Xd*ҝGуI#\4[ H1whؽ&h(z/{!#ъl2fk FPѤ-5Fu6'o\Iej..X:YbۨKjhE`4uSzqiY:P ɗsc^O<`SS bW[pǴ"VZwvmU8*ck7hCn?xB֡]6=wKţSy>9øV0U쒥ǍNb6ogP:o|> Oy4+t l=X 1jG= XFܲ{5̵H]HR D\4s3x[_~gg'̦jdjJ KM6__g3z t hy#lJ6h#&U1K<^~TA=|O d'CPܴ*=yVӧLa%;#zւ]Z },pAB*t!]Ӡ} ¼8{=_Q6@X6VbH T)/:!X LHFHg_lxO~ q*w{.54T jQK:H!Bt tlO _1m/9s|mъW#T!ɲ,']pv&_גX˲3o<хN#C~(\V6(d" t`K~ne&F{eN=/;B;|,(U*~" <>8V_5$^e1)l +_~_۔L/?_HqKbXSǫb{8Q.'.WnHDfxo!2V|^fã=n9F.=}8P\/D{k>>(K~oOxQuwG, yJb:vN? r/34Dp^AЅh(L V0W߹~>5W#ϗOtĤn~!4~!ʹ\0:>Ǿt3}5o! *m4y 2%ٰ;ڜ.ޔ3X-*L- (\]e]NˠT_սo'={y,9LlA@5%h.O)ӑog_JhxrFDA#:ˆo !Z#/]p=^YaкPQ!_ռio !yot &}*R`7IEȍggdre9+&pݣ g:e/xG/ K-^]Ar)G\bK$xvVeQL(8d}C.bTI?t&%Wk6G>X*)ln&Mtʧ%"~_[|ggc>sZ U_j>QxaKqꂹ*n#1纈7fss¦ɤqݲvI\6v1RhyfZ;Y5)#^bzĤ4BKفBCGO\񗊶Ѳ} H֘v򑘗Due Nh5ăN,sO}QEƎ`lD 7F@S}1ߌ;9/zT{֛E}cHTӿP[$ V [l 5ٱYb(|ęqx 7.4՟0ں$]ѵBHVS`qҠQ]:.(L4gsg"t|Cc'HW[ ~ 6J&ƃG_,bVЇq΢0D&C(˯͆.?b՚\AOW_lØ TlGx!/oh1QB3OnWWV36N.a5_PWR),~UYe]){:nfYvF+ NL|;5 5楋/&r'o/WT%aՆኴŽY]cڍ 6bT4ܒxH>f0>߫b T#MF]8C "r#Y]M8]E";GR]n6CHx|K44=w{@epO t2i.ALQ/uٮ!m._Ea,[q6ރ_."\PAk{uICW$<6g(: $~FmO~I01 yl&8Yľ*6)':**lZGyD~UqgοPamR/2π/} 4GҋX㥿Imc.-zV.{F6Fڽ* [aA/% )%FEhyKl؉6_a0: ޵Զי±u fn:WL^iQA4^0=ޠ~VB|x0!!m¥v:9WJ6UdD-([%a&5Wd9m ֺ#}z:\}|ƹ?@\>2_GĉB7mD C`A a-$![ѵ]~.o"9譣@Ϣ̲>ȸWYVJK/ɏQ_ 2BB&BqjjitC):T0+~ī;.p0XlɀV&^]~9雚TBFįW+ߌ"!h|桅rQS\s>e>MĹ4 %@fq1߫Ss\P-kF1EcBJx\PDE8x/șrԍtb Mm2?=zr@fQ!R\2QHU6ΘwH;Oo0ʨǰH#f7ye=\~;FNN% :Djv"qJӇ\^oCYͪ#Pm^rucD s}]s̑r6ՁL;kѹ12E3fBje$]j\eG:iml-y.u}rd'-2Yt?GhFn qD>7K *L<*.ַ"hOp ]oL-ݻ /LUB ىsLWDηK؍ ;|vR#`8]E,xoyxYdA6뵦9>;JIDROZP|V3cS3SF :GG蔅aN#  e. ۋ)b6p/s$-b >Jz-N2$MZF>V(0عJ>o#1-&_3`FS ;2ɡZJ?7 X+t5_;Bd|3%.-5ށ͋?rFN&PIZcLDg.Vo4 f;BD Vi!/ D[VGX9'>b(6Ӻ}~]?i}C+Ai8B1~Rv a=o=-wt;$;3dhrί~N)ήuw.B%܎4acXn= c)KlfM%G]||=5D0OLt:M`Ә6ۆ,WMiuH긗;C*B>֟ 3n~ցGa=5C'/fwQ͑U񟫗yq '>צ,Hr3=WADX'x~]U+WB͵adpd)'dID!X[G&(3H`Q%C phUXyZwcþ|ɗ_R%tjHaw֪W]%>"47+XpGjx$_,_^Xjw><ۣ"αO'^~ :PdL> /Up~coXK/r% 7 mT]9`b WW-C8&+ ]E &fuJ +L*`0$j>sc(|?[~8'iHE1E29ƛa?8q^Юxњy֦])e>VE!" fo/ArA+141=,?E2V/Hwi{ [Q5 4Svޗl*(ܠ*2'ӸX BbLzՈ7%sh"t)t2rê-RզhaǿW;Gޟ/p/w0z]gבe>xsK|~$J~ʟ?#7]}O>|x.ujn|Ie,ۍeD9\`>".ʚb>Di՜bY9׉^3s*u+0{l}l(ޕBVQ St|G|e>/PwR'8W!|FoԭǏD6}Q'K_N_z%%%) .z23$sLNX :0;,nm/BxVvTUyZD-9R ȨP~X}zny/骚uv= 62kd,$x@;CNKHd,T ޾#ţeB%*[ZII q%B}8"*A;O/ 9}ڮe=EFm^Bw7tÞC:\(5njJo}o_ëiBYX' f^跷81kYEcy;~5H>P;Kv1XȻsL̩_TZ,`Pg?_h;" Yh\nW-K^S(8+vMf甕| :r2l.P~79gR'_u-TCM) X$3qdSwtmC %qY|'/ /HTy tʺ{8ǭ8!/J5t!;Z/y<3PwQClj,kkyi ,)$;tX'3.FA vr(ϘK*JB^AL]?MaCcah.p*AYn %.;E8n }# ]ot+yu0wFm~N[5GMOPd6[ʄ (Zԋ*aYzNhc7jg58"},nrΫʮn3ba:A* Mޞ=~h| (MPA8ezckGOǺ衍 TyGĽH\KS2[Xfu+-=ti FGtB9Z48Ʃ';IuVTLF+ 6}]3ݰbb8ĉ8ojlML#Q,a<22qK[AЦgdjȻ'=VMC8)4#UPɉ ( g݌g% fOdy_%Kyz!]tCeXѰ PJzW =$u:X"LAYdOONB |Iqቑx0l\cl6T&OnU]2g2%#LA@>\і]vSW.)IdaNíWW%^s-MsɡRhJ^!3qs I;~9w\> +צ.!l].n &B.AµjB)#kMIؽҿ4PacC3/S /JEJ)9YY>:Ec,&OtG)紅y.(*%3L=_9\h jxc%h 4؛悬Hx_x=2W5ZxGh+T)QXVpC&W'>"{/Z) X_ Kdu71!evS!n۾\%i;#ǁ42wpա]Ux}D@k s;ߵ)4҅S ԀKE%1.p KίF|e:-W(޺Ǧnڭ4 K9wV["KϞ4Fu:BޅPfK*PR tB&b-f|ʗG-Q^uJ9-iK(kN;1M'0S [뉾|Jw~M~Uޣ7Gv~ zL:DBKz'\o: GiЄtsG\sX-`o~]E.cE J&dc+Oq!QAՆPd3Pt0cgȲ_w]:_xIyԜS֙ N5+B1lݾ/C1ceH-PwZ.6 Tj*t/=oh[>@o}ά_4wR%M<'c>z/YKznSW5s 0N?2)x,s}wOXu[8WK|S:RU>j1\W\C\wx:$ LehT#:Kix3Iws)F֒4_z<-}1CY;1s[D;^XGs/r)3ٛowYT^k wM\n(6rnPۡR;M*~WGEmyn)"Q 4`j8%AMRWm]RۅB?*5ɲ')a ޕݭL AOj\D@FGHᎿLvVMsT!<6ie1Ya%e\`]H5cֳ}W )8L]gLKzw%\L| k15;^ܧ*xI劆AW_VWn]޷qPYfƅإD޻}%_%}&㵷Iz$y]637klk) GNg 4#TXE`5ﲎ XqJwiu/-M|.+2F_ 6&Ӈ7{Qse:|F0˙"ҙdk&Fc" T'uL!E IHDD2DL~n.^i"P])c*HrBS 7K3"Ss$F9LZG4B/yw}>cwEFnLS >A;.l[rd˾Hˠp޲Td}Q" >$բ&=e#`%Y ffx]-$j}HBϙo:9V&9KU21VESv E@*WԤGxp.}ΰRݣ-bߪu. } ʵM{KH7 uGaQ|)*c{Ci(F\ȘIPޞ *6V&L{N\ڂ2"z\|d%$DatGtrTUSV_BWb h@Td5˽u ޖOBͪ]{\딶͓AlRwzUм7"7wgL^~C-N#?-4A+_ ,X<;CѭޤpopY`̓(淊滟kl9]<LWoL\x +G=]c X/uXu>=M$B mWPLHDJ{+-g>!")AX\-Uz(1bnn.x3unSY L+X2%?!a!AƟ2-pBC,"~]фA~hڈ%q-lBP\YGqKYȇ0iW`Cu ;[T%iQ- Tskk2w&}D8t \(.'qC>Pf"SzBxnka !JGVp g a7zs{(1_+n']^7 &.T"L*B%2Cac*K`̚zkC{ 'iBIvVkMR-pغ$x g[,n^'ѫD~H%.wPU!R35|.ɮZ;d[!J'o~#alx0 b7U Sn @XJ eQkΕ6GM /qNT3f1EC&lIx9i1 kx&8(`la|#Dc!.)-{fiDC'w>´6VM p/FȾoFQ>ԾJJl]"&5\4 AʤǜɱY^Vߪpfj~xת΃z=bef6Xc;.o]rx=SdJoC}VmRtn[ˑpDObmM!bY!eI oy^Z"M;+7y4D2x $H=?oF^z2 Ai'NQ _ (0O "") f7[qY@vT(EKI' gV{3df֫VX8k3Og7EpX"D2pY"Ȓ xjX5G^àCtr8mG}bYO뱙1N"\\79I\S;Y^C=~gͧHCB!wi /"ZfdW>ay 5Ƴ2e u)W4k]ۭ}p!!8˯^6tDנTdg{ĶrAnr|(g,pfA{1 Y  2Hno%~u|>rt4gPu0]P޳}:,T&6Y_HAneGmY#T;ֻ‹=D0I [,q_bgy񜥬,Ƨ q=4!U/wY_s6r8RGi(ؾ9X &?8)_Mj7! %,67Iɳ96+a #$>KVbe#Zь3"WXt#Mh(>;wes՗᧥y7siMj DKUBL13ƀ%vCDrG7*6vKVw];*GP>9k۶wth@vCr]7Vdv֒&1CF,E78fӤ͎7G>M@Fl1d+;K{TRPrl w"a+^77sRZF遗:v#kte_R'7r TJDم`u-/th#l_V7놏)X6֩GoRξB+\B"S\fȋJZo:B4@fty%tSOk/`11;7v!< 6ز@c.MҠBT HJzv"<ԫjqpӹnVo8d- P" tX[;L@/zR{Z-vBkwfH ߮)r,2:æ8!&Vp 4祈r<}Uh@%; r%kPTJ$#EKl=]s^I5.BAdd$Q{ZJAec?Ūb?46ME5*"w& 2/ !( ??t?q,9ՁKX!$$;j*QxQ8$$UUPU$'fUWAY%:eR+!r2rhr &a*9+$OOggd|n-[Ċ HPKT[iJ.fMYv4poc"ҘB:YgP搂Wx۱I" [T%w8].H,.Fw1.n`–,S37`'=.ak{6~Uݞo3^vJ_%2()"?3<]_?_?ν~:3 ?^~YT~ H@;-1U_ǒh %xm)5|F'X@2t WD;^C/_ܶw7UvN:H zϤP+=]?l1p^k\:x{}YT 5 .ޅiMqyxL)%ҩ(N0S(tv_Q 5!}&.ˈ!AYbg-_y.}>d2JȾag/L߯ם _1|n~ʍ-2~Iڐۊ1^TxOhO[GQrJ"IRet;)E"~ }}|?gkR \w_YwChKMO{&1Igo˹a~ '=oA~J#@zq_,~3N\qT俥&盄U|6I")ѹGaAU}kNgCoi0XS B#14*8Te-UM%r 53*DjB6{]*&&r;UE̴K+KY;L݄-9:.l=5(|3c1rEC \  )HaB&p&{lw|:Za !Pj:|r%I/Fi0b=_w鑆oN308\s4xeuɤd>zbRWxg>Ot;twhNb#ۿ E'E}BUgqQӲɺ wJ)&UAd%o1#՚1%P@IJqQ4ZO=Hw6|>fyMhvyܵiK|}a"SV,QꯜTrUImb|&7pr:gSjΡuwUm1,}Sz> Re5+rUSe"h94ӗܢcʬm(oJI] $":R{(/2%;wCn2ʭ'xUc\7 ݖ'z4fgU>J'J-UDR4u)~}8KDO\IЋx<,|׼[XNR;'Tǯ~[Zz1KwKxG') :B],V&efӲoRbqRkMN8Q|HGTTf8QYt3;>>umvS1I:0N7yAR ˺vU;}QHFz^s ^w}*0@m7H@kuZv)H=b.XOکf:W)<3{Pou(Me襝pQVUShꚋ+d\aQD"{+X>5)*+سjzJEq0k7qNn7bgΫ``lp 2cq*;7:*K]vɓ7\e^gj12 ^rul,$8|{fхĩL1~.'"Uw2e! <.}:E:~9K:rԉڐJ:-Iުc_}Lxᬷw\eT_9WWj]䙅@QrJ¥&s-}F7rDQRT")dVxMHj UnNXPa7.a{Ek} PŬ]xz!JzN$4or^ EP@S) A3rϮhjݜzw+fwwTZVd&\NIw̙ VNhm݇QGC`u nKEi3Ug=@R'_,H6Zn*.bYνrJfA 1a}.{0<3of繏.2T8.nyb1pP%k6DJ9҉ zMWl3.5R0D S>'K(OMM8hY&-—*kX[ ~u[t \g|{g_~O_?0?o×'_Iɸr?k/6X(B"xҬEt۸uW&5Ai[WqsZnpz(HFA)Ȏ{+ߩAWk5R>.xXCwUM߭lZνV~>=|DQ,GWa7JÖsy}<_h#!=G ힽ_uW Gux[.X]훼 c0z<>Au=]~OwP$тn/ۓ߿_ھ}=aO'-]Dz{ E60Ÿ' }RN 5maou3 jqMk rCɤeq0H`/ۅ'6 ZCnv>ks#"C6]?PEW|M&Z#.NȹQDjIXa;63!e5rMVZծWt"Z7m:NOq=c հ#tqÏn6J̈́d6 Dvpncֵ'xvR /n[oH'I22ޚq\:1ڄ( .F]h/uޑ/(5Tio`F:#&tɰ.) .g#Ş׋r_`A^O~% NON!5$0& &Y D!Mldd-C1l- ޑ͂sU||R"]q w1`ְ V$ 7@;8lEM7X67i2 +:PS { -b:Aqp۷;цCToχyMW{UyFNrɬWcKu6ӳ\3}H֚V \v1V0sʾɺ܏07gAm{<EFAAsI2Jy`Ol1%Ejl׮v8֞$ʡ]Yz3#fkiH$K۫z⋜buh6?>o%*߾(}Op1e뼦wfٵt\jwUF/{_]ajGɔĹPA 4YLhk79)}"2~دHJ% t}?_oWtkYBW?=*o(a.3N#桋k"P MjtS8P$Kn^=ޯP^, M͸{9?8~QU&~"~v3_lPwLYXi\^5pio4lO/gܻX5/{9u32O dDqKlD|LBʩkXX|/o@kP #7CfU}AZ(iR(((2P0"R H>f| (":nyI^Q[ -A%)(5ɐʝVyF9{%~Q=Uwx~ ol[-H/8lc*‰6PԂ;K܏ ScxmQL' i* (rԽМH"F:rI"DNT^;@eQ ZʝCK(/''݄-+JY;Leb6v\O;(qawI(*|y$AnzZ"_۠H6~c-0'~3"rm06$2o4Wة}?4>e-i ()%+bC?/P 3Y55Bt Bxi(XhZ$!25:'SetJQIN2J\ =6*LM033/?EW  x* S^c@g)gnL1 qŪ{ n*ě|/ok=><˹p,_x9<_*xo񻕰0,DݰpgQ1ݵ™gk$)CR2*!$t$Y_9!Űr^P$JF /=gEe\-^` VF6H9x:ד))|ߜAN)4ֵ8ԽXnxy6 B SФ?/~MxlwcעG>,m k?~J!0AU.uߗC tއ"ZK26ѐ@`iiIhȚfSU&S-*hD£]Z^{Yݥgs^rNVȂ&54ߖ4_B)zEB iKC+$5BPJ3F9dIHh G2(+]AŢA!̆-aH4Zo1Vs: z{qܥ(t:T S"[@PPAI3Y7"r92P #ʀkVE{1ȳjr'D͜#/cXnϯ!=a=~5=?mNFF@ž  E=oGhz[.ج0U{RiCwbNtɂxWg>HKKKc6ʞϯO}p 0[BC(w&bEߞPcϲ=kM4suX&38|v= CǽO:Qb?B|I_I-?}kW)Dckj -xqC9i홇7Y-Bn񛴕 =PJ§ʿt]2Z_41މql>4 >-ge+>YGigM(r>xBf[i5FOQH_{ b Ţ* Š*Z3[P1#f [2 ύ E80pYgiƟ=}nw8?wG8P~c7f'zF_Us߇yN[R(ڐ.* 5 ~6F3zsxkٞ賻}wC? (m(q?mX͟9p=:KN>ɧ~9$2ҝ~m3/uO[~$!T#ZB+ }S tVyl9ڢm9?rOutn[Y]=:.v3%h\%D,h-qP geLϠ;n|YJfOS;7^3`TR>^N0?WVcDH+Ȋ"EV@ QPa]Xe/#bjZ6`cһgN}r}ogQ s~>~Ŵ(Z 0I`#lX/cťujmq;'ۧCGCD noRcMOOkc;?`Q:*UuprVErWCL)x,]iiVܡ ašuAg?w-W an >?nT>{6T*O5Dޏa#|hch̀hGhs?$30c.MؤGfXQޗg0,R2f*zߊ^g]¿ƦxeCt)"Y| Ȗ׉/e.h.*# բwr2!Y ATSC%;<5S.|xaXTCDM-SP<}.{8P1bq8X~OCwړ`-y-3Y(Bf[(#/$!B%j}`Rʰ譐!o #nQ4hL7 #;{_"8G eBo?[J%ΨW%UDkͤi8 z?O"A$Bn\{MqL=U{PӏI9Kq,$X(hC|od0H JPQnpdGel9xDl8NU"')Tq^$*Ot3~^61tzҚ m]aR(n9!׊qY7x9:n*AQE"'nJO_}Q*XS E)7gRg웝x4/M!#վap>7m>b)2S( $)|lM}pRT` ?į5އRU lLG"}l>qwxeY]L55l,^۹.e Dd @%t^xm2Zt_6r풨< y0V PAǣE1RG͇~IqCdfإ` 'f2 a?*kjtCΚ3:+rE\|'B~6N6:4{MEF4L8t<7?_s@.׵.]VyO3Y~A;%aJCqڋe^9uuu>r?QD8O9:G+DYlZׄ<&6*o¯lP4Ӥ{8!놻L *h AӞMid/8q u bG2r~.+~.IVn(ABuC~~٣/gVQI NENibX=>  S)'I;ϣ[=|])]k?hU/=};{a]`"~4- ?IĀ`*JWRk1 CIͱ\b2Au*/*=!"1B/7 ca"anĈȢP*(!R`QNjXOсdN98ݢIāD|R锍0k"&xrd8N00h*)((CEQHhтRKh^Q4 {kFL)EcڌbĞICYa';8ΖQP]<-DGGoϯˀە?~ԻYmOb!M Lfz?q<Һ7?OEYТ)_;H{uCޢ_~ԭRpp]GwLǯol՞`/N=]Q: p"A3FB=& 4OY5x|4Yc ˚S\;NH{^割=.Ԅv`: ug-kfԂ_9HJʅJt60x^5Τ^G0Ӵ#3vŨ-AE9qCtpy=*5*Z|5GC<'<<<ӑ2ǙgsvʄV9IvNTMJ/,߹^+'l~]? 9CCǚ?y`n)7[XVSN>Il$q%N=dTE<,>Eb=$.HoѾOkHDdp뷺i\xXK1NzADV Lt:&&QdexfΡ`{F*@m%޳hN0DD΢+mf=4(%*y3t߬z}Q+}ذP;zޛ\e2*/llcX,Udȕ Lf.ZR'fal'ȑU(H"+ohk"% w A?ը`*eM*Ƞ(i`iJjtHLggS-LBMSズ%lP7. E } K`t-l>SWF GAnԏ˫~ziׅi2i'"<H5Jo5 ]e7 OCXe.1p'I=0'H&9+:ViLJý9dXIITHlRJJ 59)f#P4)CM;S桠8-D-POCUNfL}q̴I'_Q0 gx053:C: ՑUrNᆓi Ry|7Ś8WƐ@T }8daE2 I!Ga)3C*w-TPD{QmS5L<\FNhǧiz@T ӕPߎN9Za)G25 fbp.YXsyÌNG,Wф6 -/uK HOt2 OqәDʓ?^@ɘBN;fn1iAI1|S邊B/_:&ZwՄ YMv0HE :s 7a2 b+m7#Шr݉R͚d2Q=d/to= <9HCL35i.5 j_UA[s mV`h-`n3'9Ndɝ9R=~↣F|Hzi5 '5T ݉=^gF8ǫP[l}_}7o?m$wǮȶ:-֍q,ۖ!"BxTݨEAA ;qi1 &4 Qh4AIQLj@jf;\ͽz(8c tic +DEnm2!<#mHeĝNgsgî۸R~\Ά$N$td r3js>,ɚ;>&^m4|;b5U3&n%LXF!kUtX0:|7el$S(mo3Jy==^usMO39ڪvHȰs =ĞrR`V&ց/N܋2Cq'='{s(a}ìDCK/=2ntW aEﶽO{t~?=}dU r<-B{ܮm0".ȲЏ)ŧ^4)3"Dr*z/CW=LqU=~kdo1B?l{%Crjf%(R?'ɿ9힒sPY'- œ e7e-E*%G$[0=kbimS扚qPF A<+}W뛂(fruVg )ETVOOexi>>Rd^g4Kn{i|mϒ~l,#LjW92)VY(Xs&$b\J>Lc~1s ; 33? K,\C'P6ECJ0WdD=ʹQzYDHeYQAIY!0jXG<:_3ȟ}y!(#ϡxW ɩQ>cR2 獠\%JU|r֕4vft[3No?ȻDǷZW[w=nO7k|׶nQ5.eWI{)CÃWeNspᐼ}Ja_}\[UmNn9Q]]nAe-KR,`g&cR jH!xT@2M~nu0O"m?!Sd;Sheb%-VE# $D`:@Y`-6KT.ƱM !Pv&38AK"Ibm;I F\li;,[i5%F+ bT̷e$)#,2KZF4$T aH仪j c{Iq;.gWmޗ'I+8%sCϯ=⚹&ԑH("P 迟u!:nÔFi!P:N3:OvԚݤGs|ecmE.nH˷ws&4)E$Re7ip؊""H3򨌒Q+Er,=L"C+BtN0*F+k S[d_QJd*Z,#i%{ӞxLbA Tj 2T$jiAAXi3N8U(oDdAmׂ J*!dlk *w4ii Q \MjP᪓@(;7K9tk/{g7%\_`LS{-z[I4Jddiuo~:C`9"s#DЈA0\v8ܸqZ x+ 4,K/Nou;$Y$K3G@y??Ah%H)SS3VWs=\ {'xvy)CrԉaŝNSV`=r3.Hk&PA+ :CT7|_|vv3*v^jMVSͻxoav{Ͽ\0r 9*EAoD?xP2vEAL2:@"<2@r$~0v6 |!6PՂbKVѸ0l0sc䝸tV֠1}` @7yCxi^ޏ%,Z{AќVm(v\ "Ү70ݮ+hb;6 @|#W`à8E9Rӳɲsq_oIٓlqkLM8<$ݝYt }'m7ֹ7Ha}WBH殨{T+*H={mSS]@%v$HvEZ?NCѾX,G ] u*zxgP(봂/66a/'3gPX%藺|Zl !\gzP6-nv\fGjNC!/(&-&\h}#n"#} d̯^z~kev7"gQo?{>ֱ<.F|y|1]Yo<>N5{t^Xg ~ c>dsuo|. *LXWMP"iyCTcS9>?;ˀ/2d9mcC>sӋE$86wq(4QfkzTm\I}o,coWu E@yo bN| -A{7um]޲wnimFKX "LV.@6\ b͉Wۨ!86KQٞvv@͙ W,3L~˘FqHO N!'D~UM{g<n@M:A]q}Im\^3tpw'z3O&w&_:ؚpd8}kgv:qg t05|0A^u*(7=UkDDz_23Iq?e!q|jΦ8CqM@2&Y#N@6@uߧ^0_p fBK<%M= >2c̝wwn"OM 9}2='lkA}ʭ*aȬO ؈A:>!(ЩtjQp` C(z| ߈1DD%xN~ZmCUp>Sެ9HA}3 (,MɥϾkUG $z ~9Xy#T;;47&ixlgu/mA܉j'avǮ( ӱv$߃8ŒO+||IU<|wuJa=`4̦3x ǻb|iZ]N !|mu8{ d|g"6<^^_>t#9B p= "ݖaF`6~%WjwqBzόaAMpY l<7G@;퓫蕴R X=ۧ[M˗w}ֈEy Bl^ G{#ѨG:A;;"/J=Z`jLYُ͔L6Wӭ}s8OT,@P!8sH:'SP g=RcҲV@E U[= .wtj#l胜|8ʯ@$G|HA=yKk!xHP / }a;GJH>O wfT _T5{ɻL2?ꇃC" ڡ|~.TJ :K==Sl _bSͦ2,N2KiO Rew~k7wP0m&ĂK Bgh6k*f4I {UUE|)(Mr$1Al:6%78T )=(wXp"q|-:e|IԯFfMQj=(է:@z!3ە֍s]mahdJ9l@ݏYl7{9Vigd Q1s3lNΟoT#+^ntw|&{É?.r +}-.׺CT]fw8 s_N1'U{9u:l0k1狫ϩsfȂvwpTZL|o&ЈP44C},<dcv0 itVuB0z܌btqE憌G3.(69=h! 6fBKY<._tX.s,zZtr{ؒDo]FusjglOXVTCr;ښtzcJ#zĎiaI=#+xԴrm"DP9~6c#GAW rD kBTOyK|y=ӛ[AD%Hf[F!8#Z6vHr+n T0]G6%X7ͱ,X?aDi#GzCϱvxC Bw;GXG8kl3luZ\ 4c~ Rڝ p貐b!> )BPR9r(2\}s7\|捠`b.'^okk"B%~T' L"R#vB__z|` FD(AWh՗~$}i3g>{npZ `]qfPҸO֕F0c.Zl>sɋe%?o{-j_ɡy#\?ƎzQӷg& c"6g]eL!FS=نZ)PQ ~诫a65~ ;DƄk> oO?(2cI H SxIR5{YrNXo7J̹?FW~e̜ڡ-U*u0KPaPPo8؄//u[m|G]βdnKAηs ](m](x]&AeQ"CX_LV`_8b%Buab sH:Ih%0h2|]jO%ikMTFMR/01FUkY? 80ѽC\ ,^İ eűi`LB#.sd진R ÃX8x2iŒ2JB%^m}%ݥ;v0ݦK>=" 4DJe)+5wr4|CӄUKKev F{]ҳ 쌙 }#xIEeFZ4ӚCͼ[wGO,?O)~rj-T ` G#iEXXpG+l'-XP 9xzd^YQ7vbi*+ }FΪѩi3߸et{a^caTElZVQ}HZp=@.D̀0G}:IGX5%z7ˋ9 ʕ>Ei %$q`=C7{O#Ą ׅX2C(qzODTh$K%2D_ȹ.t.|X Q1Gd s1)HЌzyŻSय7Ja?&y}in\7!'$650H\m9q͆*Ab0ԟ1?ɊT˝C>7v !G2 I5I**9euv'1 ?27 x7ln k,Ycbrk5:xK)p/vG0"+hY1*[{X4uSxN>m4~)4iw|Oާ𸃇U.x„ \a僖VEddnW{}$pY{voCDr|;4Va #j4BuB$)Z`Hp(`-Fp ^3O7濝OGlh ?}/. o;NT罝ߪ%:\Mrr8 ϩ>Tߤ13I0Av@!P8fر" A M$T O}2 ڴ]sN'#U%;/޹gs*&<^3`#t ]FHFHT<_^nX91.L5^Phr>u^z7mVc0F1Ypn&}MtyMȏ2URbqY~鏭FL_iە4g){ 2q?Wz'=sa{%[.D:b.vo\) bG}>p8p+<, ^bK>\Di O2&q\"Câ'm/aIyG\x܏c^]att5U.Wn#^a_Zg2J2Cŷ! J(qX`GTY|S/]j}vk\Sz cjxC"1kOj) ")>[ 5wguo'jϡ[FCKx$ M\+T>\3C5+ j9DCjY wv욚̎ަ`H j,WH#˦芻oּ_a}xөCNđԢS=m7埘[@bF'YKNݲ7Ҹ5#B&q]ZvI؀B_z\hC!qX=y侅kl/}ePOR!R%XOx{=_/g}W'*s{hq(0v}l)4 qjYS[Dy"|>vuʭH0E2yzzZBgP{fu_J"()gW{D1+cr[7MH 6vbHľ"4\#+Y+Te-_~L5 GrR|<|7 1UnM3 k҅kF^[  rcYyOS]aS /^FO]wLd/C:k*tqN2֋D:sCO *F \tueM-'&g:u MK'xgdl<}#:ԇ  IړB_z?i|hl,cp~nCfhҴ3,KY?DCB!d세0L7/ a4.3fLH).!JC mYm5>cMجAj[E |ޖâ[h072!5҉}G'+ǯ8qP QCt >%cP*ԹH'MQ,$q%ovB]Pb+(Rr F k ]ɉE18R7Gn]~5-5TRd (QD͏@%V0=~~]ݿ+-l1'ńT4S^AP[ظ1?-!ѶnVk B}IOęMOېDd%uN^~M86J\? (~Fء"1 N<3)*}෾[\@So~ be tϮav>`}R~&aaKӡeX4Z=y5}e)hU 7f!_CտŇ e _f 舃N/Z:˩>S;9 sMl>fuqJgElc64@ݳZ,gHsfdѽn(~ S ՞69twi>:{zAT' M;_Q؈Hy>k@Rd>KS|>|j}od w% svN~$I<$x`~ꅧO>?D"ghѲ,oik^Ww*VPys0_޵π94-$"h >0S9_@z14}p>z9+Ձ A^D綵th_;𓓜-xҜ#B^5>d_,(xOo1(%~t/XNmfsxfoN z3i~$7 RZXtl8ksoxjeRnk.=uPK5WihnAGa`&N6Ae`v&?_ClM)hiGȫ %7k!3p6"D^V`'\$"{bFHh4C6E€<&.A!c#!.C?JB%Dg $ ۭ4*d9{%V&g"y7!Hgvb',- XX4H3uS,T ]A4YZ(%Jhpq4RQkDxH#H)wfTח:| ^?-{yB6Abv9>(狺> %^!.a`hzAѳ*I#ݧv|g}яfyx۴AwwÞJ G\0 I6liĩ6=JVtțu.<0rO/Vq3j~t?y:ŀ-nXY>>>MTA`{y{Abdbhv?=$za!ېmbTؑ(hNx,*kе;mS2Ψ(\j0ΏqCVܛm. K_l \}R~2=}L4Gwכ{|MpaI D"6H(eD*&6pqn#Sj1y4ښPMkhtcrSdAJX `1 @}x֗lx9@".0EP\;R lqx÷2h343<50Fӈw!m߃ ԏOqjv^??Cgꁆ5C9lnX.9~@X 5IN p8+# pf"jƜ8BO8~6#!=kTָ0vTKCI!O S8aDD$GDIQ?Lt?eYM}t\)Fi`Q> &֓/J_W2JH J-JL!2mr@ lq OѶGl@HZ[<:)2?|V?ك DQuu<3XtX+Z :R2G%DѦJZ$vj\LSC 'y\OoAQěLATq0F%R& DLTfq?MtzwXΩ['M/^}q=Q{$4\*=D**srTsi qpT OpP ꇪ:NXڂWd eߏY($'G,wcjN{uOLׅ)9 K{玙NƊKΨ)znp-:#6O:iFw?e8~݅|̶X!}+w`^deÁhi$@OYGdSEPQE('Bۍ< WscRhī/Ga{h;;ӑ5(~Y@+&j owPČh~\7VUE@El) E(ݕ'\uפU_-_Fvir%JIR.zxyugm(S*8U+"&睯PT9:E+LW~CCzTD5i͆5}|P @m񶗘,*y97PQs$933${$#?Gy"db[_60TҋoF\n#S:9k̄GiٞB%+zk￯(x t?G228,r,_8"9v}M+ѳ wgB}~""&S]C+Ei eĔ.m {}>tI~sRkc$a Vb` 02I(M𛇳,(Fsw0 *ƨr; YLtF>"Z?_ mB!>nDb9nILȓU{]|u nz?]#nx8 *~IQ87}I dKċbnOu?{'py3xo!?ƴ2<r~}}L]l2*9B9aϨIdEq72&MT>7~N;gC |f!T=&WxF?ثm: *4Fdc uf1 :EQ}`_DABChc5t#3SjxGAD^!`jሩy5'(i>|g%!ϵ wS,'?mf˂ۨ9f~[d2hbRWMy{a:S Azq\>VS8@}RRaHl L-+"ŔdT,rh]~>7")R"E$k0 ?و*\qSJ[)Zhz~׍T_bPPPđ*L?qQC Bi* d$+?ف^ӫ77l^ Ɏ2( 0b͡QAf2 Q!`,.Xe4 0 `dk&e1=q<a' @c3Y). (aXʁrBbJk JԆ[2 EC-TIGP6!UB@us+D-bSK2bbJkZn;i,7LA]fݢʆ7EQIX^QɬY*MwÙ;MH>THcoB*WQaVXFI~Y 4N 2w/?8)r Ȫr+7)tdP 5B7:%`2L5p SBRE4$shVzoמmDZNiCOYm!h}X&+< W5'u"]G]X811L02,!'lќjLLB,DO^@uAQ2!(M!L&*݊sVEQ9d]J, HPY54B{2ɉ9̪Z"$EɅY=`(= YYY>n^QQbRV[`5㯷[R"BoE!@B V( Cp dW B)Z5 @5Apbv^]4@%1< !@-\`enp`Ks>=XDە⭻{{]h\-& i}n"[6ǏDa[m*reg^jfZk"q]K,y=0H`Ϛ !w)߯SvLVFV1r\ DX,YƢb V ɠ()h""YZE%PCRӝv@`5TTAA3IHSIIIBA3THP4PQMP$1 TAJ҅JP+-UEPQCK[Gٰ^QffpdWi d뿑̋GB('cEȽ:l96q瞒jOʟڰ9>GԚR`v.ZNvĨRz3l[I'' tc'u.NQ6&2؍ѱn}Ī~^@qlm\')upߖt_^"2n 6' T<Ȥ Ĺ|/$v2ю֢Gk":;%Rz^$4PP4҃vQ"0g<C~}g!,PsJRAcyZ= uQ^]?0,Js7 *AbW(Q )S7yyEnidawY=Ф\ʘWy 3${1Gw9Y!3h8d^s3ʻqIkQA#SLjU*5m\*35'XJEjeVu3w5$97f2gfDԒeEQJ~۸0O ,̯=!*(pt^W NxBj ⵁP'g*dJ޼xCnYUD^b|\C dyjc"ë;Fc$ B78Xt Ftc D(-Q5%K`x6F]F0%"MÄhih̗eM;ه{x|4^{LRHeٛIR+$j- '۝qkI޼Rg 9{ܐy/Mv=إ,((iPR" ЇvӽDN^aߨ*ha Q "hJ ((bZ.biGBZPDJDso0Qg/z$2GU>i_[bGH2AEQAx< x?dtv=( fR'CAL<[%bpשS 2S.jbS}=柔f`OX CHM@tD"3 \q=(j& h "ZԎ &*h)YBJC .AY%4KJj31pu:D5*jL0%0p7`svl n`B1!QacSݑ"$dц2U'HT?:UTQwt M&C  V T D5>OAe}dLhhr (IRb ŠB:JP0) : CJ PPTSMUHDBP E- P% 2 BP4R,@SBDHY9 H,Ɣ$Y1Lb .BIHdDfR1@ƊIPyBTTݚ *@" a@* u)ER220̠rʁh* h)h(wWprij$"JxTk%7&`k3,EkL2R`I75ABd1@Dc1 (Bw0Y IPYe)%C%rPʢL\$ɍf;Wp BLB(AIE4&J4NstF` $p ܧP gɆ BSd(HRSEd4H9&@3N!b@,55$:T BAsg- rT!bK0"0G TV0^. H`T4HNHddj1S!*YP0- b$R/hc5YR~W~ngHxZz?W^#8wzcխih:J2P(FEqdR'Y f=26]|,{4=ZkQXhWA0Tȼ̔Haagl15! 70DoGaH/m\k}N_Y#̘#ChD]*~-+mz;ۣWᮝևu;a?=d#{ y;h&OS<Օ1_ ^fr0!]ϻ?|ޝV#<:5G?_޷%2VZѥ&?Kǁt *PT`yLg 1[Wn̓ܠ)t\Xi #xExV ,&?s PոE< IJ&DtpBniaED^ܜ0) w(mHs(HV>?ٳj)P.+p IyOR2r穛9^a EE Azft]'nq΃CƐ=f<Ihf^_[peI&ͯß/lj)(b ɢ'^bPtUMcJEAF q `HT!PXsw 'Sr{Mԗw;ŧ.YRߏ6 !55sלT`dF%InP|xL-z'oO85z>޲gd!`jdPu"+<-(jDjP*0O9-(s R\RJ HHRCPdђ'5w*j"@Ͳ M֔>=nې.lY >lۢ1i=|o;{:˙>](_?9pЗ9a憽NX L;]/ 1ڽi˾Tf{n1|Yf0USMu׷βfc~!~H<l Kw"#<>=9j4:3^lϽJѪ͋d~P既A;'X=!?z?[Aez/N n/ݫ oUwa6QdN /ƴAeipOCRlK_MC\Ik_Cg8]G;q>P^/˟\z~/ζ&znCxa4$S޼iNR"񄏯U> ;$; NOO3X9ꦒN~κX " ܿhɔޱ0K#ZV$gAGxʾ>1p8tëdx|Be>.Ѿw.;`3nBg/w3(wT4uj,߹dHbhea0e,a ¸s=4ϼsq-6Q)0.!%`u9 YE]6+xM\)g9Ãvkq21:GOb𧵾O=u|nx[0=e-KR0ĦrV:l)ze*-# q u|*Z9dWN Ofu<DW> 6x\uHp& #ʊP{HmC2C 0Z B % nU*z=78#<2 b{ˣs&7|E4F{kS|=T, OKksp_@P&N ؆]ܽ~<=RV#yvl@|%ywΘ)dQ#Zt"{m{ߍwYajAs0gR5\ZIQm߷9@xN>ssK >rqrSn(1FB| T&/nڂ,>^-ꮲyToU*/+M&M1کoU >purr) vޓbqF[(W@8׫;uN>i+7$~k QѲM8G--%PC>Z}gRQ=tP0{- DŽvz KSa>xLntyΞLo h:w۶d@kЭD^ P.K8:V֚V#qeNiOpY[*y/bm`*7xTvW46޹wAH n*+ܰ6D6-5A";#a ȵRg(/c.R^:i *ZB.HC| />gy$dؑó]xU6R0,%BߓIN߯>OW7gBi:j~1O}p1Yif}!Bc+4VO'Hbkm=ruGSxpUG_ogw*EOgmdPڡh \*lEOz^Qk?4/I8ȵ2wXϥA Ԧ&F#5VP(Bh(~jq4SOF >>w~i}dʚdU4%BL0PkV H 54 %?2--.2!Ƃ(JU(U*Wr`TOzC)Rc?Xڵ4,OW"ЅbZ#L @TT5o12\6 X"((@ 2 %MPHv adEE =aYTS,I*Ȫ hihh!( ZSp.Й,k *($#5D@IZDDX+QA@D*E(BaVuaG}n!P0E&&_`!#'ZeciEx}WbeSB_hrH!Pͬr B1Njܼ^ޝeW6ׯnqƛQai^r߈}3 <~^E#8;@=v:y_7ˤCh7i 6vKug o\h?O^Gơfvb_ `рq剛 kkur@ek ZX,x^5卵!=??rOsY!z~rzK]⎒Oi2{Ol8d?t&?>y$iN.Uu4ѹ O)Oy}eN%750іx% zH}-,FF7 Gi5Iɉ'h_00>uߠb3fRU~@?& okVЪu(i3J%ŀWDň ARS(aV L]Nٞ(UEd1=Ӓon'(+'0Ęxb0>/yNti{1D3ĦD' c5[C {hMۮnϞ]Њ*ak=Eġ5]t`5 HDRB&HF^wη%߬fhiuq"%vE`hb" F\[N\#`;zv6!dRSy/|S&9ÙE++a -^|kcio~e Wa86x-;aVj1<d~SUmMMݹ!Ŋ1qxt_;FsϠ{6sbAve-{LW:45&&JÜ[`E2r}Pb\]Ooo=^E4B5ȏM‹DewD% PNRgv>I~ļ&0r)QP^^IY|$bTƼzr| {.xb3ZAnX5.!hd-؁fʄ'O.N׼;!yV(4d֬k OpQ $9]p֧[l^nc++.EV0*qLu'H@0`S8ȏt~|e؅¡\bfeGCiTVZ^d+Ww8փA֏ϳ_S4:? cyad Vmƨ2.$ ;guxVlKTjr3n,:N)Ӻ]8\A"kYݢs}ǏiE5vU{]"#hOnߟ_^Hs H矈s&H{qRtMO̴*̓Cb:璣xY5{s/ʡ1cu=)TMo㭵FuC}mzUZQd6P7OYn Ͼ>}ؗH9r2WGXOw'&=~y(Zoy"Ng8 kk ~3=zK>_osɹ[|-8t2llDj٪Cpyq2^-GO뻾>b:9ݧY,ʰ[mOq"'ë7P)Ltz>| ǜ5@uXlFqH;Q)mRcAE` YS w{h yDxzY\`\!'b'޳]y{81vı1(mՂHsRvm3Qd{I(CPBIrCp_ & /.x.B,_A2; M+SDZ*{5v &H$ipWcX;vśZZMH2Bńjb1QoM~ 55b/7AV%$%x0AfC-VΧI0/VE 0BհZ8ce,6RXްWeXH * J @,\=삚ʂX4! 5{_4mo5dm1=-+:AE"vRld=hDf.B\C- kq!ڂD؟( rg섧OڿHYa dXz}}(@AkX?dqTzA$_߳ =#(b-X~B( 77b*`~f4EŘa@ "nn!pȗ7at_N-ƈMÙIM%R^ *~GggIٛf`:}a=۸0>`}!𜞏VtO*,eD}>6s?Cn3eԂB0!]q @KA0Sj*2Le@<uԓHzb_~҉B- ;@\;^*]'vkw;}|o(|vNg㞜$/șUcmm@/E{z-,Mĕ6G"+";-ތ&4e:Hx2d2W 5='r /> "uiU J6Ʀv,g ta{\t  pIy&[7`NV XccbP w[m"F6q Rk$ѸyAAEa❒ϳ f$ aX %i1 uDOA LSoǪ@bxYѝCBo2gR/쐢zLXuJܘkIc P*ц眝^n5HQ*%1Mf44Em9d;ۻ|^p{32T4` 7lX1yhϠPEHmةL6 @䉒PF)Mt` +IA"qJ8Ƌa "̃Wpƴ-=6k 3D1jhq/z.Jʛ'xggc<$4l;|jCf5w? vj8d$bKe ?l(r6LRpVx҃p^zm HPQm_ pmb/G5\_S*^򣱔\1uTMfFL)Wh-jcdSOdH>|zSPgn@n9# ,뫷hpT+8#73uh Xtg:-mޏ՞Crqy#1-ڳB֡ n. =ǧ՝Z_ƾ+a5(sKS/LWs.AnJ0~_{וdH[ŢUp{⤿k{$⚝$&9hk^uC) 3zQ.%5n:.5!uYtBVYPuJ~{[&I\NJE6" ͐/Q"Ouh氏XPR EKgx:&2o|p0R1<0Ѭv#vo7zM6 ç}4>96X =9fZ.aـniɥ y#e9$/-S^ 1-u-f%m(T_!aG2cQ[o\OfR;zv>hvhnԁ&Xb.(=dqeuT5Vb(`ivIzGTu*[O3YNcE+f?;%! 犬OХd!Fvģ!8eQEruCsOrZʮk]ͬ?38Ѯ-LA=G[wc>2N&8oktΘ|jN{ٷR`iUg?8zxxL(s(|vPA=_V h.cZz%59n=fsO@Sa:W B(ASDFDtyE&h9}ysFT^PhEֵQej*17:vο8Bc@ /ۦ&!;Mߣ9+:u5JTXnndEjthuu\g ɏ^P$~ e]\}{.JM{?m)t ;T x;Vps1P ֶ#WG+ ,|coo( N|jXk46ڔ&rԿ7FSn-#i JAdRjZoV6(l>9GԴ4?|7CH}$'Hs;cBkҞH{pa A`̭B80ځuwrsʼnB4{F 1YZla~TM3Ќ(v߭'DOv;Vy% Rby$;B)6\aE2,6~;ӝuW#D[G@A& *1xr-%s Xgf3Zrf 88q/3&D&'ϙ(Jg'S o=A8XwDIrBQJ()>&dMBj(2?9q俉,uv^/tqJëYN(dERXߍ皾O؏Y1b`ibNi*b'Pq  i;"2U?yGY_|dQV΋C[)䗫&xL vd5Q/W2h6%-7&ǜeSZ) ȏI.yN(:,ZAܾTU3kAI"E(L?ǧ?&(AdٹCGfPegpν`SHAk[`ѵb8=ˍ"Kn<9HsO*jwoUTW(z'u_ 6n%b2,8'3P mݒ#8|'3\0jT8򐋓:E7gQp z{[BO7<S 2MkHxBC O3hLF @y۲?%PIbO{XeD8s.F'1p6t>1,HAkqRՁYxLDy_{q2ŠB] ̫4BHnon8|gu% 1@q{ (PLTQTpUfYb̥Hgu m|=l(Bԅg:`lq9C0s`1'i (//0!_k%2NRSTU2r dlrK5 FPjBrMǼn% =7aS{v: ]!83mUa%-RGiu%-&dj5,dX`J?zJj9S!Od4v;ƏEKl_ fp5>x^ayk`R}݄{ ߨ.%$1nȽe .,dB]6K\)m-3ܻsI|‘v{B*]' J`xoٳG&).\:ZRRT5:ir,ݘ(W/mBY˂!]5@`h HB DvT@bdLLb@ٚq7IZMs/|]ixgXP c ` L{i/ ͙0ks`9d4uQN-tb=J)1"Ł1,Ym[ZA"& PT |}[5yz *zvf_B& ''x}ZTbS8'CPd82zeYJS"MBz-)mGt?1ۮZþp =Mfz^Hn2*v-k >?I`JN_ }Sy-wjHe!"3kHøM:lb&!oF]0Yپ1d?z>AT=6rTFo ՝j{4ص2jg/xfP^//J$I !`D n 5nIb/F_'Os;r4ӜKk (߾@p1BN~z_F]u_]߱P_lyA6@w9R#0UWz JP44 ~ h V(V)Bb_x(BrJ Z@+H!4% )@@RH ?U2 (V% @hJ@hZaZ9X gf{|USdci^(jGnEG=?'RVy=?>>~rjP"~߂l{P? Rzoݕt} gl5%5b9TF@J[y81s]19?,8c͟|"ϋm&!e^DC"[_zKMjB&"'1B,(]rf]x:Ǎ zaD9|5y%p vFmbPo\zu>&JHzqOYɱTM79ځKq؈ׂh&i-hqb;lz*՗ey ?X ܉v(X.!@p4)50O;=S=;ׇ|WKd 3Pї;5g MO+eX# -bScXm#@4T6LSS,*Wċˆ*9A9lXr#PGNъ A}.K~b%P^s}]י8y/~P?1'<^h~ϼ-M. LP3h=;c퉜9# K=l,v$J9j`:,kنv$nǹ@}N\-nvE؂QU6`) ,'ra3,3*9n[?؈4 *2& E??T:0?4 [f_Et WFthaz٨sYo]dRiSSRqmpz[KdN'yNu)UM=n7(̜-ڽ&C!mv Td R$,3_]̦CQ_ipi$|fY !CU*ؼf.m{n+9i$!Akm!s;q61DZ߫ӿۍwi g.$⺠B.W]6PlyHYc9eVھ֞񛬢s娹mܘ]ΓTC mͲ}s ۔߶1얿Lv_AV~MkC]JG2J+O\+8ɉ*z&?4w\6k5rmqDC|d9eW9\X?{gIWp έ}߶6{ !%<S6!T!mWs JrAPHR9(S`K) ꘱8]ڱpx3}>Er 3X/90)>S^bt mhfY(Aclߛۊn %E*B< bT"qtEq\º3Q.N3!o<ȲZɺkbȠ0!%O(:EWZ!p1eCQtT5gsnx_kBRtȢaE~=x20Zvmәhtý;#QE%阄$MCPfVRM-HLY"*xT%ovcuxLKیjA{w&LtSS M8<A+<@\!Îa:uc)= `XahćAH=" ѬSv2ys jq5Pݔ1 8NY$}ٕJ.a{ qGFQ{$++b<Ѷqӫ{xI4]f}B&>c^N:8M_m:esJ{6ȰߪuyzQNG4 r vz[Gl9;ld;GĹ8E1]X S$Mpϣǧ/(u(oVHgGu!eCVmD& R52MR#< =R4U%QLEIM)BP 9AAKA&pddH!85^,'юO$5wɝy*0n]:QD{`IOgN_|r/Z3Zr ȏ ꐯݛ'aG*€S ~e^{kqdvr*T Gi''}&ux81%VQ CD 4'Hv$Vn!ýtQgDFfzQ|Ss*ۙ Ѽ#Ψol)*POoyO~K|EɈ(M[KMw7v5I5-o,0KevK|7Õti7Tѿ]q5.c(W,c;u£HEx{ئJ¹m. sTmz޸5(˵kiå8"Z>- xtFų`4;rH=v<r.1s|^Ϙhk4֔dECV(J/`_K 0.!eT#hԴ9e5'fJc8ʉIYkCx]k׏iOKίO:V(O!Ԍ,iP@ύHLp4=){E"۹B) ( 29C7>9 ֩ C`t ӨzzH 9f"M#a=u= Yߦ=G@x!pxCT,I;`kxtKMuP9C_OAwI!8=:{m?_*Eӭ93+4ЀpkA|0 r{]j 2{:Uzp7<: ob)jdm^f*;kgDxƙfƺ)0]_O6 vzu-R/a dl/__ÓF E7v,r w0bX+S {ϓt︻Hv6Cg^X!!p~OFMɘ7 3AIQ Z%‚RJ""JZ 32 (&*&>$MTFz핚(L* * H(kseMd˻@cs;mCO@rڬͮ:nQMVup#q:\I P| GyxB'#3r)M^"J-2K BEjL3ڀ,D ' "ctȪp _(@Ri@)@iwsW07ZOuvYquן.ySSױRX(W;2?.& DoiXs:gΦL6n$d>IgGҤ9*LM`,ӭ(յxfeJ=:XJa?Z .G+~`Z v`tnry^x-I{Vea2"%ܓ gٟn#Hd%%伽rh y\O"Ċ/pʙ9 ZE iB YƑr+Ɂ=n=M 9>ՁfZA"/ݠG"]Lvynbh#aBϲO  (()20`1i@'_&4iK Z P/SV v7)Y7X"$CZq#*!`N#di2(8Ü~}kM,[ka\qtD8hPkl~Gavȅc*3ȯ핈_ .*U!.ptmYB9k* ݼ!h-3^W'Nx5KAkO3zczYGqL;3BōmXL.!7`&!l> ,~z Mh8*S%5U ahEbˍ ,69.ש L,2I:iՊjDw9u]&w3hzgqԜ[&~p",BD3G }u(‰j48P+e߹ou3P8۳Fg ]o4f))MzIAM#JwD@SEXz"!(w 1!F;ͬJ-cAz8KDb bP'otUnϙht3w 1-tʤHx/k@U}R}ꙇIYāVf䀑GgFǚ۱6'hvhu ,th-|Fw k]%$Zb͡XT 3,Q ?Bc"%IBYR,G9 =x6 WcFҁUvh|fBV}^%lI"=]&!hQm#H1)`Lpc}#N:tv9xkҔPV# B*o퍯3z?dw 'adP=,]:yal*B^ =Gn%S2.촙 35lΐHT (bk|2rOw33q4!̘60!6qAgT s;U8aA\ͩy<0Xͱf(P@ZY!R'o9x@}@UϜﺢQ^PP@Sڥn0v9BER Q|妖=x.O-՗6DJ2C-$DlӉ&[sW!.lA^RUK׉szGm%Z_^{-*=w98Vُ1%NΕ#*#iשdAk>mT$Y jP}Ϥ3;wFkpP DKhAׂ=s<񪸁IIgD>. 2F\䬶V[bT "oY;!6>b1l\3i\T6=Kt-GcY. LibU02  P$T$ȬƳ"H%5=ea ,򟋮|{K˱ 2$HטAY>Dm>;Z|"PEVOz|![O2ĝժ%ĜDaTQ9F?`߶%7u̔r)/- $k&CɖLSyHLA pUKH*iĹjfGT*x9 I]1x4e`$}3쩃,x~=.^fz2} lNy扷DlJFMvΦss|l H0!BFX1' wg))_:?H2Y}g=uwjKv.lwfƗ~OdΞJ>woy(:A(/4_3QP/(q"OD!sgE 8w^&3μ7 W\Jww%=oﴬnp4j.qg XT7S(Q2n^j<М)iu|n״g\O8v>d"nΪ2ius:FQ;u% cv] ݶt][s&v/c2.ZDžט[Vtg7ijJޮ2aӞ W2y@G `AOY^^1Zn"Nۄr"ф{rquR9ޤSZl]q.蕹j8Pƚ/PcfJ7={֨W<S%z\(贇]CF@u0o>Nx7~YCO Y|sNfJ $ 8Gᕃq `䇊!Gik]nNLFϜKRqx퇏M/'9N2NONJ2ʈJh.. l%|qKAN3߃zgry3p1@K/vJ>ok V>EoӬ]KoI2G7.R,y܍Syg"I;DkC,!ѝSf 邢z<6IW^֑qs{@1'ྑK4IEN> fG% gjt[KD+@2׿6hөt&"Cw_2'ƃo2 msYO 3J2M键pϫcEm&yfan0;IڗW\́]<ӓYg83 dhFAwճJnfYq.p0D9WܛE7>Bٌt:i"Z~!̰y}%'&߮;R:MIzYBA"%(k޺sN7+ݽ׿m #fmA+$PáAZPO ou`StiVqCޘʓ2uh/c&k5ޛ$4U"b|Uwe?>x8a~Ԑ=ߵy\T9TEF22b$$+;frZVg/|_JToxэ<$SaVPvݚuD9 䉕 d*!>>8F{wgym 0o"x:9Xi(">~S+6Kh,WIP[*t&$A4թ֚2o:ahZ#q̻&;>UAAUT1)HpC@R'_s[,q:g|K |mFZC)52% ɢ$L%R,o Y94=;+ Ry+&Z(h@dT "'ߦGZT}eg>58#v:8a<7 "oxyNe2'|%rRQvEI Y85D¦ X$ 9͘ wj}[E3w^7'v@ l!щaPzv^.1$oqktl_nNn0[}hzc^wTbywaaq=v g߀$a<;=0(h*-hl-*QC~wDIȯXi=ѱ-e\haܗVD/fy)jwq 4"nwٗSz+FDbQ8poVǔ5z0)e@Uc%21O U [ElGi YgP%؅Ig@ܶfFVW *&\`rh+a"hLZ"*@#ƛ]s};<6XHF9UCmeyI.æs, w%b^5^8V !c{V9{Dߢʒe]b~| d=S7k ѣD*[.7/˝itCt*(X㷢uzҫ^>jƉ۷]c6f>F -&$"]3D Yy)5gDMy^" Ј"4ITԂL(=c@[T my5\āLИW3YI%tWXw46tjz`W, o|]FnbQrTB+^0'4ᙪ,KԠCZdHZ%hSOhk52We,E{þi񠚇vwKVM0x39Ow#dB;dМ먔#D2":kn_L]Kv",F?;WmO-WhR+SiXʳmuXqD43U)H5%wzԥZ)ف5 Ji,,> ,At3;ȳ| ˫*z9 64t. /Zɬvr:xt/a9H%9wvpzZoo&m C21;jP"&:,Y' 3ڄ*W[,TZz!4 E:A=`#N դ8,A,oZ'ehnAeͤ<7Oj`AP ))sޤ"![W~6Cd1$$d@Hrs<0[%fаok gRQ/!5Ѧºꃹ巚I˃c$cV%`TK !-L%)2HiZ%*3=랳yDrQ6rAX0 őHAJyC# phDVʕTE:wT@vC%B{AD'²nLdq4z6jkҷoLހ؇sx:+s}`T9ײib◺j{gLBY. ܃V[Wy՝nZˎMܿ=>-97Ż5fj3d1No hS{x }t&>_Gwuj2hmY/鍛fXR.,Y{]$ .0]eh9sWW&A'cgT =.hZ!eYz,&dX Ao.ֵ/Ahp C;ɧTa«@ӊ쉵~Pv& z Y]~GN;05m&n2H7WC VYK/WGBΏ&5- Eq isnAoKq"ІFsYr/pRnE vNn iiRtM`Yc~s%b*(eX^̶ VA|qNRDFxˮ,5na(*B 6lo+d ;)٢k0%4RQ&(C~Aty~}C% ӥ Gs$3~Ӿs (NP8k)Å9I;/n3H2t{u핫_Dݤcq۸\.FaSm{0!rÛ7hQ7î*戄e.V8Z8ϐ VFy'c5 :KxTLáLh.S2raz:9Pn,d}eqzu96Ӳ:[%賒cGyߩ`gk5{2M`;;N#Yߞ: 9Ezo8G=U:j:j0;^';^*g\(aUX:Gcƍ)و c md#Yd<A[@ߥĵXA B4ԝd_ J: SJ6F=յ7Jy :.H$6lՐ_la>4eG񇮯\jcDk\h$\:(^=F}L/s4W!`Wl#zlj$V;{YNP%%c9PP%dQ%av8an'ud%(Y:vt !@R\BT%+&G^޹œIm2 ڕY`bz2T#)}||aqϛˆږ*$5"4d4ڀhTJ Fe*e@ĉG!)R;85S@DQ%BuL $)L231 ,))C )J)\ SHS "+ĬE @ᕁ;LD͍It`gNϴ z;LrL1$ճg)ej֛-^N6-Kgܗ؀h#rA3C@"H(Ufl5kb9 Eo6|ۦ8S*F $46r#05R{R$)gk}Q ݝR(GԊnk*ab1FM32^JT?4jt :bxL [vcG_ 0ā\{-Q =|NjJX\{DMiz]|>Vy.c=P`@d!(&h&OPRP"Tf&bJZʅr:5R.rVh5)AIR,dKj$RA`-#Ua9K(p)ݐ(M3 4TNAT45JD<\y^^5 /#((  1ReD4UR8LS!IHaYIued P\,j+1!Dـ,VM8tzȲK} >)((Z^-'Q.ovO)<3'Fm@7td&yU%Z }28d hߦz!8V@ݰѢ,aXhd4 ɐd>o\96v0i&1afH bV*;Jш3uw׽UߟyOoS4:YOQ| 4 ˭iB,t2񪾗lY_i=Xb6[IiO~,WC&: m!!kȯ :mjIqT|<\6 ޗ-O-~sΠ|ks{u5(dYƮmL=x$47;mPVgH{}_tOb|yI& ﶵ ̇ϖ/DY)mQQѨQzȮhC˽TQZz;i4p$APϭm5_?Z)OܖL_v{m9<2,[C80?m~c`??ovtS,(料XG#p͵2ɧ $:>K)rܲO6>ܟoFxc4 `Oo+ a_ݕM ?:o2?XkzCT~Y@)(AMʒY:ڰ-ZvͲX0B= a JT b|mڴĻZed)[CtvUNM< B^I?/҉YE?t>TۏDJC?9IzY7+5JaX/XYn]ٜYv>M[JE:>@MwϞ$<'sE:@xJ3 uJȥfͿmhHp\JѬ(oh XT:! &pNSTm|c+3()eQFb+%ʅtEI ^5"^aqED}H(0*v27$U_0a=Jo/ $/!rM[222R!NZ93m7!$?'C9jQyA1lX%Bi-vf s/j /A-đ3:|cF= !J0{|\lМ8N١qTTLG,Hml5S! n)ѬyA@chZ)A8;`玁̟:OIg5D 2e0vy6B#dViM~ޅftpˆ(8gfo8b4 6}DyھrրYQ K>r^xOL9|'3y*th5-VWOG S4)>+CqU5YJ@B@%qFX@3hkySYnQK7pVSRJ4 Bdž9B--PYf(1~ndp F@H? %!t¬lQ 2@(h{͌%tyBkD:8`Svw҅a5Մ(*z%j`֮ɄVA5F"5 *tX'lu ubt&wҪܣG5tZ^Fwـ5ᒥ B0 `.C/篌:uԭGW}Fܩ9M!1 \oJ2/A ;9W E6_Nl6r$-HWSt1Ų9;ڔm n,|֤fB;39DLc]֨> Gqaޝ"AS 4B(V ]/JMLO0)D+|FLE_=Sؕ\k9WzCG2{-:8~FbI%*X֛>/}uC\֧x> %c :'*]vSW}[S#dtcLB՝WMDE$穨&iZ)E+ &}!g PXDR۹SJ6MW@!z"hi#y!S TՂ=(PDqDT+Hn#$@ ۸aLKe1NV4 z[ҽ9=z%>(#x\kYYf=יۯA ^ ԏZ4`  do not edit by hand # Generator token: 10BE3573-1514-4C36-9D1C-5A225CD40393 generate_ngrams_batch <- function(documents_list, ngram_min, ngram_max, stopwords = character(), ngram_delim = " ") { .Call(`_tokenizers_generate_ngrams_batch`, documents_list, ngram_min, ngram_max, stopwords, ngram_delim) } skip_ngrams_vectorised <- function(words, skips, stopwords) { .Call(`_tokenizers_skip_ngrams_vectorised`, words, skips, stopwords) } tokenizers/R/tokenizers-package.r0000644000176200001440000000133613070504253016623 0ustar liggesusers#' Tokenizers #' #' A collection of functions with a consistent interface to convert natural #' language text into tokens. #' #' The tokenizers in this package have a consistent interface. They all take #' either a character vector of any length, or a list where each element is a #' character vector of length one. The idea is that each element comprises a #' text. Then each function returns a list with the same length as the input #' vector, where each element in the list are the tokens generated by the #' function. If the input character vector or list is named, then the names are #' preserved. #' #' @name tokenizers #' @docType package NULL #' @useDynLib tokenizers, .registration = TRUE #' @importFrom Rcpp sourceCpp NULL tokenizers/R/wordcount.R0000644000176200001440000000304013252224016015011 0ustar liggesusers#' Count words, sentences, characters #' #' Count words, sentences, and characters in input texts. These functions use #' the \code{stringi} package, so they handle the counting of Unicode strings #' (e.g., characters with diacritical marks) in a way that makes sense to people #' counting characters. #' #' @param x A character vector or a list of character vectors. If \code{x} is a #' character vector, it can be of any length, and each element will be #' tokenized separately. If \code{x} is a list of character vectors, each #' element of the list should have a length of 1. #' @return An integer vector containing the counted elements. If the input #' vector or list has names, they will be preserved. #' @rdname word-counting #' @examples #' count_words(mobydick) #' count_sentences(mobydick) #' count_characters(mobydick) #' @export count_words <- function(x) { check_input(x) named <- names(x) out <- stringi::stri_count_words(x) if (!is.null(named)) names(out) <- named out } #' @export #' @rdname word-counting count_characters <- function(x) { check_input(x) named <- names(x) out <- stringi::stri_count_boundaries(x, opts_brkiter = stringi::stri_opts_brkiter(type = "character") ) if (!is.null(named)) names(out) <- named out } #' @export #' @rdname word-counting count_sentences <- function(x) { check_input(x) named <- names(x) out <- stringi::stri_count_boundaries(x, opts_brkiter = stringi::stri_opts_brkiter(type = "sentence") ) if (!is.null(named)) names(out) <- named out } tokenizers/R/ngram-tokenizers.R0000644000176200001440000001457413256545214016314 0ustar liggesusers#' N-gram tokenizers #' #' These functions tokenize their inputs into different kinds of n-grams. The #' input can be a character vector of any length, or a list of character vectors #' where each character vector in the list has a length of 1. See details for an #' explanation of what each function does. #' #' @details #' #' \describe{ \item{\code{tokenize_ngrams}:}{ Basic shingled n-grams. A #' contiguous subsequence of \code{n} words. This will compute shingled n-grams #' for every value of between \code{n_min} (which must be at least 1) and #' \code{n}. } \item{\code{tokenize_skip_ngrams}:}{Skip n-grams. A subsequence #' of \code{n} words which are at most a gap of \code{k} words between them. The #' skip n-grams will be calculated for all values from \code{0} to \code{k}. } } #' #' These functions will strip all punctuation and normalize all whitespace to a #' single space character. #' #' @param x A character vector or a list of character vectors to be tokenized #' into n-grams. If \code{x} is a character vector, it can be of any length, #' and each element will be tokenized separately. If \code{x} is a list of #' character vectors, each element of the list should have a length of 1. #' @param n The number of words in the n-gram. This must be an integer greater #' than or equal to 1. #' @param n_min This must be an integer greater than or equal to 1, and less #' than or equal to \code{n}. #' @param k For the skip n-gram tokenizer, the maximum skip distance between #' words. The function will compute all skip n-grams between \code{0} and #' \code{k}. #' @param lowercase Should the tokens be made lower case? #' @param stopwords A character vector of stop words to be excluded from the #' n-grams. #' @param ngram_delim The separator between words in an n-gram. #' @param simplify \code{FALSE} by default so that a consistent value is #' returned regardless of length of input. If \code{TRUE}, then an input with #' a single element will return a character vector of tokens instead of a #' list. #' #' @return A list of character vectors containing the tokens, with one element #' in the list for each element that was passed as input. If \code{simplify = #' TRUE} and only a single element was passed as input, then the output is a #' character vector of tokens. #' #' @examples #' song <- paste0("How many roads must a man walk down\n", #' "Before you call him a man?\n", #' "How many seas must a white dove sail\n", #' "Before she sleeps in the sand?\n", #' "\n", #' "How many times must the cannonballs fly\n", #' "Before they're forever banned?\n", #' "The answer, my friend, is blowin' in the wind.\n", #' "The answer is blowin' in the wind.\n") #' #' tokenize_ngrams(song, n = 4) #' tokenize_ngrams(song, n = 4, n_min = 1) #' tokenize_skip_ngrams(song, n = 4, k = 2) #' @name ngram-tokenizers #' @export #' @rdname ngram-tokenizers tokenize_ngrams <- function(x, lowercase = TRUE, n = 3L, n_min = n, stopwords = character(), ngram_delim = " ", simplify = FALSE) { UseMethod("tokenize_ngrams") } #' @export tokenize_ngrams.data.frame <- function(x, lowercase = TRUE, n = 3L, n_min = n, stopwords = character(), ngram_delim = " ", simplify = FALSE) { x <- corpus_df_as_corpus_vector(x) tokenize_ngrams(x, lowercase, n, n_min, stopwords, ngram_delim, simplify) } #' @export tokenize_ngrams.default <- function(x, lowercase = TRUE, n = 3L, n_min = n, stopwords = character(), ngram_delim = " ", simplify = FALSE) { check_input(x) named <- names(x) if (n < n_min || n_min <= 0) stop("n and n_min must be integers, and n_min must be less than ", "n and greater than 1.") words <- tokenize_words(x, lowercase = lowercase) out <- generate_ngrams_batch( words, ngram_min = n_min, ngram_max = n, stopwords = stopwords, ngram_delim = ngram_delim ) if (!is.null(named)) names(out) <- named simplify_list(out, simplify) } # Check the skip distance between words, and return FALSE if the skip is bigger # than k check_width <- function(v, k) { v_lead <- c(v[2:length(v)], NA_integer_) all(v_lead - v - 1 <= k, na.rm = TRUE) } get_valid_skips <- function(n, k) { max_dist <- k * (n - 1) + (n - 1) total_combinations <- choose(max_dist, n - 1) if (total_combinations > 5e3){ warning("Input n and k will produce a very large number of skip n-grams") } # Generate all possible combinations up to the maximum distance positions <- utils::combn(max_dist, n - 1, simplify = FALSE) # Prepend 0 to represent position of starting word. Use 0 indexed vectors # because these vectors go to Rcpp. positions <- lapply(positions, function(v) { c(0, v) }) # Keep only the combination of positions with the correct skip between words keepers <- vapply(positions, check_width, logical(1), k) positions[keepers] } #' @export #' @rdname ngram-tokenizers tokenize_skip_ngrams <- function(x, lowercase = TRUE, n_min = 1, n = 3, k = 1, stopwords = character(), simplify = FALSE) { UseMethod("tokenize_skip_ngrams") } #' @export tokenize_skip_ngrams.data.frame <- function(x, lowercase = TRUE, n_min = 1, n = 3, k = 1, stopwords = character(), simplify = FALSE) { x <- corpus_df_as_corpus_vector(x) tokenize_skip_ngrams(x, lowercase, n_min, n, k, stopwords, simplify) } #' @export tokenize_skip_ngrams.default <- function(x, lowercase = TRUE, n_min = 1, n = 3, k = 1, stopwords = character(), simplify = FALSE) { check_input(x) named <- names(x) words <- tokenize_words(x, lowercase = lowercase) skips <- unique(unlist( lapply(n_min:n, get_valid_skips, k), recursive = FALSE, use.names = FALSE )) out <- skip_ngrams_vectorised(words, skips, stopwords) if (!is.null(named)) names(out) <- named simplify_list(out, simplify) } tokenizers/R/data-docs.R0000644000176200001440000000032413256545214014640 0ustar liggesusers#' The text of Moby Dick #' #' The text of Moby Dick, by Herman Melville, taken from Project Gutenberg. #' #' @format A named character vector with length 1. #' @source \url{http://www.gutenberg.org/} "mobydick" tokenizers/R/coercion.R0000644000176200001440000000105313256545214014602 0ustar liggesusersis_corpus_df <- function(corpus) { stopifnot(inherits(corpus, "data.frame"), ncol(corpus) >= 2, all(names(corpus)[1L:2L] == c("doc_id", "text")), is.character(corpus$doc_id), is.character(corpus$doc_id), nrow(corpus) > 0) TRUE # if it doesn't fail from the tests above then it fits the standard } corpus_df_as_corpus_vector <- function(corpus) { if (is_corpus_df(corpus)) { out <- corpus$text names(out) <- corpus$doc_id } else { stop("Not a corpus data.frame") } out } tokenizers/R/chunk-text.R0000644000176200001440000000453713256545214015105 0ustar liggesusers#' Chunk text into smaller segments #' #' Given a text or vector/list of texts, break the texts into smaller segments #' each with the same number of words. This allows you to treat a very long #' document, such as a novel, as a set of smaller documents. #' #' @details Chunking the text passes it through \code{\link{tokenize_words}}, #' which will strip punctuation and lowercase the text unless you provide #' arguments to pass along to that function. #' #' @param x A character vector or a list of character vectors to be tokenized #' into n-grams. If \code{x} is a character vector, it can be of any length, #' and each element will be chunked separately. If \code{x} is a list of #' character vectors, each element of the list should have a length of 1. #' @param chunk_size The number of words in each chunk. #' @param doc_id The document IDs as a character vector. This will be taken from #' the names of the \code{x} vector if available. \code{NULL} is acceptable. #' @param ... Arguments passed on to \code{\link{tokenize_words}}. #' @examples #' \dontrun{ #' chunked <- chunk_text(mobydick, chunk_size = 100) #' length(chunked) #' chunked[1:3] #' } #' @export chunk_text <- function(x, chunk_size = 100, doc_id = names(x), ...) { check_input(x) stopifnot(chunk_size > 1) if (is.character(x) && length(x) == 1) { out <- chunk_individual_text(x = x, chunk_size = chunk_size, doc_id = doc_id, ...) } else { out <- lapply(seq_along(x), function(i) { chunk_individual_text(x = x[[i]], chunk_size = chunk_size, doc_id = doc_id[[i]], ...) }) out <- unlist(out, recursive = FALSE, use.names = TRUE) } out } chunk_individual_text <- function(x, chunk_size, doc_id, ...) { stopifnot(is.character(x), length(x) == 1) words <- tokenize_words(x, simplify = TRUE, ...) if(length(words) <= chunk_size) { chunks <- x } chunks <- split(words, ceiling(seq_along(words)/chunk_size)) if (!is.null(doc_id)) { num_chars <- stringi::stri_length(length(chunks)) chunk_ids <- stringi::stri_pad_left(seq(length(chunks)), width = num_chars, pad = "0") names(chunks) <- stringi::stri_c(doc_id, chunk_ids, sep = "-") } else { names(chunks) <- NULL } out <- lapply(chunks, stringi::stri_c, collapse = " ") out } tokenizers/R/basic-tokenizers.R0000644000176200001440000002027513256545214016264 0ustar liggesusers#' Basic tokenizers #' #' These functions perform basic tokenization into words, sentences, paragraphs, #' lines, and characters. The functions can be piped into one another to create #' at most two levels of tokenization. For instance, one might split a text into #' paragraphs and then word tokens, or into sentences and then word tokens. #' #' @name basic-tokenizers #' @param x A character vector or a list of character vectors to be tokenized #' into n-grams. If \code{x} is a character vector, it can be of any length, #' and each element will be tokenized separately. If \code{x} is a list of #' character vectors, where each element of the list should have a length of #' 1. #' @param lowercase Should the tokens be made lower case? The default value #' varies by tokenizer; it is only \code{TRUE} by default for the tokenizers #' that you are likely to use last. #' @param strip_non_alphanum Should punctuation and white space be stripped? #' @param strip_punct Should punctuation be stripped? #' @param strip_numeric Should numbers be stripped? #' @param paragraph_break A string identifying the boundary between two #' paragraphs. #' @param stopwords A character vector of stop words to be excluded. #' @param pattern A regular expression that defines the split. #' @param simplify \code{FALSE} by default so that a consistent value is #' returned regardless of length of input. If \code{TRUE}, then an input with #' a single element will return a character vector of tokens instead of a #' list. #' @return A list of character vectors containing the tokens, with one element #' in the list for each element that was passed as input. If \code{simplify = #' TRUE} and only a single element was passed as input, then the output is a #' character vector of tokens. #' @importFrom stringi stri_split_boundaries stri_trans_tolower stri_trim_both #' stri_replace_all_charclass stri_split_fixed stri_split_lines #' stri_split_regex stri_subset_charclass #' @examples #' song <- paste0("How many roads must a man walk down\n", #' "Before you call him a man?\n", #' "How many seas must a white dove sail\n", #' "Before she sleeps in the sand?\n", #' "\n", #' "How many times must the cannonballs fly\n", #' "Before they're forever banned?\n", #' "The answer, my friend, is blowin' in the wind.\n", #' "The answer is blowin' in the wind.\n") #' #' tokenize_words(song) #' tokenize_words(song, strip_punct = FALSE) #' tokenize_sentences(song) #' tokenize_paragraphs(song) #' tokenize_lines(song) #' tokenize_characters(song) NULL #' @export #' @rdname basic-tokenizers tokenize_characters <- function(x, lowercase = TRUE, strip_non_alphanum = TRUE, simplify = FALSE) { UseMethod("tokenize_characters") } #' @export tokenize_characters.data.frame <- function(x, lowercase = TRUE, strip_non_alphanum = TRUE, simplify = FALSE) { x <- corpus_df_as_corpus_vector(x) tokenize_characters(x, lowercase, strip_non_alphanum, simplify) } #' @export tokenize_characters.default <- function(x, lowercase = TRUE, strip_non_alphanum = TRUE, simplify = FALSE) { check_input(x) named <- names(x) if (lowercase) x <- stri_trans_tolower(x) if (strip_non_alphanum) x <- stri_replace_all_charclass(x, "[[:punct:][:whitespace:]]", "") out <- stri_split_boundaries(x, type = "character") if (!is.null(named)) names(out) <- named simplify_list(out, simplify) } #' @export #' @rdname basic-tokenizers tokenize_words <- function(x, lowercase = TRUE, stopwords = NULL, strip_punct = TRUE, strip_numeric = FALSE, simplify = FALSE) { UseMethod("tokenize_words") } #' @export tokenize_words.data.frame <- function(x, lowercase = TRUE, stopwords = NULL, strip_punct = TRUE, strip_numeric = FALSE, simplify = FALSE) { x <- corpus_df_as_corpus_vector(x) tokenize_words(x, lowercase, stopwords, strip_punct, strip_numeric, simplify) } #' @export tokenize_words.default <- function(x, lowercase = TRUE, stopwords = NULL, strip_punct = TRUE, strip_numeric = FALSE, simplify = FALSE) { check_input(x) named <- names(x) if (lowercase) x <- stri_trans_tolower(x) out <- stri_split_boundaries(x, type = "word", skip_word_none = strip_punct, skip_word_number = strip_numeric) if (!strip_punct) { out <- lapply(out, stri_subset_charclass, "\\p{WHITESPACE}", negate = TRUE) } if (!is.null(named)) names(out) <- named if (!is.null(stopwords)) out <- lapply(out, remove_stopwords, stopwords) simplify_list(out, simplify) } #' @export #' @rdname basic-tokenizers tokenize_sentences <- function(x, lowercase = FALSE, strip_punct = FALSE, simplify = FALSE) { UseMethod("tokenize_sentences") } #' @export tokenize_sentences.data.frame <- function(x, lowercase = FALSE, strip_punct = FALSE, simplify = FALSE) { x <- corpus_df_as_corpus_vector(x) tokenize_sentences(x, lowercase, strip_punct, simplify) } #' @export tokenize_sentences.default <- function(x, lowercase = FALSE, strip_punct = FALSE, simplify = FALSE) { check_input(x) named <- names(x) x <- stri_replace_all_charclass(x, "[[:whitespace:]]", " ") out <- stri_split_boundaries(x, type = "sentence", skip_word_none = FALSE) out <- lapply(out, stri_trim_both) if (lowercase) out <- lapply(out, stri_trans_tolower) if (strip_punct) out <- lapply(out, stri_replace_all_charclass, "[[:punct:]]", "") if (!is.null(named)) names(out) <- named simplify_list(out, simplify) } #' @export #' @rdname basic-tokenizers tokenize_lines <- function(x, simplify = FALSE) { UseMethod("tokenize_lines") } #' @export tokenize_lines.data.frame <- function(x, simplify = FALSE) { x <- corpus_df_as_corpus_vector(x) tokenize_lines(x, simplify) } #' @export tokenize_lines.default <- function(x, simplify = FALSE) { check_input(x) named <- names(x) out <- stri_split_lines(x, omit_empty = TRUE) if (!is.null(named)) names(out) <- named simplify_list(out, simplify) } #' @export #' @rdname basic-tokenizers tokenize_paragraphs <- function(x, paragraph_break = "\n\n", simplify = FALSE) { UseMethod("tokenize_paragraphs") } #' @export tokenize_paragraphs.data.frame <- function(x, paragraph_break = "\n\n", simplify = FALSE) { x <- corpus_df_as_corpus_vector(x) tokenize_paragraphs(x, paragraph_break, simplify) } #' @export tokenize_paragraphs.default <- function(x, paragraph_break = "\n\n", simplify = FALSE) { check_input(x) named <- names(x) out <- stri_split_fixed(x, pattern = paragraph_break, omit_empty = TRUE) out <- lapply(out, stri_replace_all_charclass, "[[:whitespace:]]", " ") if (!is.null(named)) names(out) <- named simplify_list(out, simplify) } #' @export #' @rdname basic-tokenizers tokenize_regex <- function(x, pattern = "\\s+", simplify = FALSE) { UseMethod("tokenize_regex") } #' @export tokenize_regex.data.frame <- function(x, pattern = "\\s+", simplify = FALSE) { x <- corpus_df_as_corpus_vector(x) tokenize_regex(x, pattern, simplify) } #' @export tokenize_regex.default <- function(x, pattern = "\\s+", simplify = FALSE) { check_input(x) named <- names(x) out <- stri_split_regex(x, pattern = pattern, omit_empty = TRUE) if (!is.null(named)) names(out) <- named simplify_list(out, simplify) } tokenizers/R/tokenize_tweets.R0000644000176200001440000000625113256545214016231 0ustar liggesusers#' @rdname basic-tokenizers #' @param strip_url Should URLs (starting with \code{http(s)}) be preserved intact, or #' removed entirely? #' @importFrom stringi stri_split_charclass stri_detect_regex stri_sub #' @export #' @examples #' tokenize_tweets("@rOpenSci and #rstats see: https://cran.r-project.org", #' strip_punct = TRUE) #' tokenize_tweets("@rOpenSci and #rstats see: https://cran.r-project.org", #' strip_punct = FALSE) tokenize_tweets <- function(x, lowercase = TRUE, stopwords = NULL, strip_punct = TRUE, strip_url = FALSE, simplify = FALSE) { UseMethod("tokenize_tweets") } #' @export tokenize_tweets.data.frame <- function(x, lowercase = TRUE, stopwords = NULL, strip_punct = TRUE, strip_url = FALSE, simplify = FALSE) { x <- corpus_df_as_corpus_vector(x) tokenize_tweets(x, lowercase, stopwords, strip_punct, strip_url, simplify) } #' @export tokenize_tweets.default <- function(x, lowercase = TRUE, stopwords = NULL, strip_punct = TRUE, strip_url = FALSE, simplify = FALSE) { check_input(x) named <- names(x) # split on white space out <- stri_split_charclass(x, "\\p{WHITE_SPACE}") # get document indexes to vectorize tokens docindex <- c(1, cumsum(lengths(out))) # convert the list into a vector - avoids all those mapplys out <- unlist(out) # get the index of twitter hashtags and usernames index_twitter <- stri_detect_regex(out, "^#[A-Za-z]+\\w*|^@\\w+") # get the index of http(s) URLs index_url <- stri_detect_regex(out, "^http") if (strip_url) { out[index_url] <- "" } if (lowercase) { out[!(index_twitter | index_url)] <- stri_trans_tolower(out[!(index_twitter | index_url)]) } if (strip_punct) { twitter_chars <- stri_sub(out[index_twitter], 1, 1) out[!index_url] <- stri_replace_all_charclass(out[!index_url], "\\p{P}", "") #stri_replace_all_charclass(out[!index_url], "[^\\P{P}#@]", "") out[index_twitter] <- paste0(twitter_chars, out[index_twitter]) } else { # all except URLs out[!index_url] <- stri_split_boundaries(out[!index_url], type = "word") # rejoin the hashtags and usernames out[index_twitter] <- lapply(out[index_twitter], function(toks) { toks[2] <- paste0(toks[1], toks[2]) toks[-1] }) } # convert the vector back to a list out <- split(out, cut( seq_along(out), docindex, include.lowest = TRUE, labels = named )) # in case !strip_punct, otherwise has no effect out <- lapply(out, unlist) names(out) <- named # remove stopwords if (!is.null(stopwords)) out <- lapply(out, remove_stopwords, stopwords) # remove any blanks (from removing URLs) out <- lapply(out, function(toks) toks[toks != ""]) simplify_list(out, simplify) } tokenizers/R/ptb-tokenizer.R0000644000176200001440000001270613256545214015605 0ustar liggesusers#' Penn Treebank Tokenizer #' #' This function implements the Penn Treebank word tokenizer. #' #' @details This tokenizer uses regular expressions to tokenize text similar to #' the tokenization used in the Penn Treebank. It assumes that text has #' already been split into sentences. The tokenizer does the following: #' #' \itemize{ \item{splits common English contractions, e.g. \verb{don't} is #' tokenized into \verb{do n't} and \verb{they'll} is tokenized into -> #' \verb{they 'll},} \item{handles punctuation characters as separate tokens,} #' \item{splits commas and single quotes off from words, when they are #' followed by whitespace,} \item{splits off periods that occur at the end of #' the sentence.} } #' @details This function is a port of the Python NLTK version of the Penn #' Treebank Tokenizer. #' @param x A character vector or a list of character vectors to be tokenized #' into n-grams. If \code{x} is a character vector, it can be of any length, #' and each element will be tokenized separately. If \code{x} is a list of #' character vectors, each element of the list should have a length of 1. #' @param lowercase Should the tokens be made lower case? #' @param simplify \code{FALSE} by default so that a consistent value is #' returned regardless of length of input. If \code{TRUE}, then an input with #' a single element will return a character vector of tokens instead of a #' list. #' @return A list of character vectors containing the tokens, with one element #' in the list for each element that was passed as input. If \code{simplify = #' TRUE} and only a single element was passed as input, then the output is a #' character vector of tokens. #' @references #' \href{http://www.nltk.org/_modules/nltk/tokenize/treebank.html#TreebankWordTokenizer}{NLTK #' TreebankWordTokenizer} #' @importFrom stringi stri_c stri_replace_all_regex stri_trim_both #' stri_split_regex stri_opts_regex #' @importFrom stringi stri_trans_tolower #' @examples #' song <- list(paste0("How many roads must a man walk down\n", #' "Before you call him a man?"), #' paste0("How many seas must a white dove sail\n", #' "Before she sleeps in the sand?\n"), #' paste0("How many times must the cannonballs fly\n", #' "Before they're forever banned?\n"), #' "The answer, my friend, is blowin' in the wind.", #' "The answer is blowin' in the wind.") #' tokenize_ptb(song) #' tokenize_ptb(c("Good muffins cost $3.88\nin New York. Please buy me\ntwo of them.", #' "They'll save and invest more.", #' "Hi, I can't say hello.")) #' @export #' @rdname ptb-tokenizer tokenize_ptb <- function(x, lowercase = FALSE, simplify = FALSE) { UseMethod("tokenize_ptb") } #' @export tokenize_ptb.data.frame <- function(x, lowercase = FALSE, simplify = FALSE) { x <- corpus_df_as_corpus_vector(x) tokenize_ptb(x, lowercase, simplify) } #' @export tokenize_ptb.default <- function(x, lowercase = FALSE, simplify = FALSE) { check_input(x) named <- names(x) CONTRACTIONS2 <- c( "\\b(can)(not)\\b", "\\b(d)('ye)\\b", "\\b(gon)(na)\\b", "\\b(got)(ta)\\b", "\\b(lem)(me)\\b", "\\b(mor)('n)\\b", "\\b(wan)(na) " ) CONTRACTIONS3 <- c(" ('t)(is)\\b", " ('t)(was)\\b") CONTRACTIONS4 <- c("\\b(whad)(dd)(ya)\\b", "\\b(wha)(t)(cha)\\b") # Starting quotes x <- stri_replace_all_regex(x, '^\\"', '``') x <- stri_replace_all_regex(x, '(``)', '$1') x <- stri_replace_all_regex(x, '([ (\\[{<])"', '$1 `` ') # Punctuation x <- stri_replace_all_regex(x, '([:,])([^\\d])', ' $1 $2') x <- stri_replace_all_regex(x, '\\.{3}', ' ... ') x <- stri_replace_all_regex(x, '([,;@#$%&])', ' $1 ') x <- stri_replace_all_regex(x, '([^\\.])(\\.)([\\]\\)}>"\\\']*)?\\s*$', '$1 $2$3 ') x <- stri_replace_all_regex(x, '([?!])', ' $1 ') x <- stri_replace_all_regex(x, "([^'])' ", "$1 ' ") # parens, brackets, etc x <- stri_replace_all_regex(x, '([\\]\\[\\(\\)\\{\\}\\<\\>])', ' $1 ') x <- stri_replace_all_regex(x, '--', ' -- ') # add extra space x <- stri_c(" ", x, " ") # ending quotes x <- stri_replace_all_regex(x, '"', " '' ") x <- stri_replace_all_regex(x, "(\\S)('')", "\\1 \\2 ") x <- stri_replace_all_regex(x, "([^' ])('[sS]|'[mM]|'[dD]|') ", "$1 $2 ") x <- stri_replace_all_regex(x, "([^' ])('ll|'LL|'re|'RE|'ve|'VE|n't|N'T) ", "$1 $2 ") x <- stri_replace_all_regex( x, CONTRACTIONS2, " $1 $2 ", opts_regex = stri_opts_regex(case_insensitive = TRUE), vectorize_all = FALSE ) x <- stri_replace_all_regex( x, CONTRACTIONS3, " $1 $2 ", opts_regex = stri_opts_regex(case_insensitive = TRUE), vectorize_all = FALSE ) x <- stri_replace_all_regex( x, CONTRACTIONS4, " $1 $2 $3 ", opts_regex = stri_opts_regex(case_insensitive = TRUE), vectorize_all = FALSE ) # return x <- stri_split_regex(stri_trim_both(x), '\\s+') if (lowercase) { x <- lapply(x, stri_trans_tolower) } if (!is.null(named)) { names(x) <- named } simplify_list(x, simplify) } tokenizers/vignettes/0000755000176200001440000000000013257220650014461 5ustar liggesuserstokenizers/vignettes/introduction-to-tokenizers.Rmd0000644000176200001440000001233413256545214022471 0ustar liggesusers--- title: "Introduction to the tokenizers Package" author: "Lincoln Mullen" output: rmarkdown::html_vignette vignette: > %\VignetteIndexEntry{Introduction to the tokenizers Package} %\VignetteEngine{knitr::rmarkdown} %\VignetteEncoding{UTF-8} --- ```{r setup, include = FALSE} knitr::opts_chunk$set( collapse = TRUE, comment = "#>" ) ``` ## Package overview In natural language processing, tokenization is the process of breaking human-readable text into machine readable components. The most obvious way to tokenize a text is to split the text into words. But there are many other ways to tokenize a text, the most useful of which are provided by this package. The tokenizers in this package have a consistent interface. They all take either a character vector of any length, or a list where each element is a character vector of length one. The idea is that each element comprises a text. Then each function returns a list with the same length as the input vector, where each element in the list contains the tokens generated by the function. If the input character vector or list is named, then the names are preserved, so that the names can serve as identifiers. Using the following sample text, the rest of this vignette demonstrates the different kinds of tokenizers in this package. ```{r} library(tokenizers) options(max.print = 25) james <- paste0( "The question thus becomes a verbal one\n", "again; and our knowledge of all these early stages of thought and feeling\n", "is in any case so conjectural and imperfect that farther discussion would\n", "not be worth while.\n", "\n", "Religion, therefore, as I now ask you arbitrarily to take it, shall mean\n", "for us _the feelings, acts, and experiences of individual men in their\n", "solitude, so far as they apprehend themselves to stand in relation to\n", "whatever they may consider the divine_. Since the relation may be either\n", "moral, physical, or ritual, it is evident that out of religion in the\n", "sense in which we take it, theologies, philosophies, and ecclesiastical\n", "organizations may secondarily grow.\n" ) ``` ## Character and character-shingle tokenizers The character tokenizer splits texts into individual characters. ```{r} tokenize_characters(james)[[1]] ``` You can also tokenize into character-based shingles. ```{r} tokenize_character_shingles(james, n = 3, n_min = 3, strip_non_alphanum = FALSE)[[1]][1:20] ``` ## Word and word-stem tokenizers The word tokenizer splits texts into words. ```{r} tokenize_words(james) ``` Word stemming is provided by the [SnowballC](https://cran.r-project.org/package=SnowballC) package. ```{r} tokenize_word_stems(james) ``` You can also provide a vector of stopwords which will be omitted. The [stopwords package](https://github.com/quanteda/stopwords), which contains stopwords for many languages from several sources, is recommended. This argument also works with the n-gram and skip n-gram tokenizers. ```{r} library(stopwords) tokenize_words(james, stopwords = stopwords::stopwords("en")) ``` An alternative word stemmer often used in NLP that preserves punctuation and separates common English contractions is the Penn Treebank tokenizer. ```{r} tokenize_ptb(james) ``` ## N-gram and skip n-gram tokenizers An n-gram is a contiguous sequence of words containing at least `n_min` words and at most `n` words. This function will generate all such combinations of n-grams, omitting stopwords if desired. ```{r} tokenize_ngrams(james, n = 5, n_min = 2, stopwords = stopwords::stopwords("en")) ``` A skip n-gram is like an n-gram in that it takes the `n` and `n_min` parameters. But rather than returning contiguous sequences of words, it will also return sequences of n-grams skipping words with gaps between `0` and the value of `k`. This function generates all such sequences, again omitting stopwords if desired. Note that the number of tokens returned can be very large. ```{r} tokenize_skip_ngrams(james, n = 5, n_min = 2, k = 2, stopwords = stopwords::stopwords("en")) ``` ## Tweet tokenizer Tokenizing tweets requires special attention, since usernames (`@whoever`) and hashtags (`#hashtag`) use special characters that might otherwise be stripped away. ```{r} tokenize_tweets("Welcome, @user, to the tokenizers package. #rstats #forever") ``` ## Sentence and paragraph tokenizers Sometimes it is desirable to split texts into sentences or paragraphs prior to tokenizing into other forms. ```{r, collapse=FALSE} tokenize_sentences(james) tokenize_paragraphs(james) ``` ## Text chunking When one has a very long document, sometimes it is desirable to split the document into smaller chunks, each with the same length. This function chunks a document and gives it each of the chunks an ID to show their order. These chunks can then be further tokenized. ```{r} chunks <- chunk_text(mobydick, chunk_size = 100, doc_id = "mobydick") length(chunks) chunks[5:6] tokenize_words(chunks[5:6]) ``` ## Counting words, characters, sentences The package also offers functions for counting words, characters, and sentences in a format which works nicely with the rest of the functions. ```{r} count_words(mobydick) count_characters(mobydick) count_sentences(mobydick) ``` tokenizers/vignettes/tif-and-tokenizers.Rmd0000644000176200001440000000723213256545214020653 0ustar liggesusers--- title: "The Text Interchange Formats and the tokenizers Package" author: "Lincoln Mullen" output: rmarkdown::html_vignette vignette: > %\VignetteIndexEntry{The Text Interchange Formats and the tokenizers Package} %\VignetteEngine{knitr::rmarkdown} %\VignetteEncoding{UTF-8} --- ```{r setup, include = FALSE} knitr::opts_chunk$set( collapse = TRUE, comment = "#>" ) ``` The [Text Interchange Formats](https://github.com/ropensci/tif) are a set of standards defined at an [rOpenSci](https://ropensci.org/) sponsored [meeting in London](http://textworkshop17.ropensci.org/) in 2017. The formats allow R text analysis packages to target defined inputs and outputs for corpora, tokens, and document-term matrices. By adhering to these recommendations, R packages can buy into an interoperable ecosystem. The TIF recommendations are still a draft, but the tokenizers package implements its recommendation to accept both of the corpora formats and to output one of its recommended tokens formats. Consider these two recommended forms of a corpus. One (`corpus_c`) is a named character vector; the other (`corpus_d`) is a data frame. They both include a document ID and the full text for each item. The data frame format obviously allows for the use of other metadata fields besides the document ID, whereas the other format does not. Using the coercion functions in the tif package, one could switch back and forth between these formats. Tokenizers also supports a corpus formatted as a named list where each element is a character vector of length one (`corpus_l`), though this is not a part of the draft TIF standards. ```{r} # Named list (corpus_l <- list(man_comes_around = "There's a man goin' 'round takin' names", wont_back_down = "Well I won't back down, no I won't back down", bird_on_a_wire = "Like a bird on a wire")) # Named character vector (corpus_c <- unlist(corpus_l)) # Data frame (corpus_d <- data.frame(doc_id = names(corpus_c), text = unname(corpus_c), stringsAsFactors = FALSE)) ``` All of the tokenizers in this package can accept any of those formats and will return an identical output for each. ```{r} library(tokenizers) tokens_l <- tokenize_ngrams(corpus_l, n = 2) tokens_c <- tokenize_ngrams(corpus_c, n = 2) tokens_d <- tokenize_ngrams(corpus_c, n = 2) # Are all these identical? all(identical(tokens_l, tokens_c), identical(tokens_c, tokens_d), identical(tokens_l, tokens_d)) ``` The output of all of the tokenizers is a named list, where each element of the list corresponds to a document in the corpus. The names of the list are the document IDs, and the elements are character vectors containing the tokens. ```{r} tokens_l ``` This format can be coerced to a data frame of document IDs and tokens, one row per token, using the coercion functions in the tif package. That tokens data frame would look like this. ```{r, echo=FALSE} sample_tokens_df <- structure(list(doc_id = c("man_comes_around", "man_comes_around", "man_comes_around", "man_comes_around", "man_comes_around", "man_comes_around", "wont_back_down", "wont_back_down", "wont_back_down", "wont_back_down", "wont_back_down", "wont_back_down", "wont_back_down", "wont_back_down", "wont_back_down", "bird_on_a_wire", "bird_on_a_wire", "bird_on_a_wire", "bird_on_a_wire", "bird_on_a_wire"), token = c("there's a", "a man", "man goin", "goin round", "round takin", "takin names", "well i", "i won't", "won't back", "back down", "down no", "no i", "i won't", "won't back", "back down", "like a", "a bird", "bird on", "on a", "a wire")), .Names = c("doc_id", "token"), row.names = c(NA, -20L), class = "data.frame") head(sample_tokens_df, 10) ``` tokenizers/README.md0000644000176200001440000002371613257205112013735 0ustar liggesusers # tokenizers [![CRAN\_Status\_Badge](http://www.r-pkg.org/badges/version/tokenizers)](https://cran.r-project.org/package=tokenizers) [![DOI](http://joss.theoj.org/papers/10.21105/joss.00655/status.svg)](https://doi.org/10.21105/joss.00655) [![rOpenSci peer review](https://badges.ropensci.org/33_status.svg)](https://github.com/ropensci/onboarding/issues/33) [![CRAN\_Downloads](http://cranlogs.r-pkg.org/badges/grand-total/tokenizers)](https://cran.r-project.org/package=tokenizers) [![Travis-CI Build Status](https://travis-ci.org/ropensci/tokenizers.svg?branch=master)](https://travis-ci.org/ropensci/tokenizers) [![Appveyor Build status](https://ci.appveyor.com/api/projects/status/qx3vh3ukjgo99iu4/branch/master?svg=true)](https://ci.appveyor.com/project/lmullen/tokenizers-dkf3v/branch/master) [![Coverage Status](https://img.shields.io/codecov/c/github/ropensci/tokenizers/master.svg)](https://codecov.io/github/ropensci/tokenizers?branch=master) ## Overview This R package offers functions with a consistent interface to convert natural language text into tokens. It includes tokenizers for shingled n-grams, skip n-grams, words, word stems, sentences, paragraphs, characters, shingled characters, lines, tweets, Penn Treebank, and regular expressions, as well as functions for counting characters, words, and sentences, and a function for splitting longer texts into separate documents, each with the same number of words. The package is built on the [stringi](http://www.gagolewski.com/software/stringi/) and [Rcpp](http://www.rcpp.org/) packages for fast yet correct tokenization in UTF-8. See the “[Introduction to the tokenizers Package](http://lincolnmullen.com/software/tokenizers/articles/introduction-to-tokenizers.html)” vignette for an overview of all the functions in this package. This package complies with the standards for input and output recommended by the Text Interchange Formats. The TIF initiative was created at an rOpenSci meeting in 2017, and its recommendations are available as part of the [tif package](https://github.com/ropensci/tif). See the “[The Text Interchange Formats and the tokenizers Package](http://lincolnmullen.com/software/tokenizers/articles/tif-and-tokenizers.html)” vignette for an explanation of how this package fits into that ecosystem. ## Suggested citation If you use this package for your research, we would appreciate a citation. ``` r citation("tokenizers") #> #> To cite the tokenizers package in publications, please cite the #> paper in the Journal of Open Source Software: #> #> Lincoln A. Mullen et al., "Fast, Consistent Tokenization of #> Natural Language Text," Journal of Open Source Software 3, no. #> 23 (2018): 655, https://doi.org/10.21105/joss.00655. #> #> A BibTeX entry for LaTeX users is #> #> @Article{, #> title = {Fast, Consistent Tokenization of Natural Language Text}, #> author = {Lincoln A. Mullen and Kenneth Benoit and Os Keyes and Dmitry Selivanov and Jeffrey Arnold}, #> journal = {Journal of Open Source Software}, #> year = {2018}, #> volume = {3}, #> issue = {23}, #> pages = {655}, #> url = {https://doi.org/10.21105/joss.00655}, #> doi = {10.21105/joss.00655}, #> } ``` ## Installation You can install this package from CRAN: ``` r install.packages("tokenizers") ``` To get the development version from GitHub, use [devtools](https://github.com/hadley/devtools). ``` r # install.packages("devtools") devtools::install_github("ropensci/tokenizers") ``` ## Examples The tokenizers in this package have a consistent interface. They all take either a character vector of any length, or a list where each element is a character vector of length one, or a data.frame that adheres to the [tif corpus format](https://github.com/ropensci/tif). The idea is that each element (or row) comprises a text. Then each function returns a list with the same length as the input vector, where each element in the list contains the tokens generated by the function. If the input character vector or list is named, then the names are preserved, so that the names can serve as identifiers. For a tif-formatted data.frame, the `doc_id` field is used as the element names in the returned token list. ``` r library(magrittr) library(tokenizers) james <- paste0( "The question thus becomes a verbal one\n", "again; and our knowledge of all these early stages of thought and feeling\n", "is in any case so conjectural and imperfect that farther discussion would\n", "not be worth while.\n", "\n", "Religion, therefore, as I now ask you arbitrarily to take it, shall mean\n", "for us _the feelings, acts, and experiences of individual men in their\n", "solitude, so far as they apprehend themselves to stand in relation to\n", "whatever they may consider the divine_. Since the relation may be either\n", "moral, physical, or ritual, it is evident that out of religion in the\n", "sense in which we take it, theologies, philosophies, and ecclesiastical\n", "organizations may secondarily grow.\n" ) names(james) <- "varieties" tokenize_characters(james)[[1]] %>% head(50) #> [1] "t" "h" "e" "q" "u" "e" "s" "t" "i" "o" "n" "t" "h" "u" "s" "b" "e" #> [18] "c" "o" "m" "e" "s" "a" "v" "e" "r" "b" "a" "l" "o" "n" "e" "a" "g" #> [35] "a" "i" "n" "a" "n" "d" "o" "u" "r" "k" "n" "o" "w" "l" "e" "d" tokenize_character_shingles(james)[[1]] %>% head(20) #> [1] "the" "heq" "equ" "que" "ues" "est" "sti" "tio" "ion" "ont" "nth" #> [12] "thu" "hus" "usb" "sbe" "bec" "eco" "com" "ome" "mes" tokenize_words(james)[[1]] %>% head(10) #> [1] "the" "question" "thus" "becomes" "a" "verbal" #> [7] "one" "again" "and" "our" tokenize_word_stems(james)[[1]] %>% head(10) #> [1] "the" "question" "thus" "becom" "a" "verbal" #> [7] "one" "again" "and" "our" tokenize_sentences(james) #> $varieties #> [1] "The question thus becomes a verbal one again; and our knowledge of all these early stages of thought and feeling is in any case so conjectural and imperfect that farther discussion would not be worth while." #> [2] "Religion, therefore, as I now ask you arbitrarily to take it, shall mean for us _the feelings, acts, and experiences of individual men in their solitude, so far as they apprehend themselves to stand in relation to whatever they may consider the divine_." #> [3] "Since the relation may be either moral, physical, or ritual, it is evident that out of religion in the sense in which we take it, theologies, philosophies, and ecclesiastical organizations may secondarily grow." tokenize_paragraphs(james) #> $varieties #> [1] "The question thus becomes a verbal one again; and our knowledge of all these early stages of thought and feeling is in any case so conjectural and imperfect that farther discussion would not be worth while." #> [2] "Religion, therefore, as I now ask you arbitrarily to take it, shall mean for us _the feelings, acts, and experiences of individual men in their solitude, so far as they apprehend themselves to stand in relation to whatever they may consider the divine_. Since the relation may be either moral, physical, or ritual, it is evident that out of religion in the sense in which we take it, theologies, philosophies, and ecclesiastical organizations may secondarily grow. " tokenize_ngrams(james, n = 5, n_min = 2)[[1]] %>% head(10) #> [1] "the question" "the question thus" #> [3] "the question thus becomes" "the question thus becomes a" #> [5] "question thus" "question thus becomes" #> [7] "question thus becomes a" "question thus becomes a verbal" #> [9] "thus becomes" "thus becomes a" tokenize_skip_ngrams(james, n = 5, k = 2)[[1]] %>% head(10) #> [1] "the" "the question" "the thus" #> [4] "the becomes" "the question thus" "the question becomes" #> [7] "the question a" "the thus becomes" "the thus a" #> [10] "the thus verbal" tokenize_ptb(james)[[1]] %>% head(10) #> [1] "The" "question" "thus" "becomes" "a" "verbal" #> [7] "one" "again" ";" "and" tokenize_lines(james)[[1]] %>% head(5) #> [1] "The question thus becomes a verbal one" #> [2] "again; and our knowledge of all these early stages of thought and feeling" #> [3] "is in any case so conjectural and imperfect that farther discussion would" #> [4] "not be worth while." #> [5] "Religion, therefore, as I now ask you arbitrarily to take it, shall mean" tokenize_tweets("Hey @handle, #rstats is awesome!")[[1]] #> [1] "hey" "@handle" "#rstats" "is" "awesome" ``` The package also contains functions to count words, characters, and sentences, and these functions follow the same consistent interface. ``` r count_words(james) #> varieties #> 112 count_characters(james) #> varieties #> 673 count_sentences(james) #> varieties #> 13 ``` The `chunk_text()` function splits a document into smaller chunks, each with the same number of words. ## Contributing Contributions to the package are more than welcome. One way that you can help is by using this package in your R package for natural language processing. If you want to contribute a tokenization function to this package, it should follow the same conventions as the rest of the functions whenever it makes sense to do so. Please note that this project is released with a [Contributor Code of Conduct](CONDUCT.md). By participating in this project you agree to abide by its terms. ----- [![rOpenSCi logo](http://ropensci.org/public_images/github_footer.png)](http://ropensci.org) tokenizers/MD50000644000176200001440000000631313257243614012771 0ustar liggesusers745f4f5c9ce42a84ca7fa9090bdcbc06 *DESCRIPTION 7a30856cdb7e752d64265b32a0922f6f *LICENSE 606f23e675bef96b574d101c62985c1d *NAMESPACE fba61afea9fac249a7767786efb6a10a *NEWS.md 94e39027d0e7806d4e3566a7a6d0e280 *R/RcppExports.R 71c4a90a0518c35485bfa6278315e390 *R/basic-tokenizers.R 106595ac5246e2a974fcc60b1b4af844 *R/character-shingles-tokenizers.R 5534dbd5a62a6066eb2028074cfbac5d *R/chunk-text.R d6f1464c9ff1e3d4a1056b64f6f64857 *R/coercion.R dbf0c1e5f12ed05731c9c112297e3ef5 *R/data-docs.R fde384b035d0ac834323cc8be76b084a *R/ngram-tokenizers.R 338e3bba2a45db52383a4c8ecacde68c *R/ptb-tokenizer.R c74106244d4cee4cdc9da0e674072094 *R/stem-tokenizers.R feca69d2c0b9d7ce3d0d3a217bd147f4 *R/tokenize_tweets.R cd40ea2f6a3fc48dedc65631864a671d *R/tokenizers-package.r 06d95e412855098fb7a94b0c5bfd6ea8 *R/utils.R d9e433bf9116749218ca662c9ec50fc1 *R/wordcount.R 012d3ad763cdf25a072fd943fad75ebf *README.md 6d8110bbfde9d62859695febc9d4fb3a *build/vignette.rds 8e14891d161d472a5322ba7fb6c76002 *data/mobydick.rda ca0a364901d1fe6d9d909c1c62e3c1fa *inst/CITATION ac046593710fe7a0ce061a6332ec76d4 *inst/doc/introduction-to-tokenizers.R fecf3fa83860b5a3ad7673045ab9d475 *inst/doc/introduction-to-tokenizers.Rmd 4d3dcf157861975d9c5f11f12f57d888 *inst/doc/introduction-to-tokenizers.html 39e8881eb3fdd7c856de6acefe999582 *inst/doc/tif-and-tokenizers.R 83b0cf559b9f00b5894edad9cbbf3c57 *inst/doc/tif-and-tokenizers.Rmd 0493814332830f299ef0b1e167ea6530 *inst/doc/tif-and-tokenizers.html 49171babab7c36f010cd66522462f75d *man/basic-tokenizers.Rd e5bb3dc97546882a21a24e8577d7c696 *man/chunk_text.Rd 17839e1d2f204f271263b1059b8fe08c *man/mobydick.Rd 1f7b13750e553011561287d91a9fb076 *man/ngram-tokenizers.Rd 8bf9b4ec8ed84d24f965277004aef956 *man/ptb-tokenizer.Rd 2346cd4d337b32e153f3054534bdc55e *man/shingle-tokenizers.Rd ec91146ab4882c2024e71d4e0171a628 *man/stem-tokenizers.Rd 5f1f3388033d3e6d476269342072aaf9 *man/tokenizers.Rd eac6ac67fd537abfd55cc275fcadf6a6 *man/word-counting.Rd 239069bbf8bd66d27f035cb55eff33dd *src/RcppExports.cpp eef7fdb1e51b4642a7772b721ed7e0ed *src/shingle_ngrams.cpp 50fed60f2603e066bba741b9d6c8a9e1 *src/skip_ngrams.cpp c0953a311f15601701b91fb4f8945fa0 *tests/testthat.R 6d62165c5b0fe90e453a5896636fc9c3 *tests/testthat/helper-data.R 4b370e97f5104ffebe37108982ac0cd1 *tests/testthat/moby-ch1.txt 6c8bd3a896047bcefc012014bc50ec30 *tests/testthat/moby-ch2.txt 4918dc776f7f311580d63698b6bdc859 *tests/testthat/moby-ch3.txt b9f9a569a7a0383f977930bc30f8ca0b *tests/testthat/test-basic.R 605e06f975d347e44f870e11a0d06f11 *tests/testthat/test-chunking.R 36a2b76abce927a2de29b757252c1530 *tests/testthat/test-encoding.R 41915a16dec981f71d543bf520aba151 *tests/testthat/test-ngrams.R 4f25fb5fbce016f11b1bf7fc6a30a5ba *tests/testthat/test-ptb.R 8c85339e86effefeb9618fb023e85eaa *tests/testthat/test-shingles.R 0767ce25b7f91e9ddf105dc3485cae8d *tests/testthat/test-stem.R 60314e70e34b2db422705909c2c964df *tests/testthat/test-tif.R e9575adee44db0cd38582683769fae2b *tests/testthat/test-tokenize_tweets.R 7e03244a849fdef37d4fe9007369045c *tests/testthat/test-utils.R 71b73f80c997d10284759f56e004dc33 *tests/testthat/test-wordcount.R fecf3fa83860b5a3ad7673045ab9d475 *vignettes/introduction-to-tokenizers.Rmd 83b0cf559b9f00b5894edad9cbbf3c57 *vignettes/tif-and-tokenizers.Rmd tokenizers/build/0000755000176200001440000000000013257220650013550 5ustar liggesuserstokenizers/build/vignette.rds0000644000176200001440000000043013257220650016104 0ustar liggesusers}PKO1.lA!11hOثlbv`v[ӎA=YUI;ofyc}p ɄĐes6"} : L[_|Ruz1FG\O;\b%s4KuD/ٱn XW!mv{!1<|lG΁5gd>˂ًS =ϵuw]voMl'**D+2]8'o_btokenizers/DESCRIPTION0000644000176200001440000000432213257243614014165 0ustar liggesusersPackage: tokenizers Type: Package Title: Fast, Consistent Tokenization of Natural Language Text Version: 0.2.1 Date: 2018-03-29 Description: Convert natural language text into tokens. Includes tokenizers for shingled n-grams, skip n-grams, words, word stems, sentences, paragraphs, characters, shingled characters, lines, tweets, Penn Treebank, regular expressions, as well as functions for counting characters, words, and sentences, and a function for splitting longer texts into separate documents, each with the same number of words. The tokenizers have a consistent interface, and the package is built on the 'stringi' and 'Rcpp' packages for fast yet correct tokenization in 'UTF-8'. License: MIT + file LICENSE LazyData: yes Authors@R: c(person("Lincoln", "Mullen", role = c("aut", "cre"), email = "lincoln@lincolnmullen.com", comment = c(ORCID = "0000-0001-5103-6917")), person("Os", "Keyes", role = c("ctb"), email = "ironholds@gmail.com", comment = c(ORCID = "0000-0001-5196-609X")), person("Dmitriy", "Selivanov", role = c("ctb"), email = "selivanov.dmitriy@gmail.com"), person("Jeffrey", "Arnold", role = c("ctb"), email = "jeffrey.arnold@gmail.com", comment = c(ORCID = "0000-0001-9953-3904")), person("Kenneth", "Benoit", role = c("ctb"), email = "kbenoit@lse.ac.uk", comment = c(ORCID = "0000-0002-0797-564X"))) URL: https://lincolnmullen.com/software/tokenizers/ BugReports: https://github.com/ropensci/tokenizers/issues RoxygenNote: 6.0.1 Depends: R (>= 3.1.3) Imports: stringi (>= 1.0.1), Rcpp (>= 0.12.3), SnowballC (>= 0.5.1) LinkingTo: Rcpp Suggests: covr, knitr, rmarkdown, stopwords (>= 0.9.0), testthat VignetteBuilder: knitr NeedsCompilation: yes Packaged: 2018-03-29 17:26:00 UTC; lmullen Author: Lincoln Mullen [aut, cre] (), Os Keyes [ctb] (), Dmitriy Selivanov [ctb], Jeffrey Arnold [ctb] (), Kenneth Benoit [ctb] () Maintainer: Lincoln Mullen Repository: CRAN Date/Publication: 2018-03-29 20:07:40 UTC tokenizers/man/0000755000176200001440000000000013256545214013231 5ustar liggesuserstokenizers/man/chunk_text.Rd0000644000176200001440000000247113256545214015700 0ustar liggesusers% Generated by roxygen2: do not edit by hand % Please edit documentation in R/chunk-text.R \name{chunk_text} \alias{chunk_text} \title{Chunk text into smaller segments} \usage{ chunk_text(x, chunk_size = 100, doc_id = names(x), ...) } \arguments{ \item{x}{A character vector or a list of character vectors to be tokenized into n-grams. If \code{x} is a character vector, it can be of any length, and each element will be chunked separately. If \code{x} is a list of character vectors, each element of the list should have a length of 1.} \item{chunk_size}{The number of words in each chunk.} \item{doc_id}{The document IDs as a character vector. This will be taken from the names of the \code{x} vector if available. \code{NULL} is acceptable.} \item{...}{Arguments passed on to \code{\link{tokenize_words}}.} } \description{ Given a text or vector/list of texts, break the texts into smaller segments each with the same number of words. This allows you to treat a very long document, such as a novel, as a set of smaller documents. } \details{ Chunking the text passes it through \code{\link{tokenize_words}}, which will strip punctuation and lowercase the text unless you provide arguments to pass along to that function. } \examples{ \dontrun{ chunked <- chunk_text(mobydick, chunk_size = 100) length(chunked) chunked[1:3] } } tokenizers/man/stem-tokenizers.Rd0000644000176200001440000000463713070504253016665 0ustar liggesusers% Generated by roxygen2: do not edit by hand % Please edit documentation in R/stem-tokenizers.R \name{tokenize_word_stems} \alias{tokenize_word_stems} \title{Word stem tokenizer} \usage{ tokenize_word_stems(x, language = "english", stopwords = NULL, simplify = FALSE) } \arguments{ \item{x}{A character vector or a list of character vectors to be tokenized into n-grams. If \code{x} is a character vector, it can be of any length, and each element will be tokenized separately. If \code{x} is a list of character vectors, where each element of the list should have a length of 1.} \item{language}{The language to use for word stemming. This must be one of the languages available in the SnowballC package. A list is provided by \code{\link[SnowballC]{getStemLanguages}}.} \item{stopwords}{A character vector of stop words to be excluded} \item{simplify}{\code{FALSE} by default so that a consistent value is returned regardless of length of input. If \code{TRUE}, then an input with a single element will return a character vector of tokens instead of a list.} } \value{ A list of character vectors containing the tokens, with one element in the list for each element that was passed as input. If \code{simplify = TRUE} and only a single element was passed as input, then the output is a character vector of tokens. } \description{ This function turns its input into a character vector of word stems. This is just a wrapper around the \code{\link[SnowballC]{wordStem}} function from the SnowballC package which does the heavy lifting, but this function provides a consistent interface with the rest of the tokenizers in this package. The input can be a character vector of any length, or a list of character vectors where each character vector in the list has a length of 1. } \details{ This function will strip all white space and punctuation and make all word stems lowercase. } \examples{ song <- paste0("How many roads must a man walk down\\n", "Before you call him a man?\\n", "How many seas must a white dove sail\\n", "Before she sleeps in the sand?\\n", "\\n", "How many times must the cannonballs fly\\n", "Before they're forever banned?\\n", "The answer, my friend, is blowin' in the wind.\\n", "The answer is blowin' in the wind.\\n") tokenize_word_stems(song) } \seealso{ \code{\link[SnowballC]{wordStem}} } tokenizers/man/tokenizers.Rd0000644000176200001440000000142013070504253015702 0ustar liggesusers% Generated by roxygen2: do not edit by hand % Please edit documentation in R/tokenizers-package.r \docType{package} \name{tokenizers} \alias{tokenizers} \alias{tokenizers-package} \title{Tokenizers} \description{ A collection of functions with a consistent interface to convert natural language text into tokens. } \details{ The tokenizers in this package have a consistent interface. They all take either a character vector of any length, or a list where each element is a character vector of length one. The idea is that each element comprises a text. Then each function returns a list with the same length as the input vector, where each element in the list are the tokens generated by the function. If the input character vector or list is named, then the names are preserved. } tokenizers/man/word-counting.Rd0000644000176200001440000000202413252224016016303 0ustar liggesusers% Generated by roxygen2: do not edit by hand % Please edit documentation in R/wordcount.R \name{count_words} \alias{count_words} \alias{count_characters} \alias{count_sentences} \title{Count words, sentences, characters} \usage{ count_words(x) count_characters(x) count_sentences(x) } \arguments{ \item{x}{A character vector or a list of character vectors. If \code{x} is a character vector, it can be of any length, and each element will be tokenized separately. If \code{x} is a list of character vectors, each element of the list should have a length of 1.} } \value{ An integer vector containing the counted elements. If the input vector or list has names, they will be preserved. } \description{ Count words, sentences, and characters in input texts. These functions use the \code{stringi} package, so they handle the counting of Unicode strings (e.g., characters with diacritical marks) in a way that makes sense to people counting characters. } \examples{ count_words(mobydick) count_sentences(mobydick) count_characters(mobydick) } tokenizers/man/shingle-tokenizers.Rd0000644000176200001440000000377113070504253017344 0ustar liggesusers% Generated by roxygen2: do not edit by hand % Please edit documentation in R/character-shingles-tokenizers.R \name{tokenize_character_shingles} \alias{tokenize_character_shingles} \title{Character shingle tokenizers} \usage{ tokenize_character_shingles(x, n = 3L, n_min = n, lowercase = TRUE, strip_non_alphanum = TRUE, simplify = FALSE) } \arguments{ \item{x}{A character vector or a list of character vectors to be tokenized into character shingles. If \code{x} is a character vector, it can be of any length, and each element will be tokenized separately. If \code{x} is a list of character vectors, each element of the list should have a length of 1.} \item{n}{The number of characters in each shingle. This must be an integer greater than or equal to 1.} \item{n_min}{This must be an integer greater than or equal to 1, and less than or equal to \code{n}.} \item{lowercase}{Should the characters be made lower case?} \item{strip_non_alphanum}{Should punctuation and white space be stripped?} \item{simplify}{\code{FALSE} by default so that a consistent value is returned regardless of length of input. If \code{TRUE}, then an input with a single element will return a character vector of tokens instead of a list.} } \value{ A list of character vectors containing the tokens, with one element in the list for each element that was passed as input. If \code{simplify = TRUE} and only a single element was passed as input, then the output is a character vector of tokens. } \description{ The character shingle tokenizer functions like an n-gram tokenizer, except the units that are shingled are characters instead of words. Options to the function let you determine whether non-alphanumeric characters like punctuation should be retained or discarded. } \examples{ x <- c("Now is the hour of our discontent") tokenize_character_shingles(x) tokenize_character_shingles(x, n = 5) tokenize_character_shingles(x, n = 5, strip_non_alphanum = FALSE) tokenize_character_shingles(x, n = 5, n_min = 3, strip_non_alphanum = FALSE) } tokenizers/man/ptb-tokenizer.Rd0000644000176200001440000000511413252224016016304 0ustar liggesusers% Generated by roxygen2: do not edit by hand % Please edit documentation in R/ptb-tokenizer.R \name{tokenize_ptb} \alias{tokenize_ptb} \title{Penn Treebank Tokenizer} \usage{ tokenize_ptb(x, lowercase = FALSE, simplify = FALSE) } \arguments{ \item{x}{A character vector or a list of character vectors to be tokenized into n-grams. If \code{x} is a character vector, it can be of any length, and each element will be tokenized separately. If \code{x} is a list of character vectors, each element of the list should have a length of 1.} \item{lowercase}{Should the tokens be made lower case?} \item{simplify}{\code{FALSE} by default so that a consistent value is returned regardless of length of input. If \code{TRUE}, then an input with a single element will return a character vector of tokens instead of a list.} } \value{ A list of character vectors containing the tokens, with one element in the list for each element that was passed as input. If \code{simplify = TRUE} and only a single element was passed as input, then the output is a character vector of tokens. } \description{ This function implements the Penn Treebank word tokenizer. } \details{ This tokenizer uses regular expressions to tokenize text similar to the tokenization used in the Penn Treebank. It assumes that text has already been split into sentences. The tokenizer does the following: \itemize{ \item{splits common English contractions, e.g. \verb{don't} is tokenized into \verb{do n't} and \verb{they'll} is tokenized into -> \verb{they 'll},} \item{handles punctuation characters as separate tokens,} \item{splits commas and single quotes off from words, when they are followed by whitespace,} \item{splits off periods that occur at the end of the sentence.} } This function is a port of the Python NLTK version of the Penn Treebank Tokenizer. } \examples{ song <- list(paste0("How many roads must a man walk down\\n", "Before you call him a man?"), paste0("How many seas must a white dove sail\\n", "Before she sleeps in the sand?\\n"), paste0("How many times must the cannonballs fly\\n", "Before they're forever banned?\\n"), "The answer, my friend, is blowin' in the wind.", "The answer is blowin' in the wind.") tokenize_ptb(song) tokenize_ptb(c("Good muffins cost $3.88\\nin New York. Please buy me\\ntwo of them.", "They'll save and invest more.", "Hi, I can't say hello.")) } \references{ \href{http://www.nltk.org/_modules/nltk/tokenize/treebank.html#TreebankWordTokenizer}{NLTK TreebankWordTokenizer} } tokenizers/man/ngram-tokenizers.Rd0000644000176200001440000000632013070504253017010 0ustar liggesusers% Generated by roxygen2: do not edit by hand % Please edit documentation in R/ngram-tokenizers.R \name{ngram-tokenizers} \alias{ngram-tokenizers} \alias{tokenize_ngrams} \alias{tokenize_skip_ngrams} \title{N-gram tokenizers} \usage{ tokenize_ngrams(x, lowercase = TRUE, n = 3L, n_min = n, stopwords = character(), ngram_delim = " ", simplify = FALSE) tokenize_skip_ngrams(x, lowercase = TRUE, n_min = 1, n = 3, k = 1, stopwords = character(), simplify = FALSE) } \arguments{ \item{x}{A character vector or a list of character vectors to be tokenized into n-grams. If \code{x} is a character vector, it can be of any length, and each element will be tokenized separately. If \code{x} is a list of character vectors, each element of the list should have a length of 1.} \item{lowercase}{Should the tokens be made lower case?} \item{n}{The number of words in the n-gram. This must be an integer greater than or equal to 1.} \item{n_min}{This must be an integer greater than or equal to 1, and less than or equal to \code{n}.} \item{stopwords}{A character vector of stop words to be excluded from the n-grams.} \item{ngram_delim}{The separator between words in an n-gram.} \item{simplify}{\code{FALSE} by default so that a consistent value is returned regardless of length of input. If \code{TRUE}, then an input with a single element will return a character vector of tokens instead of a list.} \item{k}{For the skip n-gram tokenizer, the maximum skip distance between words. The function will compute all skip n-grams between \code{0} and \code{k}.} } \value{ A list of character vectors containing the tokens, with one element in the list for each element that was passed as input. If \code{simplify = TRUE} and only a single element was passed as input, then the output is a character vector of tokens. } \description{ These functions tokenize their inputs into different kinds of n-grams. The input can be a character vector of any length, or a list of character vectors where each character vector in the list has a length of 1. See details for an explanation of what each function does. } \details{ \describe{ \item{\code{tokenize_ngrams}:}{ Basic shingled n-grams. A contiguous subsequence of \code{n} words. This will compute shingled n-grams for every value of between \code{n_min} (which must be at least 1) and \code{n}. } \item{\code{tokenize_skip_ngrams}:}{Skip n-grams. A subsequence of \code{n} words which are at most a gap of \code{k} words between them. The skip n-grams will be calculated for all values from \code{0} to \code{k}. } } These functions will strip all punctuation and normalize all whitespace to a single space character. } \examples{ song <- paste0("How many roads must a man walk down\\n", "Before you call him a man?\\n", "How many seas must a white dove sail\\n", "Before she sleeps in the sand?\\n", "\\n", "How many times must the cannonballs fly\\n", "Before they're forever banned?\\n", "The answer, my friend, is blowin' in the wind.\\n", "The answer is blowin' in the wind.\\n") tokenize_ngrams(song, n = 4) tokenize_ngrams(song, n = 4, n_min = 1) tokenize_skip_ngrams(song, n = 4, k = 2) } tokenizers/man/mobydick.Rd0000644000176200001440000000060313256545214015320 0ustar liggesusers% Generated by roxygen2: do not edit by hand % Please edit documentation in R/data-docs.R \docType{data} \name{mobydick} \alias{mobydick} \title{The text of Moby Dick} \format{A named character vector with length 1.} \source{ \url{http://www.gutenberg.org/} } \usage{ mobydick } \description{ The text of Moby Dick, by Herman Melville, taken from Project Gutenberg. } \keyword{datasets} tokenizers/man/basic-tokenizers.Rd0000644000176200001440000000704613256545214017003 0ustar liggesusers% Generated by roxygen2: do not edit by hand % Please edit documentation in R/basic-tokenizers.R, R/tokenize_tweets.R \name{basic-tokenizers} \alias{basic-tokenizers} \alias{tokenize_characters} \alias{tokenize_words} \alias{tokenize_sentences} \alias{tokenize_lines} \alias{tokenize_paragraphs} \alias{tokenize_regex} \alias{tokenize_tweets} \title{Basic tokenizers} \usage{ tokenize_characters(x, lowercase = TRUE, strip_non_alphanum = TRUE, simplify = FALSE) tokenize_words(x, lowercase = TRUE, stopwords = NULL, strip_punct = TRUE, strip_numeric = FALSE, simplify = FALSE) tokenize_sentences(x, lowercase = FALSE, strip_punct = FALSE, simplify = FALSE) tokenize_lines(x, simplify = FALSE) tokenize_paragraphs(x, paragraph_break = "\\n\\n", simplify = FALSE) tokenize_regex(x, pattern = "\\\\s+", simplify = FALSE) tokenize_tweets(x, lowercase = TRUE, stopwords = NULL, strip_punct = TRUE, strip_url = FALSE, simplify = FALSE) } \arguments{ \item{x}{A character vector or a list of character vectors to be tokenized into n-grams. If \code{x} is a character vector, it can be of any length, and each element will be tokenized separately. If \code{x} is a list of character vectors, where each element of the list should have a length of 1.} \item{lowercase}{Should the tokens be made lower case? The default value varies by tokenizer; it is only \code{TRUE} by default for the tokenizers that you are likely to use last.} \item{strip_non_alphanum}{Should punctuation and white space be stripped?} \item{simplify}{\code{FALSE} by default so that a consistent value is returned regardless of length of input. If \code{TRUE}, then an input with a single element will return a character vector of tokens instead of a list.} \item{stopwords}{A character vector of stop words to be excluded.} \item{strip_punct}{Should punctuation be stripped?} \item{strip_numeric}{Should numbers be stripped?} \item{paragraph_break}{A string identifying the boundary between two paragraphs.} \item{pattern}{A regular expression that defines the split.} \item{strip_url}{Should URLs (starting with \code{http(s)}) be preserved intact, or removed entirely?} } \value{ A list of character vectors containing the tokens, with one element in the list for each element that was passed as input. If \code{simplify = TRUE} and only a single element was passed as input, then the output is a character vector of tokens. } \description{ These functions perform basic tokenization into words, sentences, paragraphs, lines, and characters. The functions can be piped into one another to create at most two levels of tokenization. For instance, one might split a text into paragraphs and then word tokens, or into sentences and then word tokens. } \examples{ song <- paste0("How many roads must a man walk down\\n", "Before you call him a man?\\n", "How many seas must a white dove sail\\n", "Before she sleeps in the sand?\\n", "\\n", "How many times must the cannonballs fly\\n", "Before they're forever banned?\\n", "The answer, my friend, is blowin' in the wind.\\n", "The answer is blowin' in the wind.\\n") tokenize_words(song) tokenize_words(song, strip_punct = FALSE) tokenize_sentences(song) tokenize_paragraphs(song) tokenize_lines(song) tokenize_characters(song) tokenize_tweets("@rOpenSci and #rstats see: https://cran.r-project.org", strip_punct = TRUE) tokenize_tweets("@rOpenSci and #rstats see: https://cran.r-project.org", strip_punct = FALSE) } tokenizers/LICENSE0000644000176200001440000000005412775200571013461 0ustar liggesusersYEAR: 2016 COPYRIGHT HOLDER: Lincoln Mullen