biocViews/0000755000175000017500000000000014147514557012353 5ustar nileshnileshbiocViews/DESCRIPTION0000644000175000017500000000274714140322302014045 0ustar nileshnileshPackage: biocViews Title: Categorized views of R package repositories Description: Infrastructure to support 'views' used to classify Bioconductor packages. 'biocViews' are directed acyclic graphs of terms from a controlled vocabulary. There are three major classifications, corresponding to 'software', 'annotation', and 'experiment data' packages. biocViews: Infrastructure URL: http://bioconductor.org/packages/BiocViews BugReports: https://github.com/Bioconductor/BiocViews/issues Version: 1.62.1 License: Artistic-2.0 Author: VJ Carey , BJ Harshfield , S Falcon , Sonali Arora, Lori Shepherd Maintainer: Bioconductor Package Maintainer Depends: R (>= 3.6.0) Imports: Biobase, graph (>= 1.9.26), methods, RBGL (>= 1.13.5), tools, utils, XML, RCurl, RUnit, BiocManager Suggests: BiocGenerics, knitr, commonmark Collate: AllClasses.R AllGenerics.R as-methods.R htmlDoc-methods.R htmlFilename-methods.R htmlValue-methods.R show-methods.R getPackNames.R packageDetails.R pump.R repository.R showvoc.R getPackageNEWS.R validation_tests.R recommendBiocViews.R git_url: https://git.bioconductor.org/packages/biocViews git_branch: RELEASE_3_14 git_last_commit: b4b6834 git_last_commit_date: 2021-11-01 Date/Publication: 2021-11-02 NeedsCompilation: no Packaged: 2021-11-02 20:51:14 UTC; biocbuild biocViews/README.md0000644000175000017500000000071714136047116013625 0ustar nileshnilesh[](https://bioconductor.org/) **biocViews** is an R/Bioconductor package that implements the infrastructure to support _views_ used to classify Bioconductor packages. See https://bioconductor.org/packages/biocViews for more information including how to install the release version of the package (please refrain from installing directly from GitHub). biocViews/man/0000755000175000017500000000000014136047116013114 5ustar nileshnileshbiocViews/man/extractTopLevelFiles.Rd0000644000175000017500000000103414136047116017511 0ustar nileshnilesh\name{extractTopLevelFiles} \alias{extractTopLevelFiles} \title{Extract files from the top level of source package tarballs} \description{ Extracts files from source tarballs of packages. } \usage{ extractTopLevelFiles(reposRoot, srcContrib, destDir, fileName) } %- maybe also 'usage' for other objects documented here. \arguments{ \item{reposRoot}{Top level path for CRAN-style repos} \item{srcContrib}{Location of source packages} \item{destDir}{where to extract} \item{fileName}{name of file to extract} } \keyword{utilities} biocViews/man/recommendPackages.Rd0000644000175000017500000000252614136047116017020 0ustar nileshnilesh\name{recommendPackages} \alias{recommendPackages} \title{ Recommend Packages using existing biocViews. } \description{ biocViews are "keywords" which are used to describe a given package. They are broadly divided into three categories, representing the type of packages present in the Bioconductor Project - Software, Annotation Data and Experiment Data. One can find packages which are tagged with biocViews using this function. } \usage{ recommendPackages(biocViews, use.release=TRUE, intersect.views=TRUE) } \arguments{ \item{biocViews}{ A character vector containing a list of biocViews. Currently only biocViews from the software branch are supported. } \item{use.release}{ A logical character indicating if you want packages recommended from the release branch of Biocondutor. } \item{intersect.views}{ A logical character indicating if you want packages which are tagged with all the input biocViews or packages tagged with any one or more of the biocViews. } } \value{ A character vector containing a list of packages. If multiple biocViews are given as input, the result returns packages tagged with all or atleast one of the input biocViews. } \author{ Sonali Arora. } \examples{ recommendPackages(c("clustering", "classification")) } biocViews/man/extractManuals.Rd0000644000175000017500000000262214136047116016400 0ustar nileshnilesh\name{extractManuals} \alias{extractManuals} \alias{extractCitations} \title{Extract Rd man pages and build pdf reference manuals from local package repository} \description{ This function extracts Rd man pages and builds pdf reference manuals from the \code{man} subdirectory of R source packages archives (\code{.tar.gz}) found in a local package repository. All Rd files found in \code{man} will be extracted and used during the pdf construction process. Only source package archives will be processed. The constructed pdf files will be extracted under \code{destDir} and will be found in \code{PKGNAME/man/*.pdf}. Prior to extraction, all Rd and pdf files in \code{destDir/PKGNAME/man} will be removed. } \usage{ extractManuals(reposRoot, srcContrib, destDir) } %- maybe also 'usage' for other objects documented here. \arguments{ \item{reposRoot}{character vector giving the path to the root of the local CRAN-style package repository} \item{srcContrib}{character vector giving the relative path from the \code{reposRoot} to the source packages. In a standard CRAN-style repository, this will be \code{src/contrib}.} \item{destDir}{character vector specifying the directory in which the extracted files will be written. If missing, files will be written to \code{/manuals}.} } \author{Patrick Aboyoun} \keyword{utilities} biocViews/man/writeTopLevelView.Rd0000644000175000017500000000133514136047116017045 0ustar nileshnilesh\name{writeTopLevelView} \alias{writeTopLevelView} \title{Write the view for the root of a vocabulary to disk} \description{ Given a directory and a vocabulary represented as a \code{graphNEL} containing a DAG of terms, write the top-level term to disk as HTML. This assumes your vocabulary has a single term with no parents. } \usage{ writeTopLevelView(dir, vocab) } %- maybe also 'usage' for other objects documented here. \arguments{ \item{dir}{A string giving a directory in which to write the HTML file} \item{vocab}{A \code{graphNEL} instance giving the DAG of terms. It should have a root node. That is, there should be exactly one node with no incoming edges.} } \author{S. Falcon} \keyword{utilities } biocViews/man/RepositoryDetail-class.Rd0000644000175000017500000000247714136047116020022 0ustar nileshnilesh\name{RepositoryDetail-class} \docType{class} \alias{RepositoryDetail-class} \alias{rdPackageTable-class} \alias{htmlDoc,RepositoryDetail-method} \alias{htmlFilename,RepositoryDetail-method} \alias{htmlValue,RepositoryDetail-method} \alias{htmlValue,rdPackageTable-method} \title{Class "RepositoryDetail"} \description{Representation of R package repository index} \section{Objects from the Class}{ Objects can be created by calls of the form \code{new("RepositoryDetail", ...)}. } \section{Slots}{ \describe{ \item{\code{Title}:}{Object of class \code{"character"} giving the title for the repository.} \item{\code{reposRoot}:}{Object of class \code{"character"} giving the root URL of the repository} \item{\code{homeUrl}:}{Object of class \code{"character"} ?} \item{\code{htmlDir}:}{Object of class \code{"character"} ? } \item{\code{packageList}:}{Object of class \code{"list"} consisting of objects of class \code{PackageDetail-class} } } } \section{Extends}{ Class \code{"Htmlized"}, directly. } \section{Methods}{ \describe{ \item{htmlDoc}{\code{signature(object = "RepositoryDetail")}: ... } \item{htmlFilename}{\code{signature(object = "RepositoryDetail")}: ... } \item{htmlValue}{\code{signature(object = "RepositoryDetail")}: ... } } } \author{Seth Falcon} \keyword{classes} biocViews/man/htmlDoc.Rd0000644000175000017500000000111714136047116014775 0ustar nileshnilesh\name{htmlDoc} \alias{htmlDoc} \title{Create a complete HTML document representation of an object} \description{ This generic function should return an \code{XMLNode} instance representing the specified object in HTML as a complete HTML document. } \usage{ htmlDoc(object, ...) } %- maybe also 'usage' for other objects documented here. \arguments{ \item{object}{An object} \item{\dots}{Not currently used.} } \value{ An instance of \code{XMLNode} from the \code{XML} package. } \author{Seth Falcon} \seealso{\code{\link{htmlValue}}, \code{\link{htmlFilename}}} \keyword{methods} biocViews/man/guessPackageType.Rd0000644000175000017500000000203514136047116016647 0ustar nileshnilesh\name{guessPackageType} \alias{guessPackageType} \title{ Guess Package Type (Software, ExperimentData, AnnotationData) using existing biocViews. } \description{ biocViews are "keywords" which are used to describe a given package. They are broadly divided into three categories, representing the type of packages present in the Bioconductor Project - Software, Annotation Data and Experiment Data. biocViews are supposed to come from only one of the three fields, but this function will check the list of biocViews and guess the package type based on how many biocViews came from each field. } \usage{ guessPackageType(biocViews) } \arguments{ \item{biocViews}{ A character vector containing a list of biocViews.} } \value{ A character(1) of package type: either "Software", "ExperperimentData", or "AnnotationData". } \author{ Lori Shepherd } \examples{ guessPackageType(c("clustering", "classification")) guessPackageType(c("Organism", "Homo Sapien")) } biocViews/man/getPacksAndViews.Rd0000644000175000017500000000234714136047116016613 0ustar nileshnilesh\name{getPacksAndViews} \alias{getPacksAndViews} \alias{permulist} %Undocumented code objects: % makeVocInfo pump tellSubTop tellSuperTop %Undocumented data sets: % gg oct05 \alias{makeVocInfo} \alias{pump} \alias{tellSubTop} \alias{tellSuperTop} \title{Parse VIEWS file for views and packages} \description{ Given a repository URL, download and parse the VIEWS file. } \usage{ getPacksAndViews(reposURL, vocab, defaultView, local=FALSE) } \arguments{ \item{reposURL}{character vector giving the URL of a CRAN-style repository containing a VIEWS file at the top-level.} \item{vocab}{A \code{\link[graph]{graph-class}} object representing the ontologyof views. This graph should be a directed acyclic graph (DAG).} \item{defaultView}{A string giving the term to use for packages that do not list a term of their own via the \code{biocViews} field in the \file{DESCRIPTION} file.} \item{local}{logical indicating whether certain links should be absolute (using \code{reposURL}) or relative.} } \value{ A list with named elements: \code{views}: Vector of view memberships. Names are package names. \code{pkgList}: A list of \code{\link{PackageDetail-class}} objects. } \author{Seth Falcon} \keyword{utilities} biocViews/man/extractVignettes.Rd0000644000175000017500000000267514136047116016760 0ustar nileshnilesh\name{extractVignettes} \alias{extractVignettes} \alias{extractHTMLDocuments} \title{Extract pdf vignettes from local package repository} \description{ These functions extract pdf or HTML files from the \code{inst/doc} subdirectory of R source packages archives (\code{.tar.gz}) found in a local package repository. All pdf files found in \code{inst/doc} will be extracted. With \code{extractHTMLDocuments}, all HTML files except \code{index.html} will be extracted. Only source package archives will be processed. The extracted pdf or HTML files will be extracted under \code{destDir} and will be found in \code{PKGNAME/inst/doc/}. Prior to extraction, all pdf files in \code{destDir/PKGNAME/inst/doc} will be removed. } \usage{ extractVignettes(reposRoot, srcContrib, destDir) extractHTMLDocuments(reposRoot, srcContrib, destDir) } %- maybe also 'usage' for other objects documented here. \arguments{ \item{reposRoot}{character vector giving the path to the root of the local CRAN-style package repository} \item{srcContrib}{character vector giving the relative path from the \code{reposRoot} to the source packages. In a standard CRAN-style repository, this will be \code{src/contrib}.} \item{destDir}{character vector specifying the directory in which the extracted files will be written. If missing, files will be written to \code{/vignettes}.} } \author{Seth Falcon} \keyword{utilities} biocViews/man/writeHtmlDoc.Rd0000644000175000017500000000070414136047116016011 0ustar nileshnilesh\name{writeHtmlDoc} \alias{writeHtmlDoc} \title{Write an XML DOM containing HTML to a file} \description{ Given a DOM tree from the XML package and a filename, write the DOM to disk creating an HTML file. } \usage{ writeHtmlDoc(html, file) } %- maybe also 'usage' for other objects documented here. \arguments{ \item{html}{A DOM object from the XML package} \item{file}{A string giving the filename} } \author{S. Falcon} \keyword{ utilities } biocViews/man/getPackageTitles.Rd0000644000175000017500000000467714136047116016641 0ustar nileshnilesh\name{getPackageTitles} \alias{getPackageTitles} \alias{getPackageTitles} \alias{getPackageDescriptions} \title{ Retrieve list of package titles and print package Description } \description{ These functions visit two Bioconductor releases branches, identifying differnt packages that are present in the \sQuote{current} repository from the \sQuote{previous} release. Utilizes the devel branch of bioconductor to retrieve description. } \usage{ getPackageTitles(prevBranch="RELEASE_3_6", currBranch="master", manifest=c("software.txt", "data-experiment.txt", "workflows.txt"), status = c("new", "removed")) getPackageDescriptions(pkgs, outfile, output=c("md", "text"), relativeLink=FALSE) } \arguments{ \item{prevBranch}{\code{character(1)} Bioconductor branch to compare to} \item{currBranch}{\code{character(1)} Bioconductor branch for current packages.} \item{manifest}{\code{character(1)} Which repository of pakcages to compare. software.txt is software packages, data-experiment.txt is for data experiment packages and workflows.txt for workflow packages} \item{status}{get new or removed package list comparing currBranch to prevBranch} \item{pkgs}{character() A list of packages to retrieve DESCRIPTION} \item{outfile}{\code{character(1)} file path to the location where DESCRIPTIONS will be printed.} \item{output}{\code{character(1)} output to text or markdown format.} \item{relativeLink}{Should links to packages be relative links on bioconductor.org website or include full url 'https//bioconductor.org'. default: FALSE is full url.} } \value{ A list of package titles. } \author{ Martin Morgan \url{mtmorgan@fhcrc.org} and Lori Shepherd } \examples{ \dontrun{ # At release time get a list of new or removed or deprecated packages # get new packages in release 3.7 that are not in 3.6 newSoft = getPackageTitles() # get removed packages from 3.6 rmSoft = getPackageTitles(currBranch="RELEASE_3_7", status="removed") # get depreacted package for 3.7 deprecatedSoft = setdiff(getPackageTitles(status="removed"), rmSoft) # repeated above for data-experiment packages newData = getPackageTitles(manifest="data-experiment.txt") rmData = getPackageTitles(currBranch="RELEASE_3_7", manifest="data-experiment.txt", status="removed") deprecatedData = setdiff(getPackageTitles(manifest="data-experiment.txt", status="removed"), rmData) } } \keyword{manip}% __ONLY ONE__ keyword per line biocViews/man/htmlFilename.Rd0000644000175000017500000000114314136047116016007 0ustar nileshnilesh\name{htmlFilename} \alias{htmlFilename} \alias{htmlFilename,character-method} \title{Return a filename for an object's HTML representation} \description{ This function returns a string containing an appropriate filename for storing the object's HTML representation. } \usage{ htmlFilename(object, ...) } %- maybe also 'usage' for other objects documented here. \arguments{ \item{object}{An object.} \item{\dots}{Not currently used} } \value{ A character vector of length one containing the filename. } \author{Seth Falcon} \seealso{\code{\link{htmlValue}}, \code{\link{htmlDoc}} } \keyword{methods} biocViews/man/htmlValue.Rd0000644000175000017500000000076314136047116015352 0ustar nileshnilesh\name{htmlValue} \alias{htmlValue} \title{HTML Representation of an Object} \description{ This generic function should return an \code{XMLNode} instance representing the specified object in HTML } \usage{ htmlValue(object) } %- maybe also 'usage' for other objects documented here. \arguments{ \item{object}{An object} } \value{ An instance of \code{XMLNode} from the \code{XML} package. } \author{Seth Falcon} \seealso{\code{\link{htmlDoc}}, \code{\link{htmlFilename}}} \keyword{methods} biocViews/man/write_VIEWS.Rd0000644000175000017500000000464614136047116015524 0ustar nileshnilesh\name{write_VIEWS} \alias{write_VIEWS} \title{Write a VIEWS control file for a CRAN-style package repository} \description{ This function writes a \code{VIEWS} file to the top-level of a CRAN-style package repository. The \code{VIEWS} file is in DCF format and describes all packages found in the repository. The \code{VIEWS} file contains the complete \code{DESCRIPTION} file for each source package in the repository. In addition, metadata for available binary packages and vignettes is centralized here. } \usage{ write_VIEWS(reposRootPath, fields = NULL, verbose = FALSE, vignette.dir = "vignettes", manifestFile = NA, meatPath = NA) } \arguments{ \item{reposRootPath}{character vector containing the path to the CRAN-style repository root directory.} \item{fields}{Any additional fields to include. You shouldn't need this, but if you have added fields to the DESCRIPTION files of the packages in the repository, you may want it.} \item{verbose}{logical, if \code{TRUE}, print progress messages.} \item{vignette.dir}{character specifying where to look for vignettes.} \item{manifestFile}{character(1). File path location to Bioconductor formatted manifest file that lists all current packages. This file will be used in the write_VIEWS function to cross check successfully built packages with all expected packages. Packages that have not built will be given dummy entry for complete listing in bioc_VIEWS. If NA cross check is skipped and packages not built on any system will be missing from biocVIEWS} \item{meatPath}{character(1). File path location to the directory containing cloned repositories of Bioconductor packages. If manifestFile is used for cross checking and the meatPath is provided, entries from the DESCRIPTION file are manually entered into biocVIEWS information. If NA dummy values for minimal fields for landing page generation are included with ERROR. This attempts to fill in as much information as possible for packages that have failed to build.} } \author{Seth Falcon} \section{Warning}{ This function uses a private function from the \code{tools} package: \code{tools:::.build_repository_package_db}. } \seealso{ \code{\link[tools:writePACKAGES]{write_PACKAGES}}, \code{\link{extractVignettes}}, \code{\link{genReposControlFiles}}, \code{\link{write_REPOSITORY}} } \keyword{utilities} biocViews/man/getBiocSubViews.Rd0000644000175000017500000000434514136047116016455 0ustar nileshnilesh\name{getBiocSubViews} \alias{getBiocSubViews} \title{Build a list of BiocView objects from a package repository} \description{ This function returns a list of \code{\link{BiocView-class}} objects corresponding to the subgraph of the views DAG induced by \code{topTerm}. In short, this does the same thing as \code{\link{getBiocViews}}, but limits the vocabulary to \code{topTerm} and all of its decendents. } \usage{ getBiocSubViews(reposUrl, vocab, topTerm, local = FALSE, htmlDir = "") } %- maybe also 'usage' for other objects documented here. \arguments{ \item{reposUrl}{URL for a CRAN-style repository that hosts a \code{VIEWS} file at the top-level.} \item{vocab}{A \code{\link[graph]{graph-class}} object representing the ontologyof views. This graph should be a directed acyclic graph (DAG).} \item{topTerm}{A string giving the name of the subview DAG. This view and all of its decendents will be included in the result.} \item{local}{logical indicating whether to assume a local package repository. The default is \code{FALSE} in which case absolute links to package detail pages are created.} \item{htmlDir}{if the \code{local} argument is \code{TRUE}, this will be used as the relative path for package HTML files.} } \details{ The root of the vocabulary DAG is implicitly included in the view creation process order to build views with a link back to the top. It is removed from the return list. This function is tailored to generation of Bioconductor Task Views. With the current vocabulary, it probably only makes sense to call it with \code{topView} set to one of \code{"Software"}, \code{"AnnotationData"}, or \code{"ExperimentData"}. This is a hack to allow the biocViews code to manage HTML views across more than one repository. } \value{ A list of \code{BiocView-class} objects. The names of the list give the name of the corresponding view. } \author{Seth Falcon} \seealso{ \code{\link{write_VIEWS}}, \code{\link{writeBiocViews}} } \examples{ data(biocViewsVocab) reposPath <- system.file("doc", package="biocViews") reposUrl <- paste("file://", reposPath, sep="") biocViews <- getBiocSubViews(reposUrl, biocViewsVocab, "Software") print(biocViews[1:2]) } \keyword{utilities} biocViews/man/validation_tests.Rd0000644000175000017500000000112014136047116016751 0ustar nileshnilesh\name{validate_bioc_views} \alias{validate_bioc_views} \alias{validation_tests} \title{ Validate a package's biocViews. } \description{ Ensures that a package has biocViews and that they are valid. Function is designed to be called from the unit tests of another package. } \usage{ validate_bioc_views(pkg) } \arguments{ \item{pkg}{\code{character(1)} Name of package to validate.} } \value{ \code{invisible(NULL)} if tests pass. } \author{ Dan Tenenbaum \url{dtenenba@fhcrc.org} } \examples{ validate_bioc_views("biocViews") } \keyword{manip}% __ONLY ONE__ keyword per line biocViews/man/getSubTerms.Rd0000644000175000017500000000131414136047116015646 0ustar nileshnilesh\name{getSubTerms} \alias{getSubTerms} \title{Retrieve a term and its children from a vocab DAG} \description{ Given a Directed Acyclic Graph (DAG) represented as a \code{graphNEL} instance, return a character vector consisting of the specified \code{term} and all of its descendants. That is, give the list of terms for which a path exists starting at \code{term}. } \usage{ getSubTerms(dag, term) } \arguments{ \item{dag}{A \code{graphNEL} representing a DAG} \item{term}{A string giving a term in the vocabulary (a node in \code{dag})} } \value{ A character vector of term names. } \author{S. Falcon} \examples{ data(biocViewsVocab) getSubTerms(biocViewsVocab, "Software") } \keyword{utilities} biocViews/man/extractNEWS.Rd0000644000175000017500000000070514136047116015554 0ustar nileshnilesh\name{extractNEWS} \alias{extractNEWS} \title{Extract NEWS files from source package tarballs} \description{ Extracts NEWS files from source tarballs of packages. } \usage{ extractNEWS(reposRoot, srcContrib, destDir) } %- maybe also 'usage' for other objects documented here. \arguments{ \item{reposRoot}{Top level path for CRAN-style repos} \item{srcContrib}{Location of source packages} \item{destDir}{where to extract} } \keyword{utilities} biocViews/man/biocViewsVocab.Rd0000644000175000017500000000144014136047116016307 0ustar nileshnilesh\name{biocViewsVocab} \alias{biocViewsVocab} \docType{data} \title{Bioconductor Task Views Vocabulary Data} \description{ A \code{\link[graph]{graphNEL-class}} instance representing the Bioconductor Task Views as a directed graph. } \usage{data(biocViewsVocab)} \format{ The format is: graphNEL instance } \details{ The source for the vocabulary data is in the dot directory of the package in file biocViewsVocab.dot. This is transformed to GXL using the dot2gxl command line utility from the graphviz package. Then the \code{fromGXL} function from the \code{graph} package is used to convert to \code{graphNEL-class}. } \examples{ data(biocViewsVocab) biocViewsVocab ## If you have Rgraphviz available, you can ## plot the vocabulary with plot(biocViewsVocab) } \keyword{datasets} biocViews/man/genReposControlFiles.Rd0000644000175000017500000000437114136047116017516 0ustar nileshnilesh\name{genReposControlFiles} \alias{genReposControlFiles} \title{Generate CRAN-style repository control files} \description{ This function generates control files for CRAN-style repositories. For each path specified in \code{contribPaths} a \code{PACKAGES} file is written. In addition, two top-level control files are created: \code{REPOSITORY} contains information about the specified contrib paths. \code{VIEWS} contains metadata for all packages in the repository including the paths to any extracted vignettes, if found. This file is useful for generating HTML views of the repository. } \usage{ genReposControlFiles(reposRoot, contribPaths, manifestFile = NA, meatPath = NA) } %- maybe also 'usage' for other objects documented here. \arguments{ \item{reposRoot}{character vector containing the path to the CRAN-style repository root directory.} \item{contribPaths}{A named character vector. Valid names are \code{source}, \code{win.binary}, \code{mac.binary}, \code{mac.binary.mavericks}, and \code{mac.binary.el-capitan}. Values indicate the paths to the package archives relative to the \code{reposRoot}.} \item{manifestFile}{character(1). File path location to Bioconductor formatted manifest file that lists all current packages. This file will be used in the write_VIEWS function to cross check successfully built packages with all expected packages. Packages that have not built will be given dummy entry for complete listing in bioc_VIEWS. If NA cross check is skipped and packages not built on any system will be missing from biocVIEWS} \item{meatPath}{character(1). File path location to the directory containing cloned repositories of Bioconductor packages. If manifestFile is used for cross checking and the meatPath is provided, entries from the DESCRIPTION file are manually entered into biocVIEWS information. If NA dummy values for minimal fields for landing page generation are included with ERROR. This attempts to fill in as much information as possible for packages that have failed to build.} } \author{Seth Falcon} \seealso{ \code{\link[tools:writePACKAGES]{write_PACKAGES}}, \code{\link{extractVignettes}}, \code{\link{write_REPOSITORY}}, \code{\link{write_VIEWS}} } \keyword{utilities} biocViews/man/write_SYMBOLS.Rd0000644000175000017500000000223714136047116015751 0ustar nileshnilesh\name{write_SYMBOLS} \alias{write_SYMBOLS} \title{Write a SYMBOLS file} \description{ Writes a DCF formatted file, SYMBOLS, containing the symbols exported by each package in a directory containg R package source directories. } \usage{ write_SYMBOLS(dir, verbose = FALSE, source.dirs=FALSE) } \arguments{ \item{dir}{The root of a CRAN-style package repository containing source packages. When \code{source.dirs} is \code{TRUE}, \code{dir} should be a directory containing R package source directories} \item{verbose}{Logical. When \code{TRUE}, progress is printed to the standard output.} \item{source.dirs}{Logical. When \code{TRUE}, interpret \code{dir} as a directory containing source package directories. When \code{FALSE}, the default, \code{dir} is assumed to be the root of a CRAN-style package repository and the function will operate on the source package tarballs in \code{dir/src/contrib}.} } \value{ Returns \code{NULL}. Called for the side-effect of creating a SYMBOLS file in \code{dir}. } \author{S. Falcon} \seealso{ \code{\link[tools:writePACKAGES]{write_PACKAGES}} \code{\link{write_VIEWS}} } \keyword{utilities} biocViews/man/recommendBiocViews.Rd0000644000175000017500000000463614136047116017200 0ustar nileshnilesh\name{recommendBiocViews} \alias{recommendBiocViews} \title{ Recommend biocViews for an existing Package. } \description{ Packages being added to the Bioconductor Project require biocViews in their DESCRIPTION file.(Note that the field name "biocViews" is case-sensitive and must begin with a lower-case 'b'.)biocViews are "keywords" which are used to describe a given package. They are broadly divided into three categories, representing the type of packages present in the Bioconductor Project - Software, Annotation Data and Experiment Data. } \usage{ recommendBiocViews(pkgdir, branch) } \arguments{ \item{pkgdir}{ The path of the package Directory. } \item{branch}{ The branch which your package will belong to. It can be either 'Software', 'AnnotationData' or 'ExperimentData'. } } \details{ This function parses the package directory provided by the user to recommend biocViews to the user. The output is a suggested list - the user of this function is expected to go through this list and find which biocViews best describe his or her package. It uses the following strategies. \itemize{ \item It parses the "Description", "Title", "Package" of the DESCRIPTION page to find biocViews. \item It looks up the biocViews of the packages in the "Depends" field of the given package to recommend biocViews \item It parses the text from the man pages and the vignettes to suggest biocViews. } Please note the following: \itemize{ \item Do not make up your own biocViews. \item Double check the spelling and case of the biocViews added. \item Please add biocViews only from the appropriate branch. eg: Software packages should have only Software biocViews. } } \value{ A list is returned with 3 characters - current , recommended and remove. \itemize{ \item "current" contains the biocViews from the package's DESCRIPTION file. \item "recommended" are the recommended biocViews - This is a suggested list which the user can add in addition to "current" biocViews - the user is expected to go through this list and find which biocViews best describe their package. \item "remove" are those biocViews which are inconsistent with the Bioconductor biocViews. (Hint - check for spelling, cases and plural) } } \author{ Sonali Arora. } biocViews/man/biocViews-package.Rd0000644000175000017500000000616614136047116016737 0ustar nileshnilesh\name{biocViews-package} \alias{biocViews-package} \alias{biocViews} \docType{package} \title{ Categorized views of R package repositories } \description{ Structures for vocabularies and narratives of views. This can be used to create HTML views of the package structure in a Bioconductor repository. } \details{ \tabular{ll}{ Package: \tab biocViews\cr Version: \tab 1.11.4\cr Depends: \tab R (>= 2.4.0), methods, utils\cr Imports: \tab tools, Biobase, graph (>= 1.9.26), RBGL (>= 1.13.5), XML\cr Suggests: \tab Biobase\cr License: \tab Artistic-2.0\cr URL: \tab http://www.bioconductor.org/packages/release/BiocViews.html\cr biocViews: \tab Infrastructure\cr } Index: \preformatted{ BiocView-class Class "BiocView" Htmlized-class Class "Htmlized" PackageDetail-class Class "PackageDetail" RepositoryDetail-class Class "RepositoryDetail" biocViewsVocab Bioconductor Task Views Vocabulary Data extractVignettes Extract pdf vignettes from local package repository genReposControlFiles Generate CRAN-style repository control files getBiocSubViews Build a list of BiocView objects from a package repository getBiocViews Build a list of BiocView objects from a package repository getPacksAndViews Parse VIEWS file for views and packages getSubTerms Retrieve a term and its children from a vocab DAG htmlDoc Create a complete HTML document representation of an object htmlFilename Return a filename for an object's HTML representation htmlValue HTML Representation of an Object writeBiocViews Write a list of BiocView objects to HTML writeHtmlDoc Write an XML DOM containing HTML to a file writePackageDetailHtml Write HTML files for packages in a CRAN-style repository writeRepositoryHtml Write package descriptions and a repository index as HTML writeTopLevelView Write the view for the root of a vocabulary to disk write_REPOSITORY Write a REPOSITORY control file for a CRAN-style package repository write_SYMBOLS Write a SYMBOLS file write_VIEWS Write a VIEWS control file for a CRAN-style package repository } The terms of the vocabulary are stored in a DAG, which can be loaded as the serialized data object \code{biocViewsVocab}. For listing of available terms use function \code{getSubTerms}. Further information is available in the following two vignettes: \tabular{ll}{ \code{HOWTO-BCV} \tab Basic package usage\cr \code{createReposHtml} \tab Further information for repository admins\cr } } \author{ VJ Carey , BJ Harshfield , S Falcon Maintainer: Biocore Team c/o BioC user list } \keyword{ package } \examples{ data(biocViewsVocab) getSubTerms(biocViewsVocab, "Technology") } biocViews/man/getBiocViews.Rd0000644000175000017500000000330214136047116015773 0ustar nileshnilesh\name{getBiocViews} \alias{getBiocViews} \title{Build a list of BiocView objects from a package repository} \description{ Given the URL to a CRAN-style package repository containing a \code{VIEWS} file at the top-level and a \code{\link[graph]{graph-class}} object representing a DAG of views, this function returns a list of \code{\link{BiocView-class}} objects. } \usage{ getBiocViews(reposUrl, vocab, defaultView, local = FALSE, htmlDir = "") } %- maybe also 'usage' for other objects documented here. \arguments{ \item{reposUrl}{URL for a CRAN-style repository that hosts a \code{VIEWS} file at the top-level.} \item{vocab}{A \code{\link[graph]{graph-class}} object representing the ontology of views. This graph should be a directed acyclic graph (DAG).} \item{defaultView}{A string giving the term to use for packages that do not list a term of their own via the \code{biocViews} field in the \file{DESCRIPTION} file.} \item{local}{logical indicating whether to assume a local package repository. The default is \code{FALSE} in which case absolute links to package detail pages are created.} \item{htmlDir}{if the \code{local} argument is \code{TRUE}, this will be used as the relative path for package HTML files.} } \value{ A list of \code{BiocView-class} objects. The names of the list give the name of the corresponding view. } \author{Seth Falcon} \seealso{ \code{\link{write_VIEWS}}, \code{\link{writeBiocViews}} } \examples{ data(biocViewsVocab) reposPath <- system.file("doc", package="biocViews") reposUrl <- paste("file://", reposPath, sep="") biocViews <- getBiocViews(reposUrl, biocViewsVocab, "NoViewProvided") print(biocViews[1:2]) } \keyword{utilities} biocViews/man/writePackageDetailHtml.Rd0000644000175000017500000000137314136047116017765 0ustar nileshnilesh\name{writePackageDetailHtml} \alias{writePackageDetailHtml} \title{Write HTML files for packages in a CRAN-style repository} \description{ This function creates package "homepages" that describe the package and provide links to download package artifacts in the repository. } \usage{ writePackageDetailHtml(pkgList, htmlDir = "html", backgroundColor="transparent") } %- maybe also 'usage' for other objects documented here. \arguments{ \item{pkgList}{A list of \code{PackageDescription} objects.} \item{htmlDir}{The files will be written to this directory.} \item{backgroundColor}{A character vector giving the background color for the body in the CSS file.} } \author{Seth Falcon} \seealso{\code{\link{writeRepositoryHtml}}} \keyword{utilities} biocViews/man/BiocView-class.Rd0000644000175000017500000000363014136047116016217 0ustar nileshnilesh\name{BiocView-class} \docType{class} \alias{BiocView-class} \alias{bvTitle-class} \alias{bvPackageTable-class} \alias{bvSubViews-class} \alias{bvParentViews-class} \alias{coerce,BiocView,rdPackageTable-method} \alias{htmlDoc,BiocView-method} \alias{htmlFilename,BiocView-method} \alias{htmlValue,BiocView-method} \alias{htmlValue,bvSubViews-method} \alias{htmlValue,bvParentViews-method} \alias{show,BiocView-method} \title{Class "BiocView" } \description{Representation of of Bioconductor "view".} \section{Objects from the Class}{ Objects can be created by calls of the form \code{new("BiocView", ...)}. } \section{Slots}{ \describe{ \item{\code{name}:}{Object of class \code{"character"} giving the name of the view. } \item{\code{subViews}:}{Object of class \code{"character"} giving the names of the subviews of this view.} \item{\code{parentViews}:}{Object of class \code{"character"} giving the names of the views that are this view's parents.} \item{\code{Title}:}{Object of class \code{"character"} giving longer description of view?} \item{\code{reposRoot}:}{Object of class \code{"character"} URL for repository } \item{\code{homeUrl}:}{Object of class \code{"character"} ? } \item{\code{htmlDir}:}{Object of class \code{"character"} ? } \item{\code{packageList}:}{Object of class \code{"list"} consisting of \code{PackageDetail-class} objects } } } \section{Extends}{ Class \code{"RepositoryDetail"}, directly. Class \code{"Htmlized"}, directly. } \section{Methods}{ \describe{ \item{coerce}{\code{signature(from = "BiocView", to = "rdPackageTable")}: ... } \item{htmlDoc}{\code{signature(object = "BiocView")}: ... } \item{htmlFilename}{\code{signature(object = "BiocView")}: ... } \item{htmlValue}{\code{signature(object = "BiocView")}: ... } \item{show}{\code{signature(object = "BiocView")}: ... } } } \author{Seth Falcon} \keyword{classes} biocViews/man/writeRepositoryHtml.Rd0000644000175000017500000000363214136047116017466 0ustar nileshnilesh\name{writeRepositoryHtml} \alias{writeRepositoryHtml} \title{Write package descriptions and a repository index as HTML} \description{ This function generates an HTML file for each package in a repository and generates an \code{index.html} file that provides an alphabetized listing of the packages. } \usage{ writeRepositoryHtml(reposRoot, title, reposUrl = "..", viewUrl = "../..", reposFullUrl=reposUrl, downloadStatsUrl="", devHistoryUrl="", link.rel = TRUE, backgroundColor="transparent") } \arguments{ \item{reposRoot}{string specifying the path to the root of the CRAN-style package repository.} \item{title}{string giving the title for the repository} \item{reposUrl}{string giving the prefix for URL in links generated on the package description pages. The default is \code{"..."} which works well if the package description HTML files are written to an \code{html} subdirectory under the root of the repository.} \item{viewUrl}{string giving the prefix for the URL in links to the view pages. The biocViews terms will be linked to views summary pages with this prefix.} \item{reposFullUrl}{string giving the full prefix for URL in links generated on the package description pages. The default is \code{reposUrl}.} \item{downloadStatsUrl}{string giving the prefix for the URL in links to the download history statistics pages.} \item{devHistoryUrl}{string giving the prefix for the URL in links to the development changelog.} \item{link.rel}{logical indicating whether the index page should generate relative URL links. The default is \code{TRUE}. If you are generating HTML for a remote repository, you will want to set this to \code{FALSE}.} \item{backgroundColor}{A character vector giving the background color for the body in the CSS file.} } \author{Seth Falcon} \keyword{utilities} biocViews/man/getPackageNEWS.Rd0000644000175000017500000000410214136047116016130 0ustar nileshnilesh\name{getPackageNEWS} \alias{getPackageNEWS} \alias{printNEWS} \title{ Retrieve and print package NEWS } \description{ These functions visit two Bioconductor releases, identifying packages that are present in the \sQuote{current} repository and have NEWS since the base version of the same package in the \sQuote{previous} release. All NEWS is reported for packages only in the current repository. } \usage{ getPackageNEWS(prevRepos="3.6", currRepos="3.7", repo=c("bioc", "data/experiment", "workflows"), srcdir = NULL) printNEWS(dbs, destfile, overwrite = FALSE, width = 68, output=c("md", "text"), relativeLink=FALSE, ...) } \arguments{ \item{prevRepos}{\code{character(1)} Bioconductor version from which NEWS starts.} \item{currRepos}{\code{character(1)} Bioconductor version for current packages.} \item{repo}{\code{character(1)} Which repository to get NEWS for. bioc is software packages, data/experiment is for data experiment packages and workflows for workflow packages} \item{srcdir}{Path to local checkout of package repositories, if NULL will try and use files on main builders} \item{dbs}{A list of \code{news_db} elements, as returned by \code{getPackageNEWS}.} \item{destfile}{\code{character(1)} file path to the location where NEWS will be printed.} \item{overwrite}{\code{logical(1)} indicating whether \code{destfile} can be over-written, if it exists.} \item{width}{\code{numeric(1)} number of characters news items are to be wrapped to, excluding indent.} \item{output}{\code{character(1)} output to text or markdown format.} \item{relativeLink}{Should links to packages be relative links on bioconductor.org website or include full url 'https//bioconductor.org'. default: FALSE is full url.} \item{...}{additional arguments, unused.} } \value{ A list of \code{news_db} files, as returned by \code{utils::news}, for each package for which relevant NEWS is available. } \author{ Martin Morgan \url{mtmorgan@fhcrc.org} and Lori Shepherd } \keyword{manip}% __ONLY ONE__ keyword per line biocViews/man/getCurrentbiocViews.Rd0000644000175000017500000000134714136047116017405 0ustar nileshnilesh\name{getCurrentbiocViews} \alias{getCurrentbiocViews} \title{ Get a list of biocViews for each branch } \description{ This function looks returns a list containing all the biocViews that are present on the Bioconductor website. } \usage{ getCurrentbiocViews() } \details{ It parses the dot file present inside the biocViews package. } \value{ It returns a named list with 3 components. \item{Software}{biocViews from the software branch} \item{ExperimentData}{biocViews from the ExperimentData branch} \item{AnnotationData}{biocViews from the AnnotationData branch} } \author{ Sonali Arora } \examples{ ans <- getCurrentbiocViews() ## only the first 6 from each branch are shown here. lapply(ans, head) }biocViews/man/PackageDetail-class.Rd0000644000175000017500000001503714136047116017172 0ustar nileshnilesh\name{PackageDetail-class} \docType{class} \alias{PackageDetail-class} \alias{pdAuthorMaintainerInfo-class} \alias{pdVignetteInfo-class} \alias{pdDownloadInfo-class} \alias{pdDetailsInfo-class} \alias{pdDescriptionInfo-class} \alias{pdVigsAndDownloads-class} \alias{htmlDoc,PackageDetail-method} \alias{htmlFilename,PackageDetail-method} \alias{htmlValue,PackageDetail-method} \alias{htmlValue,pdAuthorMaintainerInfo-method} \alias{htmlValue,pdVignetteInfo-method} \alias{htmlValue,pdDownloadInfo-method} \alias{htmlValue,pdDetailsInfo-method} \alias{htmlValue,pdDescriptionInfo-method} \alias{htmlValue,pdVigsAndDownloads-method} \title{Class "PackageDetail"} \description{Representation of R package metadata. Most slots correspond to fields in a package's DESCRIPTION file.} \section{Objects from the Class}{ Objects can be created by calls of the form \code{new("PackageDetail", ...)}. } \section{Slots}{ \describe{ \item{\code{Package}:}{Object of class \code{"character"} see DESCRIPTION } \item{\code{Version}:}{Object of class \code{"character"} see DESCRIPTION } \item{\code{Title}:}{Object of class \code{"character"} see DESCRIPTION } \item{\code{Description}:}{Object of class \code{"character"} see DESCRIPTION } \item{\code{Author}:}{Object of class \code{"character"} see DESCRIPTION } \item{\code{Maintainer}:}{Object of class \code{"character"} see DESCRIPTION } \item{\code{Depends}:}{Object of class \code{"character"} see DESCRIPTION } \item{\code{Imports}:}{Object of class \code{"character"} see DESCRIPTION } \item{\code{Suggests}:}{Object of class \code{"character"} see DESCRIPTION } \item{\code{SystemRequirements}:}{Object of class \code{"character"} see DESCRIPTION } \item{\code{License}:}{Object of class \code{"character"} see DESCRIPTION } \item{\code{URL}:}{Object of class \code{"character"} see DESCRIPTION } \item{\code{biocViews}:}{Object of class \code{"character"} see DESCRIPTION } \item{\code{vignettes}:}{Object of class \code{"character"} giving paths to vignette pdf files in the repository } \item{\code{vignetteScripts}:}{Object of class \code{"character"} giving paths to vignette Stangled R files in the repository } \item{\code{vignetteTitles}:}{Object of class \code{"character"} giving the titles of the vignette files in the repository } \item{\code{source.ver}:}{Object of class \code{"character"} version string for the source package} \item{\code{win.binary.ver}:}{Object of class \code{"character"} version string for the 32-bit Windows binary package } \item{\code{mac.binary}:}{Object of class \code{"character"} version string for the macOS High Sierra binary package } \item{\code{mac.binary.mavericks.ver}:}{Object of class \code{"character"} version string for the OS X Mavericks binary package } \item{\code{mac.binary.el-capitan.ver}:}{Object of class \code{"character"} version string for the OS X El Capitan binary package } \item{\code{downloadStatsUrl}:}{Object of class \code{"character"} An optional URL for the download history statistics. } \item{\code{manuals}:}{Object of class \code{"character"} giving paths to reference manual pdf files in the repository} \item{\code{dependsOnMe}:}{Object of class \code{"character"} giving packages found in the repository that depend on this package} \item{\code{importsMe}:}{Object of class \code{"character"} giving packages found in the repository that imports this package} \item{\code{suggestsMe}:}{Object of class \code{"character"} giving packages found in the repository that suggest this package} \item{\code{functionIndex}:}{Object of class \code{"character"} Not used. Intended to hold function index data. } \item{\code{reposFullUrl}:}{Object of class \code{"character"} The URL for the full URL of the root of the repository. } \item{\code{reposRoot}:}{Object of class \code{"character"} The URL for the root of the repository. } \item{\code{viewRoot}:}{Object of class \code{"character"} The URL for the view of the repository. } \item{\code{devHistoryUrl}:}{Object of class \code{"character"} The URL for the development changelog. } } } \section{Extends}{ Class \code{"Htmlized"}, directly. } \section{Methods}{ \describe{ \item{htmlDoc}{\code{signature(object = "PackageDetail")}: Return an \code{XMLNode} instance containg a complete HTML document representation of the package.} \item{htmlFilename}{\code{signature(object = "PackageDetail")}: Return a filename appropriate for the HTML document representation. } \item{htmlValue}{\code{signature(object = "PackageDetail")}: Return \code{XMLNode} instance containing an HTML representation of the package. } } } \section{Details}{ \code{pdAuthorMaintainerInfo-class} \code{pdVignetteInfo-class} \code{pdDownloadInfo-class} \code{pdDetailsInfo-class} \code{pdDescriptionInfo-class} \code{pdVigsAndDownloads-class} Dummy classes for HTML generation. Each dummy class is a simple extension (it does not add any slots). The purpose of each dummy class is to allow for method dispatch to generate HTML via the \code{\link{htmlValue}} method. You can convert convert a \code{PackageDetail} instance to one of the dummy classes like this: \code{descInfo <- as(pdObj, "pdDescriptionInfo")} } \author{Seth Falcon} \examples{ pd <- new("PackageDetail", Package="MyFancyPackage", Version="1.2.3", Title="A Fancy Package", Description="This package does fancy things", Author="A. Coder", Maintainer="A. Coder ", Depends="methods", Imports="ASimplePackage", Suggests="MyDataPackage", biocViews="Infrastructure", vignettes="vignettes/MyFancyPackage/inst/doc/MFP1.pdf,\nvignettes/MyFancyPackage/inst/doc/MFP2.pdf", vignetteScripts="vignettes/MyFancyPackage/inst/doc/MFP1.R\nvignettes/MyFancyPackage/inst/doc/MFP2.R", vignetteTitles="MFP1 Document,\nMFP2 Document", source.ver="src/contrib/MyFancyPackage_1.2.3.tar.gz", win.binary.ver="bin/windows/contrib/4.0/MyFancyPackage_1.2.2.zip", mac.binary.ver="bin/macosx/contrib/4.0/MyFancyPackage_1.2.3.tgz", dependsOnMe=c("PackageThatExposesMe"), importsMe=c("AnEvenFancierPackage","AMuchFancierPackage"), suggestsMe="PackageThatUsesMeInVignette", reposRoot="http://foo.bar.org") html <- htmlValue(pd) pd } \keyword{classes} biocViews/man/write_REPOSITORY.Rd0000644000175000017500000000237614136047116016344 0ustar nileshnilesh\name{write_REPOSITORY} \alias{write_REPOSITORY} \title{Write a REPOSITORY control file for a CRAN-style package repository} \description{ This function writes a \code{REPOSITORY} file at the top-level of a CRAN-style repository. This file is DCF formatted and describes the location of packages available in the repository. Here is an example for a repository containing source packages, and Windows and Mac binary packages: \preformatted{ source: src/contrib win.binary: bin/windows/contrib/4.0 mac.binary: bin/macosx/contrib/4.0 provides: source, win.binary, mac.binary } } \usage{ write_REPOSITORY(reposRootPath, contribPaths) } \arguments{ \item{reposRootPath}{character vector containing the path to the CRAN-style repository root directory.} \item{contribPaths}{A named character vector. Valid names are \code{source}, \code{win.binary}, \code{mac.binary}, \code{mac.binary.mavericks}, and \code{mac.binary.el-capitan}. Values indicate the paths to the package archives relative to the \code{reposRoot}.} } \author{Seth Falcon} \seealso{ \code{\link[tools:writePACKAGES]{write_PACKAGES}}, \code{\link{extractVignettes}}, \code{\link{genReposControlFiles}}, \code{\link{write_VIEWS}} } \keyword{utilities} biocViews/man/Htmlized-class.Rd0000644000175000017500000000076214136047116016273 0ustar nileshnilesh\name{Htmlized-class} \docType{class} \alias{Htmlized-class} \alias{htmlDoc,Htmlized-method} \title{Class "Htmlized"} \description{A virtual class for HTML serialization method dispatch. } \section{Objects from the Class}{A virtual Class: No objects may be created from it.} \section{Methods}{ \describe{ \item{htmlDoc}{\code{signature(object = "Htmlized")}: Return the html-ized representation of \code{object} as a complete HTML document.} } } \author{Seth Falcon} \keyword{classes} biocViews/man/writeRFilesFromVignettes.Rd0000644000175000017500000000140114136047116020353 0ustar nileshnilesh\name{writeRFilesFromVignettes} \alias{writeRFilesFromVignettes} \title{Write R files from vignettes} \description{ Ensures that .R files from vignette code chunks are written out. } \usage{ writeRFilesFromVignettes(reposRoot, reposUrl="..", viewUrl="../..", reposFullUrl=reposUrl, downloadStatsUrl="", devHistoryUrl="") } %- maybe also 'usage' for other objects documented here. \arguments{ \item{reposRoot}{Root directory of a CRAN-style repository} \item{reposUrl}{URL of repository} \item{viewUrl}{url of VIEWS file} \item{reposFullUrl}{Full URL of VIEWS file} \item{downloadStatsUrl}{URL to download stats page} \item{devHistoryUrl}{Dev history URL} } \keyword{utilities} biocViews/man/writeBiocViews.Rd0000644000175000017500000000130314136047116016345 0ustar nileshnilesh\name{writeBiocViews} \alias{writeBiocViews} \title{Write a list of BiocView objects to HTML} \description{ This function serializes a list of \code{\link{BiocView-class}} objects to a series of HTML files. } \usage{ writeBiocViews(bvList, dir, backgroundColor="transparent") } \arguments{ \item{bvList}{A list of \code{BiocView-class} objects} \item{dir}{A character vector giving the directory where the HTML files will be written.} \item{backgroundColor}{A character vector giving the background color for the body in the CSS file.} } \author{Seth Falcon} \seealso{ \code{\link{getBiocViews}}, \code{\link{genReposControlFiles}}, \code{\link{write_VIEWS}} } \keyword{utilities} biocViews/vignettes/0000755000175000017500000000000014140322302014335 5ustar nileshnileshbiocViews/vignettes/HOWTO-BCV.Rnw0000644000175000017500000001356114136047116016357 0ustar nileshnilesh%\VignetteIndexEntry{biocViews-HOWTO} % % NOTE -- ONLY EDIT THE .Rnw FILE!!! The .tex file is % likely to be overwritten. % \documentclass[12pt]{article} \usepackage{amsmath} \usepackage[authoryear,round]{natbib} \usepackage{hyperref} \textwidth=6.2in \textheight=8.5in %\parskip=.3cm \oddsidemargin=.1in \evensidemargin=.1in \headheight=-.3in \newcommand{\scscst}{\scriptscriptstyle} \newcommand{\scst}{\scriptstyle} \newcommand{\Rfunction}[1]{{\texttt{#1}}} \newcommand{\Robject}[1]{{\texttt{#1}}} \newcommand{\Rpackage}[1]{{\textit{#1}}} \newcommand{\Rmethod}[1]{{\texttt{#1}}} \newcommand{\Rfunarg}[1]{{\texttt{#1}}} \newcommand{\Rclass}[1]{{\textit{#1}}} \textwidth=6.2in \bibliographystyle{plainnat} \begin{document} %\setkeys{Gin}{width=0.55\textwidth} \title{HOWTO generate biocViews HTML} \author{S. Falcon and V.J. Carey} \maketitle <>= library("biocViews") library("Biobase") @ \section{Overview} The purpose of \Rpackage{biocViews} is create HTML pages that categorize packages in a Bioconductor package repository according to terms, or \textit{views}, in a controlled vocabulary. The fundamental resource is the VIEWS file placed at the root of a repository. This file contains the complete DESCRIPTION file contents for each package along with additional meta data describing the location of package artifacts such as archive files for different platforms and vignettes. The standard behavior of the view generation program is to query the repository over the internet. This package includes a static sample VIEWS file so that the examples in this document can run without internet access. \section{Establishing a vocabulary of terms} We use \texttt{dot} to describe the vocabulary. For details on the \texttt{dot} syntax, see \url{http://www.graphviz.org/doc/info/lang.html}. <>= vocabFile <- system.file("dot/biocViewsVocab.dot", package="biocViews") cat(readLines(vocabFile)[1:20], sep="\n") cat("...\n") @ The dot description is transformed to a GXL document using \texttt{dot2gxl}, a tool included in the graphviz distribution. The GXL is then converted to a \Rclass{graphNEL} instance using \Rfunction{fromGXL} from the \Rpackage{graph} package. There is a helper script in the root of the \Rpackage{biocViews} package called \texttt{updateVocab.sh} that automates the update process if the required tools are available. The script will also attempt to dump the ontology graph into a local SQLite database using tools from \Rpackage{DBI} and \Rpackage{RSQLite}. The information in this database can be used to create a dynamic HTML representation of the graph by means of a PHP script. The definition of the vocabulary lacks a notion of order. Since the purpose of the vocabulary is primarily for display, a valuable improvement would be to use graph attributes to allow the ordering of the terms. Another missing piece is a place to put a text description of each term. This could also be achieved using graph attributes. \subsection{Use Case: adding a term to the vocabulary} To add a new term to the vocabulary: \begin{enumerate} \item edit the \textit{dot} file \texttt{dot/biocViewsVocab.dot} and add the desired term. Note that terms cannot contain spaces and that the underscore character, \verb+_+, should be used instead. \item ensure that R and dot2gxl are on your PATH. \item cd into the biocViews working copy directory. \item run the updateVocab.sh script. \item reinstall the package and test that the new term is part of the vocabulary. In short, you will load the data using \texttt{data(biocViewsVocab)} and check that the term is a node of the graph instance. \item commit changes to svn. \end{enumerate} \subsection{Use Case: updating BioConductor website} This is for BioConductor web administrator: \begin{enumerate} \item update local copy of biocViews using \texttt{svn update}. \item find the correct instance R that is used to generate HTML pages on BioConductor website, and install the updated \texttt{biocViews}. \item re-generate the related HTML packages by using \texttt{/home/biocadmin/bin/prepareRepos-*.sh} and \texttt{/home/biocadmin/bin/pushRepos-*.sh}. \end{enumerate} \section{Querying a repository} To generate a list of \Rclass{BiocViews} objects that can be used to generate HTML views, you will need the repository URL and a graph representation of the vocabulary. There are three main Bioconductor package repositories: a software repository containing analytic packages, an annotation data repository, and an experiment data repository. The vocabulary of terms has a single top-level node, all other nodes have at least one parent. The top-level node, \textit{BiocViews}, has three children that correspond to the three main Bioconductor repositories: \textit{Software}, \textit{AnnotationData}, and \textit{ExperimentData}. Views for each repository are created separately using \Rfunction{getBiocSubViews}. Below, we demonstrate how to build the \textit{Software} set of views. <>= data(biocViewsVocab) reposPath <- system.file("doc", package="biocViews") reposUrl <- paste("file://", reposPath, sep="") biocViews <- getBiocSubViews(reposUrl, biocViewsVocab, topTerm="Software") print(biocViews[1:2]) @ To query the currently available vocabulary terms, use function \Rfunction{getSubTerms} on the \Rclass{graphNEL} object \Robject{biocViewsVocab}. The second argument of this function takes a character of the base term for which all subterms should be returned. For a complete list use \Rfunarg{term="BiocViews"}. <>= getSubTerms(biocViewsVocab, term="Technology") @ \section{Generating HTML} By default, the set of HTML views will link to package description pages located in the html subdirectory of the remote repository. <>= viewsDir <- file.path(tempdir(), "biocViews") dir.create(viewsDir) writeBiocViews(biocViews, dir=viewsDir) dir(viewsDir)[1:2] @ \end{document} biocViews/vignettes/createReposHtml.Rnw0000644000175000017500000001565314136047116020154 0ustar nileshnilesh%\VignetteIndexEntry{biocViews-CreateRepositoryHTML} % % NOTE -- ONLY EDIT THE .Rnw FILE!!! The .tex file is % likely to be overwritten. % \documentclass[12pt]{article} \usepackage{amsmath} \usepackage[authoryear,round]{natbib} \usepackage{hyperref} \textwidth=6.2in \textheight=8.5in \oddsidemargin=.1in \evensidemargin=.1in \headheight=-.3in \newcommand{\scscst}{\scriptscriptstyle} \newcommand{\scst}{\scriptstyle} \newcommand{\Rfunction}[1]{{\texttt{#1}}} \newcommand{\Robject}[1]{{\texttt{#1}}} \newcommand{\Rpackage}[1]{{\textit{#1}}} \newcommand{\Rmethod}[1]{{\texttt{#1}}} \newcommand{\Rfunarg}[1]{{\texttt{#1}}} \newcommand{\Rclass}[1]{{\textit{#1}}} \textwidth=6.2in \bibliographystyle{plainnat} \begin{document} %\setkeys{Gin}{width=0.55\textwidth} \title{HOWTO generate repository HTML} \author{S. Falcon} \maketitle <>= library("biocViews") @ \section{Overview} This document assumes you have a collection of R packages on local disk that you would like to prepare for publishing to the web. The end result we are going for is: \begin{enumerate} \item Packages organized per CRAN-style repository standard \item PACKAGES files created for install.packages access \item VIEWS file created for generating biocViews \item A vignette directory created containing the extracted vignette pdf files from each source package in the repository. \item An html directory created containing html descriptions of each package with links for downloading available artifacts. \item A simple alphabetical listing index.html file \end{enumerate} \section{CRAN-style Layout} Establish a top-level directory for the repository, we will refer to this directory as reposRoot. Place your packages as follows: \begin{description} \item[src/contrib] Contains all source packages (*.tar.gz). \item[bin/windows/contrib/x.y] Contains all win.binary packages (*.zip). Where x.y is the major.minor version number of R. \item[bin/macosx/contrib/x.y] Contains the mac.binary (High Sierra) (*.tgz) packages. \end{description} You will need the following parameters: <>= reposRoot <- "path/to/reposRoot" ## The names are essential contribPaths <- c(source="src/contrib", win.binary="bin/windows/contrib/4.0", mac.binary="bin/macosx/contrib/4.0") @ \section{Extracting vignettes} The \Rfunction{extractVignettes} function extracts pdf files from inst/doc. The default is to extract to a reposRoot/vignettes. <>= extractVignettes(reposRoot, contribPaths["source"]) @ \section{Generating the control files} The \Rfunction{genReposControlFiles} function will generate the PACKAGES files for each contrib path and also create a VIEWS file with complete info for later use by biocViews. <>= genReposControlFiles(reposRoot, contribPaths) @ \section{Generating the HTML} The \Rfunction{writeRepositoryHtml} will generate HTML detail files for each package in reposRoot/html. The function will also create an index.html file at the top level. Two CSS files are included with \Rpackage{biocViews} that are automatically copied along side the appropriate HTML files during the HTML generation process. These CSS files are: \begin{verbatim} reposRoot/repository-detail.css reposRoot/html/package-detail.css \end{verbatim} \section{Design and extension notes} The basic idea is that using the VIEWS file and the known repository structure (location of packages and extracted vignettes), we represent the details for each package in the repository in a \Rclass{PackageDetail-class} instance. \Rclass{packageDetail-class} objects know how to write themselves to HTML using the \Rmethod{htmlValue} method. We used the \Rpackage{XML} package's \Rfunction{xmlOutputDOM} function to build up the HTML documents. Each HTML producing class extends \Rclass{Htmlized-class} which contains a slot to hold the DOM tree and provides a place to put methods that are not specific to any given HTML outputting class. In terms of extending this to generate the biocViews, have a look at \Rfunction{setDependsOnMeImportsMeSuggestsMe} which builds up an adjacency matrix representing package dependencies, importations, and suggestions. The matrix is square with rows and columns labeled with the names of the packages. The entries are 0/1 with $a_{ij}=1$ meaning that package $j$ depends on package $i$. \subsection{Details on HTML generation} I started by breaking the \Rmethod{htmlValue} method for \Rclass{PackageDetail-class} into one helper function for each logical section of the HTML we produce (author, description, details, downloads, and vignettes). That made the long method short enough to be readable. In order to be able to mix and match the different chunks and be able to more easily create new renderings, it seemed that it would be easiest to be able to render to HTML each chunk with a method. One possibility is a function \Rfunction{htmlChunk(object, ``descriptions'')} where the dispatch would be done using a switch statement or similar. A more flexible approach is to create dummy classes for each output ``chunk''. Each dummy class contains (subclasses) \Rclass{PackageDescription} and that's it. We then can take advantage of the behavior of the \Rmethod{as} method to convert. <>= ## Define classes like this for each logical document chunk setClass("pdAuthorMaintainerInfo", contains="PackageDetail") setClass("pdVignetteInfo", contains="PackageDetail") ## Then define a htmlValue method setMethod("htmlValue", signature(object="pdDescriptionInfo"), function(object) { node <- xmlNode("p", cleanText(object@Description), attrs=c(class="description")) node }) ## Then you can make use of all this... ## Assume object contains a PackageDetail instance authorInfo <- as(object, "pdAuthorMaintainerInfo") dom$addNode(htmlValue(authorInfo)) @ One advantage of this setup is that we can now define a method to generate complete HTML documents that will work for all the dummy classes. Hence mix and match. \subsection{A note on the htmlValue method for PackageDetail} We could parameterize as follows. Not sure this makes things easier to follow, but it does demonstrate how you could start building up documents in a more programatic fashion. \begin{verbatim} details <- list(heading=list(tag="h3", text="Details"), content="pdDetailsInfo") downloads <- list(heading=list(tag="h3", text="Download Package"), content="pdDownloadInfo") vignettes <- list(heading=list(tag="h3", text="Vignettes (Documentation)"), content="pdVignetteInfo") doSection <- function(sec) { dom$addTag(sec$heading$tag, sec$heading$text) secObj <- as(object, sec$content) dom$addNode(htmlValue(secObj)) } lapply(list(details, downloads, vignettes), doSection) \end{verbatim} \end{document} biocViews/build/0000755000175000017500000000000014140322302013424 5ustar nileshnileshbiocViews/build/vignette.rds0000644000175000017500000000035614140322302015767 0ustar nileshnileshu] 0gڇFTt?Q!BER.=@l_MJ /v> :2 1eid7ysd<`XN1 #d"QS7|QmPxū"(n~yX{42H#9eo]}lT:{PQciDX`Ƥ^vh+biocViews/tests/0000755000175000017500000000000014136047116013503 5ustar nileshnileshbiocViews/tests/runTests.R0000644000175000017500000000004714136047116015456 0ustar nileshnileshBiocGenerics:::testPackage("biocViews")biocViews/R/0000755000175000017500000000000014140033373012535 5ustar nileshnileshbiocViews/R/showvoc.R0000644000175000017500000000066414136047116014363 0ustar nileshnilesh showVoc <- function(g,outfile=tempfile()) { top <- adj(g, nodes(g)[1])[[1]] dd <- xmlTree("a") dd$addTag("body", close=FALSE) for (i in 1:length(top)) { dd$addTag("H2", top[i]) nxt <- adj(g, top[i])[[1]] if (length(nxt)>0) { dd$addTag("UL", close=FALSE) for (j in 1:length(nxt)) dd$addTag("LI", nxt[j]) dd$closeTag() } } dd$closeTag() cat(saveXML(dd$value()),sep="\n",file=outfile) } biocViews/R/as-methods.R0000644000175000017500000000016314136047116014731 0ustar nileshnileshsetAs("BiocView", "rdPackageTable", function(from) { as(as(from, "RepositoryDetail"), "rdPackageTable") }) biocViews/R/AllGenerics.R0000644000175000017500000000036114136047116015055 0ustar nileshnileshsetGeneric("htmlDoc", function(object, ...) standardGeneric("htmlDoc")) setGeneric("htmlValue", function(object) standardGeneric("htmlValue")) setGeneric("htmlFilename", function(object, ...) standardGeneric("htmlFilename")) biocViews/R/htmlValue-methods.R0000644000175000017500000004343714136047116016302 0ustar nileshnileshwriteHtmlDoc <- function(html, file) { ## Temporary fix: we open and close 'file' here instead of passing it ## directly to saveXML because of a bug in current XML::saveXML ## (from XML 1.3-2). Bug reported to XML's author on 2006-12-14. Herve. f <- file(file, open="w") ## another temp fix: write the DOCTYPE header here, perhaps we should ## use prefix for this in the call to saveXML? writeLines(paste(''), con=f) saveXML(html, f, prefix="") close(f) } tableHelper <- function(tableData, table.attrs) { dom <- xmlOutputDOM("table", attrs=table.attrs) odd <- TRUE for (fld in names(tableData)) { rowClass <- if(odd) "row_odd" else "row_even" odd <- !odd dom$addTag("tr", attrs=c(class=rowClass), close=FALSE) dom$addTag("th", fld) dom$addTag("td", tableData[[fld]]) dom$closeTag() } dom$closeTag() ## end details table dom$value() } cleanText <- function(text) { text <- gsub("&([a-zA-Z0-9#]+;)", "@_@_@\\1", text) text <- gsub("&", "&", text, fixed=TRUE) text <- gsub("@_@_@([a-zA-Z0-9#]+;)", "&\\1", text) text <- gsub("<", "<", text, fixed=TRUE) text <- gsub(">", ">", text, fixed=TRUE) text } setMethod("htmlValue", signature(object="rdPackageTable"), function(object) { dom <- xmlOutputDOM("table", attrs=c(class="repos_index")) odd <- TRUE alphaOrder <- order(tolower(names(object@packageList))) dom$addTag("tr", close=FALSE) dom$addTag("th", "Package") dom$addTag("th", "Maintainer") dom$addTag("th", "Title") dom$closeTag() for (pkg in object@packageList[alphaOrder]) { rowClass <- if(odd) "row_odd" else "row_even" odd <- !odd dom$addTag("tr", attrs=c(class=rowClass), close=FALSE) dom$addTag("td", attrs=c(class="package"), close=FALSE) if (length(object@reposRoot) > 0) root <- paste(object@reposRoot, object@htmlDir, sep="/") else root <- object@htmlDir infoPage <- paste(root, htmlFilename(pkg), sep="/") dom$addTag("a", attrs=c(href=infoPage), pkg@Package) dom$closeTag() dom$addTag("td", removeEmail(pkg@Maintainer), attrs=c(class="maintainer")) dom$addTag("td", pkg@Title, attrs=c(class="title")) dom$closeTag() ## end tr } dom$value() }) setMethod("htmlValue", signature(object="RepositoryDetail"), function(object) { dom <- xmlOutputDOM("div", attrs=c(class="RepositoryDetail")) dom$addTag("h1", cleanText(object@Title)) ## Package table pkgTable <- as(object, "rdPackageTable") dom$addNode(htmlValue(pkgTable)) dom$value() }) setMethod("htmlValue", signature(object="pdAuthorMaintainerInfo"), function(object) { dom <- xmlOutputDOM("table", attrs=c(class="author_info")) dom$addTag("tr", close=FALSE) dom$addTag("td", "Author") dom$addTag("td", cleanText(removeEmail(object@Author))) dom$closeTag() dom$addTag("tr", close=FALSE) dom$addTag("td", "Maintainer") dom$addTag("td", cleanText(removeEmail(object@Maintainer))) dom$closeTag() dom$value() }) setMethod("htmlValue", signature(object="pdVignetteInfo"), function(object) { dom <- xmlOutputDOM("table", attrs=c(class="vignette")) odd <- TRUE rowClass <- "row_odd" if (length(object@vignettes) > 0) { vignetteTitles <- ifelse(nzchar(object@vignetteTitles), object@vignetteTitles, basename(object@vignettes)) for (i in order(vignetteTitles)) { rowClass <- if(odd) "row_odd" else "row_even" dom$addTag("tr", attrs=c(class=rowClass), close=FALSE) dom$addTag("th", vignetteTitles[i]) dom$addTag("td", close=FALSE) pdflink <- paste(object@reposRoot, object@vignettes[i], sep="/") dom$addTag("a", "PDF", attrs=c(href=pdflink)) dom$closeTag() if (nchar(object@vignetteScripts[i]) > 0) { dom$addTag("td", close=FALSE) Rlink <- paste(object@reposRoot, object@vignetteScripts[i], sep="/") dom$addTag("a", "R Script", attrs=c(href=Rlink)) dom$closeTag() } dom$closeTag() ## end tr odd <- !odd } } else { dom$addTag("tr", attrs=c(class=rowClass), close=FALSE) dom$addTag("td", "No vignettes available") dom$closeTag() odd <- !odd } rowClass <- if(odd) "row_odd" else "row_even" if (length(object@manuals) > 0 && !is.na(object@manuals[1])) { dom$addTag("tr", attrs=c(class=rowClass), close=FALSE) dom$addTag("td", close=FALSE) mlink <- paste(object@reposRoot, object@manuals[1], sep="/") dom$addTag("a", "Reference Manual", attrs=c(href=mlink)) dom$closeTag() dom$closeTag() ## end tr odd <- !odd } else { dom$addTag("tr", attrs=c(class=rowClass), close=FALSE) dom$addTag("td", "No reference manual available") dom$closeTag() odd <- !odd } dom$value() }) setMethod("htmlValue", signature(object="pdDownloadInfo"), function(object) { flds <- c(source="source.ver", win.binary="win.binary.ver", mac.binary="mac.binary.ver", mac.binary.mavericks="mac.binary.mavericks.ver", `mac.binary.el-capitan`="mac.binary.el-capitan.ver") fileTypes <- list(source="Package source", win.binary="Windows 32-bit binary", mac.binary="macOS 10.13 (High Sierra) binary", mac.binary.mavericks="MacOS X 10.9 (Mavericks) binary", `mac.binary.el-capitan`="MacOS X 10.11 (El Capitan) binary") makeLinkHelper <- function(type) { isAvailable = TRUE archs <- slot(object, "Archs") if (length(archs) > 0 && nchar(archs) > 0) { if (type == "win.binary") { if (length(grep("i386", archs, value=TRUE)) == 0) { isAvailable = FALSE } } } pkgPath <- slot(object, flds[type]) if (isAvailable && !is.na(pkgPath) && length(pkgPath) > 0 && pkgPath != "") { ref <- paste(object@reposRoot, pkgPath, sep="/") aTag <- xmlNode("a", basename(pkgPath), attrs=c(href=ref)) } else { aTag <- "Not Available" } aTag } fileLinks <- lapply(names(fileTypes), makeLinkHelper) names(fileLinks) <- fileTypes downloadStatsUrl <- slot(object, "downloadStatsUrl") if ((length(downloadStatsUrl) == 1) && (nchar(downloadStatsUrl) > 0)) { fileLinks <- c(fileLinks, list("Package Downloads Report" = xmlNode("a", "Downloads Stats", attrs=c(href=paste(downloadStatsUrl, "/", slot(object, "Package"), ".html", sep=""))))) } domValue <- tableHelper(fileLinks, table.attrs=list(class="downloads")) domValue }) setMethod("htmlValue", signature(object="pdDetailsInfo"), function(object) { ## link generating functions buildLinks <- function(x, root, class, check = FALSE) { nodes <- lapply(x, function(y) { urlError <- FALSE if (nchar(y) == 0 || length(root) == 0) { urlError <- TRUE } else { if (check) { oldWarn <- options()[["warn"]] options(warn = -1) for (i in seq_len(length(root))) { link <- paste(root[i], "/", y, ".html", sep="") con <- try(url(link, "r"), silent = TRUE) if (class(con)[[1]] != "try-error") break; } options(warn = oldWarn) if (class(con)[[1]] == "try-error") { urlError <- TRUE } else { close(con) } } else { link <- paste(root[1], "/", y, ".html", sep="") } } if (urlError) { node <- y } else { node <- xmlNode("a", y, attrs=c(href=link)) } return(node) }) if (length(nodes) == 0) { args <- list() } else if (length(nodes) == 1) { args <- nodes } else { args <- vector("list", 2*length(nodes) - 1) args[seq(1, 2*length(nodes) - 1, by = 2)] <- nodes args[seq(2, 2*(length(nodes) - 1), by = 2)] <- list(", ") } args <- c(list(name = "div"), args, list(attrs = c(class=class))) return(do.call(xmlNode, args)) } buildViewLinks <- function(x) buildLinks(x, object@viewRoot, class="views") buildPkgLinks <- function(x) buildLinks(x, paste(object@reposFullUrl, "/html", sep=""), class="packages", check=TRUE) buildURLLink <- function(u) { if (!length(u) || nchar(u) == 0) node <- "" else node <- xmlNode("a", u, attrs=c(href=u)) return(node) } ## create list elements for fields flds <- c("biocViews"="biocViews", "Depends"="Depends", "Imports"="Imports", "Suggests"="Suggests", "System Requirements"="SystemRequirements", "License"="License", "URL"="URL", "Depends On Me"="dependsOnMe", "Imports Me"="importsMe", "Suggests Me"="suggestsMe", "Development History"="devHistoryUrl") tableDat <- vector("list", length = length(flds)) names(tableDat) <- flds ## add biocViews info tableDat[["biocViews"]] <- buildViewLinks(object@biocViews) ## add Depends, Imports, Suggests, dependsOnMe, importsMe, suggestsMe pkgFlds <- c("Depends", "Imports", "Suggests", "dependsOnMe", "importsMe", "suggestsMe") tableDat[pkgFlds] <- lapply(pkgFlds, function(x) buildPkgLinks(slot(object, x))) ## add SystemRequirements and License info otherFlds <- c("SystemRequirements", "License") tableDat[otherFlds] <- lapply(otherFlds, function(x) paste(slot(object, x), collapse=", ")) ## add URL info tableDat[["URL"]] <- buildURLLink(object@URL) ## add development history devHistoryUrl <- object@devHistoryUrl if ((length(devHistoryUrl) == 1) && (nchar(devHistoryUrl) > 0)) { tableDat[["devHistoryUrl"]] <- xmlNode("a", "Bioconductor Changelog", attrs=c(href=paste(devHistoryUrl, "/", object@Package, sep=""))) } else { flds <- flds[- match("devHistoryUrl", flds)] tableDat[["devHistoryUrl"]] <- NULL } ## rename rows names(tableDat) <- names(flds) domValue <- tableHelper(tableDat, table.attrs=list(class="details")) domValue }) setMethod("htmlValue", signature(object="pdDescriptionInfo"), function(object) { node <- xmlNode("p", cleanText(object@Description), attrs=c(class="description")) node }) setMethod("htmlValue", signature(object="PackageDetail"), function(object) { dom <- xmlOutputDOM("div", attrs=c(class="PackageDetail")) ## Heading dom$addTag("h1", object@Package) dom$addTag("h2", cleanText(object@Title)) ## Description descInfo <- as(object, "pdDescriptionInfo") dom$addNode(htmlValue(descInfo)) ## Author info authorInfo <- as(object, "pdAuthorMaintainerInfo") dom$addNode(htmlValue(authorInfo)) ## Installation Instructions dom$addTag("div", attrs=c(class="installInstruct"), close=FALSE) dom$addTag("p", paste("To install this package,", "start R and enter:"), attrs=c(class="install")) dom$addTag("pre", paste(" if (!require(\"BiocManager\"))", "\n install.packages(\"BiocManager\")", "\n BiocManager::install(\"", object@Package, "\")", sep="") ) dom$closeTag() # div ## Documentation dom$addTag("h3", "Documentation") vigInfo <- as(object, "pdVignetteInfo") dom$addNode(htmlValue(vigInfo)) ## Details dom$addTag("h3", "Details") detailsInfo <- as(object, "pdDetailsInfo") dom$addNode(htmlValue(detailsInfo)) ## Package Downloads dom$addTag("h3", "Package Downloads") downloadInfo <- as(object, "pdDownloadInfo") dom$addNode(htmlValue(downloadInfo)) return(dom$value()) }) viewsHelper <- function(views) { dom <- xmlOutputDOM("ul") for (v in views) { link <- htmlFilename(v) dom$addTag("li", close=FALSE) dom$addTag("a", v, attrs=c(href=link)) dom$closeTag() } dom$value() } setMethod("htmlValue", signature(object="bvSubViews"), function(object) { dom <- xmlOutputDOM("div", attrs=c(class="bv_subviews")) dom$addTag("h2", "Subviews") dom$addNode(viewsHelper(object@subViews)) dom$value() }) setMethod("htmlValue", signature(object="bvParentViews"), function(object) { dom <- xmlOutputDOM("div", attrs=c(class="bv_parentviews")) dom$addTag("h2", "Subview of") dom$addNode(viewsHelper(object@parentViews)) dom$value() }) setMethod("htmlValue", signature(object="BiocView"), function(object) { dom <- xmlOutputDOM("div", attrs=c(class="BiocView")) ## Heading dom$addTag("h1", paste("Bioconductor Task View:", object@name)) ## Parent Views if (length(object@parentViews) > 0) { parentViews <- as(object, "bvParentViews") dom$addNode(htmlValue(parentViews)) } ## Subviews if (length(object@subViews) > 0) { subViews <- as(object, "bvSubViews") dom$addNode(htmlValue(subViews)) } dom$addTag("h2", "Packages in view") if (length(object@packageList) > 0) { pkgTable <- as(object, "rdPackageTable") dom$addNode(htmlValue(pkgTable)) } else { dom$addTag("p", "No packages in this view") } dom$value() }) biocViews/R/getPackageNEWS.R0000644000175000017500000002217114136047116015420 0ustar nileshnilesh.msg <- function(fmt, ..., width=getOption("width")) ## Use this helper to format all error / warning / message text { strwrap(sprintf(fmt, ...), width=width, exdent=4) } ## collate package NEWS files using starting version number in ## prevRepos, and membership in currRepos as references. Package ## source tree rooted at srcDir, possibiblly as tarred files # repo: bioc data/experiment workflows getPackageNEWS <- function(prevRepos="3.6", currRepos="3.7", repo=c("bioc", "data/experiment", "workflows"), srcdir=NULL){ repo <- match.arg(repo) URL_BASE <- "http://master.bioconductor.org/packages/" VIEWS <- "%s%s/%s/VIEWS" prevUrl <- url(sprintf(VIEWS, URL_BASE, prevRepos, repo)) prev <- read.dcf(prevUrl, fields=c("Package", "Version")) rownames(prev) <- prev[,1] close(prevUrl) currUrl <- url(sprintf(VIEWS, URL_BASE, currRepos, repo)) curr <- read.dcf(currUrl, fields=c("Package", "Version")) rownames(curr) <- curr[,1] close(currUrl) prev <- prev[rownames(prev) %in% rownames(curr),] newpkgs <- setdiff(rownames(curr), rownames(prev)) idx <- package_version(curr[newpkgs, "Version"], strict=FALSE) >= "0.99.0" newpkgs <- newpkgs[idx] vers <- c(sub("\\.[[:digit:]]?$", ".0", prev[,"Version"]), setNames(rep("0.0", length(newpkgs)), newpkgs)) if (is.null(srcdir)){ temp = tempdir() system(paste0("scp -r webadmin@master.bioconductor.org:/extra/www/bioc/packages/", currRepos, "/", repo, "/news ", temp)) srcdir <- paste0(temp, "/news") } getNews <- function(pkg, ver, srcdir) { newsloc <- file.path(srcdir, pkg, c("inst", "inst", "inst", ".","."), c("NEWS.Rd", "NEWS", "NEWS.md", "NEWS.md", "NEWS")) news <- head(newsloc[file.exists(newsloc)], 1) if (0L == length(news)) return(NULL) tryCatch({ db <- if (grepl("Rd$", news)){ tools:::.build_news_db_from_package_NEWS_Rd(news) } else if (grepl("md$", news)){ tools:::.build_news_db_from_package_NEWS_md(news) } else { tools:::.news_reader_default(news) } if (!is.null(db)) utils::news(Version > ver, db=db) else NULL }, error=function(...) NULL) } ret <- Filter(function(x) !is.null(x) && 0L != nrow(x), Map(getNews, names(vers), vers, srcdir)) nms <- names(ret) s <- sort(nms) newRet <- ret[s] } ## based on tools:::.build_news_db() getNEWSFromFile <- function (dir, destfile, format = NULL, reader = NULL, output=c("md", "text")) { mdIfy <- function(txt) { lines <- strsplit(txt, "\n") segs <- lines[[1]] segs <- sub("^ o +", "- ", segs) segs <- sub("^\t", " ", segs) return(paste(segs, collapse="\n")) } newsRdFile <- file.path(dir, "NEWS.Rd") ## should never be found newsRdFile2 <- file.path(dir, "inst", "NEWS.Rd") if (!file_test("-f", newsRdFile) && !file_test("-f", newsRdFile2)) { newsMdFile <- file.path(dir, "NEWS.md") newsMdFile2 <- file.path(dir, "inst", "NEWS.md") if (!file_test("-f", newsMdFile) && !file_test("-f", newsMdFile2)) { nfile <- file.path(dir, "NEWS") nfile2 <- file.path(dir, "inst", "NEWS") if (!file_test("-f", nfile) && !file_test("-f", nfile2)) return(invisible()) nfile <- ifelse(file_test("-f", nfile), nfile, nfile2) if (!is.null(format)) .NotYetUsed("format", FALSE) if (!is.null(reader)) .NotYetUsed("reader", FALSE) file <- file(destfile, "w+") on.exit(close(file)) news <- paste(readLines(nfile), collapse="\n") if ("md" == output) news = mdIfy(news) cat(news, file=file) return(invisible()) } newsMdFile <- ifelse(file_test("-f", newsMdFile), newsMdFile, newsMdFile2) file <- file(destfile, "w+") on.exit(close(file)) db <- tools:::.build_news_db_from_package_NEWS_md(newsMdFile) news <- NULL try(news <- capture.output(print(db))) if (is.null(news)) { message(sprintf("Error building news database for %s/%s", dir, destfile)) return(invisible()) } news <- paste(news, collapse="\n") if ("md" == output) news <- mdIfy(news) cat(news, file=file) return(invisible()) } newsRdFile <- ifelse(file_test("-f", newsRdFile), newsRdFile, newsRdFile2) file <- file(destfile, "w+") on.exit(close(file)) db <- tools:::.build_news_db_from_package_NEWS_Rd(newsRdFile) news <- NULL try(news <- capture.output(print(db))) if (is.null(news)) { message(sprintf("Error building news database for %s/%s", dir, destfile)) return(invisible()) } news <- paste(news, collapse="\n") if ("md" == output) news <- mdIfy(news) cat(news, file=file) return(invisible()) } printNEWS <- function(dbs, destfile, overwrite=FALSE, width=68, output=c("md", "text"), relativeLink=FALSE, ...) { output <- match.arg(output) dbs <- lapply(dbs, function(db) { db[["Text"]] <- sapply(db[["Text"]], function(elt) { elt <- unlist(strsplit(elt, "\n")) paste(strwrap(elt, width=options()[["width"]] - 10), collapse="\n") }) db }) urlBase <- ifelse(relativeLink, "/packages/","https://bioconductor.org/packages/") txt <- capture.output({ for (i in seq_along(dbs)) { tryCatch({ cat(sprintf( "\n[%s](%s%s)\n%s\n\n", names(dbs)[[i]], urlBase, names(dbs)[[i]], paste(rep("-", nchar(names(dbs)[[i]])), collapse=""))) print(dbs[[i]]) }, error=function(err) { warning("print() failed for ", sQuote(names(dbs)[[i]]), immediate.=TRUE, call.=FALSE) }) } }) if ("md" == output) { txt <- sub("^ o ", "-", txt) txt <- sub("^\t", " ", txt) } if (!is(destfile, "connection")) { if (file.exists(destfile) && !overwrite) stop(.msg("'%s' exists and overwrite=FALSE", destfile)) file <- file(destfile, "w+") on.exit(close(file)) } else file = destfile writeLines(txt, file) } # manifest: software.txt data-experiment.txt workflows.txt # status: new or removed getPackageTitles <- function(prevBranch="RELEASE_3_6", currBranch="master", manifest=c("software.txt", "data-experiment.txt", "workflows.txt"), status = c("new", "removed")){ manifest <- match.arg(manifest) status <- match.arg(status) GIT_ARCHIVE <- "git archive --remote=ssh://git@git.bioconductor.org/admin/manifest %s %s | tar -xO" prevRepo <- system(sprintf(GIT_ARCHIVE, prevBranch, manifest), intern=TRUE) prevRepo <- trimws(gsub(pattern = "Package: ", replacement="", prevRepo[-which(prevRepo=="")])) currRepo <- system(sprintf(GIT_ARCHIVE, currBranch, manifest), intern=TRUE) currRepo <- trimws(gsub(pattern = "Package: ", replacement="", currRepo[-which(currRepo=="")])) # switch statement pkgs <- switch(status, new = setdiff(currRepo, prevRepo), removed = setdiff(prevRepo, currRepo) ) pkgs } printNewPackageTitles <- function(titles, destfile, overwrite=FALSE) { if (!is(destfile, "connection")) { if (file.exists(destfile) && !overwrite) stop(.msg("'%s' exists and overwrite=FALSE", destfile)) file <- file(destfile, "w+") on.exit(close(file)) } else file = destfile cat(strwrap(sprintf("\n- %s: %s", names(titles), titles), width=70, exdent=2), file=stdout(), sep="\n") } getPackageDescriptions <- function(pkgs, outfile, output=c("md", "text"),relativeLink=FALSE) { output <- match.arg(output) if (output == "text") exdent = 4 else exdent = 2 plower <- tolower(pkgs) names(plower) <- pkgs pkgs <- names(sort(plower)) file <- tempfile() DESC_FILE <- "git archive --remote=ssh://git@git.bioconductor.org/packages/%s master DESCRIPTION|tar -xO > %s" urlBase <- ifelse(relativeLink, "/packages/","https://bioconductor.org/packages/") desc = lapply(pkgs, function(pkg) { system(sprintf(DESC_FILE, pkg, file)) d = read.dcf(file)[,"Description"] paste(strwrap(sprintf("- [%s](%s%s) %s", pkg, urlBase, pkg, d), width=70, exdent=exdent), collapse="\n") }) cat(noquote(unlist(desc)), sep="\n\n", file=outfile) invisible(NULL) } biocViews/R/recommendBiocViews.R0000644000175000017500000003312014136047116016450 0ustar nileshnilesh.cleanupDependency <- function(input) { if (is.null(input)) return(character(0)) output <- gsub("\\s", "", input) output <- gsub("\\([^)]*\\)", "", output) res <- strsplit(output, ",")[[1]] unique(res[which(res != "R")]) } .parseDot <- function(dot) { dot <- sub(" *; *$", "", dot[grepl("^[[:space:][:alpha:]]+->", dot)]) unique(unlist(strsplit(dot, " *-> *"))) } getCurrentbiocViews <- function() { #read biocViews from dot file. biocViewdotfile <- system.file("dot","biocViewsVocab.dot", package="biocViews") if(!file.exists(biocViewdotfile)) stop("Package biocViews not found.") dot <- readLines(biocViewdotfile) Software <- dot[seq(grep("BiocViews -> Software", dot), grep("BiocViews -> AnnotationData", dot) - 1)] AnnotationData <- dot[seq(grep("BiocViews -> AnnotationData", dot), grep("BiocViews -> ExperimentData", dot) - 1)] ExperimentData <- dot[seq(grep("BiocViews -> ExperimentData", dot), grep("BiocViews -> Workflow", dot) - 1)] Workflow <- dot[seq(grep("BiocViews -> Workflow", dot), length(dot),1)] Software <- .parseDot(Software) ExperimentData <- .parseDot(ExperimentData) AnnotationData <- .parseDot(AnnotationData) Workflow <- .parseDot(Workflow) list(Software= Software ,ExperimentData= ExperimentData, AnnotationData= AnnotationData, Workflow= Workflow) } .findBranchReadDot <- function(current, branch) { ans <- getCurrentbiocViews() Software <- ans$Software ExperimentData <- ans$ExperimentData AnnotationData <- ans$AnnotationData Workflow <- ans$Workflow find_branch <- NULL if(length(current) != 0){ idx<- list(Software = match(current, Software), AnnotationData = match(current, AnnotationData), ExperimentData = match(current, ExperimentData), Workflow = match(current, Workflow)) atrue <- sapply(idx, function(x) any(!is.na(x))) #which branch has hit find_branch <- names(which(atrue==TRUE)) if(length(find_branch)>1) message("You have biocViews from multiple branches.") } if(length(find_branch)==0 & length(branch)==3){ txt <- paste0("Incorrect biocViews in file & no branch specified. Cant recommend biocViews") message(paste(strwrap(txt,exdent=2), collapse="\n")) } if(length(branch)==3 & length(find_branch)==1) { branch <- find_branch } if( length(branch)==1 & length(find_branch)==1) { if( length(branch)!=3 & (tolower(branch)!=tolower(find_branch))){ txt <- paste0("You have specified ",branch," branch but your package contains biocViews from ",find_branch, " branch.") message(paste(strwrap(txt,exdent=2), collapse="\n")) } } # return appropriate dot terms based on branch. if (tolower(branch)=="software") returndot <- Software else if(tolower(branch)=="experimentdata") returndot <- ExperimentData else if(tolower(branch)=="annotationdata") returndot <- AnnotationData else returndot <- Workflow returndot } .wordsfromDESCRIPTION <- function(pkgdir) { ## strategy 1- parse the words in the DESCRIPTION file to get ## biocViews descr_file <- file.path(pkgdir,"DESCRIPTION") dcf <- read.dcf(descr_file, c("Description", "Title", "Package","biocViews")) words1 <- unique(unlist(strsplit(dcf, " "))) ## strategy 2- get biocViews of packages in depends field. pkgs <- read.dcf(descr_file, "Depends") pkgs <- unlist(strsplit(gsub("[0-9.()>= ]", "", pkgs), ",")) urls <- .getBioCDevelUrl(devel=TRUE, branch="software") words2 <- character() con <- url(urls) biocpkgs <- read.dcf(con,"Package") idx <- which(biocpkgs %in% pkgs) if (length(idx)!=0) { wrd <- read.dcf(con, "biocViews")[idx] wrd <- unique(unlist(strsplit(wrd, ", "))) words2 <- c(words2,wrd) } close(con) if (length(words2)!=0) { words <- c(words1, words2) } else { words <- c(words1) } words } .wordsfromMANVIN <- function(pkgdir, man, vig) { manfls <- character(0) vinfls <- character(0) ##strategy -3 man pages parsing. if(man) manfls <- list.files(file.path(pkgdir,"man"), full.names=TRUE, pattern="\\.Rd$") ##stragegy -4 vignette pages parsing. if(vig) vinfls <- list.files(file.path(pkgdir,"vignettes"), full.names=TRUE, pattern="\\.Rnw$") allfls <- c(manfls,vinfls) if(length(allfls)==0){ all_words <- NA }else{ q <- lapply(allfls, readLines) temp <- unlist(strsplit(q[[1]], "[[:punct:]]", perl = TRUE)) temp <- unlist(strsplit(temp, "[[:space:]]", perl = TRUE)) all_words <- unique(temp[temp != ""]) } all_words } recommendBiocViews <- function(pkgdir, branch= c("Software", "AnnotationData", "ExperimentData")) { if(!file.exists(pkgdir)) stop("Package Directory not found.") if(!file.exists(file.path(pkgdir,"DESCRIPTION"))) stop("No DESCRIPTION file found.") ## existing biocView in test package? current <- read.dcf(file.path(pkgdir,"DESCRIPTION"), c("biocViews", "BiocViews")) current <- .cleanupDependency(current) if(length(current)==0 & missing(branch)){ txt <- "No existing biocViews found in this package and cannot determine the branch of package to recommend biocViews" stop(paste(strwrap(txt,exdent=2), collapse="\n")) } words1 <- .wordsfromDESCRIPTION(pkgdir) m <- file.exists(file.path(pkgdir,"man")) v <- file.exists(file.path(pkgdir,"vignettes")) man <- character(0) vig <- character(0) if(all(m,v)){ all_words<- .wordsfromMANVIN(pkgdir, man=TRUE, vig=TRUE) } else{ if(!m){ message("No man pages found.") all_words<- .wordsfromMANVIN(pkgdir, man=FALSE, vig=TRUE) } if(!v){ message("No vignettes found.") all_words<- .wordsfromMANVIN(pkgdir, man=TRUE, vig=FALSE) } } words1 <- c(words1,all_words) words1 <- unlist(sapply(words1,.cleanupDependency, USE.NAMES = FALSE) ) dotterms <- .findBranchReadDot(current, branch) ### split "DecsisionTree" to "decision" , "tree" terms <- sapply(dotterms, function(x){ m <- gregexpr(pattern= "[[:upper:]]", text = x, ignore.case=FALSE) s1 <- unlist(regmatches(x,m)) s2 <- unlist(strsplit(x, "[[:upper:]]")) if(length(s2)!=length(s1)) s2 <- s2[-1] word<-function(s1,s2) paste0(s1,s2) ans <- mapply(word, s1,s2, USE.NAMES=FALSE) if(length(ans)==0) ans <- x ans }, simplify = TRUE) terms <- lapply(terms, function(z){ z<- setdiff(z,"Data") unlist(strsplit(z,"_")) }) if(branch=="ExperimentData") { terms$CpGIslandData <- c("cpg", "island") terms$GEO <- "GEO" terms$HapMap <- "HapMap" terms$SNPData <- "SNP" terms$DNASeqData <- c("DNA","Seq") terms$RNASeqData <- c("RNA","Seq") terms$ChIPSeqData <- c("ChIP","Seq") terms$RIPSeqData <- c("RIP","Seq") terms$COPDData <-"COPD" terms$qPCRData <- "pcr" terms$SAGEData <-"sage" } # combine words from all sources and map words1 <- unique(unlist(strsplit(words1,"\n"))) words1 <- unique(unlist(strsplit(words1,"-"))) words1 <- unique(unlist(strsplit(words1,"_"))) words1 <- gsub("[.]","",words1) ## match against biocViews. idx <- which(tolower(dotterms) %in% tolower(words1)) temp <- dotterms[idx] ## only if both "decision" and "tree" are found add biocView "DecisionTree" split_word <- mapply(FUN= function(x,y){ i <- which(tolower(x) %in% tolower(words1)) ifelse(length(i)==length(x), y, NA) }, terms, names(terms), USE.NAMES=FALSE) suggest_bioc <- unique(c(split_word[complete.cases(split_word)], temp)) commonbiocViews <- c("Infrastructure","Software", "AssayDomain","BiologicalQuestion","Infrastructure", "ResearchField","StatisticalMethod","Technology", "Annotation","Visualization","DataRepresentation", "miRNA","SNP","qPCR","SAGE","Genetics", "GenomeAnnotation", "SpecimenSource","OrganismData", "DiseaseModel","TechnologyData","AssayDomainData", "RepositoryData") suggest_bioc <- setdiff(suggest_bioc,commonbiocViews) ## setdiff between current and suggested biocViews. if(length(current)!=0){ new_bioc <- setdiff(suggest_bioc, current) }else{ new_bioc <- suggest_bioc } ## some pkgs have terms which do not belong to software branch. remove <- c(intersect(current, commonbiocViews), setdiff(current, dotterms)) list(current = paste(current, collapse=", "), recommended = paste(new_bioc, collapse=", "), remove = paste(remove, collapse=", ")) } .getBioCDevelUrl <- function(devel=TRUE, branch) { con <- url("http://bioconductor.org/js/versions.js") x <- readLines(con) pattern <- ifelse(devel, "develVersion", "releaseVersion") dv <- x[grep(pattern, x)] devel_version <- strsplit(dv, '"')[[1]][2] repos <- switch(tolower(branch), software="/bioc/", experimentdata="/data/experiment/", annotationdata="/data/annotation/") close(con) paste0("http://bioconductor.org/packages/", devel_version, repos, "VIEWS") } recommendPackages <- function(biocViews, use.release=TRUE, intersect.views=TRUE) { if(length(biocViews)==0) # return avaialbel biocViews stop("Input some biocViews to get recommended packages.") toMatch <- paste(biocViews, collapse="|") ## check if the input biocViews are defined by us. existingbiocViews <- getCurrentbiocViews() match <- sapply(existingbiocViews, function(x){ length(unique(grep(toMatch, x, ignore.case=TRUE))) }) if(all(match==0L)) stop("See: http://bioconductor.org/packages for valid biocViews") ## which branch do these biocViews belong to ? branch <- names(match)[match != 0L] if (length(branch) != 1L) stop("Input biocViews belong to branches ", paste(sQuote(branch), collapse=", "), "; choose from 1 branch only") ## recommed packages based on branch url <- .getBioCDevelUrl(devel=!use.release, branch) con <- url(url) tbl <- read.dcf(con, fields=c("Package", "biocViews")) close(con) ## get child biocViews of input biocView ## eg: if biocView is 'Alignment' then we should get packages tagged ## with 'MultipleSequenceAlignment' also! biocViews <- c(biocViews, .getChildEdgeFromDot(biocViews)) idx0 <- sapply(tbl[,"biocViews"], function(table, x) { y <- gsub("\n", " ", table) y <- unlist(strsplit(y, ",")) y <- gsub("^\\s+|\\s+$", "", y) # remove trailing/leading white spaces tolower(x) %in% tolower(y) } , biocViews) if(length(biocViews)==1L){ ## a list is returned. No operation needs to be done return(tbl[idx0, "Package"]) } ## if intersect.views = TRUE then 'and' operation is carried out. ## eg: Packages tagged with both biocView 'a' and 'b' will be resturned. colnames(idx0) <- tbl[,"Package"] if (intersect.views) pkg <- colnames(idx0)[colSums(idx0)==length(biocViews)] # and operation else{ pkg <- colnames(idx0)[colSums(idx0)!=0] # or operation } pkg } .getChildEdgeFromDot <- function(biocView) { ans <- .getChildren(biocView) ans <- unlist(ans) names(ans) <- NULL ans[!(ans %in% "1")] } .getChildren <- function(biocView) { biocViewsVocab <- NULL data(biocViewsVocab, package="biocViews", envir=environment()) ans <- unlist(edges(biocViewsVocab, biocView)) if(length(ans)==0) return("1") else return(c(ans, .getChildren(ans))) } # Best guess if cannot be determined defaults to software guessPackageType <- function(biocViews){ if(length(biocViews)==0){ return("Software") } else{ toMatch <- paste0("^",paste(biocViews, collapse="$|^"), "$") ## check if the input biocViews are defined by us. existingbiocViews <- getCurrentbiocViews() match <- sapply(existingbiocViews, function(x){ length(unique(grep(toMatch, x))) }) if(all(match==0L)) return("Software") branch <- names(match)[which(match == max(match))] return(as.character(branch[1])) } } biocViews/R/getPackNames.R0000644000175000017500000002316114136047116015232 0ustar nileshnilesh## FIXME this is really dumb as does not ## use the information in VIEWS (VignetteBuilder field) myStangle <- function(file) { oldwd <- setwd(dirname(file)) on.exit(setwd(oldwd)) bfile <- basename(file) tryCatch(Stangle(bfile), error=function(e){ if (file.exists(bfile) && grepl("\\.Rnw$", bfile, ignore.case=TRUE)) { rfile <- sub("\\.Rnw$", ".R", bfile, ignore.case=TRUE) if (file.exists(rfile)) unlink(rfile) if (!requireNamespace("knitr")) { stop("'knitr' package required to tangle knitr vignettes") } tryCatch(knitr::purl(bfile),error=function(e){ print(sprintf("Error purling %s!", bfile)) }) } }) } readPackageInfo <- function(file, fields = NULL, all = FALSE) { info <- read.dcf(file = file, fields = fields, all = all) if ("vignettes" %in% colnames(info)) { info <- cbind(info, "vignetteScripts" = unlist(lapply(strsplit(info[,"vignettes"], ",\n"), function(vigs) { paste(lapply(sub("pdf$", "Rnw", vigs), function(v) { if (file.exists(v) && (!file.info(v)$isdir)) { myStangle(v) rfile <- sub("Rnw$", "R", basename(v)) file.copy(rfile, dirname(v), overwrite = TRUE) file.remove(rfile) sub("Rnw$", "R", v) } else { v <- sub("Rnw$", "rnw", v) if (file.exists(v) && (!file.info(v)$isdir)) { myStangle(v) rfile <- sub("rnw$", "R", basename(v)) file.copy(rfile, dirname(v), overwrite = TRUE) file.remove(rfile) sub("rnw$", "R", v) } else "" } }), collapse = ",\n") })), "vignetteTitles" = unlist(lapply(strsplit(info[,"vignettes"], ",\n"), function(vigs) { paste(lapply(sub("pdf$", "Rnw", vigs), function(v) { if (file.exists(v)) tools:::vignetteInfo(v)[["title"]] else { v <- sub("Rnw$", "rnw", v) if (file.exists(v)) tools:::vignetteInfo(v)[["title"]] else sub("rnw$", "pdf", basename(v)) } }), collapse = ",\n") }))) } ## TODO: make the 'manuals' path configurable if ("Package" %in% colnames(info)) { info <- cbind(info, "manuals" = unlist(lapply(info[,"Package"], function(pkg) { man <- file.path("manuals", pkg, "man", paste(pkg, ".pdf", sep = "")) if (file.exists(man)) man else NA_character_ }))) } info } writeBiocViews <- function(bvList, dir, backgroundColor="transparent") { ## bvList is a list of BiocViews objects ## dir is the output directory in which to write the views. for (bv in bvList) { fn <- file.path(dir, htmlFilename(bv)) html <- htmlDoc(bv) writeHtmlDoc(html, fn) } ## copy the css cssName <- "repository-detail.css" cssPath <- system.file(file.path("css", paste(cssName, ".in", sep="")), package="biocViews") res <- try(copySubstitute(cssPath, file.path(dir, cssName), symbolValues=list("BACKGROUND_COLOR"=backgroundColor)), silent=TRUE) res } writeTopLevelView <- function(dir, vocab) { top <- getRootNode(vocab) mainSubViews <- edges(vocab)[[top]] topView <- new("BiocView", name=top, subViews=mainSubViews) fn <- file.path(dir, htmlFilename(topView)) writeHtmlDoc(htmlDoc(topView), fn) } getBiocViews <- function(reposUrl, vocab, defaultView, local=FALSE, htmlDir="") { viewList <- getPacksAndViews(reposUrl, vocab, defaultView, local) viewRoster <- permulist(viewList$views, vocab) if (local) { reposUrl <- character(0) if (!htmlDir == "") { htmlDir <- htmlDir } } biocViews <- loadViews(vocab, viewRoster, viewList$pkgList, reposUrl, htmlDir) biocViews } getBiocSubViews <- function(reposUrl, vocab, topTerm, local=FALSE, htmlDir="") { root <- getRootNode(vocab) terms <- getSubTerms(vocab, topTerm) subVocab <- subGraph(c(root, terms), vocab) bvl <- getBiocViews(reposUrl, subVocab, topTerm, local, htmlDir) bvl[terms] ## exclude root } loadViews <- function(viewGraph, viewRoster, pkgList, reposUrl, htmlDir="html") { views <- nodes(viewGraph) viewmat <- as(viewGraph, "matrix") viewFactory <- function(name) { subViews <- viewmat[name, ] == 1 if (any(subViews)) subViews <- views[subViews] else subViews <- character(0) parentViews <- viewmat[ , name] == 1 if (any(parentViews)) parentViews <- views[parentViews] else parentViews <- character(0) if (name %in% names(viewRoster)) { pkgsInView <- pkgList[viewRoster[[name]]] } else pkgsInView <- list() new("BiocView", name=name, subViews=subViews, parentViews=parentViews, packageList=pkgsInView, htmlDir=htmlDir, reposRoot=reposUrl) } biocViews <- lapply(views, viewFactory) names(biocViews) <- views biocViews } getPacksAndViews <- function(reposURL, vocab, defaultView, local=FALSE) { tmpf <- tempfile() on.exit(unlink(tmpf)) method <- "auto" ## FIXME: needs error checking and to look for VIEWS.gz first z <- download.file(url=paste(reposURL, "VIEWS", sep="/"), destfile=tmpf, method=method, cacheOK=FALSE, quiet=TRUE, mode="wb") pmat <- readPackageInfo(file=tmpf) bcvl <- pkgList <- vector(mode="list", length=nrow(pmat)) if (nrow(pmat) == 0L) return(list(views=bcvl, pkgList=pkgList)) ns <- pmat[,"Package"] ## The DESCRIPTION fields we try to parse for tags DESC_FIELDS <- c("biocViews") names(bcvl) <- ns for (tagCol in DESC_FIELDS) { if (tagCol %in% colnames(pmat)) { tags <- pmat[, tagCol] names(tags) <- ns bcvl <- processTagsField(tags, bcvl, defaultView) } } ## In case none of the fields were available, make sure everyone ## gets a NoViewsProvided tag. bcvl <- lapply(bcvl, function(x) { if (is.null(x)) defaultView else x }) bcvl <- normalizeTags(bcvl, vocab) if (!local) pkgList <- createPackageDetailList(pmat, reposURL) else pkgList <- createPackageDetailList(pmat) list(views=bcvl, pkgList=pkgList) } normalizeTags <- function(tagList, vocab) { ## try to match tags to the vocab ignoring case. ## If found, replace with case as found in vocab. knownTerms <- nodes(vocab) knownTermsLower <- tolower(knownTerms) tagList <- lapply(tagList, function(x) { idx <- match(tolower(x), knownTermsLower) unknown <- is.na(idx) if (any(unknown)) { warning("Dropping unknown biocViews terms:\n", paste(x[unknown], collapse=", "), call.=FALSE) idx <- idx[!is.na(idx)] } knownTerms[unique(idx)] ## remove duplicates }) tagList } processTagsField <- function(tags, tagList, defaultTag) { ## Given a named character vector of comma separated tags, ## parse the tags and append data to the given tagList. ## Names of tags and tagList must match. if (!all.equal(names(tags), names(tagList))) stop("Names of tags and tagList must match") tags[is.na(tags)] <- defaultTag tags <- gsub("\\\n","",tags) fieldSp <- strsplit(tags, ", *") names(fieldSp) <- names(tagList) for (n in names(tagList)) { tagList[[n]] <- c(tagList[[n]], fieldSp[[n]]) } tagList } getSubTerms <- function(dag, term) { c(term, names(acc(dag, term)[[1]])) } permulist <- function(allv, vocab, interp=TRUE) { if (length(allv) == 0L) return(list()) lens <- sapply(allv, length) packnames <- names(allv) repp <- rep(packnames, lens) ans <- split(repp, unlist(allv)) if (interp) ans <- pump(ans, vocab) return(ans) } biocViews/R/htmlDoc-methods.R0000644000175000017500000000461014136047116015721 0ustar nileshnileshmakeHtmlHeader <- function(title, stylesheet) { ## Right now xmlTree's addNode method doesn't accept XMLNode objects ## html <- xmlTree("html", ## attrs=c(xmlns="http://www.w3.org/1999/xhtml", ## "xml:lang"="en", lang="en"), ## dtd=c('html', ## '-//W3C//DTD XHTML 1.0 Strict//EN', ## 'http://www.w3.org/TR/xhtml1/DTD/xhtml1-strict.dtd')) html <- xmlOutputDOM("html", attrs=c(xmlns="http://www.w3.org/1999/xhtml", "xml:lang"="en", lang="en")) ### gaah! header is only supported by xmlOutputBuffer ! :-( ### so instead we write out the DOCTYPE in the writeDoc method. ## header=paste('')) html$addTag("head", close=FALSE) html$addTag("title", title) myAttrs <- c(rel="stylesheet", type="text/css", href=stylesheet) html$addTag("link", attrs=myAttrs) html$closeTag() html } setMethod("htmlDoc", signature(object="Htmlized"), function(object, ..., title, stylesheet="style.css") { dom <- makeHtmlHeader(title, stylesheet) dom$addTag("body", close=FALSE) dom$addNode(htmlValue(object)) dom$closeTag() dom$value() }) setMethod("htmlDoc", signature(object="PackageDetail"), function(object, ...) { title <- object@Package stylesheet="package-detail.css" callNextMethod(object=object, title=title, stylesheet=stylesheet) }) setMethod("htmlDoc", signature(object="RepositoryDetail"), function(object, ...) { title <- object@Title stylesheet="repository-detail.css" callNextMethod(object=object, title=title, stylesheet=stylesheet) }) setMethod("htmlDoc", signature(object="BiocView"), function(object, ...) { title <- paste("Bioconductor Task View", object@name) stylesheet="repository-detail.css" callNextMethod(object=object, title=title, stylesheet=stylesheet) }) biocViews/R/AllClasses.R0000644000175000017500000000615214136047116014717 0ustar nileshnilesh## TODO: make the reposInfo list into an S4 class to represent ## repository data setClass("Htmlized", representation("VIRTUAL")) setClass("PackageDetail", contains="Htmlized", representation("Package"="character", "Version"="character", "Title"="character", "Description"="character", "Author"="character", "Maintainer"="character", "Depends"="character", "Imports"="character", "Suggests"="character", "SystemRequirements"="character", "License"="character", "URL"="character", "biocViews"="character", "vignettes"="character", "vignetteScripts"="character", "vignetteTitles"="character", "htmlTitles"="character", "source.ver"="character", "win.binary.ver"="character", "mac.binary.ver"="character", "mac.binary.mavericks.ver"="character", "mac.binary.el-capitan.ver"="character", "downloadStatsUrl"="character", "manuals"="character", "dependsOnMe"="character", "importsMe"="character", "suggestsMe"="character", "functionIndex"="character", "reposFullUrl"="character", "reposRoot"="character", "viewRoot"="character", "devHistoryUrl"="character", "Archs"="character")) ## Define a subclass of PackageDetail for each "chunk" of the object ## that we want to be able to render to HTML separately. setClass("pdAuthorMaintainerInfo", contains="PackageDetail") setClass("pdVignetteInfo", contains="PackageDetail") setClass("pdDownloadInfo", contains="PackageDetail") setClass("pdDetailsInfo", contains="PackageDetail") setClass("pdDescriptionInfo", contains="PackageDetail") setClass("pdVigsAndDownloads", contains="PackageDetail") setClass("RepositoryDetail", contains="Htmlized", representation(Title="character", reposRoot="character", homeUrl="character", htmlDir="character", packageList="list")) setClass("rdPackageTable", contains="RepositoryDetail") setClass("BiocView", contains=c("RepositoryDetail"), representation(name="character", subViews="character", parentViews="character")) setClass("bvTitle", contains="BiocView") setClass("bvPackageTable", contains="BiocView") setClass("bvSubViews", contains="BiocView") setClass("bvParentViews", contains="BiocView") ## Outline ## 1. given a repos with VIEWS file, run extractVignettes. ## 2. Now get a pkgList using loadPackageDetails() ## 3. write HTML biocViews/R/validation_tests.R0000644000175000017500000000112214136047116016235 0ustar nileshnilesh## These are test functions that can be called from the unit tests of any BioC ## package. validate_bioc_views <- function(pkg) { bvStr <- packageDescription(pkg)$biocViews checkTrue(!is.null(bvStr), paste("No biocViews defined for package", pkg)) bvStr <- gsub(" ", "", bvStr) views <- strsplit(bvStr, ",")[[1]] biocViewsVocab <- NULL data("biocViewsVocab", envir=environment()) nodes <- nodes(biocViewsVocab) for (view in views) { checkTrue(view %in% nodes, paste("Invalid view", view, "in package", pkg)) } invisible(NULL) } biocViews/R/pump.R0000644000175000017500000000244214136047116013650 0ustar nileshnileshgetRootNode <- function(g) { rootIdx <- which(sapply(inEdges(g), length) == 0) nodes(g)[rootIdx] } pump <- function(viewlist, vocab) { # make packages annotated to hypernyms of their # direct annotations vs <- names(viewlist) root <- getRootNode(vocab) for (v in vs) { st <- tellSuperTop(v, vocab, root) if (length(st) > 0) { for (sti in st) viewlist[[sti]] <- union(viewlist[[sti]], viewlist[[v]]) } } viewlist } tellSuperTop <- function(topic, vocab, root) { # returns vector of supertopics if (length(topic)>1) stop("must have length 1 topic") if (!(topic %in% nodes(vocab))) { warning(paste("attempt to interpolate term [", topic, "] that is not even in the vocabulary! just returning term")) return(topic) } path <- sp.between( vocab, root, topic )[[1]]$path_detail path[-c(1, length(path))] } tellSubTop <- function(topic, vocab) { if (length(topic)>1) stop("must have length 1 topic") # returns vector of subtopics if (!(topic %in% nodes(vocab))) { warning(paste("attempt to interpolate term [", topic, "] that is not even in the vocabulary! just returning term")) return(topic) } desc <- acc( vocab, topic )[[1]] names(desc)[desc==1] } biocViews/R/show-methods.R0000644000175000017500000000075414136047116015314 0ustar nileshnileshsetMethod("show", signature(object="BiocView"), function(object) { cat("Bioconductor View:", object@name, "\n") cat("Parent Views:\n") print(object@parentViews) cat("Subviews:\n") print(object@subViews) cat("Contains packages:\n") if (length(object@packageList)) print(names(object@packageList)) else cat("\n") }) biocViews/R/htmlFilename-methods.R0000644000175000017500000000101014136047116016723 0ustar nileshnileshsetMethod("htmlFilename", signature(object="character"), function(object) paste(object, ".html", sep="")) setMethod("htmlFilename", signature(object="RepositoryDetail"), function(object) "index.html") setMethod("htmlFilename", signature(object="BiocView"), function(object) { paste(object@name, ".html", sep="") }) setMethod("htmlFilename", signature(object="PackageDetail"), function(object) { htmlFilename(object@Package) }) biocViews/R/packageDetails.R0000644000175000017500000001314414136047116015571 0ustar nileshnileshloadPackageDetails <- function(reposRoot, reposUrl="..", viewUrl="../..", reposFullUrl=reposUrl, downloadStatsUrl="", devHistoryUrl="") { ## Return a list of PackageDetail objects representing ## the packages contained in the repository located ## on the local filesystem at reposRoot. ## ## reposRoot - Path to local filesystem CRAN-style repository ## ## FIXME: should allow reading VIEWS from a URL also. viewsFile <- file.path(reposRoot, "VIEWS") pkgMat <- readPackageInfo(viewsFile) createPackageDetailList(pkgMat, reposUrl, viewUrl, reposFullUrl, downloadStatsUrl, devHistoryUrl) } createPackageDetailList <- function(viewMat, reposUrl="..", viewUrl=character(0), reposFullUrl=reposUrl, downloadStatsUrl="", devHistoryUrl="") { if (nrow(viewMat) == 0L) return(list()) pkgList <- apply(viewMat, 1, viewRowToPackageDetail) names(pkgList) <- viewMat[, "Package"] pkgList <- setDependsOnMeImportsMeSuggestsMe(pkgList) pkgList <- lapply(pkgList, function(p) { p@devHistoryUrl <- devHistoryUrl p@downloadStatsUrl <- downloadStatsUrl p@reposFullUrl <- reposFullUrl p@reposRoot <- reposUrl p@viewRoot <- viewUrl p }) return(pkgList) } setDependsOnMeImportsMeSuggestsMe <- function(pkgDetailsList) { ## Add list of packages that depend on and suggest each package ## listed in pkgDetailsList, a list of PackageDetail objects. pkgNames <- names(pkgDetailsList) depCols <- lapply(pkgDetailsList, function(x) pkgNames %in% x@Depends) depMat <- do.call(cbind, depCols) colnames(depMat) <- rownames(depMat) <- pkgNames impCols <- lapply(pkgDetailsList, function(x) pkgNames %in% x@Imports) impMat <- do.call(cbind, impCols) colnames(impMat) <- rownames(impMat) <- pkgNames sugCols <- lapply(pkgDetailsList, function(x) pkgNames %in% x@Suggests) sugMat <- do.call(cbind, sugCols) colnames(sugMat) <- rownames(sugMat) <- pkgNames setDepsImpsSugs <- function(pkg) { deps <- pkgNames[which(depMat[pkg@Package, ])] imps <- pkgNames[which(impMat[pkg@Package, ])] sugs <- pkgNames[which(sugMat[pkg@Package, ])] pkg@dependsOnMe <- deps pkg@importsMe <- imps pkg@suggestsMe <- sugs return(pkg) } return(lapply(pkgDetailsList, setDepsImpsSugs)) } viewRowToPackageDetail <- function(row) { ## Given a row from a VIEWS package description matrix as returned by ## calling read.dcf through readPackageInfo on a VIEWS file, return a ## PackageDetail instance. pkg <- new("PackageDetail") ## assume we have names on the row flds <- names(row) ourSlots <- slotNames(getClass("PackageDetail")) for (fld in flds) { if (! fld %in% ourSlots) next val <- row[[fld]] ## FIXME: are we sure we want to get rid of the NA's here? if (is.na(val)) val <- "" slot(pkg, fld) <- val } ## Fix vector fields ## FIXME: we are using a private func from tools. Also, ## this func gives more structure (version info) which for now we ## ignore. cleanPkgField <- function(val) { val <- names(tools:::.split_dependencies(val)) if (is.null(val)) val <- character(0) val } cleanField <- function (x) { x <- unlist(strsplit(x, ",")) if (!length(x)) return(character(0)) x <- unique(sub("^[[:space:]]*(.*)[[:space:]]*$", "\\1", x)) x } cleanVigs <- function(vigs) { if (length(vigs) > 0 && !is.na(vigs)) { vigs <- gsub("\n", "", vigs) ans <- strsplit(vigs, ", *")[[1]] } else { ans <- character(0) } return(ans) } pkg@Depends <- cleanPkgField(pkg@Depends) pkg@Suggests <- cleanPkgField(pkg@Suggests) pkg@Imports <- cleanPkgField(pkg@Imports) pkg@biocViews <- cleanField(pkg@biocViews) pkg@vignettes <- cleanVigs(pkg@vignettes) pkg@vignetteScripts <- cleanVigs(pkg@vignetteScripts) pkg@vignetteTitles <- cleanVigs(pkg@vignetteTitles) pkg@htmlTitles <- cleanVigs(pkg@htmlTitles) return(pkg) } removeEmail <- function(line) { line <- gsub("<[a-zA-Z0-9._-]+@[a-zA-Z0-9._-]+>", "", line) line <- gsub("[a-zA-Z0-9._-]+@[a-zA-Z0-9._-]+", "", line) line <- sub(" +$", "", line) line } mangleEmail <- function(line) { chrA <- c("À", "Á", "Â", "Ã", "Ä", "Å", "Æ") chrO <- c("Ò", "Ó", "Ô", "Õ", "Ö") makeAT <- function() { i <- sample(seq(length=length(chrA), 1)) paste(" ", chrA[i], "T", " ", sep="") } makeDOT <- function() { i <- sample(seq(length=length(chrO), 1)) paste(" ", "D", chrO[i], "T", " ", sep="") } emailStarts <- gregexpr("<", line, fixed=TRUE)[[1]] emailEnds <- gregexpr(">", line, fixed=TRUE)[[1]] emails <- sapply(seq(length=length(emailStarts)), function(x) substr(line, emailStarts[x], emailEnds[x])) emails <- sapply(emails, function(line) { AT <- makeAT() DOT <- makeDOT() line <- gsub("@", AT, line, fixed=TRUE) line <- gsub("\\.", DOT, line, fixed=TRUE) line }) other <- strsplit(line, "<[^>]+@[^>]+>")[[1]] paste(other, emails, collapse="") } biocViews/R/repository.R0000644000175000017500000011506314140033373015105 0ustar nileshnileshgenReposControlFiles <- function(reposRoot, contribPaths, manifestFile=NA, meatPath=NA) { ## Generate all control files for BioC hosted R ## package repositorys message("Generating repos control files:") message("- write_REPOSITORY() ... ", appendLF=FALSE) t <- system.time(write_REPOSITORY(reposRoot, contribPaths))[["elapsed"]] message(sprintf("OK (total time: %.2fs)", t)) ## Write PACKAGES files for all contrib paths packagesPaths <- file.path(reposRoot, contribPaths) names(packagesPaths) <- names(contribPaths) for (type in names(packagesPaths)) { path <- packagesPaths[[type]] if (type == "win64.binary") { type <- "win.binary" } else if (substr(type, 1, 10) == "mac.binary") { type <- "mac.binary" } message("- write_PACKAGES() to ", path, " ... ", appendLF=FALSE) t <- system.time(write_PACKAGES(path, type=type))[["elapsed"]] message(sprintf("OK (total time: %.2fs)", t)) } ## Write a VIEWS file at the top-level containing ## detailed package info message("- write_VIEWS() ... ", appendLF=FALSE) t <- system.time( write_VIEWS(reposRoot, manifestFile=manifestFile, meatPath=meatPath) )[["elapsed"]] message(sprintf("OK (total time: %.2fs)", t)) ## Write a SYMBOLS file at the top-level containing the ## exported symbols for all packages that have name ## spaces. This is used to build a searchable index. #message("- write_SYMBOLS() ... ", appendLF=FALSE) #t <- system.time(write_SYMBOLS(reposRoot))[["elapsed"]] #message(sprintf("OK (total time: %.2fs)", t)) message("DONE Generating repos control files.") } pkgName <- function(tarball) { strsplit(basename(tarball), "_", fixed=TRUE)[[1L]][1L] } unpack <- function(tarball, unpackDir, wildcards) { args <- c("-C", unpackDir, "-xzf", tarball, "--wildcards", wildcards) system2("tar", args, stderr=NULL) } cleanUnpackDir <- function(tarball, unpackDir, subDir="", pattern=NULL) { ## Delete files from a previous extraction pkg <- pkgName(tarball) pkgDir <- file.path(unpackDir, pkg, subDir) files <- list.files(pkgDir, pattern=pattern, full.names=TRUE, recursive=is.null(pattern), include.dirs=is.null(pattern)) unlink(files) } extractManuals <- function(reposRoot, srcContrib, destDir) { ## Extract Rd man pages from source package tarballs and ## convert to pdf documents ## ## reposRoot - Top level path for CRAN-style repos ## srcContrib - Location of source packages ## destDir - where to extract. ## ## Notes: ## Under destDir, for tarball foo_1.2.3.tar.gz, you will ## get destDir/foo/man/*.pdf ## if (missing(destDir)) destDir <- file.path(reposRoot, "manuals") buildManualsFromTarball <- function(tarball, unpackDir=".") { ## helper function to unpack pdf & Rd files from the vig status <- TRUE cleanUnpackDir(tarball, unpackDir, "man", ".*\\.(pdf|Rd|rd)$") ret <- unpack(tarball, unpackDir, "'*/man/*.[Rr]d'") if (ret != 0) { warning("non-zero exit status ", ret, " extracting man pages: ", tarball) status <- FALSE } else { pkg <- pkgName(tarball) pkgDir <- file.path(unpackDir, pkg, "man") RCmd <- file.path(Sys.getenv("R_HOME"), "bin", "R") Rd2pdfCmd <- paste0( RCmd, " CMD Rd2pdf --no-preview ", "--output=", pkgDir, "/", pkg, ".pdf ", "--title=", pkg, " ", pkgDir, "/*.[Rr]d") ret <- system(Rd2pdfCmd) cleanUnpackDir(tarball, unpackDir, "man", ".*\\.(Rd|rd)$") if (ret != 0) { warning("non-zero exit status ", ret, " building ref man: ", pkg) status <- FALSE } } status } tarballs <- list.files(file.path(reposRoot, srcContrib), pattern="\\.tar\\.gz$", full.names=TRUE) if (!file.exists(destDir)) dir.create(destDir, recursive=TRUE) if (!file.info(destDir)$isdir) stop("destDir must specify a directory") if (endsWith(reposRoot, "data/annotation")) { n <- vapply(tarballs, function(tarball, ...) { tryCatch({ buildManualsFromTarball(tarball, ...) }, error = function(e) { warning("error extracting manual for: ", tarball, "\n ", conditionMessage(e)) FALSE }) }, logical(1), unpackDir=destDir) } else { n <- 0 } paste(sum(n), "/", length(tarballs), "tarball manuals processsed") } getRefmanLinks <- function(pkgList, reposRootPath, refman.dir) { unlist(lapply(pkgList, function(pkg) { refmanSubDir <- "man" refmanDir <- file.path(reposRootPath, refman.dir, pkg, refmanSubDir) if (file.exists(refmanDir)) { refmans <- list.files(refmanDir, pattern=".*\\.pdf$") refmans <- paste(refman.dir, pkg, refmanSubDir, refmans, sep="/", collapse=", ") } else refmans <- NA_character_ refmans })) } extractTopLevelFiles <- function(reposRoot, srcContrib, destDir, fileName) { extractFileFromTarball <- function(tarball, unpackDir=".") { pkg <- pkgName(tarball) cleanUnpackDir(tarball, unpackDir, pattern=fileName) message("Attempting to extract ", fileName, " from ", tarball) unpack(tarball, unpackDir, file.path(pkg, fileName)) } tarballs <- list.files(file.path(reposRoot, srcContrib), pattern="\\.tar\\.gz$", full.names=TRUE) if (!file.exists(destDir)) dir.create(destDir, recursive=TRUE) if (!file.info(destDir)$isdir) stop("destDir must specify a directory") lapply(tarballs, extractFileFromTarball, unpackDir=destDir) invisible(NULL) } extractINSTALLfiles <- function(reposRoot, srcContrib, destDir) { extractTopLevelFiles(reposRoot, srcContrib, destDir, "INSTALL") } ### Will return NULL if the citation could not be generated from the CITATION ### file. This typically occurs when the file contains code that relies on ### the package to be installed e.g. it contains calls to things like ### packageVersion() or packageDate() instead of using 'meta$Version' ### or 'meta$Date'. See ### https://cran.r-project.org/doc/manuals/r-release/R-exts.html#CITATION-files ### for the details. .extract_citation <- function(tarball) { pkgname <- pkgName(tarball) tmpdir <- tempdir() ## Remove any stale DESCRIPTION or CITATION file from the tmpdir/pkgname/ ## folder (could happen e.g. if 'tmpdir' somehow already contained a stale ## source tree for 'pkgname'). tmp_pkgdir <- file.path(tmpdir, pkgname) DESCRIPTION_path <- file.path(tmp_pkgdir, "DESCRIPTION") CITATION_path <- file.path(tmp_pkgdir, "inst", "CITATION") paths <- c(DESCRIPTION_path, CITATION_path) status <- unlink(paths) ## Should never happen. if (status != 0L) stop("failed to remove files DESCRIPTION and/or ", "inst/CITATION from folder ", tmp_pkgdir) ## Try to extract files DESCRIPTION and inst/CITATION from tarball. ## Note that the path separator is **always** / in a tarball, even ## on Windows, so do NOT use file.path() here. DESCRIPTION_tpath <- paste0(pkgname, "/DESCRIPTION") CITATION_tpath <- paste0(pkgname, "/inst/CITATION") tpaths <- c(DESCRIPTION_tpath, CITATION_tpath) status <- untar(tarball, tpaths, exdir=tmpdir) ## Unfortunately, there are some rare situations where untar() returns ## a non-zero value even though the requested files get successfully ## extracted. This happens with some package source tarballs generated ## by 'R CMD build' but seem corrupted e.g.: ## > untar("simpleSingleCell_1.13.5.tar.gz", "simpleSingleCell/DESCRIPTION") ## /bin/tar: Skipping to next header ## /bin/tar: Skipping to next header ## /bin/tar: Exiting with failure status due to previous errors ## Warning message: ## In untar("simpleSingleCell_1.13.5.tar.gz", "simpleSingleCell/DESCRIPTION") : ## ‘/bin/tar -xf 'simpleSingleCell_1.13.5.tar.gz' 'simpleSingleCell/DESCRIPTION'’ returned error code 2 ## So instead of checking 'status', we check for the existence of the ## extracted files. if (!file.exists(DESCRIPTION_path)) # should never happen stop("failed to extract DESCRIPTION file from ", tarball) description <- packageDescription(pkgname, lib.loc=tmpdir) ## If tarball contains a CITATION file, use it to generate the citation. if (file.exists(CITATION_path)) { message("(try to process CITATION file) ", appendLF=FALSE) citation <- try(readCitationFile(CITATION_path, meta=description), silent=TRUE) if (inherits(citation, "try-error")) citation <- NULL return(citation) } ## If there is no CITATION file, auto-generate citation from ## DESCRIPTION file. message("(auto-generate from DESCRIPTION file) ", appendLF=FALSE) citation(pkgname, lib.loc=tmpdir, auto=description) } .write_citation_as_HTML <- function(pkgname, citation, destdir) { destfile <- file.path(destdir, "citation.html") if (dir.exists(destdir)) { status <- unlink(destfile) if (status != 0L) stop("failed to remove previous ", destfile, " file") } else { if (!dir.create(destdir)) stop("failed to create ", destdir, " directory") } if (is.null(citation)) { message("(failed! ==> replacing citation with red banner) ", appendLF=FALSE) html <- c("

Important note to the ", "maintainer of the ", pkgname, "package: An error ", "occured while trying to generate the citation ", "from the CITATION file. This typically occurs ", "when the file contains R code that relies on ", "the package to be installed e.g. it contains calls ", "to things like packageVersion() or ", "packageDate() instead of using ", "meta$Version or meta$Date. ", "See R documentation ", "for more information.

") } else { ## print() can fail on a citation object. See: ## https://bugs.r-project.org/bugzilla/show_bug.cgi?id=17725 html <- try(capture.output(print(citation, style="html")), silent=TRUE) if (inherits(html, "try-error")) { message("(failed! ==> replacing citation with red banner) ", appendLF=FALSE) html <- c("

Important note to the ", "maintainer of the ", pkgname, "package: An error ", "occured while trying to generate the citation ", "from the CITATION file. Please make sure that the ", "CITATION file in your package is valid by calling ", "utils::readCitationFile() on it.

") } ## Filter out lines starting with \Sexprs. html <- html[grep("^\\\\Sexpr", html, invert=TRUE)] } cat(html, file=destfile, sep="\n") } extractCitations <- function(reposRoot, srcContrib, destDir) { tarballs <- list.files( file.path(reposRoot, srcContrib), pattern="\\.tar\\.gz$", full.names=TRUE) if (!dir.exists(destDir)) { if (!dir.create(destDir, recursive=TRUE)) stop("failed to create ", destDir, " directory") } for (tarball in tarballs) { message("Generate citation for ", tarball, " ... ", appendLF=FALSE) citation <- .extract_citation(tarball) pkgname <- pkgName(tarball) .write_citation_as_HTML(pkgname, citation, file.path(destDir, pkgname)) message("OK") } } extractReadmes <- function(reposRoot, srcContrib, destDir) { ## Extract README files from source package tarballs ## ## reposRoot - Top level path for CRAN-style repos ## srcContrib - Location of source packages ## destDir - where to extract. ## ## Notes: ## Under destDir, for tarball foo_1.2.3.tar.gz, you will ## get destDir/foo/inst/doc/*.pdf ## extractTopLevelFiles(reposRoot, srcContrib, destDir, "README") } extractNEWS <- function(reposRoot, srcContrib, destDir) { if (missing(destDir)) destDir <- file.path(reposRoot, "news") extractNewsFromTarball <- function(tarball, unpackDir=".") { pkg <- pkgName(tarball) cleanUnpackDir(tarball, unpackDir, pattern="NEWS") unpack(tarball, unpackDir, "'*NEWS*'") } convertNEWSToText <- function(tarball, srcDir, destDir) { pkg <- pkgName(tarball) srcDir <- file.path(srcDir, pkg) destDir <- file.path(destDir, pkg) if (!file.exists(destDir)) dir.create(destDir, recursive=TRUE) destFile <- file.path(destDir, "NEWS") getNEWSFromFile(srcDir, destFile, output="text") } tarballs <- list.files(file.path(reposRoot, srcContrib), pattern="\\.tar\\.gz$", full.names=TRUE) if (!file.exists(destDir)) dir.create(destDir, recursive=TRUE) if (!file.info(destDir)$isdir) stop("destDir must specify a directory") unpackDir <- tempdir() lapply(tarballs, function(tarball) { cat("Attempting to extract NEWS from", tarball, "\n") extractNewsFromTarball(tarball, unpackDir=unpackDir) res <- try(convertNEWSToText(tarball, srcDir=unpackDir, destDir=destDir)) if (inherits(res, "try-error")) cat("FAILED!\n") }) invisible(NULL) } extractVignettes <- function(reposRoot, srcContrib, destDir) { ## Extract vignettes from source package tarballs ## ## reposRoot - Top level path for CRAN-style repos ## srcContrib - Location of source packages ## destDir - where to extract. ## ## Notes: ## Under destDir, for tarball foo_1.2.3.tar.gz, you will ## get destDir/foo/inst/doc/*.pdf ## if (missing(destDir)) destDir <- file.path(reposRoot, "vignettes") extractVignettesFromTarball <- function(tarball, unpackDir=".") { cleanUnpackDir(tarball, unpackDir, subDir=file.path("inst", "doc")) cat("Extracting vignettes from", tarball, "\n") ret <- unpack(tarball, unpackDir, "'*/inst/doc/*'") if (ret != 0) warning("tar had non-zero exit status for vig extract of: ", tarball) } tarballs <- list.files(file.path(reposRoot, srcContrib), pattern="\\.tar\\.gz$", full.names=TRUE) if (!file.exists(destDir)) dir.create(destDir, recursive=TRUE) if (!file.info(destDir)$isdir) stop("destDir must specify a directory") invisible(lapply(tarballs, extractVignettesFromTarball, unpackDir=destDir)) } extractHTMLDocuments <- function(reposRoot, srcContrib, destDir) { ## Extract HTML documents from source package tarballs ## IF any HTML document is present in inst/doc. ## ## reposRoot - Top level path for CRAN-style repos ## srcContrib - Location of source packages ## destDir - where to extract. ## ## Notes: ## Under destDir, for tarball foo_1.2.3.tar.gz, you will ## get destDir/foo/inst/doc/*.pdf ## if (missing(destDir)) destDir <- file.path(reposRoot, "vignettes") extractHTMLDocumentsFromTarball <- function(tarball, unpackDir=".") { ## helper function to unpack HTML documents and deps from tarball ## here we untar twice, once (just listing files) to see ## if there are html files in inst/doc, then if there are, ## we untar again (extracting). Optimal? fileList <- untar(tarball, list=TRUE) if (length(grep("inst/doc/.*\\.html$", fileList, ignore.case=TRUE))) { cat("Found HTML document in", tarball, "\n") ## This extracts everything, including ## Rnw and Rmd files...too liberal? Then use vignettes/ dir cat("Extracting HTML documents from", tarball, "\n") ret <- unpack(tarball, unpackDir, "'*/inst/doc/*'") if (ret != 0) warning("tar had non-zero exit status for HTML extract of: ", tarball) } } tarballs <- list.files(file.path(reposRoot, srcContrib), pattern="\\.tar\\.gz$", full.names=TRUE) if (!file.exists(destDir)) dir.create(destDir, recursive=TRUE) if (!file.info(destDir)$isdir) stop("destDir must specify a directory") invisible(lapply(tarballs, extractHTMLDocumentsFromTarball, unpackDir=destDir)) } getDcfValues <- function(values) { if (is.na(values)) return (character(0)) values <- gsub("\n", " ", values, fixed=TRUE) l <- unlist(strsplit(values, ", ", fixed=TRUE)) res <- unlist(lapply(l, function(x) { p <- strsplit(x, " |\\(", fixed=FALSE) unlist(p)[[1]] })) res } getFileExistsAttr <- function(pkgList, reposRootPath, dir, filename) { unlist(lapply(pkgList, function(pkg) { ret <- logical(0) filedir <- file.path(reposRootPath, dir, pkg) file <- file.path(filedir, filename) exists <- file.exists(filedir) && file.exists(file) ret <- c(ret, exists) ret })) } getFileLinks <- function(pkgList, reposRootPath, vignette.dir, ext, ignore.case=FALSE) { if (length(pkgList) == 0L) return(character(0)) unlist(lapply(pkgList, function(pkg) { vigSubDir <- "inst/doc" vigDir <- file.path(reposRootPath, vignette.dir, pkg, vigSubDir) vigs <- NA_character_ if (file.exists(vigDir)) { pattern <- paste(".*\\.", ext, "$", sep="") files <- list.files(vigDir, pattern=pattern, ignore.case=ignore.case) if (length(files)) vigs <- paste(vignette.dir, pkg, vigSubDir, files, sep="/", collapse=", ") } vigs })) } getDocumentTitles <- function(docs, ext="pdf", src=c("Rnw", "Rmd"), reposRootPath, fun) { if (length(docs) == 0L) return(character()) filelist <- strsplit(docs, ", ", fixed = TRUE) unlist(lapply(filelist, function(files) { if (all(is.na(files))) { NA_character_ } else { files <- file.path(reposRootPath, files) titles <- unlist(lapply(files, function(file) { title <- NA_character_ src <- paste0(sub(sprintf("\\.%s$", ext), ".", file, ignore.case=TRUE), src) idx <- which(file.exists(src))[1L] ## extract title from source file if (!is.na(idx)) { title <- fun(file, src[idx]) title <- trimws(title) title <- gsub(",", ",,", title, fixed=TRUE) } ## use filename if no source file found, title extraction failed, ## or the extracted title is empty if (is.na(title) || nchar(title)==0L) basename(file) else title })) paste(titles, collapse=", ") } })) } getVignetteIndexEntry <- function(file) { lines <- readLines(file, warn=FALSE) ## use the same regular expression as in tools:::.get_vignette_metadata regex <- "[[:space:]]*%+[[:space:]]*\\\\VignetteIndexEntry\\{([^}]*(\\{[^}]*\\})*[^}]*)\\}.*" ## match to first occurance res <- grep(regex, lines, value = TRUE)[1L] gsub(regex, "\\1", res) } getPdfTitle <- function(doc, src) { getVignetteIndexEntry(src) } getHtmlTitle <- function(doc, src) { ## First look for an old-fashioned VignetteIndexEntry in the source file title <- getVignetteIndexEntry(src) if (is.na(title)) { ## now look for an HTML title doc <- htmlParse(doc) res <- xpathApply(doc, "//title", xmlValue) if (length(res)) title <- res[[1L]] } title } write_REPOSITORY <- function(reposRootPath, contribPaths) { contrib <- as.list(contribPaths) names(contrib) <- gsub("-", ".", names(contribPaths)) contrib[["provides"]] <- paste(names(contribPaths), collapse=", ") fn <- file.path(reposRootPath, "REPOSITORY") write.dcf(contrib, fn) } read_REPOSITORY <- function(reposRootPath) { reposInfo <- read.dcf(file.path(reposRootPath, "REPOSITORY")) reposInfo[, "provides"] <- gsub("[ \t\r\n\v\f]", "", reposInfo[, "provides"]) provided <- strsplit(reposInfo[, "provides"], ",")[[1L]] m <- match(gsub("-", ".", provided), colnames(reposInfo)) if (anyNA(m)) stop("malformed REPOSITORY file: 'provides' field is inconsistent ", "with other fields)") if (anyDuplicated(m)) stop("malformed REPOSITORY file: several values in 'provides' field ", "are mapped to the same entry in the file") colnames(reposInfo)[m] <- provided reposInfo } .write_repository_db <- function(db, dir, fname) { if ("Bundle" %in% colnames(db)) { noPack <- is.na(db[, "Package"]) db[noPack, "Package"] <- db[noPack, "Bundle"] } gzname <- paste(fname, "gz", sep=".") out <- file(file.path(dir, fname), "wt") ##FIXME: gzfile writing segfaults for me ##outgz <- gzfile(file.path(dir, gzname), "wt") for (i in seq_len(nrow(db))) { dbi <- db[i, !(is.na(db[i, ]) | (db[i, ] == "")), drop = FALSE] write.dcf(dbi, file = out) ##FIXME: writing to the gz file segfaults for me ##write.dcf(dbi, file = outgz) cat("\n", file=out) } close(out) ##FIXME: writing to the gz file segfaults ##close(outgz) invisible(nrow(db)) } ## To manually run/debug write_VIEWS() on the central builder (e.g. on ## nebbiolo2), start R-4.1 from the biocpush account and do: ## library(BiocManager) # check that the Bioconductor version is correct ## repositories() # check that all the repositories are correct ## library(biocViews) ## reposRoot <- "~/PACKAGES/3.14/bioc" ## manifestFile <- "~biocbuild/bbs-3.14-bioc/manifest/software.txt" ## meatPath <- "~biocbuild/bbs-3.14-bioc/MEAT0" ## setwd(reposRoot) ## write_VIEWS(reposRoot, manifestFile=manifestFile, meatPath=meatPath) write_VIEWS <- function(reposRootPath, fields = NULL, verbose = FALSE, vignette.dir="vignettes", manifestFile=NA, meatPath=NA ) { ## Copied from tools::write_PACKAGES if (is.null(fields)) fields <- c("Title", "Description", "biocViews", "Author", "Maintainer", "URL", "License", "SystemRequirements", "organism", "manufacturer", "hasReadme", "VignetteBuilder", "Video", "BugReports", "PackageStatus", "git_url", "git_branch", "git_last_commit", "git_last_commit_date", "Date/Publication") ## Read REPOSITORY file for contrib path info reposInfo <- read_REPOSITORY(reposRootPath) provided <- strsplit(reposInfo[, "provides"], ",")[[1L]] fields = unique(c(tools:::.get_standard_repository_db_fields("source"), tools:::.get_standard_repository_db_fields("mac.binary"), tools:::.get_standard_repository_db_fields("win.binary"), fields)) convertToMat <- function(reposRootPath, reposInfo, os, fields, verbose){ ## Use code from tools to build a matrix of package info pkg.dir <- file.path(reposRootPath, reposInfo[, os]) if(grepl(os, pattern="mac.binary")) os = "mac.binary" if(grepl(os, pattern="win.binary")) os = "win.binary" db <- tools:::.build_repository_package_db(pkg.dir, fields, os, verbose) ## Turn 'db' into a matrix with 1 row per package if (length(db) != 0L) { dbMatTemp <- do.call(rbind, db) } else { dbMatTemp <- matrix(nrow=0L, ncol=length(fields)) colnames(dbMatTemp) <- fields } dbMatTemp } # get standard list of fields information for packages os = provided[1] dbMat = convertToMat(reposRootPath, reposInfo, os, fields, verbose) if (length(provided) > 1){ otheros = provided[-1] for(os in otheros){ dbMat2 = convertToMat(reposRootPath, reposInfo, os, fields, verbose) idx = !(dbMat2[,"Package"] %in% dbMat[, "Package"]) if (length(which(idx)) != 0){ tempMat = dbMat2[idx,] dbMat = rbind(dbMat, tempMat) } } } ## Integrate version and archive file path info for the different contrib ## paths in this repos. We duplicate the source path info here, but that ## makes things easier to handle later on as there is no special case. fldNames <- c(colnames(dbMat), paste(provided, "ver", sep=".")) dbMat <- cbind(dbMat, matrix(NA, nrow=nrow(dbMat), ncol=length(provided))) colnames(dbMat) <- fldNames for (ctype in provided) { cPath <- reposInfo[, ctype] buildPkgPath <- function(pkgs, vers) { ext <- switch(ctype, 'source'=".tar.gz", 'win.binary'=".zip", 'mac.binary'=, 'mac.binary.mavericks'=, 'mac.binary.el-capitan'=".tgz", stop("unknown type")) paste(cPath, "/", pkgs, "_", vers, ext, sep="") } packagesFile <- file.path(reposRootPath, cPath, "PACKAGES") if (!file.exists(packagesFile)) { warning("No PACKAGES file found at ", file.path(reposRootPath, cPath), "\nSkipping this contrib path.") next } readOk <- tryCatch({ cDat <- read.dcf(packagesFile) TRUE }, error=function(e) FALSE) if (!readOk) next if (!length(cDat)) { warning("Empty PACKAGES file found at ", file.path(reposRootPath, cPath), "\nSkipping this contrib path.") next } cDatGood <- cDat[, "Package"] %in% dbMat[, "Package"] dbMatIdx <- match(cDat[cDatGood, "Package"], dbMat[, "Package"]) dbMatIdx <- dbMatIdx[!is.na(dbMatIdx)] col <- paste(ctype, "ver", sep=".") dbMat[dbMatIdx, col] <- buildPkgPath(cDat[cDatGood, "Package"], cDat[cDatGood, "Version"]) if (length((grep("^win",ctype,value=TRUE)) > 0) && ("Archs" %in% colnames(cDat))) { which1 <- which(dbMat[,"Package"] %in% cDat[,"Package"]) which2 <- which(cDat[,"Package"] %in% dbMat[,"Package"]) dbMat[which1, "Archs"] <- cDat[which2, "Archs"] } } ## Add vignette path info vigs <- getFileLinks(dbMat[, "Package"], reposRootPath, vignette.dir, "pdf") vtitles <- getDocumentTitles(vigs, reposRootPath=reposRootPath, fun=getPdfTitle) rfiles <- getFileLinks(dbMat[, "Package"], reposRootPath, vignette.dir, "R") htmlDocs <- getFileLinks(dbMat[, "Package"], reposRootPath, vignette.dir, "html", TRUE) htmlDocs[grep("\\/index\\.html$", htmlDocs)] <- NA htmlTitles <- getDocumentTitles(htmlDocs, ext="html", src=c("Rmd", "Rhtml"), reposRootPath, getHtmlTitle) allVigs <- paste(vigs, htmlDocs, sep=", ") allTitles <- paste(vtitles, htmlTitles, sep=", ") formatVec <- function(vec){ vec <- gsub(pattern="NA, NA", replacement=NA, vec) vec <- gsub(pattern="^NA, ", replacement="", vec) vec <- gsub(pattern=", NA$", replacement="", vec) vec } allVigs <- formatVec(allVigs) allTitles <- formatVec(allTitles) names(allVigs) <- names(vigs) names(allTitles) <- names(vtitles) # get any included extra files readmes <- getFileExistsAttr(dbMat[, "Package"], reposRootPath, "readmes", "README") news <- getFileExistsAttr(dbMat[, "Package"], reposRootPath, "news", "NEWS") install <- getFileExistsAttr(dbMat[, "Package"], reposRootPath, "install", "INSTALL") license <- getFileExistsAttr(dbMat[, "Package"], reposRootPath, "licenses", "LICENSE") # add additional values to matrix for writing dbMat <- cbind(dbMat, allVigs) dbMat <- cbind(dbMat, allTitles) dbMat <- cbind(dbMat, readmes) dbMat <- cbind(dbMat, news) dbMat <- cbind(dbMat, install) dbMat <- cbind(dbMat, license) dbMat <- cbind(dbMat, rfiles) colnames(dbMat) <- c(fldNames, "vignettes", "vignetteTitles", "hasREADME", "hasNEWS", "hasINSTALL", "hasLICENSE", "Rfiles") # get reverse dependency list including CRAN all_repos <- repositories() all_pkgs <- available.packages(repos = all_repos) mm = match(dbMat[,"Package"], all_pkgs[,"Package"]) dependsOnMe <- getReverseDepends(all_pkgs, "Depends")[mm] dbMat <- cbind(dbMat, dependsOnMe) importsMe <- getReverseDepends(all_pkgs, "Imports")[mm] dbMat <- cbind(dbMat, importsMe) suggestsMe <- getReverseDepends(all_pkgs, "Suggests")[mm] dbMat <- cbind(dbMat, suggestsMe) linksToMe <- getReverseDepends(all_pkgs, "LinkingTo")[mm] dbMat <- cbind(dbMat, linksToMe) # add (recursive) dependency count for badge on landing page bioc_pkgs <- available.packages(repos = all_repos[setdiff(names(all_repos), "CRAN")] ) deps <- tools::package_dependencies(rownames(bioc_pkgs), db = all_pkgs, recursive=TRUE) numDeps <- lengths(deps) dependencyCount <- numDeps[dbMat[, "Package"]] dbMat <- cbind(dbMat, dependencyCount) # Add place Holder for valid packages compared to manifest # That haven't built so they get a shell landing page rather # than no landing page if (!is.na(manifestFile)){ if(file.exists(manifestFile)){ file = readLines(manifestFile) fmtFile = vapply(file, FUN = function(vl){ if(startsWith(vl, "Package")){ trimws(gsub(vl, pattern="Package: ", replacement="")) }else{ "" }}, FUN.VALUE=character(1), USE.NAMES=FALSE) man_pkgs = fmtFile[-which(fmtFile=="")] missing_pkgs = man_pkgs[!(man_pkgs %in% unname(dbMat[,"Package"]))] add_mat = matrix(NA, nrow=length(missing_pkgs), ncol=ncol(dbMat)) rownames(add_mat) = missing_pkgs colnames(add_mat) = colnames(dbMat) # manually fill info for missing packages add_mat[,which(colnames(dbMat)=="Package")] = missing_pkgs if (!is.na(meatPath)){ for(i in seq_along(missing_pkgs)){ add_mat = tryCatch({ desc <- tools:::.read_description(file.path(meatPath, missing_pkgs[i], "DESCRIPTION")) for (dx in names(desc)){ if (dx %in% colnames(add_mat)){ add_mat[i, which(colnames(add_mat) == dx)] = desc[dx] }else{ # check for Authors@R and parse accordingly if (dx == "Authors@R"){ authMain <- tools:::.expand_package_description_db_R_fields(desc) add_mat[i,which(colnames(dbMat)=="Maintainer")] = authMain["Maintainer"] add_mat[i,which(colnames(dbMat)=="Author")] = authMain["Author"] } } } add_mat }, error = function(err){ add_mat[i,which(colnames(dbMat)=="Maintainer")] = "ERROR" add_mat[i,which(colnames(dbMat)=="Title")] = "ERROR" add_mat }, warning = function(err){ add_mat[i,which(colnames(dbMat)=="Maintainer")] = "ERROR" add_mat[i,which(colnames(dbMat)=="Title")] = "ERROR" add_mat }) } } # make sure necessary columns are not NA if (any(is.na(add_mat[,"Title"]))){ add_mat[which(is.na(add_mat[,"Title"])), "Title"] = "ERROR" } if (any(is.na(add_mat[,"Maintainer"]))){ add_mat[which(is.na(add_mat[,"Maintainer"])), "Maintainer"] = "ERROR" } dbMat = rbind(dbMat, add_mat) } } .write_repository_db(dbMat, reposRootPath, "VIEWS") } getReverseDepends <- function(db, fieldName) { pkgNames <- db[, "Package"] names(pkgNames) <- NULL df <- as.data.frame(db, stringsAsFactors=FALSE) depCols <- lapply(pkgNames, function(x) { pkgRecord <- subset(df, Package==x) pkgNames %in% getDcfValues(pkgRecord[fieldName]) }) depMat <- do.call(cbind, depCols) colnames(depMat) <- rownames(depMat) <- pkgNames ret <- character() bar <- function(x) { deps <- pkgNames[which(depMat[x, ])] ret <- c(ret, unlist(paste(deps, collapse=", "))) } ret <- lapply(pkgNames, bar) unlist(ret) } writeRFilesFromVignettes <- function(reposRoot, reposUrl="..", viewUrl="../..", reposFullUrl=reposUrl, downloadStatsUrl="", devHistoryUrl="") { pkgList <- loadPackageDetails(reposRoot, reposUrl, viewUrl, reposFullUrl, downloadStatsUrl, devHistoryUrl) StangleHTMLVignettes(reposRoot) } .printf <- function(...) print(noquote(sprintf(...))) StangleHTMLVignettes <- function(reposRoot) { viewsFile <- file.path(reposRoot, "VIEWS") pkgMat <- readPackageInfo(viewsFile) info <- read.dcf(file=viewsFile) apply(info, 1, function(x){ if (!is.na(x["vignettes"])) { if (!requireNamespace("knitr")) { stop("'knitr' package required to tangle HTML vignettes") } docs <- strsplit(x["vignettes"], ",\n")[[1]] docs <- docs[endsWith(docs, "html")] for (doc in docs) { vig <- sub("\\.html", ".Rmd", doc, ignore.case=TRUE) out <- sub("\\.html", ".R", doc, ignore.case=TRUE) if (file.exists(vig)) tryCatch(knitr::purl(vig, out), error=function(e){ print(e) }) } } }) } writeRepositoryHtml <- function(reposRoot, title, reposUrl="..", viewUrl="../..", reposFullUrl=reposUrl, downloadStatsUrl="", devHistoryUrl="", link.rel=TRUE, backgroundColor="transparent") { ## Writes package description html under reposRoot/html and an index.html ## file under reposRoot. ## ## Links created in the package description html will use reposUrl as ## prefix. pkgList <- loadPackageDetails(reposRoot, reposUrl, viewUrl, reposFullUrl, downloadStatsUrl, devHistoryUrl) writePackageDetailHtml(pkgList, file.path(reposRoot, "html"), backgroundColor=backgroundColor) writeRepositoryIndexHtml(pkgList, reposRoot, title, link.rel=link.rel) ## copy the css stylesheet cssName <- "repository-detail.css" cssPath <- system.file(file.path("css", paste(cssName, ".in", sep="")), package="biocViews") res <- try(copySubstitute(cssPath, file.path(reposRoot, cssName), symbolValues=list("BACKGROUND_COLOR"=backgroundColor)), silent=TRUE) res } writePackageDetailHtml <- function(pkgList, htmlDir="html", backgroundColor="transparent") { if (!file.exists(htmlDir)) dir.create(htmlDir) for (pkg in pkgList) { f <- file.path(htmlDir, htmlFilename(pkg)) cat("writing html for", pkg@Package, "\n") writeHtmlDoc(htmlDoc(pkg), f) } ## copy the package detail css stylesheet cssName <- "package-detail.css" cssPath <- system.file(file.path("css", paste(cssName, ".in", sep="")), package="biocViews") res <- try(copySubstitute(cssPath, file.path(htmlDir, cssName), symbolValues=list("BACKGROUND_COLOR"=backgroundColor)), silent=TRUE) res } writeRepositoryIndexHtml <- function(pkgList, reposRoot, title, htmlDir="html", link.rel=TRUE) { if (link.rel) linkRoot <- character(0) else linkRoot <- reposRoot repos <- new("RepositoryDetail", Title=title, reposRoot=linkRoot, htmlDir=htmlDir, packageList=pkgList) f <- file.path(reposRoot, htmlFilename(repos)) writeHtmlDoc(htmlDoc(repos), f) } write_SYMBOLS <- function(dir, verbose=FALSE, source.dirs=FALSE) { con <- file(file.path(dir, "SYMBOLS"), open="w") tdir <- tempfile("NAMESPACES") dir.create(tdir) on.exit(file.remove(tdir, recursive=TRUE)) extractNAMESPACEFromTarball <- function(tarball, unpackDir=tdir) { ## helper function to unpack NAMESPACE file from the tarball ret <- unpack(tarball, unpackDir, "'*/NAMESPACE'") #if (ret != 0) # warning("tar had non-zero exit status for NAMESPACE extract of: ", # tarball) } writeField <- function(field, v) { ## Helper function for writing DCF if (length(v)) { vals <- paste(v, collapse=", ") field <- paste(field, ":", sep="") writeLines(paste(field, vals), con=con) } } if (!source.dirs) { tarballs <- list.files(file.path(dir, "src/contrib"), pattern="\\.tar\\.gz$", full.names=TRUE) for (t in tarballs) { extractNAMESPACEFromTarball(t) } dir <- tdir } pkgs <- list.files(dir) for (p in pkgs) { syms <- tryCatch(parseNamespaceFile(p, dir), error=function(e) character(0)) numSyms <- (length(syms$exports) + length(syms$exportMethods) +length(syms$exportClasses)) if (numSyms > 0) { writeField("Package", p) writeField("Exports", syms$exports) writeField("ExportMethods", syms$exportMethods) writeField("ExportClasses", syms$exportClasses) writeLines("", con=con) } if (verbose) cat(p, numSyms, "symbols\n") } close(con) NULL } biocViews/updateVocab.sh0000755000175000017500000000210114136047116015127 0ustar nileshnilesh#!/usr/bin/env bash if test -z "${R_HOME}"; then echo "usage:" echo " R CMD ./updateVocab.sh" exit 1 fi DOT2GXL=dot2gxl DOT=inst/dot/biocViewsVocab.dot GXL=inst/dot/biocViewsVocab.gxl RDA=data/biocViewsVocab.rda SQLITE=inst/extdata/biocViewsVocab.sqlite rm -f $RDA $SQLITE $DOT2GXL $DOT > $GXL echo "library('graph') con <- file('$GXL', open='r') biocViewsVocab <- fromGXL(con) save(biocViewsVocab, compress=TRUE, file='$RDA') close(con) edges <- t(sapply(strsplit(edgeNames(biocViewsVocab), '~'), c)) colnames(edges) <- c('edgeFrom', 'edgeTo') if(!require(RSQLite)) { warning('DBI and RSQLite are required to dump onthology to database') } else { m <- dbDriver('SQLite') con <- dbConnect(m, dbname='$SQLITE') res <- dbWriteTable(con, 'biocViews', as.data.frame(edges, stringsAsFactors=FALSE), row.names=FALSE, overwrite=TRUE) if(!res) warning('Failed writing data to database') res <- dbDisconnect(con) }" | "${R_HOME}/bin/R" --slave rm -f $GXL echo "DONE" biocViews/NEWS0000644000175000017500000000425014140033373013034 0ustar nileshnileshCHANGES IN VERSION 1.62.0 ------------------------- BUG FIX o (1.62.1) Fix parsing of Authors@R for writing VIEWS CHANGES IN VERSION 1.61.0 ------------------------- ENHANCEMENT o (1.61.1) Added Spatial, SpatialData, SpatialWorkflow to distinguish from SigleCell CHANGES IN VERSION 1.59.0 ------------------------- ENHANCEMENT o (1.57.3) Add biocViews term DifferentialDNA3DStructure o (1.57.2) Add CRAN packages to reverse dependency list o (1.57.1) Add biocViews term Chlamydomonas_reinhardtii CHANGES IN VERSION 1.57.0 ------------------------- NEW FEATURES o (1.57.5) New views AnnotationHubSoftware and ExperimentHubSoftware BUG FIX o (1.57.4) In NEWS generation, fix formatting. o (1.57.1) In VIEWS generation fix but with vignetteTitles. When combinding different format types could potentially remove vignette titles that ended with "RNA,". Do strict start of and end of string check for formatting. CHANGES IN VERSION 1.55.0 ------------------------- NEW FEATURES o (1.55.2) printNEWS and getPackageDescriptions now have argument option 'relativeLink` that allows for relative links instead of hard coded urls for website. Important for release announcement. BUG FIX o (1.55.1) If new package fails to build still include in release announcement and don't fail out for lack of version number. CHANGES IN VERSION 1.51.11 ------------------------- NEW FEATURES o (1.51.7) Cross check views with manifest file so all expected bioconductor package get listed in biocViews. New argument to write_VIEWS manifestFile o (1.51.9) New argument to write_VIEWS meatPath. For packages that fail to build on any platform, use cloned repository DESCRIPTION file to fill in as much information to the biocVIEWS entry. o (1.51.10) Add new biocVIEW terms ImmunoOncolocy and ImmunoOncologyWorkflow. o (1.51.12) Allow NEWS file to be in .md format. R 3.6 started allowing NEWS, NEWS.Rd and newly NEWS.md formats. Adjust code accordingly. CHANGES IN VERSION 1.37.2 ------------------------- NEW FEATURES o Add new function recommendPackages for finding packages tagged with a set of biocViews. biocViews/inst/0000755000175000017500000000000014136047116013316 5ustar nileshnileshbiocViews/inst/extdata/0000755000175000017500000000000014136047116014750 5ustar nileshnileshbiocViews/inst/extdata/biocViewsVocab.sqlite0000644000175000017500000006000014136047116021074 0ustar nileshnileshSQLite format 3@ .S` ^ tablebiocViewsbiocViewsCREATE TABLE `biocViews` ( `edgeFrom` TEXT, `edgeTo` TEXT )mS4rT3 f G (  c >  w [ 7 #  y [ 9 & f S > % |^@$gClL$pBxS.pI"W)c?+u19BiologicalQuestionQuantitativeTrailLocus"t1'BiologicalQuestionPeakDetection*s17BiologicalQuestionNucleosomePositioning%r1-BiologicalQuestionNetworkInference&q1/BiologicalQuestionNetworkEnrichment#p1)BiologicalQuestionMotifDiscovery$o1+BiologicalQuestionMotifAnnotation,n1;BiologicalQuestionMicrosatelliteDetection'm11BiologicalQuestionMetagenomeAssembly*l17BiologicalQuestionLinkageDisequilibrium#k1)BiologicalQuestionIndelDetection(j13BiologicalQuestionHistoneModification%i1-BiologicalQuestionGermlineMutation%h1-BiologicalQuestionGenomeAnnotation%g1-BiologicalQuestionGenomicVariation*f17BiologicalQuestionGenomeWideAssociation#e1)BiologicalQuestionGenomeAssemblyd1!BiologicalQuestionGeneTarget"c1'BiologicalQuestionGeneSignaling&b1/BiologicalQuestionGeneSetEnrichment#a1)BiologicalQuestionGeneRegulation#`1)BiologicalQuestionGenePrediction(_13BiologicalQuestionGeneFusionDetection)^15BiologicalQuestionFunctionalPrediction#]1)BiologicalQuestionDriverMutation#\1)BiologicalQuestionDNA3DStructure)[15BiologicalQuestionDifferentialSplicing,Z1;BiologicalQuestionDifferentialPeakCalling,Y1;BiologicalQuestionDifferentialMethylation"X'1ResearchFieldEpitranscriptomicsW')ResearchFieldImmunoOncologyV'+ResearchFieldTranscriptomicsU'+ResearchFieldAgroinformatics&T'9ResearchFieldComputationalChemistryS')ResearchFieldSystemsBiology"R'1ResearchFieldFunctionalGenomics#Q'3ResearchFieldComparativeGenomics%P'7ResearchFieldBiomedicalInformatics#O'3ResearchFieldMathematicalBiologyN'!ResearchFieldBiophysics$M'5ResearchFieldStructuralPrediction"L'1ResearchFieldStructuralGenomicsK'+ResearchFieldCheminformatics J'-ResearchFieldPharmacogenetics I'-ResearchFieldPharmacogenomicsH''ResearchFieldPhylogeneticsG'#ResearchFieldEpigeneticsF'!ResearchFieldLipidomicsE'!ResearchFieldProteomicsD'%ResearchFieldMetagenomicsC'%ResearchFieldMetabolomicsB'ResearchFieldGeneticsA'#ResearchFieldCellBiology"@!7TechnologyExperimentHubSoftware"?!7TechnologyAnnotationHubSoftware>!TechnologyddPCR=!TechnologySpatial<!!TechnologySingleCell;!TechnologyCRISPR:!TechnologySAGE!9!5TechnologyMicrotitrePlateAssay8!TechnologyqPCR7!-TechnologyMassSpectrometry6!'TechnologyFlowCytometry5!!TechnologyMicroarray4!!TechnologySequencing3#'AssayDomainTranscription2#AssayDomainSNP 1#1AssayDomainGeneticVariability0#)AssayDomainGeneExpression/#AssayDomainExonArray.#)AssayDomainDNAMethylation-#AssayDomainCpGIsland!,#3AssayDomainCopyNumberVariation+#AssayDomainChIPchip*#+AssayDomainCellBasedAssays)#AssayDomainaCGH"(;WorkflowGenomicVariantsWorkflow'+WorkflowSpatialWorkflow&1WorkflowSingleCellWorkflow!%9WorkflowGeneExpressionWorkflow$1WorkflowAnnotationWorkflow#'WorkflowBasicWorkflow!"9WorkflowImmunoOncologyWorkflow'!EWorkflowDifferentialSplicingWorkflow# =WorkflowResourceQueryingWorkflow1WorkflowProteomicsWorkflow3WorkflowEpigeneticsWorkflow )+ExperimentDataPackageTypeData))ExperimentDataRepositoryData )+ExperimentDataAssayDomainData))ExperimentDataTechnologyData)%ExperimentDataDiseaseModel)%ExperimentDataOrganismData))ExperimentDataSpecimenSource%)5ExperimentDataReproducibleResearch)#AnnotationDataPackageType)AnnotationDataChipName#)1AnnotationDataSequenceAnnotation%)5AnnotationDataFunctionalAnnotation))AnnotationDataCustomDBSchema)#AnnotationDataCustomArray)AnnotationDataCustomCDF!)-AnnotationDataChipManufacturer )AnnotationDataOrganism SoftwareShinyApps )SoftwareInfrastructure /SoftwareStatisticalMethod %SoftwareWorkflowStep1SoftwareBiologicalQuestion'SoftwareResearchField!SoftwareTechnology#SoftwareAssayDomainBiocViewsWorkflow)BiocViewsExperimentData)BiocViewsAnnotationDatamu umS4rT3 f G (  c >  w [ 7 #  y [ 9 & f S > % |^@$gClL$pBxS.pI"W)c?+u19BiologicalQuestionQuantitativeTrailLocus"t1'BiologicalQuestionPeakDetection*s17BiologicalQuestionNucleosomePositioning%r1-BiologicalQuestionNetworkInference&q1/BiologicalQuestionNetworkEnrichment#p1)BiologicalQuestionMotifDiscovery$o1+BiologicalQuestionMotifAnnotation,n1;BiologicalQuestionMicrosatelliteDetection'm11BiologicalQuestionMetagenomeAssembly*l17BiologicalQuestionLinkageDisequilibrium#k1)BiologicalQuestionIndelDetection(j13BiologicalQuestionHistoneModification%i1-BiologicalQuestionGermlineMutation%h1-BiologicalQuestionGenomeAnnotation%g1-BiologicalQuestionGenomicVariation*f17BiologicalQuestionGenomeWideAssociation#e1)BiologicalQuestionGenomeAssemblyd1!BiologicalQuestionGeneTarget"c1'BiologicalQuestionGeneSignaling&b1/BiologicalQuestionGeneSetEnrichment#a1)BiologicalQuestionGeneRegulation#`1)BiologicalQuestionGenePrediction(_13BiologicalQuestionGeneFusionDetection)^15BiologicalQuestionFunctionalPrediction#]1)BiologicalQuestionDriverMutation#\1)BiologicalQuestionDNA3DStructure)[15BiologicalQuestionDifferentialSplicing,Z1;BiologicalQuestionDifferentialPeakCalling,Y1;BiologicalQuestionDifferentialMethylation"X'1ResearchFieldEpitranscriptomicsW')ResearchFieldImmunoOncologyV'+ResearchFieldTranscriptomicsU'+ResearchFieldAgroinformatics&T'9ResearchFieldComputationalChemistryS')ResearchFieldSystemsBiology"R'1ResearchFieldFunctionalGenomics#Q'3ResearchFieldComparativeGenomics%P'7ResearchFieldBiomedicalInformatics#O'3ResearchFieldMathematicalBiologyN'!ResearchFieldBiophysics$M'5ResearchFieldStructuralPrediction"L'1ResearchFieldStructuralGenomicsK'+ResearchFieldCheminformatics J'-ResearchFieldPharmacogenetics I'-ResearchFieldPharmacogenomicsH''ResearchFieldPhylogeneticsG'#ResearchFieldEpigeneticsF'!ResearchFieldLipidomicsE'!ResearchFieldProteomicsD'%ResearchFieldMetagenomicsC'%ResearchFieldMetabolomicsB'ResearchFieldGeneticsA'#ResearchFieldCellBiology"@!7TechnologyExperimentHubSoftware"?!7TechnologyAnnotationHubSoftware>!TechnologyddPCR=!TechnologySpatial<!!TechnologySingleCell;!TechnologyCRISPR:!TechnologySAGE!9!5TechnologyMicrotitrePlateAssay8!TechnologyqPCR7!-TechnologyMassSpectrometry6!'TechnologyFlowCytometry5!!TechnologyMicroarray4!!TechnologySequencing3#'AssayDomainTranscription2#AssayDomainSNP 1#1AssayDomainGeneticVariability0#)AssayDomainGeneExpression/#AssayDomainExonArray.#)AssayDomainDNAMethylation-#AssayDomainCpGIsland!,#3AssayDomainCopyNumberVariation+#AssayDomainChIPchip*#+AssayDomainCellBasedAssays)#AssayDomainaCGH"(;WorkflowGenomicVariantsWorkflow'+WorkflowSpatialWorkflow&1WorkflowSingleCellWorkflow!%9WorkflowGeneExpressionWorkflow$1WorkflowAnnotationWorkflow#'WorkflowBasicWorkflow!"9WorkflowImmunoOncologyWorkflow'!EWorkflowDifferentialSplicingWorkflow# =WorkflowResourceQueryingWorkflow1WorkflowProteomicsWorkflow3WorkflowEpigeneticsWorkflow )+ExperimentDataPackageTypeData))ExperimentDataRepositoryData )+ExperimentDataAssayDomainData))ExperimentDataTechnologyData)%ExperimentDataDiseaseModel)%ExperimentDataOrganismData))ExperimentDataSpecimenSource%)5ExperimentDataReproducibleResearch)#AnnotationDataPackageType)AnnotationDataChipName#)1AnnotationDataSequenceAnnotation%)5AnnotationDataFunctionalAnnotation))AnnotationDataCustomDBSchema)#AnnotationDataCustomArray)AnnotationDataCustomCDF!)-AnnotationDataChipManufacturer )AnnotationDataOrganism SoftwareShinyApps )SoftwareInfrastructure /SoftwareStatisticalMethod %SoftwareWorkflowStep1SoftwareBiologicalQuestion'SoftwareResearchField!SoftwareTechnology#SoftwareAssayDomainBiocViewsWorkflow)BiocViewsExperimentData)BiocViewsAnnotationDataBiocViewsSoftware xj@}K$ i L ( r R 3  y S + l =  j S < & nU=#sV>"tZ2waK-sT<v]8eH&kL+m%OrganismOryza_satival3OrganismOncorhynchus_mykissk/OrganismNeurospora_crassaj%OrganismMus_musculus i7OrganismMonodelphis_domesticah3OrganismMedicago_truncatulag1OrganismMagnaporthe_griseaf)OrganismMacaca_mulattae3OrganismMacaca_fascicularisd5OrganismKluyveromyces_lactisc+OrganismHordeum_vulgareb%OrganismHomo_sapiensa#OrganismGlycine_max!`9OrganismGasterosteus_aculeatus_'OrganismGallus_gallus^-OrganismEscherichia_coli ]7OrganismEremothecium_gossypii\1OrganismDrosophila_virilis"[;OrganismDrosophila_melanogasterZ#OrganismDanio_rerio$Y?OrganismChlamydomonas_reinhardtiiX1OrganismCiona_intestinalisW+OrganismCicer_arietinumV-OrganismCanis_familiarisU1OrganismCallithrix_jacchus!T9OrganismCaenorhabditis_elegansS!OrganismBos_taurusR/OrganismBacillus_subtilis Q7OrganismAsparagus_officinalisP5OrganismArabidopsis_thalianaO1OrganismArabidopsis_lyrataN)OrganismApis_melliferaM/OrganismAnopheles_gambiaeL'VisualizationNetworkK-PathwaysNCINatureCuratedJPathwaysBioCartaIPathwaysReactomeHPathwaysKEGG GPathwaysGO%F?AlignmentMultipleSequenceAlignment*E-;MassSpectrometryImagingMassSpectrometry!D!5MicroarrayProprietaryPlatformsC!-MicroarrayTissueMicroarray%B!=MicroarrayReversePhaseProteinArrayA!!MicroarrayChipOnChip@!)MicroarraymRNAMicroarray?!'MicroarrayMicroRNAArray>!+MicroarrayGenotypingArray=!-MicroarrayMethylationArray<!!MicroarrayTwoChannel;!!MicroarrayOneChannel:!%MicroarrayMultiChannel9!SequencingMNaseSeq8!'SequencingPooledScreens7!SequencingHiC6!+SequencingMicrobialStrain 5!3SequencingDenovoTranscriptome!4!5SequencingTargetedResequencing3!%SequencingDenovoGenome2!#SequencingWholeGenome1!!SequencingMicrobiome0!SequencingSmallRNA/!SequencingSangerSeq.!SequencingmiRNA-!SequencingExomeSeq,!SequencingMethylSeq+!SequencingRIPSeq*!SequencingChIPSeq)!SequencingRNASeq(!SequencingRiboSeq'!SequencingDNaseSeq&!SequencingDNASeq%!SequencingATACSeq$)InfrastructureGUI##)1InfrastructureDataRepresentation")!InfrastructureDataImport!!)-InfrastructureThirdPartyClient /!StatisticalMethodTimeCourse/StatisticalMethodSurvival(/5StatisticalMethodSupportVectorMachine,/=StatisticalMethodStructuralEquationModels/!StatisticalMethodRegression&/1StatisticalMethodPrincipalComponent /%StatisticalMethodPatternLogic!/'StatisticalMethodNeuralNetwork+/;StatisticalMethodMultidimensionalScaling%//StatisticalMethodHiddenMarkovModel#/+StatisticalMethodGraphAndNetwork%//StatisticalMethodFeatureExtraction&/1StatisticalMethodDimensionReduction /%StatisticalMethodDecisionTree/!StatisticalMethodClustering"/)StatisticalMethodClassification/StatisticalMethodBayesian%)WorkflowStepGenomeBrowsers%'WorkflowStepVisualization %'WorkflowStepReportWriting %)WorkflowStepQualityControl %'WorkflowStepPreprocessing %WorkflowStepPathways %'WorkflowStepNormalization!%1WorkflowStepMultipleComparison%#WorkflowStepBatchEffect%!WorkflowStepAnnotation%WorkflowStepAlignment!%1WorkflowStepExperimentalDesign+19BiologicalQuestionDifferentialExpression/1ABiologicalQuestionDifferentialDNA3DStructure$1+BiologicalQuestionDenovoAssembler/1ABiologicalQuestionDemethylateRegionDetection1BiologicalQuestionCoverage(~13BiologicalQuestionAlternativeSplicing%}1-BiologicalQuestionVariantDetection&|1/BiologicalQuestionVariantAnnotation){15BiologicalQuestionTranscriptomeVariant(z13BiologicalQuestionStructuralVariation%y1-BiologicalQuestionSplicedAlignment$x1+BiologicalQuestionSomaticMutation%w1-BiologicalQuestionSequenceMatching v1#BiologicalQuestionScaffolding Nc=xY=! g H &  w ] A  n Z D ,  v _ H 1   y [ F 1  iVB0rZD+fR<(w`G2nQ4u^I4u_J2hN)SpecimenSourceGenome)SpecimenSourceProteome)SpecimenSourceTissue#PackageTypeEuPathDB#'PackageTypeAnnotationHub#PackageTypeFRMA~#PackageTypeMeSHDb}#PackageTypeTxDb|##PackageTypeXtraSNPlocs{#PackageTypeSNPlocsz#PackageTypeSIFTy#PackageTypeprobex#PackageTypePolyPhenw#PackageTypeOrgDbv#!PackageTypeOrganismDbu#%PackageTypeInparanoidDbt#PackageTypedb0s#PackageTypeChipDbr#PackageTypecdfq#PackageTypeBSgenomepChipNamem20kcodoChipNamem10kcodnChipNamelumiRatV1m#ChipNamelumiHumanV2l#ChipNamelumiMouseV1k#ChipNamelumiHumanV1j+ChipNameJazaerimetaDatai'ChipNameilluminaRatv1h/ChipNameilluminaMousev1p1g+ChipNameilluminaMousev1f+ChipNameilluminaHumanv2e+ChipNameilluminaHumanv1dChipNameindaccChipNamehwgcodbChipNameHuO22aChipNamehu6800`ChipNamehu35ksubd_ChipNamehu35ksubc^ChipNamehu35ksubb]ChipNamehu35ksuba\%ChipNamehs25kresogen[ChipNamehi16codZ#ChipNamehguqiagenv3YChipNamehgug4112aXChipNamehgug4111aWChipNamehgug4110bVChipNamehgug4101aUChipNamehgug4100aT#ChipNamehguatlas13kSChipNamehgu95eRChipNamehgu95dQChipNamehgu95cPChipNamehgu95bOChipNamehgu95av2NChipNamehgu95aM#ChipNamehgu133plus2LChipNamehgu133bKChipNamehgu133aJChipNamehgu133a2IChipNamehgfocusHChipNamehcg110GChipNameh20kcodFChipNameh10kcodE#ChipNamedrosophila2D#ChipNamedrosgenome1CChipNamecelegansB!ChipNameath1121501 AChipNameag@ChipNameadme16cod?ChipNamehcgi8k>ChipNamehcgi12k=ChipNamezebrafish<ChipNameygs98;ChipNameyeast2:'ChipNamexenopuslaevis9ChipNameu133x3p8ChipNameSHDZ7ChipNamerwgcod6ChipNamertu34 57ChipNameRoberts2005Annotation4ChipNamernu343ChipNameri16cod2ChipNamergug4130a1ChipNamergu34c0ChipNamergu34b/ChipNamergu34a.ChipNamerat2302-ChipNamerae230b,ChipNamerae230a+ChipNamer10kcod*-ChipNamePartheenMetaData)%ChipNamepedbarrayv10(#ChipNamepedbarrayv9''ChipNameOperonHumanV3&ChipNameNorway981%ChipNamemwgcod$ChipNameMu22v3#ChipNameMu15v1"ChipNamemu19ksubc!ChipNamemu19ksubb ChipNamemu19ksubaChipNamemu11ksubbChipNamemu11ksuba!ChipNamempedbarray!ChipNamemouse430a2ChipNamemouse4302ChipNamemoe430bChipNamemoe430aChipNamemgug4122aChipNamemgug4121a!ChipNamemguatlas5kChipNamemgu74cv2ChipNamemgu74cChipNamemgu74bv2ChipNamemgu74bChipNamemgu74av2ChipNamemgu74a%ChipNamemm24kresogenChipNamemi16cod$ 1+SequenceAnnotationGenomicSequence& )7CustomDBSchemaGeneCardsCustomSchema 'CustomCDFMBNICustomCDF #CustomCDFGACustomCDF -%ChipManufacturerCodelinkChip*-;ChipManufacturerUniversityHealthNetwork-ChipManufacturerRocheChip-#ChipManufacturerRNG_MRCChip-!ChipManufacturerQiagenChip-%ChipManufacturerIlluminaChip-ChipManufacturerINDACChip-ChipManufacturerGEChip -'ChipManufacturerClonetechChip-#ChipManufacturerAgilentChip!-)ChipManufacturerAffymetrixChip~OrganismZea_mays}1OrganismXenopus_tropicalis|)OrganismXenopus_laevis{)OrganismVitis_viniferaz/OrganismTriticum_aestivumy/OrganismToxoplasma_gondiix3OrganismTaeniopygia_guttataw!OrganismSus_scrofa v7OrganismStaphylococcus_aureus$u?OrganismSchizosaccharomyces_pombe t7OrganismSaccharum_officinarum#s=OrganismSaccharomyces_cerevisiaer/OrganismRattus_norvegicus!q9OrganismPseudomonas_aeruginosa p7OrganismPlasmodium_falciparumo+OrganismPan_troglodytesn%OrganismPan_paniscus ifAwN' Y 2  v K & ] 3  n F  n D nU;y]B qW@  yR2p;dC#zX0 qI)m)=MicroarrayDataProprietaryPlatformsData%l)5MicroarrayDataTissueMicroarrayData-k)EMicroarrayDataReversePhaseProteinArrayDataj))MicroarrayDataChipOnChipDatai)'MicroarrayDatamRNAArrayData"h)/MicroarrayDataMicroRNAArrayData$g)3MicroarrayDataGenotypingArrayData%f)5MicroarrayDataMethylationArrayDatae))MicroarrayDataTwoChannelDatad))MicroarrayDataOneChannelData!c)-MicroarrayDataMultiChannelDatab)#SequencingDataSpatialDataa))SequencingDataSingleCellData`))SequencingDataMicrobiomeData_)%SequencingDataSmallRNAData^)'SequencingDataSangerSeqData])SequencingDatamiRNAData\)%SequencingDataExomeSeqData[)'SequencingDataMethylSeqDataZ)!SequencingDataRIPSeqDataY)#SequencingDataChIPSeqDataX)!SequencingDataRNASeqDataW)!SequencingDataDNASeqData2V5CMassSpectrometryDataImagingMassSpectrometryDataU!1CancerDataProstateCancerDataT!/CancerDataOvarianCancerDataS!)CancerDataLungCancerDataR!1CancerDataLeukemiaCancerDataQ!-CancerDataKidneyCancerDataP!+CancerDataColonCancerDataO!-CancerDataBreastCancerData$N+1PackageTypeDataImmunoOncologyDataM+'PackageTypeDataExperimentHubL)RepositoryDataENCODE#K)1RepositoryDataProject1000genomes+J)ARepositoryDataPathwayInteractionDatabaseI)RepositoryDataNCIH)%RepositoryDataArrayExpressG)RepositoryDataGEOF)RepositoryDataHapMap E+)AssayDomainDataExpressionDataD+AssayDomainDataSNPDataC+'AssayDomainDataCpGIslandData)B+;AssayDomainDataCopyNumberVariationDataA))TechnologyDataMicroarrayData@))TechnologyDataSequencingData?)TechnologyDataCGHData>)TechnologyDataSAGEData)=)=TechnologyDataMicrotitrePlateAssayData<)TechnologyDataqPCRData%;)5TechnologyDataMassSpectrometryData*:)?TechnologyDataHighThroughputImagingData"9)/TechnologyDataFlowCytometryData8%DiseaseModelCOPDData7%DiseaseModelHIVData6%!DiseaseModelCancerData5%'OrganismDataZea_mays_Data&4%;OrganismDataXenopus_tropicalis_Data"3%3OrganismDataXenopus_laevis_Data"2%3OrganismDataVitis_vinifera_Data%1%9OrganismDataTriticum_aestivum_Data'0%=OrganismDataTaeniopygia_guttata_Data/%+OrganismDataSus_scrofa_Data).%AOrganismDataStaphylococcus_aureus_Data--%IOrganismDataSchizosaccharomyces_pombe_Data),%AOrganismDataSaccharum_officinarum_Data,+%GOrganismDataSaccharomyces_cerevisiae_Data%*%9OrganismDataRattus_norvegicus_Data*)%COrganismDataPseudomonas_aeruginosa_Data)(%AOrganismDataPlasmodium_falciparum_Data#'%5OrganismDataPan_troglodytes_Data &%/OrganismDataPan_paniscus_Data %%/OrganismDataOryza_sativa_Data'$%=OrganismDataOncorhynchus_mykiss_Data%#%9OrganismDataNeurospora_crassa_Data "%/OrganismDataMus_musculus_Data)!%AOrganismDataMonodelphis_domestica_Data& %;OrganismDataMedicago_truncatul_Data&%;OrganismDataMagnaporthe_grisea_Data"%3OrganismDataMacaca_mulatta_Data(%?OrganismDataKluyveromyces_lactis_Data#%5OrganismDataHordeum_vulgare_Data %/OrganismDataHomo_sapiens_Data%-OrganismDataGlycine_max_Data*%COrganismDataGasterosteus_aculeatus_Data!%1OrganismDataGallus_gallus_Data$%7OrganismDataEscherichia_coli_Data)%AOrganismDataEremothecium_gossypii_Data&%;OrganismDataDrosophila_virilis_Data+%EOrganismDataDrosophila_melanogaster_Data%-OrganismDataDanio_rerio_Data&%;OrganismDataCiona_intestinalis_Data$%7OrganismDataCanis_familiaris_Data&%;OrganismDataCallithrix_jacchus_Data*%COrganismDataCaenorhabditis_elegans_Data%+OrganismDataBos_taurus_Data% %9OrganismDataBacillus_subtilis_Data( %?OrganismDataArabidopsis_thaliana_Data& %;OrganismDataArabidopsis_lyrata_Data" %3OrganismDataApis_mellifera_Data% %9OrganismDataAnopheles_gambiae_Data)SpecimenSourceSomatic)SpecimenSourceGermline)#SpecimenSourceCellCulture)SpecimenSourceStemCellbiocViews/inst/htmlfrags/0000755000175000017500000000000014136047116015305 5ustar nileshnileshbiocViews/inst/htmlfrags/topfrag.html0000644000175000017500000000044314136047116017636 0ustar nileshnilesh Bioconductor Task View: top level views

Bioconductor Task View: top level views

Maintainer: None

Has subviews:

    biocViews/inst/css/0000755000175000017500000000000014136047116014106 5ustar nileshnileshbiocViews/inst/css/package-detail.css.in0000644000175000017500000000121514136047116020057 0ustar nileshnileshbody { background-color: @BACKGROUND_COLOR@; margin-left:15%; margin-right:15%; } p.description { background-color: #CE8; padding:10px; } tr.row_odd { background-color: #FFF; } tr.row_even { background-color: #DDD; } td { vertical-align: top; padding-right:10px; padding-left:4px; } th { text-align: right; padding-right:10px; padding-left:4px; font-weight:normal; } h3 { margin-bottom: 2px; } table.vigsAndDownloads { margin-left: 10%; margin-right: 10%; } table.author_info { margin-bottom: 20px; } div.installInstruct { background-color: #FFF; } biocViews/inst/css/repository-detail.css.in0000644000175000017500000000065214136047116020707 0ustar nileshnileshbody { background-color: @BACKGROUND_COLOR@; margin-left:10%; margin-right:10%; } p.description { background-color: #CE8; margin-right:20%; } tr.row_odd { background-color: #FFF; } tr.row_even { background-color: #DDD; } td { padding-right:10px; padding-left:4px; } th { text-align: left; padding-right:10px; padding-left:4px; font-weight:bold; } biocViews/inst/doc/0000755000175000017500000000000014140322302014047 5ustar nileshnileshbiocViews/inst/doc/HOWTO-BCV.Rnw0000644000175000017500000001356114136047116016071 0ustar nileshnilesh%\VignetteIndexEntry{biocViews-HOWTO} % % NOTE -- ONLY EDIT THE .Rnw FILE!!! The .tex file is % likely to be overwritten. % \documentclass[12pt]{article} \usepackage{amsmath} \usepackage[authoryear,round]{natbib} \usepackage{hyperref} \textwidth=6.2in \textheight=8.5in %\parskip=.3cm \oddsidemargin=.1in \evensidemargin=.1in \headheight=-.3in \newcommand{\scscst}{\scriptscriptstyle} \newcommand{\scst}{\scriptstyle} \newcommand{\Rfunction}[1]{{\texttt{#1}}} \newcommand{\Robject}[1]{{\texttt{#1}}} \newcommand{\Rpackage}[1]{{\textit{#1}}} \newcommand{\Rmethod}[1]{{\texttt{#1}}} \newcommand{\Rfunarg}[1]{{\texttt{#1}}} \newcommand{\Rclass}[1]{{\textit{#1}}} \textwidth=6.2in \bibliographystyle{plainnat} \begin{document} %\setkeys{Gin}{width=0.55\textwidth} \title{HOWTO generate biocViews HTML} \author{S. Falcon and V.J. Carey} \maketitle <>= library("biocViews") library("Biobase") @ \section{Overview} The purpose of \Rpackage{biocViews} is create HTML pages that categorize packages in a Bioconductor package repository according to terms, or \textit{views}, in a controlled vocabulary. The fundamental resource is the VIEWS file placed at the root of a repository. This file contains the complete DESCRIPTION file contents for each package along with additional meta data describing the location of package artifacts such as archive files for different platforms and vignettes. The standard behavior of the view generation program is to query the repository over the internet. This package includes a static sample VIEWS file so that the examples in this document can run without internet access. \section{Establishing a vocabulary of terms} We use \texttt{dot} to describe the vocabulary. For details on the \texttt{dot} syntax, see \url{http://www.graphviz.org/doc/info/lang.html}. <>= vocabFile <- system.file("dot/biocViewsVocab.dot", package="biocViews") cat(readLines(vocabFile)[1:20], sep="\n") cat("...\n") @ The dot description is transformed to a GXL document using \texttt{dot2gxl}, a tool included in the graphviz distribution. The GXL is then converted to a \Rclass{graphNEL} instance using \Rfunction{fromGXL} from the \Rpackage{graph} package. There is a helper script in the root of the \Rpackage{biocViews} package called \texttt{updateVocab.sh} that automates the update process if the required tools are available. The script will also attempt to dump the ontology graph into a local SQLite database using tools from \Rpackage{DBI} and \Rpackage{RSQLite}. The information in this database can be used to create a dynamic HTML representation of the graph by means of a PHP script. The definition of the vocabulary lacks a notion of order. Since the purpose of the vocabulary is primarily for display, a valuable improvement would be to use graph attributes to allow the ordering of the terms. Another missing piece is a place to put a text description of each term. This could also be achieved using graph attributes. \subsection{Use Case: adding a term to the vocabulary} To add a new term to the vocabulary: \begin{enumerate} \item edit the \textit{dot} file \texttt{dot/biocViewsVocab.dot} and add the desired term. Note that terms cannot contain spaces and that the underscore character, \verb+_+, should be used instead. \item ensure that R and dot2gxl are on your PATH. \item cd into the biocViews working copy directory. \item run the updateVocab.sh script. \item reinstall the package and test that the new term is part of the vocabulary. In short, you will load the data using \texttt{data(biocViewsVocab)} and check that the term is a node of the graph instance. \item commit changes to svn. \end{enumerate} \subsection{Use Case: updating BioConductor website} This is for BioConductor web administrator: \begin{enumerate} \item update local copy of biocViews using \texttt{svn update}. \item find the correct instance R that is used to generate HTML pages on BioConductor website, and install the updated \texttt{biocViews}. \item re-generate the related HTML packages by using \texttt{/home/biocadmin/bin/prepareRepos-*.sh} and \texttt{/home/biocadmin/bin/pushRepos-*.sh}. \end{enumerate} \section{Querying a repository} To generate a list of \Rclass{BiocViews} objects that can be used to generate HTML views, you will need the repository URL and a graph representation of the vocabulary. There are three main Bioconductor package repositories: a software repository containing analytic packages, an annotation data repository, and an experiment data repository. The vocabulary of terms has a single top-level node, all other nodes have at least one parent. The top-level node, \textit{BiocViews}, has three children that correspond to the three main Bioconductor repositories: \textit{Software}, \textit{AnnotationData}, and \textit{ExperimentData}. Views for each repository are created separately using \Rfunction{getBiocSubViews}. Below, we demonstrate how to build the \textit{Software} set of views. <>= data(biocViewsVocab) reposPath <- system.file("doc", package="biocViews") reposUrl <- paste("file://", reposPath, sep="") biocViews <- getBiocSubViews(reposUrl, biocViewsVocab, topTerm="Software") print(biocViews[1:2]) @ To query the currently available vocabulary terms, use function \Rfunction{getSubTerms} on the \Rclass{graphNEL} object \Robject{biocViewsVocab}. The second argument of this function takes a character of the base term for which all subterms should be returned. For a complete list use \Rfunarg{term="BiocViews"}. <>= getSubTerms(biocViewsVocab, term="Technology") @ \section{Generating HTML} By default, the set of HTML views will link to package description pages located in the html subdirectory of the remote repository. <>= viewsDir <- file.path(tempdir(), "biocViews") dir.create(viewsDir) writeBiocViews(biocViews, dir=viewsDir) dir(viewsDir)[1:2] @ \end{document} biocViews/inst/doc/HOWTO-BCV.R0000644000175000017500000000246614140322300015510 0ustar nileshnilesh### R code from vignette source 'HOWTO-BCV.Rnw' ################################################### ### code chunk number 1: HOWTO-BCV.Rnw:43-45 ################################################### library("biocViews") library("Biobase") ################################################### ### code chunk number 2: VocabDefinition ################################################### vocabFile <- system.file("dot/biocViewsVocab.dot", package="biocViews") cat(readLines(vocabFile)[1:20], sep="\n") cat("...\n") ################################################### ### code chunk number 3: getViews ################################################### data(biocViewsVocab) reposPath <- system.file("doc", package="biocViews") reposUrl <- paste("file://", reposPath, sep="") biocViews <- getBiocSubViews(reposUrl, biocViewsVocab, topTerm="Software") print(biocViews[1:2]) ################################################### ### code chunk number 4: listTerms ################################################### getSubTerms(biocViewsVocab, term="Technology") ################################################### ### code chunk number 5: htmlViewsGen ################################################### viewsDir <- file.path(tempdir(), "biocViews") dir.create(viewsDir) writeBiocViews(biocViews, dir=viewsDir) dir(viewsDir)[1:2] biocViews/inst/doc/VIEWS0000644000175000017500000001665514136047116014720 0ustar nileshnileshPackage: RBGL Version: 1.7.0 Depends: graph Title: Test interface to boost C++ graph lib Description: demo of interface with full copy of all hpp defining boost biocViews: GraphsAndNetworks Author: Vince Carey , Li Long Maintainer: Li Long URL: http://www.bioconductor.org License: LGPL source.ver: src/contrib/RBGL_1.7.0.tar.gz vignettes: vignettes/RBGL/inst/doc/RBGL-004.pdf, vignettes/RBGL/inst/doc/RBGL-005.pdf, vignettes/RBGL/inst/doc/RBGL-006.pdf, vignettes/RBGL/inst/doc/RBGL-011.pdf, vignettes/RBGL/inst/doc/RBGL-012.pdf, vignettes/RBGL/inst/doc/RBGL-016.pdf, vignettes/RBGL/inst/doc/RBGL-017.pdf, vignettes/RBGL/inst/doc/RBGL.pdf, vignettes/RBGL/inst/doc/filedep.pdf Package: Rgraphviz Version: 1.9.0 Depends: R (>= 2.1.0), graph Suggests: Biobase, geneplotter, fibroEset, hgu95av2 Title: Provides plotting capabilities for R graph objects Description: Interfaces R with the AT&T GraphViz library to provide the ability for plotting R graph objects from the graph package biocViews: GraphsAndNetworks Author: Jeff Gentry Maintainer: Jeff Gentry License: LGPL SystemRequirements: Graphviz version >= 1.12 source.ver: src/contrib/Rgraphviz_1.9.0.tar.gz vignettes: vignettes/Rgraphviz/inst/doc/Rgraphviz.pdf, vignettes/Rgraphviz/inst/doc/layingOutPathways.pdf Package: Ruuid Version: 1.9.0 Depends: R (>= 2.1), methods Title: Ruuid: Provides Universally Unique ID values Description: A package to provide UUID values in R Author: R. Gentleman Maintainer: R. Gentleman License: LGPL source.ver: src/contrib/Ruuid_1.9.0.tar.gz vignettes: vignettes/Ruuid/inst/doc/Ruuid.pdf Package: aCGH Version: 1.5.0 Depends: R (>= 1.8.0), cluster, survival, multtest, sma Title: Classes and functions for Array Comparative Genomic Hybridization data. Description: Functions for reading aCGH data from image analysis output files and clone information files, creation of aCGH S3 objects for storing these data. Basic methods for accessing/replacing, subsetting, printing and plotting aCGH objects. Author: Jane Fridlyand , Peter Dimitrov Maintainer: Jane Fridlyand License: GPL source.ver: src/contrib/aCGH_1.5.0.tar.gz win.binary.ver: bin/windows/contrib/2.3/aCGH_1.4.0.zip vignettes: vignettes/aCGH/inst/doc/aCGH.pdf Package: affxparser Version: 1.3.0 Depends: R (>= 2.0), Biobase Suggests: affy Title: Affymetrix File Parsing SDK Description: Package for parsing Affymetrix files (CDF, CEL, CHP, BPMAP, BAR) biocViews: DataImport Author: James Bullard, Kasper Daniel Hansen Maintainer: James Bullard License: LGPL-2 source.ver: src/contrib/affxparser_1.3.0.tar.gz Package: affy Version: 1.9.1 Depends: R (>= 2.0.1), Biobase (>= 1.5.0), methods, utils Suggests: tkWidgets (>= 1.2.2), affydata Title: Methods for Affymetrix Oligonucleotide Arrays Description: The package contains functions for exploratory oligonucleotide array analysis. The dependance to tkWidgets only concerns few convenience functions. 'affy' is fully functional without it. biocViews: Microarray, OneChannel, Preprocessing Author: Rafael A. Irizarry , Laurent Gautier , Benjamin Milo Bolstad , and Crispin Miller with contributions from Magnus Astrand , Leslie M. Cope , Robert Gentleman, Jeff Gentry, Conrad Halling , Wolfgang Huber, James MacDonald , Benjamin I. P. Rubinstein, Christopher Workman , John Zhang Maintainer: Rafael A. Irizarry License: LGPL version 2 or newer source.ver: src/contrib/affy_1.9.1.tar.gz win.binary.ver: bin/windows/contrib/2.3/affy_1.8.1.zip vignettes: vignettes/affy/inst/doc/affy.pdf, vignettes/affy/inst/doc/builtinMethods.pdf, vignettes/affy/inst/doc/customMethods.pdf, vignettes/affy/inst/doc/dealing_with_cdfenvs.pdf, vignettes/affy/inst/doc/vim.pdf Package: affycomp Version: 1.7.0 Depends: R (>= 1.8.1), Biobase (>= 1.1.0) Title: Graphics Toolbox for Assessment of Affymetrix Expression Measures Description: The package contains functions that can be used to compare expression measures for Affymetrix Oligonucleotide Arrays. Author: Rafael A. Irizarry and Zhijin Wu with contributions from Simon Cawley , Maintainer: Rafael A. Irizarry License: GPL version 2 or newer source.ver: src/contrib/affycomp_1.7.0.tar.gz win.binary.ver: bin/windows/contrib/2.3/affycomp_1.6.0.zip vignettes: vignettes/affycomp/inst/doc/affycomp.pdf Package: affydata Version: 1.7.0 Depends: R (>= 1.6.2), affy (>= 1.2) Suggests: hgu95av2cdf, hgu133acdf Title: Affymetrix Data for Demonstration Purpose Description: Example datasets of a slightly large size. They represent 'real world examples', unlike the artificial examples included in the package affy. Author: Bioconductor Maintainer: Laurent License: GPL version 2 or newer source.ver: src/contrib/affydata_1.7.0.tar.gz win.binary.ver: bin/windows/contrib/2.3/affydata_1.6.0.zip vignettes: vignettes/affydata/inst/doc/affydata.pdf Package: affylmGUI Version: 1.5.0 Depends: limma, tcltk, affy Suggests: tkrplot, affyPLM, R2HTML, xtable Title: GUI for affy analysis using limma package Description: A Graphical User Interface for affy analysis using the limma Microarray package biocViews: Microarray, OneChannel, DataImport, QualityControl, Preprocessing, Statistics, DifferentialExpression, MultipleComparisons Author: James Wettenhall and Ken Simpson Division of Genetics and Bioinformatics, WEHI. Maintainer: Keith Satterley URL: http://bioinf.wehi.edu.au/affylmGUI/ License: GPL version 2 or newer source.ver: src/contrib/affylmGUI_1.5.0.tar.gz win.binary.ver: bin/windows/contrib/2.3/affylmGUI_1.4.0.zip vignettes: vignettes/affylmGUI/inst/doc/affylmGUI.pdf, vignettes/affylmGUI/inst/doc/extract.pdf Package: affypdnn Version: 1.5.0 Depends: R (>= 1.9.0), affy (>= 1.5), affydata, hgu95av2probe Title: Probe Dependent Nearest Neighbours (PDNN) for the affy package Description: The package contains functions to perform the PDNN method described by Li Zhang et al. Author: H. Bjorn Nielsen and Laurent Gautier (Many thanks to Li Zhang early communications about the existence of the PDNN program and related publications). Maintainer: Laurent License: LGPL source.ver: src/contrib/affypdnn_1.5.0.tar.gz win.binary.ver: bin/windows/contrib/2.3/affypdnn_1.4.0.zip Package: graph Version: 1.9.0 Depends: R (>= 2.1.0), cluster, Ruuid Suggests: SparseM (>= 0.36), XML Title: graph: A package to handle graph data structures Description: A package that implements some simple graph handling capabilities. biocViews: GraphsAndNetworks Author: R. Gentleman, Elizabeth Whalen Maintainer: R. Gentleman License: LGPL source.ver: src/contrib/graph_1.9.0.tar.gz vignettes: vignettes/graph/inst/doc/clusterGraph.pdf, vignettes/graph/inst/doc/graph.pdf biocViews/inst/doc/createReposHtml.pdf0000644000175000017500000032701414140322302017652 0ustar nileshnilesh%PDF-1.5 % 37 0 obj << /Length 1423 /Filter /FlateDecode >> stream xڕW[SF~ϯ܉vu7@6 xt>Y1_sVBN[Z9߹}{LlM&y0$ frbjgxw ߔyW>N}x|pڽԇ9 8gY SK:m$v6tf M3롚 eL|Ysl:6"nqtRɲ{ytk& YjX&S.\h@NԶl3)!Bjő󵬯xKx+ cA~Knd pz(t63.G,NՊ]q v?Moqh]ņ.ն{^"G%Փ`[RT]q6"^ +WQ(7 xnUј(ν]RCY|J\D2"oޟBXVBPQ dx[ ` H(4%T{P x(iG=eW%'>jn/> YFYFĸÚՎȺ_kQ&|Je"cdzl-uОw# R$n'\(>L@4:wKf҇6F-uSH” X*kdݝG{Th<1麛DhA5t 'чR!  PIC*Fs ȘnE z,pRi@~+^k'x.u8V[,Hݼ5Egx hW#'vQ |?Rw7pN[cN9^ 70*KT{nLpF1 Y;mOry} {$ebU\zO68΁&NsvzM[L gϽgN?uH[}H3i i|2PMEė0 aT i׽F,lFGQ'iD3#6zt\M|E*緞gڛ3w̮8,Ho)l~yVߡ ΐw;˃!989-WU?7||FhM&?vbY&cA5kN.(z6[! YFK/8wN[#38B/'!`fpgAYԻ!- endstream endobj 52 0 obj << /Length 1959 /Filter /FlateDecode >> stream xڽYnV}WC%4/E[u8S MіjJD*Jr1%)w3s̕?:iz$Nmo%8_LLYOw3wZJ5hZFVg4NIS]83c͹(kPpfRڈ̕.?!;uNrpup['tD8CrQG<=&?SYm٤}5ǎ< _"Qr8\gTٰZʿR~ _(8E].dƶ"TS;uYsfL/Pju7|ߟGawqtp~p^̧9i]vy!,uoIhQqIeW`YABs6;sñD@4偷@ō՚RՍxb6B8j`\/(/: ㅲ4NIWF9婳b麶ocOF' |C 239k̝oܻFL yF[譇i&73=[8 8$<;\Z@UAf(-w\(u +J;MO m,q?q?`2 4]dK4Q3'`00tTh0]]W)+:^z7JMZgJE(xϕ-M1@G^~ct%W#=b%퟇Q?OY/CJenb0E: (Yk}&uG$Q2}ס5 V C?g!gPew0,m۱)7Ֆ3c'U#Ik^^gYC;QY [6)63><lJIlQ8mX+ 00#ޔ>}Cׄ;6y)k-xy8^E9ͪ4$3a`'; %fj\OLu)sV 9yu3Wr^@zs&Rֆ:sk_t .`8,L!l;)WNWa pnbT7ߏ 1z諓/Aqgya@-r0},–K^pvHX Z>Y|}/ak厊Wf޷'KyP G 8槾I?x1R;JEtT#4mu!u(X&[CI_ZbwYM=QuEmwĎ+&M*#ov\dzF[^c"-7l';dRg ߅4k[Ar2?rS]a;}0W #;3|(T%a RjsMs۷TڰUO]k. \qi<}7w0 ?q@~Ƒ hYm߶ ۷Qvc"MBK-50v_#@{Iz'q-4o sի[;2pXj畹k)j1]  n7yO n~S 3EKztkYPRB ,Xȶy@CTU7}yM'm]m׮k oKRD25da MMZzOߋcfD] | endstream endobj 59 0 obj << /Length 2624 /Filter /FlateDecode >> stream xێ}Bp TB";ڨ':IхPWZ*"\3k7(Is9s̹}{}8iͮ׳2 eE!f],O8 -44#{4>0,meX-ntYsJ1b]0;AK|^єu @J"*KTx/ 3~?=!7.⤀v> n͎83tHf8˂dI$xk_#LDTkgdFz3G2q$q)Oc +"bY"b>r"&FMGy$8?v7,((ěERo,p$1WLҠأ2AΖ M"TB7YGK J=$ek+=2QzN_ Rž?2Q!b' &p2N4,_%I ɠD2pG~gZ6=U/6'HJNN )dl3I JO8VZm;0DgD\ mI]0i2641e]0p~M鲻}1ͪ$G.D=c ۸yteCݓ7':&|'r]ŊGzW&+T%KNSu6+:c8rA GQMq`J4%NЯfr"l2K#. `($-TZsHBXt[痁;L'h޽G2k9#9ﯯ~2B,/hYև+M!*WI" !q*Mk}L $W &KL}"NX&OҪD5JL2zD)*T ׬2{Do(#/vZ$QrĞxCL7X bg~0]HſW)%:9׫~ǙsZ)xjVɚAfoI+* oM #da a͢KoQ2ٝu_>qvg[9$Od 5,԰OcΞM!$rRs`խU []$6s҆ǯ6 _" pLjaFRy Y^?]7bd]')VNY gf:?SԎϷT_II{VeFbp/iϿe_q|Dj;4,Ht3#m]SXT<^&Onep"Hwz"wam_ҝkJʊ'\(ˑgöiKD㤟inj2V!m}v!ȿ=`leۃli+vQ>8>S!_O8ϻ!m`˰fvFҺ6H :Y~ T{ /w}dڂv n%7?:\Ƀ0kZ>Eɏ0dQ[륪\=3U/CgZPVVL06QW"?~*abϢ!=/3|붾ܛ21.I`{U~y2Eӌ%;lWG-d):QIϩ>*C xlF ]\{SzscvrX c2"lrf@2~(,uko8(5*ڭ-7:H4A-O7<`kRuGt٘zU# b|M.f&Dbbxl4H+=?Y , lIQ cH.24$~{TԽ{OYZ.4I{t͗j{F - Ǝt#zt;\[l̾7>`g)'_#N Mzp{&'cؿv{Ǝ9 p=L=}y3`s5ïE2xĿB!BуFX&}؎#N ( <5,(xAGP endstream endobj 65 0 obj << /Length 590 /Filter /FlateDecode >> stream xڵUMo@W̆À\4R{h*: Ҥ߻ovۉC +fͼ˻bvzJGJ'I'"Νe,P*WͥYDޙZzv$-Qfs1q 2'ȊX6z@<# #֝ Gb䙈d2q Jm5ރ2‚!@? jݩ$t< ' ְW)vJt+^\o%BD"hb%t^+"I|fXlYͩ~RbUc%-eEHt5*0YMI*{ mc: !GelLU5,d"(nE0S\Ggrpj}*AJZJH&S=H 3$J;`Y PB똸YW,F~,H0U{`qu̝f}W%?0!-ha^){DOT1)s|{Wܙbc 0_X# 0U8=9v.ktzV"RǏT,s/fH endstream endobj 75 0 obj << /Length1 2041 /Length2 14065 /Length3 0 /Length 15302 /Filter /FlateDecode >> stream xڍP\۶  ,-4 4и;www w{{ 1jȈ鄌 A_"LFFzFFf22?b25=ڊ_"v 7Û@`bffd032r7@ lHY[Dm\&oa@ 0qqqفVYS[D@9/ J^SngggzK{zk;~*Z39 3S=@l\xX +7G+#-8@YR oXZz`g/"[lZ[XL` @ -/C {7'ߙ) Oy@;==y벘%Dv []9Ys+kg+c_E90ZmA1y!lL-4e^o[66"@`c`t#&&0A÷2tˠ%#CO [ؙtl&ƷoF4+iel '۷6ON,sYM-@gȵo?.&/! }q[MGm` p::-X_Su?K+lmauok deb&]@F `?\[]*:&F{[,a6@o{CYZ0f6v+!6;&\aÛ 8O_`K0A? _` ;Az87N"NFƩTjE\o [_f0|/ XN[;_-gBַNژ],dB*|+|pQ4`ooo?a6ofWLo)Miv-5{ {9q:qx`jWtpNo);o.o5@G;m7\@@Ek OYm@}3$ٮz ]#Do~vB#kb7+/G-[۞ϐ 'Ia-El-eqF-H{)`tvg;,ԀSrr :8@q*L| rMGS./BW䨉Z*>=kYYƏO/<3hfMp?J_[\)k3dOP+=2 ,Z[X(wzOþn~] j[|ĮRUPhY맥phg?5xܼcEYI~3Վ 7B~h+rQi ƙfyzO8mRtaB/Zr\QKx^O>b(wxY?|_5}=z7m1P(fA!g@$#ORb%Z4%>:1ʟ'쓩(W|GXIO ߻YXa*wa=۫rRNP] N# TCǺ@hEZFJ4Gяv(ʘ`qy&pvH`e?Feɽ>oiBˇO>GU㐹Wh6H Nl[t~uIŖ*1)1լ>_-)-z@8ˀJC'413ʙ2Y0BC}pu8+E5!):3x h#g2D 7)Ǵ%+T.RTcy𪧙 7D#Ihk5CYbyrpSɕv-lVutj^^?9Cr(0IēBU|B GNT_!2~?BBB|ZO: 8-cFPȆ^z%zLb3CMYpY?.:@}U d C.:LN3qbZE};%ƌ^!W9K0$nXŵqZ,cht}]f*kuȈ6& Yz݃GG*Ȣ\2xcr eR4 c(ڗo.xgi !kܚe BY)ՖpI%Jis,q,`JMssǟ\UMqTq8)_()ymr?86E3+RUQ aE~#pIPTQ1-]Խ~uك TF7]mjNah*{7i̅Y0G L;A /oaB9O"˰LwY^!K$ϗB*2*[(U5Og`Sמx{:j< K흥g``V ,(+X>> o#'Z6۰GIgo)2xp#,^Yzvz>f6T!ChM,G\ǑR/'AR@tUy>A}N7{v~-dj;b;z8Y DCWsZ=y8y6'f= ޠ[ ɵn2R'cQ C&CUg3QM+y$'wׁѯ')7s 0A)qgֆr}MhVTL5Ek pHBP![U509sp B5](/Ywm3C儏2':W~gۦsѮEh8g"/-L{O+o{"sd5 S0/kE>S;!Dbeaq]({,f" kbX9J9t7<9ʄ[cgRX5Qs}R42;Zc\;[" wŴd _/oe"3~&sfL 1תfk ZYjZ}!1(*PU(rѽc1'cWu9:s4̪% G\9lVԕ. M/%OǗs+*?3W(icMHΉ1%uqW\fec ]'DsLVSg;w?))IRY";_K 0zq$_ipՈC/y?: AS#UOk8ss !#Ӝ'`7<^Ĕ<#ΗmE)v2*NuYjH>br}(V*3"U_QeqEwA+J0bm;Yoorc1\A;0t(ͺU4&Z4 ,t%: RL9 ~ 7B=쪰srA0;P{DLg|o<眠/{rbubdAqͭ^>x}|}A?V}>9g ^zcjMPN4kI ~xں+sE$k yQt<9N%)P+wsn'Tۤf_*6uHۼ(eVG [#T`aU^M%%C뽿I+-rn᩠yb6v(*:q(FRca(W&;{{ݭ)]虝.$ur)fMt)#fl[LU[.*~ov oB^NjҬsXП&0/d!'g`-?mgđC)&j6vdXD-HdO\?>ͷtq &0sdN0JnŻkWeDt*7x^"Ì[1Qef̰H'/X=.c.ӯۍi5,)TW ¯D,IbW8y6s= ?:sf~e@);?XL?y-+|^4/NmB|^"*2_g<8UV$XvPfi/q="ͯhF^hNh,XX9 'oZ/Die@}owX~MO=_HJ6 ؟cY9g1QD | =ڬgr!"ќn11ꂉ.b!~"4,#EwJX~@,us{"2a= ג/Yx/9,M3Ol:4f緟1xȎ_<1 OfJ\dgc ' s@I25?FKi~V-Nbq*BdJX=`Ax\ximώ%"D˱FS|G#!X DZ+# i#(U}@"@1R,3^)MoΜ 8Z\VCWtCir螵Hhp]G]F>A)gBJ46y&yGO{_QU)0RW|6X}m7zk/e& q̿W[Us(Z{R4G>=9+v2f ]ܥqqf koe7طhV+KmC#Gl Xc[S$}ꢺuͮρH.M[=5Mz?܅7(.̊759 M*>pATKCC0W9x>\Ud Ԙ&Y1h(ތ ̾"邃iy[0& "̒V./o~{gHX4P[N8Զ/ #>ZU )|`T|!oqoZ Id!\8.)k^3I:(\>*}ܠJ(s)JIu]fPZ!'^6}6ѻ62tޅw$ 1t~e5 J3ǖy^o\> oo;+iCէ5'`CU!q &W[PqO>RQP= >&܍u7F6jqTdw#{Ħ/_btg(B=Xup Z`m(R\+x H,;-zw)mJLˮ84'F Z+zKTaҷpq$E/ ;`}-fWEL\Q6"9ӮN2nU1v[dxeW? c(DYC|waȐĆC<7Mee7Os9ζH_9V%6Gͅe"{Vũ-Ǒi}k{}9qQ"/TY&Q5뭆HKby:3Xk`'CCx7RcQ!SM, tMf7SQ:1}ד:]yl:-bV:9_=aZ9& [f`Ҍ"!+ בo4ߤ%^NcdY2ᦌXK['o0VYzQSD5Go~a9Dz⿂Q KWyMGGB ьEzY幬mzh"Xվp־2`fHT NVֽ!!ڣvl i'?ʱPO'? P@i'uFyHN>E-uP%AQ]( #Sc7|{hK#};47u4&N:{CGP\bT~'qiysކl0g*~ph({NMj=xZT%iSybW yJb w~&F`"I[$_'?Ԇŭ#PA{)hB~\Kvk :RYڶ6c`=hzDSM ,<}(aݑeB@Q{tDGpgփUm"#NOK]~\}>!ڤfUFIXYX,YJ] 9QߧULԏXJ53!:5R'*Xs1WFORR˦`ӠõɤPhꋞN !LJzY`f?EShLS~a艧to>?Й{UsZ0;Du&{} ʰ>_=y,RB8ms)>V8ϰ&'@LՓNA4bhTT؂-`j!.>:)tqx%)v,&E Vsh{&~ T?ܸI!C)peh)m7v=z<%%^7["yV4HSVŴ13ޖJ .Ch1]R`161'ybb[udJ xmgAV6d0 qt["Wj&_7} 1*#8.3D](}%e./;Aυ{P $8p Cp C7ʃo4oGUg+M>U bvAmaXT [0iHoCBz?>KoK ^2-KG{|x|bJ*"5{_,->_o*RPƭ[@5zSD[1J]x}TP7d ;F<3 B*C%ol'+w9;C3H JZɹlp02x<)AD텨tS뒛2IBZ OrQ<"hUe0v{@wN MLsEVf5Ph^u`9tDZ<kzl-/*I#9ƥt[7)WfR#3cCҍE,,!yU!d8F#>98ur-$W,@mw=Q\S j* tmˣig8_ƨļ-XsGJ!͘59$ ~(˲GȤSGk &;xw'Mmx@ E(Iz>ytչw|Vܐ)FZ];<{e` #ciztVތ臕F E.J~?}+x]q`-cd[p\"?)DR|eѮw uD~lY#8v![3~1h:ԽDk.5%5HttH/Ѹ%5Gigz*7nӎ:xq17^䏬D8:TRGyJtLO+ď~p겣 +bY/!J1‰3iVXdx4 a*p"'~ݾ"F' nQ*^*/C')_7KGOgml2?PZdWMQ}|G"t=' 4 (Dw~~q-52:ǬMr>ee;c@L9d3;/$5f~qXg|ú{'F*)W9q)0 zȾr"C%@, =REK'I { Lfy0T]n~̇64b\6 5x5kNvܜq(,ju}]BxN㍘!9c'7KMƫ.ߞ!ׄrfG9=άsueɕhq;ci7DJPl `NѰ&9rz1e7{;_$cmb ŝQe}щ[Džڋ[+o/wǛzzO( ̼n'MIi[L< "yjUõWm';[((vc-Vk8xsv\7l*jrA D nʇórUf3D/!&U{kv3/O-/.# undƩLOc7j?spIJ!E0FdԹþ| =v˭ʭϣ7Wx]5r8!_ۍ5%Mptn_?*:UB Y7lƣvQ  h*_r,K8oRc)5̀xzW" }!ZFa(E/b_*>3TmPq볧M5q݆ouĕw,k#>7CЎ Z GK Wk42R/ ,TMҫlXbRDV 8z\+-wzw7;9:0'OTr)xH֗.br`{W0MP`2ǔp> n1f98?SD5-^DEܥF}G@֟Cι~WleL6q#TSMQ C~,%G8,N) 5S^U?:wvLqo^#Dk&F8e1ٳb:;81J"'X *'bMwB| 銤P!˝ .d/c(QE!05(*8VF6}Mox!s`A^ϤL.h2+%p?=f&Ȍ&ޥcI 0=3iv1nۥm L߫`q1БZ?j'GC9^KU$ڡH-3/a<>:U>)2W" :lB)~m7YG_%h0r@i ?)aNlo5r0 РO 6 |ǎ0m[ ZDdAhf\>GgSek ixF/&S \L3PFk B+d3f\PEźr,11=kGgy2߹YAr VTU$K"+i(ad~xB]MoRZLtR0ܫ:p'`3TmhkȰ moo_,_FM\˙ 0i)yDPPc*?O1kQcZr7',g,u!/uiEĐ}Yq%Hg{Mœ|t<{7c.iUH9;-t9UUkq&9!QQ$_Mš%%]6Efh)ѬZ&ZSv{rU<ӗ ׅTNCth? IO 0K%BOYs7Zv gq)Kfr+VWsxԝ# (koXd⭲mwf@1éԞMcEHF˙Ux ?ԖRrԅ T-S:k Nz}+4K苍4hkiUi]._<D`WK@bZVP'Bz:{S&\d ;a93^*ޞR k!v%; -2?csd/3v85|\I(hDT}*T \n5HJ,qǙt?Iˉb5^W{I~yC#Omz&?hC>ƽiqVS-Tևg9 VCXn¶nߦ?(G@)Se cszPSCi<sd8M&15r8}B1kXcMӥ_3 +98@^h K3o#\e c[&awR0b?E5 endstream endobj 77 0 obj << /Length1 1423 /Length2 7528 /Length3 0 /Length 8501 /Filter /FlateDecode >> stream xڍtT6Ht 4 ]"tK 0CwHtwH!݈{wkz{_{{&:u-IK9DCr8E**  77/'77I08!pBH;A{ yTÀv@/$ ps ;e.PK 'P Lpw' ^,@ t= jUH`; AG f1$AՕl;Y]H&qrX TCԶ" h`'a)0Kt2P A;w!(O2nCa@+&̉tC0@~vC?r@χp: ~fY4C":A,ݝ徆]aXVP1,t`PGgߘ{>k-$+$8!n6\vw ~gt;ǀxC OD:9C=w?-Z k( n_n@C{ܿz3g%fosiH+d{n@O^ ޿xgu0>gEӳ$`[!,, .oqs[?@|*OwGrvv8j7➺{o./@,U@ 4/?!uXC6/oAau8s_{Y W"ې{Qg0 o NN`w-~'^?rq@+\߮?_Z8;9w-c?L-DlkZ~UIRra$3FȎJOU*əضJ9X:\sA+iMkn'8s*QuhD zPԩ#;p^XZh1=zV5Ibpj[GUQ2 eMdy)ٳeFI.K}ԟ"]CT2|Xs3 ]vzIhdisȌlm;d[so.x;2чZ[kk '=X\{1̜eSVtYR+6€S7 2IMjQ$ѓw,ePE hi=И{|3ڽKXI8)lFc'ē.G{q4H{= },$}uC~u|jiap(2ɧmk8pWPdU?QcF8 YRsP>Y如W)kAAO_0ַgֽNB|m{{=xTsګtKPg*@ۄ.RJwEy/3Y>s+l=x.Xgg{7AnR,W'[ IڋӉbƷg^kޙ\ 5(BA\ϋ5߀92*e"5Tn(᷹PJ 7LB^t> ( jS/)` ny4jGGv]}ԖPivP `>JV [{҂Uժ 51eh#H&PlQTCNv :yU'6*ӹtʹ]UC l Y2 Ez/ ܶ}c='^wRwELRW2zo6iCl'cU!5%ɾ(#5!%q,R4\ T;\.ej<.&PPznPef[{YiE:4dwrV8&uW3DʹZ>$plBedNe\9v8bӥ[;e% 9}ȡZr+YI} W^Бկ}ڳ,? K#ڐ[z,mBdNz0~&s^c׊HA_lcygA JY ~,%}5zYI<әx[t9ͮ!%AML՗$Y( IBWITHg΀ B+3< e(0}ڹxDcBS?N &CU-#.<)\s&Q%w|#3MؒơaXF̓ݾ1RSo~uzBƕӷҚofJc)lw/9oԶZ̙Qt,JQfc%(L|X^d=W1Nh#xNeggE6;TT$TP[U Ei3k06s{R4ۉG7pAH^ FcxֵcBcZgEbQNj}.ٓT/ECJN](\;>r 2% r}eơZrj'~=Ocd&l"wZZu*)J`pm<\t̚!*uf2'ObJm9 :Rؖ2N j_5*]J+َw L,]ube}{3o}gSRL5֭e3P}Ob[X<7٧G]o)3褏F:s *X?:8SϜ!I3b0"}xJ#sQb_w IdWÌ8{MEWRVߊ%քJ'ZƷXw&孌C`x vXPN}+k߽cܚl)[[Ho~G>$&q+^]Z:;QDt=Ƥ(%ςᐧ%K Ym/2v05åYfG?'Þ|N?pȯ%k<[xH-D(4$YWPYQ]"(~ hgp 3I|# ROٛ;ޗFUh^?;ΈԪى7eZy4Ijuce“JO#ncp,fmAi}IBFU:@¤N*t197_!"@(k%L 7PvIN/('E}{(j6|O2NAJ ˭%{bN1P]2rRB,3m~)OO(?T$_w^<@sG,|AÖ<%ko!8/ ]bP 8R8=s-8x,xV ;,BXA)6c$/{6ToKGq*,ZjuyHNi:zS; -RM;e5֪s hI!g Di7ިOP\EN;j&Z9hD4uF ⴬r~AlL==4ڹdi!0Ϊ?r T^ AP ;ƶ ]w0Y2/:bnE%1gU.=y:# {Oiy̵P(-qbbv{]^eQycE7+9DfޏGx Þ˷; #fB1]VT+WToiH*lתF/C|Yvי}鿮OSz":sKuVu1 "?ȭ41 PVM#fؤe\{%8fȠ%kn!u},_[uU8{VK典f按&-ZT]&ܭ72tw 7$0g\xׯyY. .U\{x)_F-^X%&.+|CtJ*z'qHʪ`o:pn6PD'@wd@ 9}|.]M_- 8WxbSP^l%9'U`>Wj6<ݼ~v8^lDA@.skj}.ϙyCƒmzޥo&==;{h10!3=&F{shڔ@oܭʕyFSBɷ~M,ڏjGW&^OeftR}A4pp`f}I|߲OԒ/K`րo8~œj%ObW1,b:,DkaN .qo6c G8īMѶu49+w9M5"_ w?Wn\⹴^\$:MD5$&z'E.$'|1/|V&CE=S/%[΂%tcPH |N[K$Y@rI.6uӗlEC^#A ! *iy4d,{–+|hK@JNa+@oa fry?RֱK2AD{в-!I~AYmt<ᄆu1ݯ\6 =BP]˞6 L1=e^p#MA/jb CWPC*s~p@=#Xlg|We>ݐ8[[5=˦ .mz(:gC MrH^I')v0wcg\xy[ RƱkU压*((,ȝE<Ĝ9oYʓu;j],3vL_GLmاCJUdqVkuzW{0%ǫDc7_YީcZs_${W$80^grqwo<\ F4yB{hm۵^ńSmCCmI)B>a/jTj1:x?]ZEe05=fO5Ľ-'0m==mSPgYڥ,BP'#9?CA "ehlHNXws.@\:Nd:1|k=k5^ïL 'H{lz؈v?H9o*ZԒxtU*i6gC EDLeJaOs6T(o4dU4FcT8; HpETQd gs2P0%m8ŋ (ZIk*uGsUz"z&-Xrv\(^W lGK5x bfO>Y%mNg7.<]IKUMP0$z%nRގ@(Z8p0;kS6; t.b VAF0fIN(2tr}g\C>+@u=?oq6~񹋋꠮ 8vb*p!%)\Hc~ U/)kb) zp;=ӹ zo.49u)1>?>paҩ&Di.>[_x$ymn'4(bGYg5N ;Rc:/rFl⦙RmDZ^{ Epd+v;V 5C]&S'ݍ+gX {=?'-yc3Ztm9ɠԅLS ?zG endstream endobj 79 0 obj << /Length1 1401 /Length2 6312 /Length3 0 /Length 7276 /Filter /FlateDecode >> stream xڍt4D$6^g0z{6 c3z% HIDA['{׺wZ|{gww?ʤ+k)^0H * @ "VV8&b5(ȻàO4QH" @.Pz|5&bG1c Dx~d]`p[( 8\'B}-G IG UˋC;HqG s~ ЂLG 0pQ/; u 0$ၴU5ڮ0|;jkrq"}H=h+ia1<(@PO(~w(ub|h8׈`oYi'rq!1h_)akYg$ װ# aoyTB.`$*"(*`޶~ \Q{0= G䇆zw"vp[ G: cc<a~f ~Lu L+ `0 rU(]=eϿ N?i8rsd};F_U_,<a# u#|`Pcj.U@JE:`  Jpoc3#a:(4W +0[g%U0ه"eKxaC]=ւXSχDa){;ѯ5A?) wm[wwlف_o`0[ DƓjYZ/ޕO'& -ob:Ggͨ)M,N'kk}xmN\]K;c }g* ^ށCCB(mСJǩ$/;J"jZM-եYČ~,at@s\Cƚ?N`Vl:%J"[%oo V Ya+Jt)T߾)AgL)O_3I[>?6)cwjߊQAiPѴ1Hh?;9%w#T □,#NJҽ>YJ[3^>\e^;-28M;p/)\p,@\X#mǚF<)/*gNkG~<"OXc ݯe '҈)NOrDΔUiE't#8|pG}J͒ W Ӡ~;l!If # /rcÔ;/X,5|1:}+ Dt3n#yio_ɓ?90B0ͬ?=:/҂˨7f녉*8}??uJڦI;pNDG+%8IYtN#ϊxRjTe߆] Venj=lRkg_~=P)|ޓXl Q'g Lߢ+҄4=9gz :kR `ߓ@L]o|k锞u>{CnȊ;s4"c5(M3Ll z;Qnՠ} 1lQImhQݺ6I:s>HN0ꥦɕN[Nͭ7v7zc|9MfuQ8N2조̒IÛ൷nOJwԃ5sqs woWpsAs"p5۾7i_IQUYxufuN׬jGUx^ vɏVJ&.i*-t|.[}ⳟw:Ͼ, 뾡fgMJ9g ܙl>j9"U+jV~1={jgxClg 7|g9V+[:-۬BB蹫ĥTu ]-(~0xYٚ[/kh $8F+=TS7曑e2 }rEFDvurLB`atpX1oA=y<=hʟ<աokT87GC'67h};5=Ct8:ʋ&S=˯js[[>@PWR 3cT5@i#ӷن͟UtҐGeJӖR{CeխI1r:SGI9ɴ %eyY~3Jm~INUrH`#XіB=W2|r9׋bCCz ´K;~҆X&iu. ϧfdJ[rh''ZdQ9yȇv'8tDWhm.غСgSGOL}dOܓM8J[h0eIļd[qFvL2,>ΧuMWc*#"V1ߧ噅<dK)tbblFuV#6Q8qܺIs4=zw9֩pUV1Q^Xo W3 #YܰIN I6rq 6!3wBK2d6m㙴Ri .x`K%FksD sb#Bkʞ)"n~g||]]:~wsfm?,?j5zQ X8g5S~Q&9xy wu~_ G &nS&ޅBT۬~T Sw0\Od7ل\~vz CAt~ sؿ`]3uypF~RWbѶ43\ ׿lXW0vd<]wv!Sq>f"o"hgW S8VJj%:u#I#gt%syX^rY*.'zL{ByKwʍO5L8EyvԭtɂmOlEBw&xwIp/\@dM@+o,f^ R)k̿r1Kڱ#ۖJkZמ `"]rdkR7TX1ApU?N!c8hQ`U=Ot)w2_#lꆜ`e䤞Vah"ASwvcU'ߓԳ#ݝ].9Jb y}/v* I^̬\F CbWR`/>ZULr~Ơ~>7دtm9e`(Q=f}V$@KHO gEJƙ.VZQݻs<Ѱ? $jWH0eAk[V:5f_3ţw@&]~Q|'~ge,l_{'Nmj5)^HưGLFn߆.0?Dm\/0,,SZ Fn+ōS: VQ`S5sDՀ |vX fhp#o2hJ{bKv?u'g`+<?^I8(5!Lvؠ73*VF{swMĆlmb~ jŗWm#2ŬhUskkYxST~X >Q4~}g9H䯡h'X⎹{G,mz`'8P< $ī5Jh;8C1qH^Г2΅5tTJ/W"!_eKO]_ld=*OeGh {Gd])=Ī .R~9.!}w'ϒ˩;Ka4NV[c_P&C `GЩx6z*~lLyr}$~t?k,j3OIm4QOf bBmKh&,@NGF0\Q5aqL*|o$z`hO쐜"ޏt|Fߗ-(daN}G5BŭD- lrG%^%hw 5% mȡ"$ OX)ׄZyZ٥,w- |q~yGH| KwVޞ PH4BPR2M.V#uI[tUej!і{]#P1M1j2*qX;沝qPw< "gGh೷k߮V 9!Wt1.:טM72GYrǦFGHUZ A76طNL,0,2 ;}!M,,}H[G(? xs$֧T^ڗ! 5q&S}XҌ(D&ōpwt~<)ހ@D+Er-YpE2[Li;v—vi3S? 3{\CUlYBw8d?Wu٨fFiy\$G)?Ed_F*L}t䓜08kXs .d5`2 㻡vZ_?rĖ t*zf-- }oqjU")tR2QHfh"M2e?,p%Or^ VC14İGtfZy(j2>CjR!CeVcbS8=|߶tkY˭K=i*>@kxq[*l&WNy_ՇuH%}Wmuc~&LwRS}r +R|nNNE!W~bE?GAgä&%> ?ʬ.] m&,RLW=&i̓d%~}ϙMљ +"U{=EXϼ_;dϔ&V"ε[Z-+FGZ!H߼*`n*w9:Jl?|_QW7)g[s0홂zlCYHqZ;|,g]Y}jAi}I`$N;>-@pg)[O(1NlpLfpǗ uɁζ>KY0g1C•pw[nkۖr y݄ĉA ?u>fKy|2u"$o.L RŸmp2䪛:\nc5p<gm-ҙK_R8-zq n"՝ F/Aە/ɪ#1DkQ IWRc=.Eg _S!<2|G&sQu@B4 ;k&r: ;^,Ufh[zSIa ;eo/gA ?Roq@![KKfhhHjT^b22>s7*ńʜcʏjj]Xe x-if?̪".$RT3`.DiZ%Iȫ+[{wHq%a_J=cʑ>3?{6cTƾaje{)ڼna.n٨[PPރ nq]ɠ c? FkCSO {6"Ŗ>[Fb!ϵbW"Vo}x{%LFҦ__&L-d0[j)ϞIKdJlK}^vkMO uA XxBMeULk~c9-:t৭BDh9&$-V ș<.tr UQSo6|0yd#Iԅוj_0No 5ވ8wԿlgMڎpt}7#(>izɝ&nz$F7W\|F'92" xΤ@ܸ)ix¥wRwT>YXrk?%N!Ycn6/_q};8l+Ћo٫IS<>|~BӰ̓YSӔ]ZG=ԴEqk,k$^Cf " endstream endobj 81 0 obj << /Length1 2427 /Length2 18215 /Length3 0 /Length 19625 /Filter /FlateDecode >> stream xڌP[.S݂Kp[qwS["]Jn~LֵTEQ(eradeb+XXؙXX))5\lSjA|2w$]A9W[+;?N| c7+3"@twtty ?_4V^^nv@'+Sc@hnojt/ K>ffwww&c;g&{' !Z%@ tr~ P2]<%@o&2<\Af@'[p@Xoz`eb?޿@99<@s+[ @YJÅ` 2mhllofleklfW)QU[)řw̿i޺, 2\'a4}k'ߓٻ̭@f濋0su`Y9e%c&#8YXXy9@GԒ7/%o[>^">Vx/gc7 o#xVV ha&߆dey=V?d2HIh]?:11{#;d7`* 2['a̟?g\JoK q=Nӷ7Mov'$jk/GmlgeuuyESߦYEʺQ?mrXZ,52[+P`dea_ejvs8m_* S{' `d Hl/ַhkL {7[y>s{'0Y0KYR+Yb0Af?-`Vޢ+Aoѕ *[ &Vgn8݈&H\,<{6%?z?ƫM/~|$Z?HGȨ![+[2ѕY}@ڣ~le1E^6b\`sKy>>M%3Eghi<;vĖ]O{JjwNTS|4x̐Qm2 vŤ 0grܭ& Vʶj{`iӅYb%ד9Ès멐 ,bpM׾a6"Ivo[dչizΝ} w;ء<2ʹ1lfgL(-9W >?<9+#QaPn6g;]ff٫hueOg8Oa[߶$öc_hB .W]Nt]20Y *&a|QS8͏$0m|+o%X/dytf1H8JS|?8;C0rS-ZOUiKzw6ܔP !_S0 B^@Khߩ9Zvi\r~ybGnJwj2eG%*^І6Gr{V'`%W6s^>0"ukZӿsmiBzkRNA󉊍|%v 6 H|J] VX+dc&S˔WytT>a4*Ls/#T1xea y% HY ϿdN` 'X{6~6m.3t͋t`L%Q[cuTGmbx׷ہˁG2}0'Ga:s;74WҽC"h Lߏ?'ˑ7B;r99iap0DО1XIg-j-T,m%q)u<.t[$j2`jZ`'¤_b*,00ah p'>4O% BK26ض)0EWIDZ&ys;3a8\DFO9D .h]8oq$642m;T)i[ʳ:+bӂ ҡ=c br~ⶖŇ IcK7ƌx*D|M@y:I ` 1ʙv١Hv GgG K 4hԘ?GGq $d`i#rn8G"nk`3+6|]mhɻE#`[͟\=,ÜX̑,U93و E=A QiLM#po%FMֻ'אEPd{#Ja|U+wiv ""Uۯplψ[׮=\iQ:jW3`De O`g|/F"A{X`X- rbQ#:kdD,XeU.aRݼN`礗vE\bCF?9TRk_% n[.Z-\NU.%E?>a6ʐ : bMU~N? "n/d屨)34NppKوst]鷲s&{!İً,!0Va֏4PsKA\tiL7ֶ-]>M ,/yRH 8"ԴK1)"{ngL0t&Rv1(3<Pnj AAW5L`a?xr?fKۇϑT~@j0.i`^;k5+٤J?R~}6\rJNt8=CȠ-w6ΓZΥeL450IaG^?iȓixR_xF#yL@.<A 3 ~2KN`RӀLlr3,fmA=Ih&|k!21WewZ{4ͤr<7:KH?.,05c!rDM1~JPh4tZ;t,f&v/Di1QB0;Nț}6=L6_cFLǗsgcf^P bvPOX1dkA5: ,IDbi1M, TafN`H ~?7N 1tDJ?NӢ cYQ+*A!밋Qm؁:* QAꃄpV ZDjY aI-cGHCj1C%.yjFy!u(!tC. #ϘxV7@?2I}l.'  ݏTMPI#ʫP[SzؠcS&u_`CY.п:ҘMCj 0s*z2CLaHЧ_l_Az B:`Q;֬?&d|@@R?TUA| \nk 6"/AA̸0,k K?(3Mlt~2, Fa(4$ R]G΢u`x_u;o~.R/CMNh@inm%шMg}~xpF[fq Ŕw(|;X=.Fvazϩ$%!rLQq0WRӹ(+6`C'I,AfSah `T9{bÐ${628Za^08nΫJ] 3 3͛VL&m΃b3FH3, >?arnL:^ĿgE%i6szDKyq,8OxL@{4 .H.(^J| NJH.("& X'o_`Sn2|4HJl0=R==w-mvKk1H-U&$Nضhj B62H.-Gr:ץScȾJ)_^DI$-3ּz`w>msռS>4t(%aGGw XImHwl wr~QJI#u5&(ZwMgx=OruW)%z,1DEs|ńG)v"Ę uwD- ѕ ,QvaɛX&5\1d 轀Th)MoVM"i*Ps>xr!ܙ?0>3!,VAY^(#,s2B4c ;g>zgj7h*TmsaW-fqN1W),~ La *h&[('f}DkW&v0sS8)FbjVJemѩ\xh̬pĈS,1_. JK2Dyk@q(Ϭret1 5ݥ 1t"v6n%|2@ ,γvh.gkJTU(ЃԌm&Oj)k L~QCcEf;mW%1cXAeXѤ\s Kf?;ω^ֈN]祴[WW85ÅHzq/h6z#( vƻa3D{_p;W*ɉ\ߠ]gڽrV¹lbHW?RԙS~Rv Ժkk!j.4:H$n8\ ]zj~r&#z>b4퇕azXG7S#B}a}Sm tA_ XZ;Bo&F^`HV.c/ڴK4t;.a 捃Ob'\ g._~+qEx@@ET2(Ap ȍw\K/76Rv܉0K_ߚd?o:dQT{hf4&جF,=o{p-ӭXz OD9vOlns2[׶$JY#Jl¦Dڋ.,mXh{L'ayJndAXO=i➪$;U7-]kxL|/Xy "_ɼ|ya3Dp^*{ѦDzgQf|UulpڹG;K \R_kJЖQ!F ag_nKz#-]:EC)Pc(byMTɤiCG{z1gP>@Fғ%`.imD'\/5ưJ/xDnthM̔;1;GÔ9BzMaf>B {*3 7z2"[ gٰf QԭS8-hqB{ga-Hf"b&PSd00Kҟ}nA4NYbBPqGHwE}A^)l L N:A %rRN>%.Vt7 L'֬=lh=Gwuobꐴt$]U1=E'G$şOē3z^i):qQYܖWs*cNh(n.\zpGs~4|YP@Wjٛ{]uP\>weW:;d0W¹^ >GSh ~3 3`GpDP:#{Mo(9,abI0ai > /)S܉moXJp,'xFHl/x;qffUlݪehC :$#?>=H8_Oɔ?d@}UIzvn}^s2/H&<,() )͕#cvno,$d-5\Z .2aI+.3q?h„Biџsb,vJO4M63)Z+C 7O0MxRYYobj`dC7h"1^ ƥ+^ЗKsj Ns'7'F>Om&Rric>1 _Һ:McuGJe`tE GMv7y *T2*5 dЀ4lpHҪ]`*56 `Ni'G_ Y̵Y"`yJ㾅?[=C 4 O-5˟/`1~KȝW .ΟR}z a$B%}g ~NyEdE*kSW*˰9;*#JCFٷt!|b "K F`9 R4MS V tαN-$ӋԨU=] {h |?B1>?{sR?:Q. VSw0DTRG:?¸٦(i䡀=~hL,%M˄s{ޙ=h BEWZAC RȈ6gNZjy®>#Z`1W*Elfd"ct m[RJ~n^~k'%h#j@>˚zE.Wuy{4]bk蠒(}>Dgb.8T/$f]N0r`loG&)-9zHfڝ 괏u֩ Y_+Eś]mVy =ԇ/A^%~kO]epПlc["B[Ow@oD~k6[C a)XևmKTi瀘0n~oBG+^зρnGAG0eO+,Li6f&}S*}e`[T סkYRlIrh 0~Ħ_4r0ca7x( [~']tZԲJTIE?& s5։!u(Ceq1,6n[X1`4F g> K}XJPs ͪsFS>y*}/|ɕ"Q[Vvi\AXW8YN66Oc Fݵ_g%L3L%J9!n.!JKJ.0ɒ}P)N|g%JuT%j2=;6؆vy_vy#wʤS&}ѱ3Vy]¯+-MHqC)*e9XE .*dLẗt]2cP.65h }|!b٥zS7d:.1'CEuW;jo[ ?}jڮgr󩩻R {v_!=Yͥ{"wvU-R3f~v1";t01֠S2LA|,I$Ihz TIwoӋ40w8t3Cpokd_PEy7tu(,.Ŝ;nM|!6$W LcyȲ\$.dxc7bBXMFq{Go“W5YA$W/-G|X,3TBB+1ע;\فkʳ\eE~5R+Qn t! "O+c% >ύ;60c"v>RjzNgY[J}ZeȲen_$,B/o=Okro^tRg;dX91rҚ#+9ͣO*]2Qbo(DAh6RhO:!OwkCz .7 H>&)ĻYƘأ9S%ݸ)]A{ @MTL<$vydmM s0QZ=ܙ+ =q Bطx^5©TAp{rhD;D~Z'{%smd]#VIjZA?)˜pfQ\)c8_X;H2AVsSB}W! *˼ۺ7U]K]?'_4R7igK`>?hI =)TΕ䮩R6MǢ)d%$Kw2kjDznqlU13<W׍*sYhA(Phk[5 >@B\G+N!%,k#^6$G7f*Z)^v?PSL>%F/ewW ARISFFVd }dlFQ*aΘ$;1#P,%vUt(#1BU,;𔐳_qucOpˋPs+$J]X.s 19bGʏ\($^~o2%.&4ԼU~ Z !tW'ʼ=l!BbfO.y=#jFLZ]Ƿ~!ԈђcaH*u@j0Ju}=Oϡkfzu5ԯbS̡\T)6@zl H*8JhMq<+#A?pڀ#(++Ay%a1 ~܍u))Q[*,2h#PW5C;78-h˻J}S{R6S#8!UgʜdBgG1S;spf_CGL{8q"uBogn_rt}9 IHnND>Ob% 0;e$S-"uȆ 7>YQZ{َ=t_ ]5AOQA)V7/ NQ<Mi|(ETy;"-l3TdRTӍ۾D]^vC.uLdž2(:\}QsV-cbCJ}+XU <_#52{خ#K1(X=AM̀ ~R̍\jۖ~ AV= AT| T6(%3̎x )ٸyo5a\@&_dBO!aٸޚi7#^pf")"ǤKF䇳J=߲tRW+:( ȒN,`wc=*;6 NQVJ߻d2k1el(/< 'Ѳc;,ApQaNP h`\nW3rM~:+(:ޗ<X'HH .౩"DC=X;gVjjbǏ09C?VOj<~j91ΎL//Ec?%f9TU/?DȘ`u0_I-,_Y7Ϲ}J$*cyF#Ɨ T4wB~5™B|<1VsLIS&Hv >pPMErwVR7fl`t-ae/TU"F>DFYo b uj?~:cuir4$uXJ'q.8DdnD!A 4ɡR: 1w[28>AEޒfڳ ,9ʳɆ82Ă4GqOL/%Zr@&/KMEwD=\;h<eGNTE43ǎ8K2+>UdFf2Ӧn3Teaƽw뉈Ò]W_=-u6tn4 lM^տ),_a9q.DIB8'Bl /V%h;cdVjs d[:?ʝa-Kk</w}7C4bV?&#9 4dᵠ'`M3+-G A$;q7DĢ{sQjGLm#^LO/,J: 1~E5V=BV?!iq'Q#-M*s3acp']lr)łhˮd;)_c|yn YN͒~IuE[Jϔ%:Jk_N?lq0nievG|UZ&hhr2㲛 l_]:8n'qe^)KT?7:\䐄[j3%|ŎOm&'Gy1*^ׄ`+0ݠdyDx:+ZQ?J[r:ABF ~&wS4 4QF7Յ%ź*IO50l:Gr 6o0Sghh)/0J}>B.o^#DUS)(KY ['C=\a(@c#HЬA'̄Mp/=#zb*K4&i+qKAfX_r:hPRA˙ 8^MG "Aeк)ºYevI N*?& ;G5)|- 7 0;bA7֭BQfY?N[uד?N .ȨķM{Z>Vq.r cgdN 6)iݽfb&woYZ>ga"&0ͺwjZcy~C%-Bf]hHzϰxwqϦs6s)91$T51{9f4UM%eB {LH+(-*ggZ2;|>'4{ǼvKeL_ ,Z΋a}d2.-TjaOP W2/v FUڧ,y&*+CqI+}}O6'rbt+\?[x&8 eLhb/ԟRCRks4t;lɬL+GöbyyƯOf:, It4̒j]hxG2o!eZ#9M|(cKbdJi}wKPH$\mѤOp{9hOK[fT`,٢NBW?zXPړGwgڙ"UrIҊB `Q+plTM%}?LbPT;0f|2T?2#M3Pʸ+ >Je@>;L?PާUMM5$p%ښПQjD7nX# 3ę*B'_84T̸G֧5:v,7d0}^eLɣ.6saS!Fԅms{he3i?;dܻ9J-Ծ>iKhKsK:e{hbq/ŭIxcLr6> "dӳѱ&8Ar{n{oaŭFڥ=]wX+U562"J5ھ1/*N,3+=>zS MbF5Ã$*D9!@ǐ3! CSGBW&GqZ\e**m 1sI & ᛛ5 {t A/+/'"Qg/ˤ¦ ؕ~M9Ib*I#ѭ*;jY<4c̈0h*5ݙsdKU@d]M~MB H mJl!ʊSMDE5!*Y6gHg`6^Ґ|ja2Trٱ^wf +Zͳ ՛{+f9Fa!3S)owd"|>'|Blj4 {~!#Րau{_WJ߁O%HtlL"R2'xռI*˗)Ο!b4/܎.>P]ZAϲfaVW^HTxq#G ä1vi~WDxɋo=g-m=(uF3~#='-IR8?ͭ+{^Dɏ ~L/;-awzKRzoءj~QZA ےzOGoט1T ~3qo 5qUfV&53U#g4} ˏP8"g9M꯴-!8  yt>]!FTV  $üCMcì6nU Gg\Z eQe^CۤJUSPFsuo(mmo\ɤÂKL3_x%C1 xz.C XѐjgkF+ *Rɍ/:| @%H%[KPZYl6SG>%m67xn4"JhO̦Vj, !7ywbdu]Su(d:Ven5E{$n?wzΚc_A)g&vGGW&ZcŽؚBq.L6û5TƢFuÁF|[iW-~=b<<}A8oA?܃F*'zDf^3izM+}F4n(m/g71\/,EB:uy$´8$_ qYt%yb!{tQg{:"ߒS!462o': 0? P)6,ާYʼGȮ̆.ė'M^ L#2a3n[ _&:b\|26*(SSC$kCJ?.MzU.1 0ׂ DH9b0 SUW^e cK#csu7PWnhQ~q1@<֎zk;LOp?0duQ tuĕg} s `%8Z] Pwrr .p?ҝ`F&xvzBsY:As4]x=&ht鐤d·D VF;: {27;h4Iz!l:G4{9 ?15)]5bU~8T46J2y40ڟ9 &(G/ }j?Z9UxTY)1^( sH嘻!4['T *3Ի=3/R75~$Ӆ Cb&!ӛ [.w 2U`Kp=Ϙ4,jV^794K~2 ~^d2H -f%MD⑑o$|kܛj JJ1JUg2b`@d '*bH@)Zs-Lf 97lW](sX;ѝ?4-vo#tvD#cPHorTi<O*C/SxG'uwpIN7#iSc[+d0~~O$k.>of^pSU*Dj(.ql/v<T%BFe3&xj> stream xڍP. ]4;݃3 3;IpwwBpw$x[dwUT|O=3*LSWTcr0#SSkmN _@" sqظxYY쬬|18$L\AEf tB{8,_#֌;@23Mv/'!f }gllb qcj@'+`ʘV ? g7G E` 2^<\@GuY=Ÿz`cf;_޿8AM %d (K)0;;3L Ml /&& [?27HL^ <'3Gw,ütYl.'r݃ϛC^ w.,` PV/?2K3 tͬX~d-~bx)| {9Ύ.@+!AfS%O1Or w> _?eѓPeubbww_Y* LKIk6K BZ gb5{ycf.MH5d g]_yjYE9&/s ');\lf'Yk2[q*&6VѽLpza*`3 c8:x  2@?8 `aC_\/, ȿo"['(x, >߈z  kYA.  ` B_|I_%ӿK1@Pj /c_@3@u]pm(,v:ײc=:B ]Mkє/}kW"+dO^G amIޏF jKSxCEG$HL"OZ60ry.*8nec (e3L1M (H1殮gs'}c8>zm{Wh;R\aMx}K_*<^$mp$GNke=!ߋ q|I =G<o@:ѕvlS%$b6{w};r"7]2Gch}혆L˯S⮾TunY 9(~2+"R Ӆ{&de;F.m􂔆äEW۔{zҝf?5jtn|,me*XDonCBNpShv8]ӻXtlr5Ⱥڣ$ˑ]8yI12Yg(yM+:F(wte {ԣnr-M&VN)ޝIgnc3Btg<9p5k$BemuLJwI*xV%z֎'@+nvHMuaOpf n˯*4ɨ߀d0 P>&%xEJߌDa.3 }A+҂ea0n f`4A9ǒ|T*O7J,,* {^bs0E 7y׎-^]o [6õWRFny#K"[~⴫: Bek&/&'eGf(^滊D_U n Y |̧"V߀~us9pd]E}^Ac{POV-s"J|05)լ,ʇ}^w<#GX$D ;7&&b$(̑CY& MHN{ESDII2g@Z& #q6")e! ͧ3++|j 8;FQ Ңק kjm :f@g~Zб*3eyBcͨ>|޲%߯BGV#T/~ BJoL~mH05\64f4]9o4iğ4$D䴺OFFkaE{!Eg} (+SkAgi{6)N<̔/nq#]BncYtOz_&0ˠЮ!B 1pUPn-B4,>8fa_-k :4u޺ц}Dv3\7Noa Ժw8~wQ< }'P`l:W ᤡh%[Gjb` r*.JWauK΄bθFvT8,BM<9Oo3FP!8wWf؋D+ *iP D^$0 Jҕө-Z 6/C]}vh=AdAز`|(=[[f}߲÷W4V_b8f-DBr bTLkd`) ϭ/!ɚ+KtH:Xv7cdXE=o=rpi+p?UNN~POՊQ艋P謁TI0@r|X\\AÚkf56S}`tU_Jwޒ0츮Qk  ojk |wꣿΓ5OW XByAp {̰mjsި!/] NqjܸF%E~ϐuUMf*0lCב$RCQErFL|&zpa.nꩤ(_HۉJO b3I4Ado(geB=zq`AeO:,v)j:v[kFwAoӸtܛKq[D]>O e1wBkIt"9Ʌ]a=ϖj`B`}9gFp |[12L@`26iv}8=EU_~ |31mGl ɻL10>vX1ق}klea*,NcuBZ<)6^r +zP0Zb00nvSƞW볳J^ay\g&kVL蚯*$7# NVjRc3'vFI35٤rP~DNJ=) ~LIMC`z\' dI/Gv8 >qnS @ޖx[ml߸V1g@\y$;-"}X S=&bS=): 4NOD+ G Z+\M FHC"jT-(E#,ґ,#|˧0dƳ8r;7 듒]?yLFHGs-Fq c&HTv)`Q+ՏO!hln赜#Q~h޽5Vݓehm0M2,WwVj #w*"kb]V|U 7nka$׭fd.{ yZW*T,_bXȿ`6W(;hw\m y&,;S+c >kS3کtjZAY9PM:}K]Bl ^-Yw`#B~7m/{Q7Ab+ א] R Ғ8*aIԡ:SٷWu]Q(ܨf/w)6I֬L3mԏm[02>-q&̖<ϯXĮ³oK\L3?KԃjoĨtN}(MqԃEOoxoףīC)#k/ ,?-I$w qTL1'2 5fQ-~N7t;߲*P lE꘮g#sgF?ta#+j=W-k`7y`H7~ ӟ8< eȝw+ߨ2(滹Ρhcф祕G'5yeFvipQD=9J.SX0c)ƗI"v)a!æЂpx>7ځ\oJ{+uacL s#ȤlV42Bh!;p>̫ ];~~Ūmȥ0t9ǓTi8f,Oj; Q2# 6/ k/OF4hATk^I )#M󼺷IF49l+`F^ ZHΛNjHq.@.Y&R/O\2D!_P HA ٫I;EN4y~<|1Aȉdu7Y,_I.ws.WӤK +C!L$/ƜWgXt3=J49zYlL!3 80b#K:~~6PS!Cs(%j\D,ޓjZ/I(Prk@=G>b&ӎ"I‐v+fG=eP`4x< {Zl؊ˁ{9Au3j!ϩNgϑ-u gu 3&<' $2AO5âˁ0qL%'DL;6{Ϟ1B ;]{{a e͖֩J|1T []9@IB0cRߟe~u(}h4p}Wk~:fJuP^]mq-jgDNlg#(9G]M.圱qEQKpHlɥFHrfQAr,BL ~+UIf7~AQY_בX;mױUǬ٘p~Lt#JK,jQKČfyMY\9h`wdtck:m(7z$+e<5T= WG 0I#5&,X~ԍ'C}/V#GIFV9&'׎ȝQQ-!yPk,N#}~r!I`j+FP@PM3hsj.uqӀ~- JJe?&ɐq6;z`.P1 Inw;Sp!O61otѻ ])4oW4rr2Lbƣ?m$5}S5$ұS놌rd`!6ªTn]ݓB&ΆVz`_}j#d3#Ioz]tZ˄nB2q܊ȴ~_+F4G]SG*6gm`jqkNa5E~ޙ{ L*VR+ۈkkmQ@';ZLīd>ZsT-GCU"pebIXv Ls>UڏAr(jW^2lGB8h#YX#RV&:š!ep='TsSS8` 7T__ut{// QȅC+L|bwHWZwH96k&5XUv-!UX|{DZ9@^`/J#bJ2;_l%r>G {-TY'DrZ?faf5eF nzv bR~"eڸf>odKYCWfֳWLw>ITZg=E ẽ WH^ *fәZ(&<ABrH07sp.Xg.N''0+ZXy 9ˈFdPhq»ԋD]Ȼf>贔ˊoP5p0n&ʕG8yui'Ql伞L+oE`0jr=Q3\2 :s7HW}l|X{ Sީy#znTfAFߝE.zh@̚rJ"hUUj}Q~ H0IX?گث]:DI|r/Zum0E۲kHTsI-1@̥JD [)2yҕv𘌒XM|TsD闷PJ>LEgو Y%@>7 ŊTe^\ǘk=|+>r:a`<1 Pw&-ԙD;CTd:\W(ygۑ9C'gܩT"}i_ELd,lYkaH'=r؏3͈G{ˈsEt(#][+kv'יӱx` 7, ł:_Q^ 3D#KS֛!>zҰ=U#p0s=˯Gl0Ζ1ꉈh ΰP~v$V-~#8{'fP.9hig(=͓Bkf]kʫSynơ1}dp/~]uy-d endstream endobj 85 0 obj << /Length1 2270 /Length2 13975 /Length3 0 /Length 15329 /Filter /FlateDecode >> stream xڍveT\۲.N݂64AwwM!{ Aܝ}{?1WΪLEI(bbcf++hhXY9YY4Av?*-8.@ЛLFUttȹ8ll<vVV!:$Lܭ9G+-<hl||<D.f&E-@/V ? +0#dP]܁eLnj@аv[h0qvf@7#7s ->@]Vt7ؘqߎ2613sw2qvXXR O#7W&)QU[ jmJn-`.hot"Ohy/v0]PV?7%bee};X3if;/%o[~>NNR~Ww  0ZZ; &Zߦ6lߟ fQSQcbb&vV; oG*&I_}kAϞۗ݀{2 o/srA?abomAm)7U+ͭVdoUhb 2kLk^;;ktoffvM_**wTI3G; 0qq1B`}*v..r=g #VrsXDF<?"aHA?NzS٩A?qXA|o>M4Q0ecbbZ%iǘo-_|>ˀb|Kc1rahgg/[Lj~#gCxk?7rt,8x[~MyKOo[y9Yo2Dm:n/? rh{ߺb=,BqErp7}eX+k_?)pz{?9ױpG߇_,No'o=rs#[GA@sMDŽ v&V"ET],nom̽/o-݀c37 A]oK?) 4CXw4i%`ǭՍ!|m9!g-oZʨp>"$xx"d-X73,fL)s?%kAR,^zޑ׀9YO]*aq pG{CgeBjxy:>2v0=6r[h[N01濉!s̆K#.L uy{??Zu˿XxnI2ٟRȅG35DBjlV6NCxb>)#JGw.Op yg1c=<A|YE #0SvI{RIuRA(CrZg@FDhHecyB3_IڬCy,I) }.v9~ozU5^Sl I:ާSM$/Bh$3rtd(hև}l'gH32~z`c^]P5fU "P+4fXkQGh:!'Xu˗Ye.y8o0S$KBOT!43AOc"{kUy4v05D6 #L~rcaވɉ{ iOW{%9L㛊(rmh,a7,ψy lFc)7NBkd<; }Ѽe1bƧSg6sTDa}9!#- gt:"$pHL&t 4p^SɅ_z"ݟƯGj(Ȏ&;Jc6u~b(m m( "3}oj ;fwD\K/w޻'2 ұ34NӸiS̢!sI2eqQx @qS3eVMr~}Ǝ/Rt0I`jl+%b=ӱ= A-6gJ]D"-/\p%.B,cV1(.3k Oc Mq|gar*&NBpxT:0G)'?UC~8f4eC?Ѓ4ZRUm7D>ʬL+0@mKE3V`Ӿ!_PU 7HBszӏq?дڙYꉟMTI7:{w-JO08ns𠂡?L⮡e"nSVr0NŒTdl{JNI $]=GI/O)_{ޜ0w|b1a6G?0$ϛ2@,2P3gt7+iv‹8j_ 'YiC"^J<$5^s!..nAmVq0f'~_'Qh*Lp)mDN3l7 fϜ;5NyeNhtIbV9mYaX P27h7#K]Ce=<} Zxviф %Mr-#4&Zûzրȹ'7I$)M^Y.rSe^#"'UEyetoEeS{LNUVx#tG4ywǎa`>Y/[oie=L7kF]Z k]禗#kS:F)_7Y=S2x>I^NVG#|n+- b-[nrj=坝hJY#LLD&C^v)߭T4-Wc 8ɩma$]ʏ[Gg gTq@e(~^T1S!I$dAH`9(=E24J`s!ԵK[bV+/(т}j̵.Mܩ;sMɴy7U3}򩜝HCj ӟ hDo !&a 8RǧD|6ATb bL蠸7J§RӴz r ]H9оo_daھǒIÙOߏA# f JtVʐD`0 Y6VĹʸ+ 9:zp-w\nr42_JJDSx`nBM If9}Aq)"SW)7aFܩf.7(xQ%6!~Qut90vPTQ 9)sGD1qtp8vgME+@s*?9e \8EeK:Y-3($^pGntWqGhKA'L&ڴ/džc^k[O0 k|Sdnu9]ꑱIlj㖴􄯯fƒ9{pߑt+w67:')0ÔϬ}bXu Dy lL>=" wxxV{AַbEF5mc=egTh[0%A 2 Z4]Qq枿Zwg< ]+q󮎸!V v>\9_G0c{S8.d/zc׾ϓjA3; L%y)y]^-0% \{O7F| /f'3ջ\0*psˬH$osqUT: Cm#dt3m(֌_kB2$6fDŽ25)CEa5&m"G~֍=oP6,lT,:Ce$:ʕ;IF2y oV+.X:=B35_^Em|Bxoi*HGu < y64eҟo\g%k Gwߞށ(؈ffm<dپY1^\*<=eiҋQ*)v >:-,7T4(C lwKgIgEH}{ w=E;J޴/`3k*9n͞Ȼ\›17C+ú\kcjaNSw+QYE *}>&aK9b漺b |}(A9}VX\:^v7Hb$|%]Us%a^uW(%H*q+zLPk8.6cM{W˙R{"Y[I1C=8Qh 5P"Ck2*\IL0eɑ#_*, !pmB'|_%]UVףCHq@ݎ-39h2Gd_5 ld1zܑ J$4I2FL'Wػ/GߋM|[bw<&6TڮI3&[+> ,=x*3yMH;IO?̌ȹ0?i&N1'ӰWwϛ!D:IBKYIOP+v\l:1g}d i#EڏoP,g[׵,6>YjsL;̻Y"o@(aprZ6,ϧzI/`7D&3< 2vb 8 e/b@x&!VDJ5& <+ka8t uXzz#|x>=&P!srV0v}wY?.Cq Qt/ dKL}JōkC"NnTIL)m\+[KJb;I +e~/(l'B _v %`(PͦbHx5?lcOue HYb47Q ªE'RL=XdEF+DGPqS}2qU$:TnCw=`R9wdzSr!~7c=+wP{^ң1ڦ⦡Um|-wxIDg 檽DV8 |g =̝ތ !$%Z] S0vqWL,౩&Ad1'avVTұFMp6aK(ŪS#E+UlJa A!zHeD^ {_r- pEA-.{ǧ̰K)8!y" aP{ .or9Gj28iDB%(<үrIHH}!;U"'c{ԏ㚏?jb?&rx.O Pw 9N6|?a(S{g%͢V}j5tR[V5\̸ͯ%DK>͔!0sf{{#hۓ2F|gZ@u9֡7%f^u|%*3bLW6E%2g gҪܨ5rrGJ^d&yv)Fǎ:/\L)صA,dNxW=|wKOiudTԼ [tЍ}8VOpX[zɷ5(s|Cu·?!l|SNB `9&C%*ju) DH%1i|x  !X(Z+i1E> kHCM&`aahLs^*&NĹ4Hf]e{Ι}" նfjp9VZF?&0aa2)`C<'@7*W85qHj%@Lcz"Tl^wB#k ekBs9p;,+|=^O~6pΉ,+dٛ>|bW;*:VfS왜Fc6v<((>FWI$7Q·_Bin3 H+vJ!');Ό ĕ;}opvy3pf~-Çt%^`=)2RyPUSET6nkb7xǤWg„ **<3f[udZr*Bn/ ï̧B.lR1]C1IkHp[ =礏vC-_qŷ8-5`pՕWDثw ] x>ABPK sT_P,!1W{O^Wr\M!P*t 5OLJέAv볞kV:~sYO_^k!ir#,AYݕ1[\kgxwJD݂V^, CϛmrS=*38G`X MLެ K?#Fƥ\B}pqaSUNEgK[@;=wsuAU~0Jk{_K(5g6cyEm{tW6j3cHq}A.\pP9›󴉛*x Q,V1z1=ߤcbt/ߥ @(Hܪ 9cG8:qAMG+hU ϹfJ"~?' :4#%G^1#qf9ˤUxچBl$mqMf bUb6-_qЩwR;*aU6k.\ bŶylai(pxSvMMVK2NUMƅ''[UW<'@?de 7U{,;yG zظl`K&}6hF1 e6A7Lvr矣sG`@F?J۷;)⮪e6w~GN OqHsO cLmAbONc:ByNn({6l z^y$W| :i .:uE u'i!أ$2n bŔq7މ]ޤqNrswt=4:~Z *c9' JŗN^WD#Z]6!Ms$_ WgY+Y DM)qMYÎd-I:u"A~U@ʯ3/æ_jvhCvݔrϤm]ҵ4-9 ]~ mltb{.L;<_D/5_R?+f xYpVz4"CoU㝧#iع]/v˃c.oZ .Kl>1Ñ:s/%rƋa=|X@Sc;Ko^:h4(q7\pkc+B%ua ŃO(x]t*;/ Btt7 $m?nhJqYTAkc-SI~9vp! ,aENG⤇j?0?^3 $́N1$|du&5׫& n(1qH6Ln'>/;tW䉸e﫬DZ Ԧ\ט_ e:آqՀkRh /jHDr%<\D/dBh 6t45VgLi/ywHHC:I QZ3^ k^P{^QTA9W)f k"5^pJ} H!empoEu۝#^3%qO;y&̪XJѿxMHw|sV2~dp2Y$2 +&bRy/?F^Uj$Q+XRxjq bd j{vN)trԅ+BljOO^$i51 3/`۞rPEH nIWoek(ϸt]ۧb5۱e|QNx":h'c*mnC{RVd+1N>HRp? kt h} v+SeMM,W.U^iS3ئx7G9NgHZ+)VQePF/tڠIݤ⅕5YbbkK0a> A]#KC'@\BTahl@.,crc#39 arFU\uZ$q5 V2=tGx@Z&z<,)|\a$/£!^ݩpw1誦b6R[akPҏ mM$BɢLIg|ӒVۛo%.BlÈƚDҔç yjXI t;&i=5bF+vVe8w9!VrCvSofAUA+urߺ`wR^q71usȈdrzqR>n.ἴ-ߒX#Vx[`3|uQBppCX;EEɷ0?I4G;eG+tZY|cEa=!.]F ɻ\ԫt{SEF-t%tcZ(ȀE˚Xs stZR,!u!*|df9ȸkؔYlT"*+Aă+;J , ELbj_1W҅mԝ#GY]Y4U3霗AaڤP^P5?n;<>^hXfz~k5"2(*`tyf$noq HkO[EJ%1w Hz% (0$xf-Fwj:kz~08,Wt]J>IrMl %=!"iP^Ssߐ^\C!)]CՏۨXe8 iڈ/L2pdtVu(߫/OU4B'Mő[RUK͐5<`x-y$JUሧٌk]$吽v0=H_>YJ's/Y觲~(z$ԧ@#2ޘc̍|SFkq˲w$:".] dh[9FU+֜+J# QCi.Hĺ"ǨA+q^BA:a(_,T0dQ]; +\=ior~|rC`nTR]Bz&ƈH9Y#qp*gz(x-!$[mWQJŜܺ% tG)h ,+$UƯdgkYeٶy!EoSd'I;4nOޕN͡p#z)e7sVFln7e(ꓲrT#Ip! [G0|7}\Pq]3chVh99}>/Dҷ0_Y9z/6Y0!vX݂5 xE0Sg! !ں4D|a>M_PMQLsG=CͩOEnVŵcs3/z4,G}n%L0&Gn5'$7> ͱ~d6226VQqTMi0K[m"*eTu:?k-+eq)zl߱벸d?FfmlX `IP+H-8Hڐ)@NSQ>}0X&5˽u퇚|E2C@߾4E2G^{O"#Ujl{Osz3s7-ٻ)Sp41uܛ/AaFSV'8&MzM50FZy{$ðĴ`:~?r ^t'߯i/KGVO[ďeV40P3̝JgGFIAV$_-VW¹A̱0Mn &"nu78E߯=M]'n\wiBy0dtP]K:k@_錄^]V͍+O`}"M0.E Ofo$& 5J_~-,8y"rh%[ÿ9뱯VNFs,&۵<+*O/fn`Sv*[u3#M@ýQJY<0imqxVr'l$Y~K!29h?{\mCt?Ȉ)P ]x+epUgXX])U ZœLwx&IQ%Ohgp>|$G2Di?p~Gԑ־.% nlsu6 -̶RRZd7  Yo&ֲ8S-y!Ŗ۹]ЊFQ[2 "e=#x)d,_~[-o{?2vFk`1,BngLԀ-g~ZA&ޤ}AG= g<(we3QZiY'9s>h+{ƍ Es:婝T > stream xڍT] wC)\ww(Z $)Ŋ[q(w).-^_yy߷ֽ+k%=g9{N uf1s)H qffgaH(iȱs؀,ll44`g[4@N`(_ GM]l@;7?;??D#?@lPbC! 'T #f v>>?bv G dl{hfb PA^͍ٞΉh)p;[@N GW9&v[cAhXrC-LA- qz q/r _dſL7tG 3 jgoC,`[ུ"3b oj51}!t*ÿs2s;;8m4/,1ف NO2{wֿuxY!abϪ ;$漘PY\lll|nfG {ПN?/=xC/m|T/'Wo"Tvv9` CPbY_c{; ^fzCYe4dPw3`ggx>G:X@|-ed׿5@072E ?Bgb3{yc,?C,B+vOob\)PR@br&/ |Q43;' _v4dv6K557[0uqüDːټ"N/z+11l\GGTEqpq_,KGJapXeAVU鿈 za~/{3LA|V?g5dVBN.=|l/R_?R*+_ec9/u_v_?%Vs^N?+r.C׈ucޝBڑz㺯(!x{ ,Z݋9(}n&`1Έ'XjhG0f;vRtS5B- N ]029}<RuSiuDt‚;D D Q%^+63)Gt!:#T6r9 x+# SRDWNc_Xێ|weH|d沿dšH.U8̲<{U0_c!SS1`eVuuۡ8sgfCz#ޮF<_?Թ^UKÞve ,1K( $32S:cV4lv_8IoDbۤK UOX;gTXM. 4hvG5c'JiP>H H& +) w[N#%"%N ?E=[J$J}I܉)x\*Ճd #_Ɲĕ bQn~_GJo{`%9wMl=t<:q. &ƇP]]۪fhN.Cc#!7Hc-غ~ͥH^W[ 2@֊"j~dEu4O֦JN:/gmϜzM7:mI)0qF#k*(H.7ƨ ySY,av+|||#O`=f~EYHU]$F",8h`)^٦r{h|T]fl/CkQ+xzSdjU.Qd*&3l'D9N?XWZϖdggJfEz*:;l< `H ߤ~ mVD4)&\XQkݿm!&E{l[4"v(v[\<"u/Rh' !BYmdž181EO><[U&% cʼnT١t|Hl0ؠ2A WW/'hmO {iVD߈Ù lyJj|y3gcϼب>>Gn CE1%ł5$>ErE;F d{4Ň3ot.,o_Kc$[*sDAѰIz%j]2+TTy4R"[׏8E~(P>c{);C"7ɒɌHl+GH霶N[τyb2uTK[z/viuUKrImڵ їxl(of"s8Bk[nNfW5;MdV lpdƆ84m ]2wB7IؖkY]=+sRW|0cvb+UL9ix$u_6`IY2fJl (2VP@n}StdMZSKvNUnrOb Og;aRkdu7V]KVtp^؜zn%A No-јt&' ^O],XB|`ڜFĎjpvai>h^׭ɗdEx8D/\`F*s2 r֘DfYՏc^gT K0G_XFT*"!O_M?%<'tEk*?P$r(&|`|­wn]/a ~ydA]C0yOZ53eS\={wB^fُ(q-G]Dbj~: ߚ_؇+{˭a^9]m*yJGRܟ: fn;iA"ߊ!_*rp$_~p'eҜᏃOQTp}J>S &[}MzڝZ?,]16V‰`B&Tp e\21:S#G!Sl'*4OQ}<w p]ZjwQ`SO;4AwF=ww߹"n;3yc pxyXʘMHvxtoN Vig.3"ô?$]\|݌oߎךL fD1|Ջun ,ކJrÒbI@!͎4<3)>4ݚ1;EɎsLTʟaWeıI.(A~m S塰,wNY0? W9{ӤeN)Rs':pzGnuvPapW/^#|qY ?W2o3$I6SkBP1!hxuJ?15GF=a7Z jCiB1BQ*+M1LwmхX`dUޑG70@͡y-B5Dg \~]ϯ|#}W !0)OW+o)位"ިLzIخvS[^JU Mm5?|䍰A`0t<*1rYP/ xd:S>mڢÕIXtf]d5c-mBQ9 ]t};ڇ:Tk;*_r|@Z Webi=";E!&,Bje>,AKoEvaBPpz`^J^ YLKq:3s7Sq9q֔S[ί{MZ:n{9ޘ_ꁟƥ;' @R!/V'\&x3뺂dL,gBҋFo^}\w ^*g04Xdn+%2z -Ń'%E0~w̭ $V1u!iWuIPV;􋻆lrL ﮃ}+2RsuA+L~zq"kR9FúMI(>zn_CN⊆jCrXH@7^B'kY.W#bE DM.״4_vԈ"Ngȟ|;H# b?^l(!' C}4r8yrA;Iq^A)m€8b%CN8K'HiӻaMt' ?MjZϊhȲg.Ȋ=tVbp&j-z yGWNUJp.`ήx*e)p8E2v33,^oE2t ^fa̭<]xpJ\ [ڎKX& 'r1)8/=hX+&\ekd>5'~zsrVw֠m{)ͬ]RWDpaq^G(Xs IXS=FCG=gU%mf!wP9M әon'7űX$P$nU[)'. m|tSS"`-: -鈉,42軹'8'Wm#}ΕJe;d_5k]~m~>F8VP+7|ItC&r8B`Z5G'*^B۞D+Z)o%ג {8Ο"e0=[X\xpCZ9TS2`TqR9` H_÷M9*e^^pM}o!NKs)݂EQ-0J8D yԀF~g)$6Se< OMCwWIqLEVsGg )Ҙ6O$⶯G8v\+znvK2>hpx._8-;<"k2(~U#{D|}Ń?, K *慻 ?ŒƤy+*^Y7/Ąs 4%oNr0`m,48K:n]NIU`򰲨,Z)Q{bF׮NI &S;:@I֭DPc#M v&\lh |Jp/8:<{MeOW벓^ &u QGA ț᜵.N-g3~q)#$ۊ# Jb{Q^'ejSZ%Zm,v%!9āis&cMQ2VflSt=ԍtW?PvNMͥ[H+(l_As" :Wgi 1]ud?QŽө`qyBa[391$S(A7DJ;A(/xai639FO9bzY4{,:6_Shr9B\;'חl^l.Tl'Ƽd3bs PM-g^3T;4a4k ^k0g4֘O.3. c^PEcs`{?]Mg4jThȣΫ)W@hw@Ю3,p"#L@ ~t1cfE\SkQ|ꞯIi^l>l Ox s|BZD\t /THu 7h|8ϡ뚴+)k?&=-sgyHb3Tt;4FY1>vad|=iȽ1Br]R/0Nw)%R Y.Ӛ|\LΆ[52 _LDKPf9_f)2Lya!\a7N6[Fj'Ե(Y`7Ah^T6gvϼ>#OrMmB]*LdV)RȤӷ?('ܖr4}nM0~h,C٢i-1v,ʖLмl䁄f]pJK^FSIak#: *JHdo/[&uDUD6`Oy#9.|qxeh5KAM7i#?T?l]CٳTpfweaՏoOqw LKuE6B45ؘEh(b'7Ek k,Cv#$42Qn2z \opoa:OvZWRIIyYSHÃ;Lo:CrBUufQB%/ʨ7jY9*K"bԐ2/ =6$kt_v?O|DZT؛#ќ3ne6[lH櫠DڲqÉfogkdQ X[:C2*8i3VkY=Co UtU ƳY|LUv RIP}+wJJ= sYP=34$LdT sEkJŪ|ꮡuv|T-}4t h}ZXnEӘ+ӀԸZ-^G{V2j$e(n(j\˧{{82 Xcps{ɾolg5t#/֋/1}KyD$x?%KPqИNJ8<[v ?c1n׾ Y'Hg+*0ChN"$=4 nmrڐtPyڗo9)}X厴Qb/uc^YfP1"p%*z /6f#w.{U|@6lѢ1H"9C_oJ@nZ+?"zr,]{'+K gӒy ~er&Lʆ|ʫe?;(?4OF@u%<39Mad)zk 8E/hBVzEjYoLwy\O-;,U2,Ujq0t+v B湪6 D[^;\2R cmAOg%0M8~}pl],OIXҼ tQ8Fgsml#ݭ>OEJE0nc4[D}"uHwFo7rq\t~`HDH{bX3EUucUQb|HmZf0[[<ovOyrfe.$ӯK [kky]*+x_vȰClƬhmCi ĴkR`K3nSϭ;PUJ!ϰX9tueū* }.!_;=C@̪/5 P9Stby(KނklxR%%֞$ƜT,>W-R0M۵?s74ǬT2?..|x3gln>+&0 c/ˤP18b/aSxN)x=O;b !pʻޗo<9vb4n@ [KZ c2.Ba=YwYzie)2Hם`y ݗn[2jt"DZ YNۨ/DO杘a..+>{f/*0^+7C;w1X0IpINoT: 0; ldJ3qu!ϋU=Lfm1%l~ElˌcЖl^0Kq/,QEV7QExş,#5AS^3<wfS%J|1\]\-l3Fhp(GjSߐ:c'O{|f~/cNKF1蘝נ~mQ ɦ҃Y@""<`"a Z ஐzʚkQqXGע_vfym)m~#el"p؞Rn#}謰g=cC 0 BH6ZvgBO@lC{?Dl۫7ڨlss'w"Ӈ$-u%qG,xq;6:@i<Ơwc[N>.N@l?3ХtbMKc2nam"*/§D;hIy=RтvZ% dYfZlsٮYâ(_0յ2K ;&Ib~x ICN"\ Viak*H2ɰ:L|-M77Xqx3O ;[wӍ$^dn9eA%[2\G[Ff>X[g%/`AMfr.G=V/WoSO L9*S H9:̖ύohl@tkdƓc"#w 5ǮIQ]^g㘅R+8N;ps-SN܋u#_NrQF0>#sDZjIJ:f;6iiǽwX(QJK(Zz(<ҤAV(@{[zHS51ۉ,!sVYg>2>jI1ZfwDbim:lOT@[ /$-p"*MyVɅ?[;xB }b>oxݐàzI|XcˊՈ}cf=)j]MmU: o?Ov͟g24[ O| +7jմ Vk#_~a%p=>ҟ* vM4XWTe*˛M|EYiCt٠' #J¦F~Yj!15L,lZz_foQǘ!e9;Hn%Rge;CZm*9 ꘾OW.lSuqpDARE%ӼnUBz$ \1ٿ6C&k0Qiɼy2@#khb@)cX$+@2:}a1Yʍ8'{_90@ݔJUC aB-_RʆO:oa iIO_:;ѫz\~AVvJl_ c5{::4R,8aUvkNr-WSS.s1]ؼ%\IR"72[]0vx_T)X*Y1>V9^15 ȒM@񈰟-//(0+DLH=E%R WD\ð0^ I~2"+~'v\]ڒtpL6456-r]PFhO?qGϪXz͇,$<]( M@GECqDOM=]`WU8vt g~ 0M23oJ r9(i[Vsn<Ҕ0./WDeGǝS¿WX i-~K8GW;F՚/xK?{. U ozV zry6^N}z¼AT%2Ag endstream endobj 89 0 obj << /Length1 2152 /Length2 8353 /Length3 0 /Length 9625 /Filter /FlateDecode >> stream xڍP.L4R+HIҍt#%,. ]ҡHwI7H Rҝ w }7s0sv`|!i0!N\ !67qaN68Ϡ0\섔ɀDlp q @nH"A vT88QatB# `e: `8@d EFmZ ,"NNvB@+'֑`!p9Y4PgU-8qږ0ǿZs'W P#nu jvP_d쀿?࿌ w-0(@MN͉$mH{ f6E~Ij srt Ͳp3i-3?;ǵ#\ sgfv@8 7)-:@ uXvRr#kC̑e@aPOG  S_0AP wjF `BOF 3Cm=1PQU^IRUJI!\n^7/__/`Y~[*Ev韄]׃__B,9A_&3u7#9g_z=f79NPA 7T]_59V IͿ9ܠf0'Ksl`p:8@! bȑ"we%8 $q)(Sz@o F<o!Fڿ ҧo!}jFo> "#d<7ҍ;![HkL/EE.> _ȃ acvB'wFoL? 2A? 2 iZY"oR"s"aDvwl~d6?Yoȅ$ y? 2s߹!v/ :Efil6!+wF8ALRG7NY 0țttB mȆ1ȚHw ,%D?LO VUԮkܼ]G'X."<9$_].os[Y/z}%t+ն3R[ Ϯ'kߞ05Zc2p F¤x-E2LRrtUn<ܷ˷VfQaoNRĒ6;^ FW;d)cXOXLcx̵vV忼+:"~#0vތVYӊwV:K>l萏EY8D^!3Ĥ +7 8.wx]|t5ue'kc?:=bh]μ^`q[0OEgge W< ! w|d6o/@8IU4D$\#kǒTbƣ׈Q$aiNLKFJ=OЫ{_mbã4|E?={xƉ7CU٨uBP*m"E: ,}kK{+iRF&䛜)I{Rl}כ+"U6u"dejo]vV2uI~J|*59Ӣ2_n$zέ,7?ۛD_E2uOTMjf$(GoHT1&(W9.ρ(3$;e2ΧH>T:y-zo\)~Vx%KIʄuϹIA"_.O_Q,jPHe+1Lr2 fAzJ_1k;RĠ/gFS4?I=3,e-,Tdb"jsk&n+^.X1`&zR K*E?V3!U*N#G1B-2:7 i)+P^eJv168,Kɿ]Th,^Aԫ*xILtV^"’89㌞} `.f} ֒IPY+z0wj?DOl,1˔#n0SrG7a%Td9kޚ{DltrA9k1fz-ɱEMG9̼㉑gk9Y/o=6:Xk1#-SsҜy0IId:M4q`S%syijUXQ 3lkUMUtUكfZVk;H"5?)9L<.ՎX:/GZKn|=?\[8z+[v+ l$WE2$^*Ӷ4~aK C cl;^@,ɼԪӔ\>cziS]qX3Fd1ܞ{/0&uJ0y&*~n'-~<F;DBqN"|ӊm6;&>.Np)Rq<}'*V8a51`o%/&Ó܋}0R\>qVܬQ X@?q4>b3?^M׉b˴'nJRvAW%k~|fl>/ JUF$tO6Qt)\*(XgUyZLGPWu7u9UN:z:x vRj0F\KU.ML*ɎI 9ZMļ98:8Қ T ( 3T!glYʗOJ;s-AQtpCݓEJگ,&F.+Q ۆIu$"J VI=LrD6+r2:ehwL \{n1NZj(x"E[3ԫng!W"t!4wI0z}Wm4Nk yRg(76~ 췓'm2#LBa:8]7}S)6i2Yt>SjAmA2ϋ9Z6ۼќL\-oS)>ЀyL=]gyByI/hy'K!=}+t lBu+JR潘۸{;Ѩ^vZ.DZP_ۺl`B\J- $|KZ0I!1v&ō8yiS- L;S!nT}3E?jsE臈-FcYgg Q99MN[Ί"|(ڌ93^!1Ӎ&zr+F Vy/$kKZ9 |~,|#hza͡ f$# Ndݟ嘲Cy3ͿoV pB)i'y]Ok1&#J= k0lE{Z9{Uo&LS 9&0Bked} }"R|Ht`kiVtM6KrGq*N{@`ay?is[@cJx_݀\lΣs5÷I0vz&58 3o&1&vrvEM6BOjE9Pzn$a#m07rMv$;{yS c<)#Ss>kXoĆ&G7#itt6WTJ% xw2]f&.1 p< W{ ę7:`9E{OyMx 1 q 8ChdOiub`G=9D r_CHow>Ꮋ`Pi<8j/qMgls(gh ;߼`Kvy/.^?71O83~eJjȊ5搏qnJ:nn01@c76O=NVO ͙-\ksC4#mO%!L"qJLӂ7p沬^0_L9:eNt3zt3~އ2D_t]1P'ޑm2PM)&SP&qDŽG֘</# SL%K >S^.vX]h=q[o ۔ROR dQ n5`'Xn15|Nq g~6fn;2B:C{Ŋ&#h.8T'OCsĉĔ%y\iju /~ yN 6Aijq8ę:L'S cp[R_%Q($ʻ6BuC>̫[dg aZ w߲j8à-뭂}irΖՈț1I]fq%xyB'rH[&HdvS/[8üCY[c+]E1h>`U:ͷ\ Gh<\O[B8Bq}5AtȢk6_&-(+7x^zY~e[PB&܄sۘ]v?`\-?^No(O2(W7"GI,)WImO@;%˓nbŤ_< R,lfV_>Lգ8u͡;лY_r7GB:S2N1jtUht Us5kS 'i,a$?SK F8j8#xWef1%cJrFƜ/l@W[5_n&ekk94IHK pp^fpKCZďVQݤ9 Fqz?cKK꺱V_,j(:޳v}g3^я0r.rOr]҄Ȱ@mDQb*$iMBy`#Zf 52Í{9Wǎ*g0M+eg~:z'EILH HIkVbJvֿ58LK@il >/H-hsмrmY+~IL^ ^ΧrYM2N/*j&Nd~2L<{^̐u1M/.~W^piVQgԊ6&z)jܔs𖐺>֎X~S8\%A%dlBw4?O*w:oG0Eq* B]UbE#G puX?5Qo{E;8Ȉ -1È3?(D2`Id;PP uoGSzb>*U8ZǶo)uxzY..jlc L }3m(,l|lib> q*5 2~ ?xW6,kq{"~|shOO"!M.wA#FgjBZM\p/">pz$du:(Y9Qu):R[NTԪD(hGfȒsui@ ׬22N2E.%/a RB=ƯW;郪:#זݱmjݴBMGN&nP'ȝ*k?⑄ +0dٯIvs~Wq#*֓d>G3YQ<& 6Nu"9{a~0ty"U NTh"&SufJUfW7O͹^sz:m#·hWίQ\?RuCj=b拡S>m!rb<{Q$x endstream endobj 99 0 obj << /Producer (pdfTeX-1.40.20) /Author()/Title()/Subject()/Creator(LaTeX with hyperref)/Keywords() /CreationDate (D:20211102165114-04'00') /ModDate (D:20211102165114-04'00') /Trapped /False /PTEX.Fullbanner (This is pdfTeX, Version 3.14159265-2.6-1.40.20 (TeX Live 2019/Debian) kpathsea version 6.3.1) >> endobj 2 0 obj << /Type /ObjStm /N 85 /First 637 /Length 2856 /Filter /FlateDecode >> stream xZYO~Ǡ+\&E#0$! LVng_Sն6ftF˵:ujL 昑,2/,L*O`JXymO)TcZ:"VEh}Ìúe&,0Ӳ 6[[# y<~̳hAñ#$= ihHjS*ZĔ$.ҊY,i(H-s ^xf"@ D܂كXz#XڤD #k2*Dt`!e䍨AИ^) FG8*0Ht p3~٘}hZW/w/VN ;U|zw;~KvXϪ:c]oQYUt3HQ9)RuU񨚌`x]fq)c肕?r4:ppʀEe-x^bF$Uu=|] g%.@`Ldi7X}{paw4?nt1=6V@kT_#3n4^%v@݁Q} PɾB~  -8QtuT0ޑը~<H?` - !!dB;P 0}v@+$NG[pM?8spt--;߯tF卸 @oz#t!e_PZ76ol:s8KC6@:5*>ȨŨi4j|NQk(Տ=,J @PVSTC΂44Ռ%0560=x]S+>.HiNSρ/} =V4/ 3C0D5'iRO̹66qbN}c3khip꣒&Hͺ*Xށmmom)R8DLRȔthB.: $my& _ٖ3ԒJۆ&: Xes$3~j)Lo5&=zXKG$I !XtD1۲AnqzΥ۩MmDn[3T,ҿ3m k3<;6K7͙grLxj]zRL.)wߛSąy908e^ە*d ZׯA( y 4 K~_c~k^O_U%W?o/|ȯxTr2_ xu5)K^}ƿNV`Qml50Rs*ņ6J3 #j4b3Go8J(S#a:]e⽜`;(s7IbGq̣ {+OSbZ!r_Rȟ"uV i4pV UMHʿUyi F{qkAkOsLYhz-gWxɤ7U;u}=O?> zmJJ@t!醦-UiPU)lqqK 6).&y#r8PKY- 狈Ť9T/'eZ a9ͮ?H-|o"hs:e7$z lsw9 +*덮گz8/םXj^oG{B&8d:稠TTڇ]riu63h)To[p[ Vag4Ih2"M=LY{ .)]ߟw0]7 `w6 VvsCZa.[VWKަo?c~%iK컩|%`wMrmQ]sʜ\t2}p1]\sn REټUH$XğHL-8cG_Ϥ^}{3: S9rk1RF5vZQ6y6-: kT5UrN^kH̪/Ș[Ȩ&Rl&TjZTvi4.ט}auFkwDiUL~b ŇэڡocCU-fZ}fUcW4vD114N4Ɛ4]y@+4> m`*e{ |mO2QwQ+6kBFjĂdNR1:,mm~Dѭǧe/'"-8i <6F2E86BE880DD417359014A57BB0D6B2>] /Length 242 /Filter /FlateDecode >> stream x.QTUPZښg5\EJ$z:pQqɛWV̚Y`RŨA h\B s.3C7@"P(B$eu!%(  CF`N`d5p G0Sȷ_}8Eşt`V` 됓%bbCXڔ=ze^[/ymK׎Ty*H. Tjp #iTK endstream endobj startxref 109596 %%EOF biocViews/inst/doc/HOWTO-BCV.pdf0000644000175000017500000027410314140322300016057 0ustar nileshnilesh%PDF-1.5 % 30 0 obj << /Length 1763 /Filter /FlateDecode >> stream xڕXKF+\d*jf",%UAeX] >F.QNlM&E7I]y6&S&_Ng. r?@RʤbHlQKsݷBDɋBDE {IjŒDPZd(SÐ~X93- safC׷]f2;bDykAnj%eDg  9Kq? G;2սLit:h:OH #xI&; K簾s_,~%̡1(>$G'5_,&5~Ru)a16h(WBEt+)g$NEvC1[ufjZh' >z#rhIp1E8QN/.HL7Sy}6x9ѷGGUF8B.;̵rxOOTݰ6{ŝx҄v eYieYJ/- GQOni}bXDڳ=$LEj􏔣J*=yE#C(uiAjkl< r;U_nxbS}0z#N@=L$m)ߵnBN LyD*ỈU}3U ܞl3$mU"&ƭ4,6v5˪8bhܰVZhq[.Sr<Ѣ-Pn#m;ὭػZӜRRT{d$40ShaèTN[(+qx.~4.u}ON 1yJMa v9}Z+V,iPH6lY3 ͩZU,?m_OkߺfpBйXkZ endstream endobj 43 0 obj << /Length 2256 /Filter /FlateDecode >> stream xYn6}0=Hˢ( $lf`2[ޯߺZ$yɃպUUW.^$sEY'EQI^?OUE],Ly/߃ pipi[YWN#@] kiyw" o.JDa \-k8]:zү/*hy]lOp }>Ax ޴n MGqjf!8 _ K&UxbTR!R'DOXʈ4:aAƂLWwuyV[jy;ǵHPw'j\ݲH?39.h;)NT^g(1*ώϊ|x2g\y/+ FUͦ $ŽN\6S͌$BR6Ao6gg p(M&$dI"gbYq?д'a컽6%eh002"T{"%:q~kA5{/-tK"ᓽъvqN6Z>7 g78Qw' W+xqO[VŽ<7Su+_P5%fY*Dy@c тRJ.dr8"b*×S\xz#a4X2: 4q-8݊>5(&LU6-( vmi;cC樂tH^ydО_ xFcτ.?zܡ9E HoiIa;gФDyne||4AsͬFN?>ϗk1"P$"IEM E`W'L#/#:.CrRHe(!<;İHc&ИIl"=F@2jkǍ=΃Lsel:*ϵWE endstream endobj 53 0 obj << /Length 2142 /Filter /FlateDecode >> stream xڭXYsD~_r SftoU] 6 rbp,$rd(lf>Gg/NeՉNUbuRTʜeޗ'EFMud3qf:K$zúktItL-^3|iߴ[x,w㪉9×N%֪2->tV\¥cfъl(e=Se̙7V J{R$%M*D> )+b( fH0 ItBRoxvH튟s\ + Ȩ_1yhR0IHDn;ejGMk%V;UgҸԁ?3%(NdJê"BV|.]!XU#5{)|GqpEǥcqu*<>>'ل= RM#75rU/ΰKĩ59*y{nYRZ'|h[mfG[e [O00seD n}F$dQA@n3 a\ ׶Sr]t$vqLJSU$F"g_#^0H-1]*-3T~'DRyR(@UeJUz ו_n<Ԏ{+{K[&8R+rIjŴ $ɆWj~l|w`%gs.JA.Ee64g} a;ޑd' Mi[dB: ޱ(࢞K6L ?g< z ޶DZ^ak1hRu>GR &89\@!mtfv-8#pzāԁᑮv=NjIvcX^oT]a]m{[:)Hc%WB?8 -zjx[0z:XF)hL/ kiv#}M:l4giP8vvDAAR.y# lך)'6vG֖y0V3 Jp?J5̈3# 75'ح^.v>Y62wBWNª7VXt?2@wY E14K3מ--P4{H`C6wAul1 t} w}z0g0[WW]ij#W4Qi9P:IShm_۷5ȡQqVfִ&N_;#7ͶFo=';8Rqe'̓:܋ k0<ٺ\Z?yS:6<ݿ;8 o~#0  S]…lYeJy~ VAx~pޱ+uHl5_MWkHLݟc*=ʝ(ƯՊʭ{{e)LVNk hΥo5D'Ž\wg9'vzΒLH+I-b`C u뻥 E7R:BE.hÛX?:6 8TއY嵏 A}]42p;%)4r;_|o[5 endstream endobj 60 0 obj << /Length 1664 /Filter /FlateDecode >> stream xڕXKsF WhtfBvFQ'X@ѴFRX>D*4I>ȶHg~00CK?LMIvnv7RN a{ ,KE^?o3XU=lx,~]>F홞+]K@;dV(gU-# |0k[y{F%V⛶'hQ(o:'Az*  w9&cز'pAOY19Y[; x 9Ys"h[ rZX dd2B|aJ OQw,m7( )1& iSJS46 S* LN_6nʒ/wK wq 6ĤJvLt3_O1' .e?w7's=j㸦oJ q;0hZl;DwQ9UəB\8tՎLBL-=g&uʵ kI~C=<ʏzBR:"Tg\oiSoIo]oZkN I0L+Qu3/JEI>0ErI? T@\J;b;L[t>B&CQ6X |am:+I՞lxPd`8(BIA80 ]&V;0&OT8 GW"3p5jk~Ś(#Ƌ+u[[Qhv[=EͰдDحmp\oK<%O=Qw &=Ynsuz4:l֘rq;ݞZYO$%Vz,B:NZe- &2Xxv-{l5crMu1Mq%i[@c0QAQ\Uf䶃*6XsnFS͠9O@ʥhD/4qH[frv)+FŲKF2vRMf}-s0vb9Zhq_KsnӺQȅn@Fe>@qϾP]}ۧʠvQa. C0gd>ii􌅥%J ew9Ɂu|ǂ;%9!jM'YF!)'AF- ؕP>W]=#ONy520T=Xf_ endstream endobj 64 0 obj << /Length 533 /Filter /FlateDecode >> stream xڕTQ0~`Df\Giӹr)ԂvMw{ؐovaotIÕԑUID(/[x~)ߗL?pف]G% 0?dgl^߄“Z+^bԡM.#U`]u`Wx:Qmim\SMgwֽׄv59 &hvLϱr-v% F`i]|EN!]}X_#*ɈGQTZ1X @ƖXM(z,$ܻ'i k(%箵'5> stream xڍPk.\Cq( ,S܊Kq hP|3LfZr-׺P(1M`{Ff^( 3Bb [@ trye 4qy)6 '/ /33 Nq79@ :#P<@V.oa@mF` btL\voLlj`3(\\xMN4w@ tr(hb2F / x؂̀o@'[p<@h_{`adt6139{- [ @IRÅ`bo3dkbfg&I[lrpqftQ"4o]7]O4{k'_'kcvX-(ՁI M`ffd@3+?=*UvXY>M܀'WXX 3)dh~;|'@mXd6^`{[<_&iUe9*NTfd0r0Xs=7 4+co['c7"mj\yt?XoCtSM6zm6.o ~[mkiE['b"i"Y4WY5+51[=P R003/bټ]o 7R lǂrpLL<ގ qY6.o.|`'?Γ$/`$XLR ַiXAo,JAo,* v?b088tf`۷N2 E %o/=_-K4AomR!ӿz<& ¿o8~ۂU [.ʔ-7 { b'1dZsy k\?zf|5wU" cZi4 ދNm(4߳םnDV6%`ÚUZ}U'[& Ej{ ԅw|}4l e)]Q1{~հL$S*I@F7 +xM֐z&;2S@r-W-q ;`qnfNTp'* HpVr %AX1L-/~ݤ61w#rFnt;bRM˸!sO]=߾Yqj J?JmV֝MF\tG,٧_EZ"?e/7LLzekqJM=J{{4$T 3dw=(mArpIJhL?e~[ FrLa&lC4M.; ^+̘s^Q㢆.uqI⸲E~wtV:U;Z3U2"")UwL텘K[K'eE=5Ue?Wr_5i gI0n`{*)}wB)]rGz5!baCX;Y8!xdwsSndᅣOQ&fb:'N@)>M[=*Al ygG6iessIJSJˢVkr 5"6@&к=oNjZ5g ~hP.mavLdvcY2i"ޝg)|km3fl~mIڇF{qw:NK]|f[}CMBj`$O r'^GO8)srv~,xȸ`]|k=M}# z6w.Ge&Z&IfxEιҠ,YL(wh@GpqG~MneԻDN)2&W-dQ!+Wlǥ=9wUʕ9B5N@3/7ݦPW3Y!^ ?:o"X=49pAoke;խGNi3Jk92nm_ٺpU[s':1Sk%;(sQbW KG Єu|]5m'uF,L/T@uz(e-K6%!.5򿞁Cp`ڵ2] MS!YI.BEG ZFښTN4,ciJ/m }=NNbjnA I SH9*b;v6a3UiƞNN98 _;TB^;Fbʶ ~]-{hpMzvR:a/<8   IDy26^7寉q|[8j耱MM 88Y`H+s|N=A!<_?2z0$~1z I\I9DLttք_ ^ mcc] 7[-npX/Ї!8FAĵ۔gK/94ܵ 眴 gŰӟ*i0VlU"K0*|GWFߜ_ tpڽ.աhVseypL[Z"+T^ R[ͷMnOkT3Pʋ X`C%{1Oۗ5yF;3$>is(G?LTL}b;06Ă_mm ~:&ŧEl]oNNÏGR_ DX! *g=ܥa7_QX %c:Hv'ذHLu?'%slerȬ 5 o= oYĬn95$%<j"n+Jzd=y\v(u3]~ .AK@kO #Ӆdwg$RwH2KR S Eu+Fy J-{}٢Ibtt ȼ>iE\H~ pk&HxO\?!\H ChBw&ƙ g&{k|'YBu?7n1lC;աVVXN(Ā-7 1ݶD9Q2e !__(t_ ۙ=QÐ ܴPtEDo,V?JQἏ[U,Gd Ē[Pjp$Қg;h\gl#U4=@T]DFCEнU/)sPFow3)M<%7G-@3Sj B"5NOX{m咹- >jdL)6ȑ`2%sjH@a:xmn?I<}E~.y9@{I,}NiA|<Ʉ=A9%D[߯a;υR1 {r#}8(,hMfZ<;j Lg$|CHnJ*ĉuRBYՠ]~^3iD w1ɕ Rdn{1PD,NxW?ʷ7Onշf#{YܜVk 6`E.uTpk$triytV 5*ܿeYL @44x?06 c+xXq.+uJ%YCriE9:ԙd1LU09#ly:$wlA8Faɯ}ƭmMyAK| ʼn Q=\u9ƘoOr.'%n)S/itGHGP:91r20\k%GJսvGdR}B: ` } D}سjNMx,3D0>.V osKݨ+؛kqx{Gtxa2Gv[ 84;.Y0R)n%. 5ѝ(j:07*5RnB\#pA8wD8>%-/Eh%zY \`eץ KRժʢ'T&r~í,g$}X9+qn[[mo,"8q7ɺiü k=B=Z6;tH%xQ[pZ@ ܅H{ZY)""]4-ْI}/z\_E !(7N%hAx&zh|Ohyy" 9T[xe/<׀`hw%Kqyx:3M{^^!J m$uL~PxC^&QSNחP\o/."5Vl~fEor0GI$i¼p`R1A Hqz;bɱTD:>@A9("Ing_a4e%N|ˀ?LJx׺͏| #E,/U3oٓic/)"7?X ¥|;I)@K|@K"qeu%/co`0Gon^^'uiW+dZL$Q3%UoNh dictA{˚Fq.^Qqflep{9p!pXAF=PFz~uaĆJ!%TiP|5?KϧpFZ!#n:*SopM$?`3I/?GN~(f4фapDݷ-[`8T}ۀ@އ|ҋ dMeS_S%wYSX$8fSQD|a>e}Ch|] ŝyЎ6A#SH1^Q X\ _Fi~׆ (2H-!鏲DO߄$G.MvoRB\02zcU ʌāz٠1NZì0s -LS#q`ЮD#rv!eYAƞnJ ]pqNu{zOoe1w0w*+,} ;DȦ.T]/Ȭ~8J\ t,Ácpt>^ڶңn-곈>y7l7% kzq /`zwCqm@`\ 1n_q%I-jI75~3Fa5E` ,fcUN[2G-3acvx>SV a*lƣ_4H`46N6g>3[{ ۙlSw6:!2<Ƥɧq4^"|.gR %Ɓҷxp vJ1#jeUoaJ?WAou1+^SLFa!bjd>c+cػW{2Q188~RSN6>gU~JڪFBk3N(38T$p:u'W&|3=K̝!0Ьuo5X.ͫERU(ٛ)nUFXK#Sg)gݳ? ?3?H/_MuFhJwpR\]bEӰjݻ+a0%фU#OG@у#9L+$(Rij떯Vo #Bn5ǘ,3؝3Ӗ$R߻ 곈E;q Væ5>`k@"|ߌM5rw/XOtZmxg/h%=øX9G,dFQm(Nc҇8,} %vjg q*Ѵyf ?-]aj3Gvme _[H*`Jpy+I{-5?唰OrkhjÂ|:E%7GhzXRsP#,/|.~hg@(a$Ik{ŷ!AF< +5>uM`v~b5~&4DXXP҇BFeS\5Z27̘Wc`|kw:D3;_]y;-\Jph@-2֯1[vjg]nrbf,"Ёo|?oySU7`EHuϘp >Igb!O*I߷uϝ:vX${9 Z ʭ(P^P#Ml3~*2wSkTɬefjו?$+>Ob<;^7ː4UyEQ?pS$Ss*Kү{vkuu/~葌{֟1;90 Ѣ$TsI)s4Sٚg٬ء Bλ[7w,6< c2hf&風BA4-i Tųv++lF~;cat2IKZU9H~h+avLCDRs ?JYk5ԃEIQAp5d>J <QDUi !$=4r4[]`|~{JcK讳PА7N\s31 NnGx=GImKaJk[.jk*. wI "RP{v1?uY4S.z2G>-i9w6$B\LѰ)\;,.\_&GfVO3>gcS`Z/G :_zRtV{~4.ѥh4NJ6ߺsgL@KƱ>OJf8?@dV"mR&(jnX3Ij;F{k,;nī/ѥjTk[PVU+,KWm*‚w(#5I80 UnlZqND׳SȲA@9w!YQH0 ˵I Z/|/?|6|cɳͣ,X@L\k%8͞=Pk|Jdj)I^}>M}!M- C/Sg :f߇g 0PgV&txqS•Fxb:w3ށiɐ\Ғ}R=.t4Mg+KLM<fo0]ϔG R~sdג LjC@W[2tәGG$:ad10cf6@By}K$H3ˤ\6Ɇ7BJĕQQJ%JlR[a,i=>M0 ^Lx kE|&]Zr..{ gYP|SRsCF3[(;R>&bZn:}1zy+|7t-7zC$h)fȶb6\B>FxWORZAo W&<%^{rQ %ؚAœR1R8.2PFu .xd$9S ߣPXxNS #IY- 2 T'&.#WuUEoS;w_dAHbxqd<@ No+3(48u4<924[ 8:w_-BBx<<)N?B[ImTnEu}w{Q 6U@>ˍa]ӭE"v;Bff& n1ug ]0{ h#f&ZIAC*n쮝3 .Yb6`~{'Aնl7ׅGqFdj-w[kL-MlDAjj3] TάImvwiwYx2l̞! 1{0Uԭ)G˅;!;XQ3Հnñeb$s$UPrhulԚ1`HV.{&` ݃ۓd)5cЪ . rڡ$7HSLUՔR3gݙCtȝEg=H{aŭ3M Y FBS㶆3KJտVJ<= ۵rV`@6:K- NS*F-q3){JOb>"j7@+3:7ERlT4Kw_y>] BsS_r_J̒o=G-]tf"=?xa?Ē@, *f󏪈m=nM]/Y[ӵN\׹YerC痱Ⱦz0nOlw_KH"~X|ã.l%HbbiCz d f {FvͰ_IȯDŽbF.ΣKdnSeF+B9Ap/c~oxɯ9IҫY8*&S,QY#L0Ҹ):U>Y~ϵu1mF}rAYju'og{e}Q}z'%>#G˦Ősjn;Gd7eEt%οpc->FkRN5^UBB qt8Pv\VIllK_\Nl?f,5@{+OakSMx?]Sy~[fYrJLۭZ ; ݺb pm.؟Rw.z0b)_y2ݲ[$ VHħ. W^h { Х7 >ZQ: .(x|ԨMOo0-V4Jͯ+)a,#Z$|$vq8+1n0mm#ۇp$?s~ gj YMuH2!v㇛KU%jMs"w!UǼx5Cd<&4 e U@=+TU&:Hs$B<\z](SwZ  e2n!F}w w-Rjf# {aD8xCf7Pf"rd . =J<;V>*kfaj梈"||!%ֳ)3uk + ԫi,BH Q'GNM2;Kﮟ1FaJu>ZoH늺QhKkC30, x(0gpM)q'0PK$ |c veP_Պji4VOũ5yzN CMנoGƾQRJ.;6zMdbS _.TJL`"^ݳO"?$wW ,'ڝrpO$߬/ $P۩>o3~IΩKu%cbP $~"=ݍLLJ ,.K޵@3hʕ=<J8UbăI=fP=<gAţr33ADe9(Gi#l[Ey-,D' W05;^gDMh 9=GG}LX*A5dviZO*> stream xڌt[ F|b6ۍ'm'ƶil66v}c;2Fk'#RP254:13rDd,pdd*N&Ñ88Ze `b!5pL,&vn&nFF3##:pD \,i[#G@iD` dc`ad8|d42(Y8NNv 6fTW 's1ெr6&tFGP1pGlkj`X[?<&e) c h W FF6v@w  /.CD 0eh`hob`am`awq!EGiɑ 1e1 D-L>Zm]S _M;1-MDc!#33q122rpL&nF Wq3[oO;[;G&&< \LN&ޞVocb[9 M,pMLwph3~^ƶ@k?/vcem?nsgeM-mVͬ;F Gq 7c '#_h`h׫cbd?edr8~l*Slalw8ƏEbfcx2}\Ecw@up| 0uD B!v0!N08    X #.>%Ώ #Ȯ>+*c?QE3C:?QCgdkq#aeKbc`0~tc/h~(M)-c ]xuvWG9 _2ǔ,~k/Ϳjh_OۏOQx"y8]u#gߏ;?nyֈ'Ȳ.Fוnپz*C3t2UufýPhߧ]1;WϓжD^/_fᖦ1 OaT^[Ar9??HN,+T)V/'3Z"rÇFtCC͙z'>f)byZXPav&C!>&{ѦDFMǾң97%68TdfdM Qk5G#|Ż5G=<~sN4%zu9zr=!-4)`l%sCgh}l I_ o,x]htmDjJc6x(v`[41都-9G7> '$93=Q~n NXPL%`Jy+ Qž$oO! 6CLXp5֒Dkx]W,pY{ӅPq݌^s>TF:E2M`T4[5woԅ>Oӱ3Q.EojSG|[ޡnAvUǑJ`=ƮtvTPUPX t|~WHh,;tՇÌm?P|(<$.F=Q@MfF% WΣԫrw*@d]-\1lYh-Xe#u*їJHf3!ގUFv4O"\O r^OqֲWF|?AjdfwyDYtvHmX8B-/2*$˶Y|p1A_ & c|q<RSb9,p1CFRSp^ _JS9XAR&u& g)*fwiܒ]&5C"'P$YPDSbr[Yvܢ_L rc̑3NXAMx4s lo ֶC6{EF8$qm*SeZqCW$q K2$4(HbuUA-6^ \a@o(0MD^ 9ok _sʪf9FSPC]LYU ޸ezv!Quι}'Ud1 gPSB6jr$v _ -Kr-0 x83>P. $91BZiP#+kƈ!}ûŋz^byO)^#${4N @%yÛ Ժ\J(!{ ̷ohҟjoP|>fYӷVAh*GfgSZ<&bq4V0V6" 7e`4]vEOď6wɯJފmحjdb%^n95>:ՎZ+ׁ 7 7sղ;9x+=YM8'V6O#0Je೦锡σ乤xQ!?LՀqo-kUABQB9s~v "Dr3iZ;{>laʊ`X%~Jveeֿ\|JZh8-_%g.ΝBƩeX"% If@-(ŘS 4)7,*ʿ{ SjPR?N!iͳpH7Vos6 I0J"Zfhv")z 9_X+$n c*0jtDҍZ]߉t=~ o+}BG`; Mi`,\ioPR@pXlZ,1 Pa[[Ы`XT^:e5wm~X~쳂aPT0> "A(|ڌ|t{J%7C63blq*.\_@'HD2҇N?LP.t(VLv2o[äd8NcNB476GɏЩ~EA5uf^5M"цZ$*֐+gl {p_ԘC 17`PYbK8+xK!zD8fT:UrGKrKX3T7$#\S>J"WxsՂoɈ> @,CKFؠ @;S?('7eTD^ 4X#{0vh*"o M^]=1-{/{ߣ_xDB[CN(+!Y<(A4 azJՄgo0ܻLnOI#{Pj=Wbb`9MFDVS88I!+e]6&wPŮA4`F4 NtE R/Ǥ>| >MeF]o/ FE#uփ뼭&5U ary+ KPPqP#n{1}1pICh#Txu&$_!MFS#h;k00 ~P6ߙ%J/AM''oEѱ"n"8Q H?b]o&qe1#6J},'D; Gʙj UP_ːJRԍr9qvKnKZ>Q{ (ĴA\|ñLLwS\C?s9? JpL,ɨh{"#ۄHyNj9&Q|'z1}'a@c>,K!Y/׶)b,#B:2HXFRfB;q."('u`Z=ȼQSg1B-bf#s,ȯ6t$g3KՎFN&R-Hmվr gIcJN)9z &&Q")Roq|@eWIh-ɭݞiaL!zO9H)FS^iZUIfCpq l[ z,O#"#3 !&=ng\rĦ$~cV?߅ ro!R5ݥ2iюFfwK>6V.E8dž? "ri N Zp537%(ʔs 6c R) h 0Bz_g4H8 K"S_`Giϛk0z;>OfQzs1u29. @=8Cb8H b\KRoȆBNtC@uNO\s؉TȔI 2ݣe(<ؼTº(P}~W"C3%wmŲ6C] n$n :\YzF~ O4yq_J<\ .tEuhgڂ Ǟeq[;B~LAشhw@,/]P :Ĭt-7ɥ?v8KKWܧ@r3]@=ͪl4O.Lp8 '*~%7-@_~~782oc1\/YYf:(ܭjRg~3kMsv5m;ԼNkbsKb_ݭJ5Z3!Z~|+!,I%~8ϟJnBG)b@ed%<\md>@\˩.& []g4-uCD .:NUS@4|4Q#Px֒UQ(X3!qgKоG1%' wHhUEz吟m63? tH kw ȬismM–; |:x:;%5YvXcRr*3J2z}kDD8}߰X Ӿ91jG|Cې{[rsw~Dq3J{W+媐a-Baq}w!6?_9fS3"^g`F'P}3(_nnݠ14:v>'&`}$}Us? `$mg"Kӓ FDuY w7K*Bk r(K\p@]djvoY4ftJ9kbLV22,+ߎ2xE`#s348{oS+Wx]~~K7씲Wǎik؈HDyɎ{P LLJq-|7TȾʹUY;qB`ZԳMϵ#K`tO m^N|u}fk J2:iiܫޢ^7>iyj  ==P7ihh N1a'ٻSyջ=<򉅐LC0o_N-AսZR)ȧe)%=26 Kޖ5}R# crU?B:̧{0z|>V۲͉Mu^o'(\|d)NIK ѴtO_dOsZzHP_[1xp@^K~j>| lGB>Is_e.ݗ~i)ґE .ilQIrQF$HI4Skm|Z2DY0uH{Rrj/ :^ϯ|ğx2FT2ں݋;5VLWGiQ;,ty^ bf#\hۜSLoÙd9.?OBnO|!,X(jO2[ E|N Bpʿũ3 qЪU} gtj@A h..ԕ1C]C>: Kt#Õ93Lu֐K JiiFLURcaٹ?qpnK_<,k@Q4WaywŨâcw|t(F1hߊ;8e4E+:_Ftk1!*h~h PF}%.L"eDN%Cު^dxgDŇ 7o~b!'(.r@5r~\r1 ͆ fj,5qj|;Na$F?rNBw*AOu'A3;䭵"#C<2\U4LW^;j[ y*)+Ƨ/0M5զ ۏO4P=xb ظl^{stB11,HWǘX@-ѥa5U@5m3:Ъ>OEU)?zo(haL@gf( H˒\jn;r`㿎9 v/l@+Oec4ξ xw>Zn0}1GH9x^JOV@,a=xL_Aj+[#eKEZΘgg^~v{Z+(LqjsvDŒ!w:im<mp7 ypݷ 㡾~ A+cO( V:=p'mqbO|H)h͖ Igk|ikSrqٌ=SyBib8b/<} 1QU ,.;nh|*lq##L1-lk\\`h˼i`|Xw,eg@-ƘdܧLB^ Q !'WZ0 x96^.]J'D3c˜U)vDϠ&.16ڜE^5ˣ}ZN*':'y|{Nzex$fhc쫮=͢}3)<3uhnm?_"H Y<9;/I=SɥH BbճML;S}<"~*0,Ffz.W1Ou싛ɢ2.JB`p{C,HIB:0=dA< 4r xXFl֫[ś|8 ׋G.30Daz}BD/8̃y 0rѻ^IHtNkA!OI$iQ 5%DFK5*ߞ@Te$y.)C%u4^7+VCNdq0%нcX`ZZjs]W@Sl55 ˊլX\&Rw)NZI~`n|j;äkF 37=ޡ^.Z;=z"}ćoE2Yq a,HHu.=yc(dOrך\8up3ypL>:sn=tY`@rb%tQ]>0:g2 ̿ݗ*wukODx;~[_)B뚳>5~PDFP[` $p oXY) Zm֓;iH8L(i /"? /{J( LsU&dt1l0_QP(WHDF_j "wYaUP/E;* 4neҋ6c4'k MNrMz*#@DP5'%8b1֨4'Ò(9 }0@l3~Đ#‘dz :zd"MU5g'- =/aSW&v *K҈%5i8]+SmaO^?t'܌_H6*?WIl%m 1>Cn Qo >2c, 67uc6LZ?,)TBOYJ!@1qu/}/!֎%o]DYV5_G`DS#9/#O}J [tUm"Sxs 8ӝt@ll?y)={{13O<+<<:Syj KE-tH(53Ԟ/4i#CVR;a$H#z(/"X,{:%iQb0iu kG  NϼS 8㯛w`?T׍૰y~_*\2Wd+i覷 K D 4|bsy&֋1y}Ҩǩ~DiWD;=dC 3Ɓy/iǴ,U#itx 6hag Eջ||D~A>WRsHJ5mXE_).)i>הa ܢ?X| 8i ree:ׯ΃*PBğTyڬC]җċV{< 0z̽#NGr1^"To/oPGUf.lUuT?t $WeXyM?:a9|qr\M–hf˜Kl 4].͝fxhwԾaչu`޶$X=bO:,"vpȐa+u.jT03[(U.-'D , HœJ}Iwvu[By[ d70JLUi!JY%I…:)T;C4N@l,-i Qߊz؝)_ղԉ^PB(O3 }gTTSgK܌4Z@GN+KOa7 z2̷2FRkjՅSڳfyIj8s)G@FhݫŎmIxRd{Ȟ>ZjHam.m5ypBϴp{Qp [m*NDd%$ulcx6g' plF2+dݶ.C:R`3ξ  f-R;;β` 7j.& RoXY9yF} 5+!j2 Mb,6Zz nmpihs"ƍ b*lEA f(:cHF}s7M ѣuIg+1+<֩>]Xd٨ og~Og!'nVlk7AҌo"\ՙ_#Үn)I$&}dbI<_V) 7V:KK`uAp7v3B}#N ݯU@?m#a=goĪ+LEgӦ#*/$\=uN⛨@lN Iut:rp`  k2N+d#r|py'>]'0 (XUQ4T=+o^'&Kq) ,fD_y،KUt,WFB=,*"} 䜮4p; ⥉Н>.^G`#A7m^S* rV)JC(U)D ;D¦^'agF(ri ~ LvmFLr>wDzT]8%F\/nƏBv7Yj1&Fe+v[,<<9% >3O 5HwUZrK&asV^Ln?WZa@g% ^6bsUZ:0U/.im|o}.{xsPUԲ?{=+lmH(nLy*u#Vo%QB/.F:٦ 2"]M5@=mɁCs~`zj|^ x&1\Z$ֆ>p"q D1a$/r-Mʑ)o$aW2l!WBqSnk(Ȏ&t14I(>q z91 ^^; big kGWG:(*lt/hM=lQ8=wA 9pݦ\ma&<CZXc< -ȵ94uj5덖ӆS=%b8_6ӚK 8\ezf.mV aqA.+(2<u?I\OQkEnJQ0`^x,9%KiNI݂n0ٯ ) ~%NrjH@aAyE CT%%:^L ֦9XNp :k`эun cX Vp{Yu ^>ss:~': 1'Th/E5k: ]ɵN,Ww\|Kݼz|~YtRlpTH҈P%h-^u-ՎP]HDyiWy_+o&kmNc)}]n gXwtl5ݷoF9}PMIUCC=;`v@K,c!7_ ;ƫߧnU끸uJ")[Ӳ4V ]\F{TBj~Ηw{/oU7_RߡH^+iC !wW2$}tݐڷQf:Y=*qn&^d *XYtHԟbvsv*$$#/1öv]xZ!#[Sn.\bpC%I0ٗ{~0mZ:SRe4?ok՝h I{y80#)_̱Jsgh雒B1Z#3pR gjusifZ!aJy?tP酼p bvk<ǘ :ָʀH|%d&H$GFet knDvc5D))ʧ0!1;_Ob֘cg@:b<S/bPϠ!K|51{r _P: .;6yQYm'[3ʙSտO-ve+83=o cIWfˌ~I5h/CV$B:%pl@S#ks#ٙ0bi/۹5Ai,D'od&d} =;ԄMK.BS-x΂c2 !mϸCG=dpnQZ %"]VIIwix'rwuP%oQme_4]wc#2%›D9&H[$NvXe på,5鈘stW [ZZ. fݛL[+Ho{up0|0jg!dwSs1u+zK8QaAZziD-YqSWEX,܏aj2rP6%1+ƪn~[O(u "(Ԅrɛ'!X8 gNC6i IgN"ynC<+&X-RpN*s00&`uF/܅&[(4ٮ(*yX dvF˄kkK@Zj}|SQ`W(nkX/lE?4 Z{I/sl\gͬ{{SN>rU|Pޒlg|,*JׄBs! $wkc%QORDFA\ijlߣ;L"oåk ӈ7P Y,PP ^UTg)=cp{͚RJt]˸L(Y`Wn7: X|JGUB>5 GviTtE !t:ak\ưOqu8;6%IGpHM{0ΐj +m iU|K^Ld̹dESxp~nn e48w@ŔOgsK bS +P@+t#՝e6#ŸK`;n` 鼲= 7T+h6iR<=KPF1As!] xCT[Ҵ|vf˳_ S26nͻq -'p񥸕56}Pߑ<-m:`߀lf5?~tnWj oHh@Ϯr2;2Oz;uyF2;R* :0ElBg_ydIfkYiڕcjNԽM$ PF"4bqwTgaK'OD r:M&ӅSOL~jp ʇo1,bm&}d 먮IqԖ#"bh9hObW҃@ IaX`: ZUI)K!$88w8c$JIyoǣ>`ܓJd+Y-#R 9twdw%,I2e7w&Ԙkr7-Z8i<)ŁP+ HHNMY'MBiZSx3r\eVFי-^"6a c/s 'rTDeO B *3Dl(ks\@'MJTNJqtij2id5b3,*W{dG'r[Ppm/h|oaQw5 [I_9@׸?׬?7@skOΖں')f-@,5u:񱭛 D .ϖ[2;=~P5/hח1(-B+woh)$#EVᩮsts/#=d J+[ԲG$+RmkM1HQ0gS;k1;OP2'bibNj T܄kpz^1c?y Z (Qΰ*_fRps O""\`k`@񧛸~~W]0ƟnI[6#yLOtKR* @eĕRG ;FOM xuGV 8ㅮRHC Y*IU`JJo[iύd;-q+#5#7 j:`}'ja徭cJ[ }gb%Q_Z<%2Lxm_^ۤv[qC:,g!Td\cA` z>-e\!Geͳ44~][$rc%Kw z\U1Z^\Z.E)MʻWIK@4ې#YMN- @(d*x YQv: iݤ ,ߌ k|=(m@TGO8 ;Zj/[Gˣr&f.a}j&jxa^̵[ݙ"0, !k߮gk$6pt>qp@"t2V}dfTD;xqZSX1 aK+y 'AjOnjQN_L%-<w%׿Bn,8=NVwQCt^da/Q4'Hg'UA:zzUOթ=n()52w@P`TވB6PF_d^2S9hA<ݤ^IQFIZժۉ{=zV1n+]S~* CD':@N$T,?[̐VUPL]}(p?$mB٫cm8ӯg[20Z.Qkoa"6F.tLP,WE:An{lniOV۷s$ a^Y6^̈jɎ[:d[t$sʁcdi[37 wDG43eEөL1#ސzJ-%*-QH2c> stream xڍP.{ wwwwwgAww  !܃/ݳս5U3O>P3Z8L,|qE5Vn ; %R r+@hj0u}StV.>Vn>  "@ tAwprYY?sZ+//7t=dn (Z_hnjPw0]+#3 -j P݁ LvƄD аiWwt0u^ v s :^_PU(;+k6V&ֿljn`h  ; @YJӕ` hj7u7ٙQ)@JT`_;]]\@v[duʒ` q{{ w} gؽ.@ߎFH + h# 3=VOpy2˿NSώ9x|l,^N7/￳,gSO?_A \JhѸ ' w )7;?4?nS{_us}տ7T*-@nu5}QcH<* Ws?]ف@[?2}9\^ 8JI c:;z! @?4 `f;R,~(7YO`VqAfYF`64/`0,f+_>W˿kexknίW 4GZw0%`NeYttGO|-2ԋ%Is%Dsіh6ӎ}g`Ps?"!ȮV-t+9|'7TB[>i?FwTw?q#?VN3j7*,0˙#se$B&>t|I,I^:06pf3ڻ^cQx~O "&I_ '{YQ*.{Vu:N<%2/ )E,[Q6MK lAF=~u5?2cBuZwQ9LĴ0$^H0@~*IJv3}&(4s_>=pPȇADb7eWl8J=W1х.x9sX>^F)m[h$Ҩ(|9H^Iyz y] yH$i.[;qOpDž13A:,Lǫ8fIYZjUB4^I>AEK(*jO)(] Jܨq/T;} ܡ+fȨB*=9r1rm{ ǽuQ2KZsJRYmAғ}F,01,5|Gr8 ԃ/kW_/}5%W3bƎzUk]Â6NFFS5W.!;T'qrVYyCsYIDN}XyXjcb,XGGi[]39@zW:8l_w`KpcqA| !r`I(I-LQӇRr)v_/+<# ;dVRXޝ-/)Et Zxj_6Īnc)~ζ@2ᭃlqKr}-Z 3XU3yUy#\z֪3.p1kRQn] wb͞,>ChaajiifVDZIj+8b)1EPV%mQJЮn^}nУ#[7'NU{֋ H Ȃ`=f)@+=t:[geOǯ:Kͩ\( /֊Ta>@0O(2 A0d$Eg6eHɚKH̷t:Yn;=vt^3ZO"tȺ*W @ߚ=8u;IpW٘MOUœ`$F{wS/yb-Pv\;l`7SՉ~]%Uso-TAGt|PTVM͒gf56]}`QC+'JKr||7%:[Ȧ iЄ wLa `Y_ʅF4L7=e+ݐŚe0[~ԩLw%\lڸT Ԍc'Bju<g RGX/L0mu]T^F ~oմ?b.q|3C]5)Шq#IJ%:܋%RX dҏ_%h5 X|.e杧p;i^Hz/GāBn~0˼阹wO:wܴk&chM\"tqbc &N_fN swlf ?f2n3N;wp-eHAy >,Hs"dX~ʓ{{rT l׫uڮusG+>K ࡃӄPbr) *]ahUK09W5.-'Z.3G%"xeaf onl0)H?u80<!{HLGRwZM]3'e^FSkED 1dK} 9[}U3v՞cH?5Y+Կz|[b_I&9MD&6e&=arN~x&tSyķ '%!jsa"c&97wc6c:M +:i6HP[Ymf^s &(-'fYŚCA)1̋0bnI|ёp_܋<rj> ZRh\_/^ "ZlyC^l<ˣ"zoiwRǴ|"螵)CcD{ho+0"a>4]B#6ͥ!- .ޮ]v +'mJ&EMg !  POs&_JS7(6z:;wm$}WOCff7|77Z#M||Ch,cȀ8oZ')(b%1<͸ A?V X KayB-.ƤUS iEOt3L$H3uҴT4MTyOs%v3`]/IR-jjR':Slxl돀4utªU-=(o"qY.˧ą[וQh$b-6|;5܆-ʹˣ-̷^ˎDQiaJ"-߇cW̭`Ǣ2BN~rB p1 k 6P]NA% "p%Jx( y@~9\Zdxz?5+smӳJ=m|*aݺ!OȫPT6f"؋X=Q# %W!mqWC^ȵQ4guom 4hl T;iŒ4MX/ZC45*UXZև9Ii>A@L,3BqoB[yB9[c8gLLepm9ɥKɸ6k<.upc aN Z"y&C"z>d ^v#ŖNC5(I L=0ҚP_㽯`Ǵ5 : G_'ZN5{b~Ȕ- ,ndTlJĕ_ŕ@8 ?[1M',xoۑ^62㗗 n uy:v@*@[GUqVr\ Dle 0"|tFj6݊+Tn%5l.>-D J$7*gU3?id4|5 \ե*f=Q B6g%V3^DL{n; t96ތN"({so^Eߖ肏s?9=uu Viv;G[; q4Tq2Z9#%x8>'x̣ Kgqٺm(t9˝Rn8a(Oj7$Q<+ 2+ iv/MHjlsP*U?>熺L=y$܍qX4ca%ȱė1/$]0%? $gݮd/%F: ?OBo -qITu1ͬ3٠ UɁrgoe )L-YЮXIyOXW[-Cu %=y=դNd)E;jZ/Q(HrO=W>S_tRQ@ԫ'jQ6}O\( Q?%U2MQWG| M)|=r 4e }{n3>{}AI@* ۘ鳑Ls]fW qK!2: fB?Qw )%ZǵBZ@l/ZIqҒdDmtqM:uha/uMY><AOFzʓ=rVG}AaۊQ ӧ )8L%;["N~K#浆WS>Iۈ)h #3d#49o{Cm#[ /NɊl2c:ߛE€#F7a~)5N*ia ([\6=BXi>SoI辿z\>u;k;;oB=^*B4:mrddE./ͬԤWن!" a0=]c2}m7q.9Rʂa(ګFgT hݵNP?#Kr.Rb=!0~=gV`}O3_Jsשh:+Ȯ4+nUrՀa7\:{U/a ;SZX)Hl\8*7^`s*z)wm?kld##/Pp}ج/ 'Qܘ:wRSc'Eݥy2vf74[Vg#`{p6$~^t$dڡyJm*vBo;oBdpƅ+/<}5p5)2>6&Bq nڢ%sHK~}SގzmiCwttztW<Κ $+nIl{5Pdt^\` ziq,S]?1i ~~xa)eGErw2l"UPc#{rJzRQ+D u[e@F3|W;HP|çfGnWa}AVOHʃ(@{w-l:G J&9 /]TZqHˮu?%@ &͌L…L錓YVɦjwԩ|E × \3)$<՗6!#7zxL|sM`Гf=LF=O]aT0$qiEBq~t$k=D%G_Th_w1w\r0I) > %jSÒ[(16z)#KG[pԨo+;SVǤ z2Eg#T4R}xxb]! hi & ~fqJL'܉*HmD"WѿjiV].y+~1H"DK9I\vgɘ i fL1O) ܣ/"7qy2HlG~TYI#܀ { +db,rOF6}P@n f&#r׎8-?v8hɭ@?\!.z aוU! 7I.{GW}aAE{ f`/b]S--ܞƲk^XNoK0=Ze"+WBF;m̓[*& ;fDTŠe?g!{X>_dIc^@7uVo{iV]iPsh r5H~f;0PG&]MxF6'{kW0~ZVHen#Шd@V)GHN'\WTdgTb%&';GCG\6_4;:$lfyt .Zs\E2,*+79/"UCk@p1'R Cs2ykHc=AZjoYh o&M6VJi#~}k;ȇ+F.E"WnVzphx5DV3VdҽVj8Bjaќk2s/Heċ~glanLL_$…IS}F?ydټ]`MMƤoig4Vyj] HQ6|`kJ>(3{,1A_ijjuXX]gMt f8Mv[/.X(*q^NT ۥF0DeL3p7-z۫F쓽成U U3M\SG^gtR3PϿ.> stream xڍeT\ [\ww n]Kp K9ޓs~ctjMM$ljo sabeDUX dCVt Rk-uebF.`@`cl@ N|1#7KS<3@H-jdin߿Z:+//7_a[@d hbdP7x w..|,,FNtwK 2.`d 8fDj*U{3w#',49\LANp|@d7Yo#?2?ֿYeldbbo`diig0%]<\FvF6`{#7#K#c0#přw,݀-ng*jok sqFyO635]+HZ?,<9@&,y:RvwKZ?Fn +ߊESK1wd7O@BVz93CY%Ŵ) `bcgprxH?TwF7g  w] '{2^I/)/8vu=x)/U˃L-]mVvfe[l,a2Ut1kL^;K; xL׉3x:RQLM'*6NN7+x9MA3lW 0wB}\\ߢE HA`?S,EA`/ ^nn)+X9?G/ abo%%} ,?-]_>Xi!?p?5q;d`,7_\Ɵ?zpѶ Z`oUb': ~dK l~fO9X#fg.dj+?sSge; ,.R-\qWs 9V \_ xz@ ą9{Vu;kܙv{o_]8cl )[q$g) [~с D$n+  %Ax\N 0>?5@yŭ(j=?ɽ޾"i*9:^_AR6~|p3]b@T$m;NLٹ>NPIDsc"2nn_ 2E]1 -A'QR 8q+G\%I̝H(9 CNB0s<*:.; =Ŝ궜1܁Uz/M&:9n~ "|Lpj#o@4eq @MFbĈ(3m(kća6cx_vcO\BM#if,IߔyOqIV^Ǔ nzAU/T z{3.e@U8፛jG'gR=#0_LR+v`Lhޚ` ƺ%qG6ќ*#0R$\S[<ڕί[qBVܵ<qEZ^  35*錍9 T79i0]Mq(κ9+;Y9zSomݕNohN|/O,|4jDPR`3,?;mCn&$gFջWҟ~kN/qFAmWXGĹ/go3cK!`%'M( 77,</UjZ VGB 586QiT.lݗe40pE T b]*>]+2)]5lꤤJ8K饼Mqx3eK>YEO'p5`G^F2"%#ł7Pn?^NB7u3_Yo{j]oE6((%cT>^JGkC,( +! juWH)Y3RCiNB5cռ/(!eL>=(ajǎ3|=+YV*XdN&\b!#q\נ|| ̗уtqNjW^QS 谗h-gI[GZ3շS.|O /N71ԗa+ 2zẇ7O::A4MhY1G RjE)+cdx. I v/6«`({VJ'Q嗫@tƫDz1WHџ2h/ui$䨕ds&x {EM<<`>):7)Uw_1}3܁bɐ&q Q A-P.]9TfOg 5D./,]j?o>p܁R>"gAT@/O' Ӷ#2j^g)=:"Oju-|8EB1R]dY-W9 gElvS{f9`Fbv}}HDig2~Ќ"7'tX8ⓖ!Ԫ46G>ql;^ @Sc\8nY@y}-?|NG(Z+}Z4&D\c7ʨJAP 7.@uE vާ pmɐYDp:EzL-Y7 N#Ѷd(MZ)㦙9}=|[>rvťf؞>Ҷ ie-On2Ck9[29$M aӕ"8g xǩM=.|I%{6p=ԑr07hsRZ> 0 NyaeKv'> M0e m2$HpVtBc'7}ұ7u1(NiH^!=8 C^eѩlB)ݸL&(%,g!?rvE5B毩y&.HyBDl܈Pi@󝡽ciqt1@~ DÕNY^$X潳OfUIv.:u5軉.fn ϊ26jX7__2 ߖF"!Ȯ2|+Ǖ8<^&N |N78q8t2)+Y  ҉T=YUZe,Et5؆TPT-ҁruzj )Ұ#(-Ə.?X֖0Nc(i9MVwM;/(з )=?o|2.}3"~q&OL,H^rZDxLiD0x>D?EGYXQ9!cKHWh +`<:'od?3DԟI(*YԞAA܍JQDrm^S C|/.K.O=;HI<HxW0۵BC!yo򼏩Ly^$$\8c""auE,GYi%JY79nRQ 07DCuA/=ݗr>4xy>TdDk^"Uh@rEGa( Q=־yS;Q)&npX7'~m|j)8ǵ`G|,|&_Oٷ:H5LAKtTI6} < iWpAEۀF謁|*ӎIzv Qx,$#&V3UB!ikļܗtQ P6v@_;p|4fkBVG1N+W͟6۷ݻZ.W7csguVrEj6A$a:CI+-tFqzٸVZV.L+tL7.7K8\tT ^q7ՠBD^4fk-h)*rfG -X wK" ̭F:}cdtК/35OR~vHhW39u.3wsP'yD$$wђfMQ祂 G[ !O9dWD7UDkPtE5[.vBeCs<00BdiL V"ҷ#ᅭ|?+6n12<>ZWiꠠxsH6ɦذ_:N5wHӧO`FX@Xݍ@SQK%wŗ۩9ZaMHJ% z%vr5*5|;c/u8 =D;L+SƾeB?a9۸veo{@Z;<Ȅ僴g~j7n]l yq~@r|^3{ڶDgWvw/wYAx܇?"UvKԾCk#UVÑMX#9n5S0{জH"XywT \2` ^7~|!wcJN.0^F+}b>#w!A/c-WV7hIN=&k˻BC9(!yo<M!6''aOh+(Wۈ< Pg ƵPqsˆ'~i ?U.^N@C+<, X&VXph%)sTיi@KuGt'#ϤE> ~X?WȎ`B.V\R,ꮵ\>NhJ+JYrj} 7M6f|H\A@ A=7)a& [`)s̓i/Ս~WG350~(Pz,j+j0Z)~qg\'>,M?Ӛw?{"uKCF䲊y3/@N B 2Xc*eQ=aJ@9!r>?Vv|~'[kӉC qL|]@=1:`[sd xx<)&hB1 4P"; u%tFE_)MBX>K "^s)q-q?b; "?J B^Ag~ C;j ̩m7s_K+u*u28R=./#d*TS35= q:8Yޟa{3'yĤM(ۏ:VW>"~0) u/[pxL _[hZkhI_~rlC6gԽ2G-Xo1fLo(U/~jXgn#FMp~r ř+%[g?Z䑗&̪ۡtk!*x./GQY)Om1~Y,RƮ!6$M\WgVwt3/ *fgv1 NĽ;eyxRfֆ=Oeې\T#h@iiHf0c!3eΤA,3e|<;$'9q HMFD$"ѥTO0M\G?yj}S d|DC8V,]Q8]>>F>@EI4qIFG/N7-;—t|;,UBvӂ=uRw]됵ޟ;:1ihVy )t[P $~%&-3z'$-T!.yb}\ɅKݝЩ.-Z:Eю>ϻ=쩲ρfwTF9x?aQT%UuJ> wN#uZy@ivIÎdtlvLQ<1<<~tJOOBNb  wAmJf΀(ٮӕ 8GB6cP=]M/caa# qzNer I;yP×FL^(|:KUTMu|10TSE wܗa7 ??d5 L|¼\ќkrK]й.h˦`8<;5^hSP(@ ~~ ,;\q\+#60AGMQ}*P|жTjǭjlIJ~zQPXHpwY̫Ǜ +oK羡f'"hԣ#hRO} ^r{'P6y!ksF2ed@RlAXEɗ1Jug4U>rIOY )7LPr9vnYCt W˪[ÿp,&4C5WK_?K5$obmo'FVq> TD c|w3}ԁtfs}Cna*j]Sg3k)r;v8gB_oa? (c%ߕ&Y ӯ<08hXjŔ43uioil2c[\ǣ@灠)ʓMݺCɯft_{DQ(SGXK p4 .i \TPk Oč*?w {<;jcVcٖ].+DJh}0Cw&_"F竡Y.2= xT7+L~pA2 ge%k2mmjnhI%gB!:躲0N >I⧹;&_&a۹<3+k Pݢwk%s]Qx_Mƴ˯"9H$M?ZyKbŸwZM Q ej?$.IC.z $)Ϻ+Ni&AOvƌ]oʣl4]H)Rsy(4jHpŴ-;̩mm꟦Zs2B@u} xx} Vb69  QVXMԷlg$;S#5nm6ۣo{aKt)5GO3@8 7ԕupFBQiTYWxi*#bHR:- i{5',"vfR5Q!LF&g+!›('-غuVfe?5gC$+$%PsWhOɵ\EE.I6QЌBIeMoVg_UF &b~h)@#hssI Hz@):=DPd(!b:M8_(;-j]3n5e0X,2[`xuDj1 C,L汹%dV-4 H-7CForD@C8jL~ٽ ??T;Y޵X)P1I^!.ad'I^^OE:&E"rkf^h_5eÆItFγ SHf3ёnK6,@_THC~ طrV+.AzqUSNRi`W%+i^/pR Knվ=>Kiίo)t]zS#}u1\I 5Y]ȢCn:3 YS._ S#^SkB.qzFCEN< 2e9]֒B.lN7P'N9%?+< WkVUMώL]%]Ȭ"f`P}.VO'k1')3'*:yj,! ۳WN})3N2ePNp+(=DDyTͰT.x8q(2;3ҧIs{DoߺtJ5bNW𯓝 ;3h 3B !œ"5 y˺H9J:Z Vnie/CuU39c9Q3*PZuڽTnKEjV/tRnĻQqGd G<ܐ :QÈRa-LrZ%5SqSOښ.Him֌֧6)Ґ y潲zX?d^nzVð^:x|Hl2@+QyT e_f(e,@ލi뫈2<%6RT. &ou$Śrc"NJBZ(y)DZJ /vQ]p+R8Y)~%RrHwW㶓h!ߢvJ K7UN]}=}gºpi U*\DoM^ZzOkb .u b F;I 5w} v'n|&/%dGz/\%hPp=z)}&o͊`bT#@T(B K%cWk.6bzLx^)N+Hbi'ZNs/ :?M.+DC l4ݟпC 廒Scs?؇Lt:gĸҔIaOUXݨ1@@_~ /SLw{7LQ]iz4C A"˻AD!PMVL)CۡէrĀd":ZW U}m:DSfG3a@a95z;nӟ nZM9YeT0 p @ ;I}ЮY _ua&sQ{A7}l*J+?#iJS6+)+Z(:akŷɜ˘;1ԷI.C_wǀ{OSvd~5Oo<=œRۄЧH@R uC_οT&)}Ml@Kqx¿6}d}\N}5f.Y+ۦXk>=5tԇtٍ|Rt/7xzHC u=Pɩ"Yȶ2*))>ܚo([q?xUmeu/X#tzOH[|>zj1fFƤ(5THiᜉ^pǰSRqYDBM@'V "XwW*w?o[ :5qkK쪈QhjܬдN2!V|GLTw*^(3+Pae wO9BY+ ]h_r(C*sKN]b'}5;ΉCXs-f4FrbyՐ4s[SC endstream endobj 80 0 obj << /Length1 1798 /Length2 12171 /Length3 0 /Length 13297 /Filter /FlateDecode >> stream xڍPҀtpww napww[-}bkﵺJf&@I{ #+ ࣂ + R9/N@cw˻= j `errXXxch7v2(0dA@gʏNV.y@cJ `f jt25],vMmV@ A#`dldd!DprNn@3-Ƅ@ PrKjon lL wW *#e,27lljjo` Y̭lOL.. cƶnV&n UwwΦNV.LVGc}\O h~_ 2ц:(#ͻ :$PtdCރ 9 pqrz[ 02u-@Dw貼+O=?YGC铨:-W)&ffd0qXYYܜ,dlw,ʀ~N){AhK}r]qߔuoEi2Yzm>.[` k kufVvW+b fd`bKn,i4Sr1kjoV /̻ ѽ/+>;y%@f,'}89ެ[isL {w{s{'?.,/0C<f,_f0C~лU({/7zib_ܘBad(^mQ*Ŭز)i;<s%qdwBe/򔧫4^*-Xx$OnV+k4OkQW]T5h5[TX(t2–iP)ǖ'yj`58ֽDHeC)NE0d*B{ǐpE3I d0d5|O |4 i|(J=IcD]a@(_LkǗ fmZ]ۄML$|*!}t Q>y!닫*jղlgr:MHc`y+!'Xm#:!2~=k ;jnIA5{TwGVf4^{lQă%b9T>66$2wa+f?<J>h#ɶ%ȶx*n9:SW0_rh,ƌU!7_9Dx KnMsz F^DVU "(0/o `*\#wBŸɳqG?x}آRz[xz]qDVD*1 jVTėCz}M pNK#`΄5)XNt砬Ʒ=Aԓ !Ǣ@-w+tכRRƵ^BeaОkт^y$܄?&#A&|{gy%8 X99` 8 K& dB+cstt{ 6ejWu &;Gm/̫eM]ߌ3*xf3F l 7pň&!>#$ǽ]GqGW#Dp1辘`G$$ksTs@i(]"u@LUp+l]6'XPT."xXZ0~f4U.w$kbg9GhMATŎ`5bI q~Sw'(ȫ0  IR&’.ϟH''Q\[mˠ@NJ8΀b_EԤtu.9Kz58lw9$̅:s ޜ65{97$oxa/JHy7{D5nrnvF0d(uLd*<1>Yh+~{v27uR,V& 'epd\0xVY "I%3?~1A|Jyn=awZ9|)+76#G;ꯩWpaݩz#`:_uWUw8C`i%R&&bKeHe4Ld{NVV]%I)3bێMX8IK5ڳ{ e_{@9sƗ rc$Q:# `>R8^1J BHDõڎ: hۋ ʘZԜJg@6݇ud,JrAIՁ(`S};(}c tOCȡX!֥{?-)h ~BsC*sߔ9,](~xQc*~h- ǡ]NjG˧_at~:5`Iuje˫rsU+84 n&Wno O-~RR,?Av#!;6^T5Cn:qV]@>Ȝ*5řڂץ5G.U:&ۑ?DKűi-R-z: bAe[\y~yY01sU5dڥ+;WKhEn^_fz{1'sg')vjրd㌋!%o Y9EQԷvYL#ď{ NR Ğ:wpUBeuஃbG%%_?Ҳ=h"XۑkySO)EZ9/#9.8-v k7 eRW+9g YO,h c$Yă I^E ]NMqhf;)8o ؖ}.EE N=auJs{ʛ΄d>P"]\  Ш5#7BPHRDZrôԦSo/f!(dOe:=6UQ:tGnXȔG"vȈX;ւ!\O*Y:?+;}&vVs żZě<ޅ-ӦR;u<骡6%f Gt͒$Lɢ3OK[w \dA3y`16Įw? u͠i JRQ#2>ގrTԏ]0n>EҙW7QP6'hsFʝoΗuuG;e,UHQ="|j{:O8dzJdfWB0h(S)Bxe|J@?}nH*Ho_MW!' w 'C-kg٪%*iҢ;3վ}([ K 6b?楐[o"@vkgӋ2 Z2!GX~k盛٤EDѾb@$܈X%1FՊw &rJg7ġÕ{qX n8*CJe!C\v r.+biVFMhEN']_g_k3ȟ+;]fjT5lo<{A[=Gt<|Bf-w"ȘuNZUFQjUC;4'l. Gվb>s|bnW\a5!vV=,3'[@Þn7$ sXNM׎d;Jh&_>OJ;1퉜qU*)iomuNbpX[J~@ME?<| Wg,M.5VuBvS0<ƚ}㝫wyZb\GgLsffi;-6EbV.=$$RILl$1s^yA^FI-jx8h3rcBN;)oՂ'(aԦBN;ML+~:G 39{yʇhp1NS} 9 >2{P:s=:=n^)t›C(p_?=CN0o ,R;:?;z-\C'lJ\*\b*٢Ҵr$m[nU:>${,Y *_v.]r|;v˷LUA>ſ;EF{Tӻm D@>, nGuĠF" APlSNv 묭L˚B$l=p #L?&hѢyZYNXRE X3nW%QeJ"U3-EeL1N/(&)m!XVF8Lce_3# #5LʣZZYl}&"cŞHoIT^i&TGR U[ C 6Ƀ;b/_۫iȏB#h2zMj$y rJ!ؒЈUVVh7XD CKޯrIC{T@;y>Ew,R5۽-fU٧Fɝ x& Mҙtv Mmc6AW1=asv$"u !XAO*שVMڐ-yjP]Se֝@0Eg&1Wބ\K!_ӕt:yrF)3 ~ݍ| Bq{рע2븒re~+ N9Ar6`Ã1Vr/"XcEGE/3:V-\b匆BȢc{z'ݱ,<*u'M0 fHxډyyxQG|~[deޝۜhBX5EAĬj~8~8a|%Q̓(vt.Sr+ J@é^y@C=EژItYRPA&k}g!D~^\`^6)'Fꤙ3[=F3k|/8 6YOt$!_ǟVa?xg8Y&MaK(0HPf=oj= w+N+}_JFxj1"'oͶ]B& v2.E  `!i&udٞuĞ-ٍզl9eR*} {g;sn%' ͠ BΩUJSntAlh5e+[["&G҉XHQ{:ew MwcØ~I Їm`;/zN74WGWd(k][4MzOZqGZ9}=>F-MLH~r }ey?'NI@aPOqng0Yx`h DWQ242[j#nt5|JK3|aHsc5Tx+Wi7%C?JJP]jfO]b#*}\֥>y8,j^%#L9 WHfqчBlg? 9S%[c Y@Clx6o5<DžG 5>Hru/N~ qeo .#}7rmU>Qs@y#Ұ3L EZxep٨Y+6]2+e`8覂ᤨgM-bzTx٭?ӊ$_0Ri'!%y"&ڊ&7v,10J9jBdB,6۸XV %<#~PrcbFswd@i \9 =YrVns(Ώ$[ˎв"l;D- lF8qzV&6i d֟Yb! A 2!YG!JKMuJr ׾d^ HpٞZi!xI(GѠ:uP8bu)3 mY;  HbDr @~r#%IЉZ#"$ie Y\Z6HT=^1E~D2NTŖpYRʱv5U/d%¾>R0=r7 A)Y7w"b!)k"륝)e |䶰Zp6S'Loww{ Zgm+?]h6:sQfPXAB3&Ven9mB27ڳv[b'*N^ \2 z\@Kݨxh?<(VG~:m"' YqhF),Ghi?he{ #ep:!J5rT})D-n^U6nP횝6p/]7_Jb&sYtm9RGY >,Eą~dml(v*Ν?%Hrp͟o/fC+$1-EÊd"ǛØ~,RwƓ/qd0ҏ#̲%Tʕ5msSU9)kC:FX9 qc!xp .|-NԈd>(;MbiC[ Ovi͈&A[1#Tw)D5ueC *c)0,/}B9A8bN鸓6Zv/&P@:u`Ϟ : h7~. RÖS:!ˇ4!7WufNdM|{!W:嬞a?;PC>p0~D6bT;1vijUf󔊚j\{,CC4uWkyQ&f+g㭼:*zBogPEXIp@:'hm@%I ɣj-uاpG̼)dԴ,Ž|MIJ'xtpeSՄ;yeU.S?N"0M.\XH`)O.Zn0621iUQ3I#|]`K̫Cy~O(kS*F?v>=(0aJaڵ.Ìd4< 3X!vYiUx9ڃǎ;wi*3B5sZ^h.GIm3!ICև FTKFS=&|mO!m_ɵ~V\H\C0A1R Ud&~i?KUBnH*&1ȼM;yB1=ov֟<9FP8c#9IxXX:(5 =sma5Sl㸝K=C哣Yy\if5d@Sobрo k %" hx핁NuG(B3AHa5v}j#\Bp*с,#v+YAURG(h=MߋftVkmNPXei3B*]ŃgŞfCƨs\ESuټH]-S]/܂ge\|& t,Z u6d\&7|.|6Yod< &pPnUHoeJӑ@4aMhl"rEmnp7g/t*gyα{-ON{lNoN dG~&|T߶tDOn bAF_:Ba]ة{ .ѹ=Y"= z`)}-ylA!R:l)n vdċy$ՋV3Zw*",1A5j4,j"<$#|fguZAEGK_[AtDN4O`+1!l8OR@ Ǯzfo^Q]_ :( -T6)Q.y%).ťgNXHJbLl6buRs]b{h 7CYOR*L[W!\G^x@4⬞m Y7z$6iߔϾΊ_u!\5Y]a Re;"%_} P N]ҍ|Zba++p攩jE=Fhzty GkpY9 YTS8:/톁9s ?5DFW%r&yZ?;^eVKO˸TUݗ,<{xJ$FBOݭ$ FC _gZasqR^ymj* 6EW̊B/Q5%\ew$NatPFp{ $2mm#: `UeqL*k,ysC m '=gEY˪ZI2EUB<qRܤY9  %si=s'dmL̪_շndS9>R MZȻݺ?R0sl3*2r^Ln^+2|\G{BuT]ӫЌ53SQLiiw~,~I 8]t+[цj\sjx`} eFx&= YPl(1x&s*i9߭{n!VF>S7 =Bm uRKjhpG{gi,v)wxJV 5ތ8#*HFX]h5y7Tl$9u]&Ri7X%c${`&\#A_m.ti[2"} ƬbG'5m[PbzVR[.u AA1\yH:鏼e|M'轓TxAG2^ىI> bS(͒^J=Wi~)KfOAF IZGD#9 /zazaU+;Z1 bu endstream endobj 82 0 obj << /Length1 2553 /Length2 11600 /Length3 0 /Length 13059 /Filter /FlateDecode >> stream xڍT  HHHC)-) C % HKt#H!Z,yJMM"co Tppp98贠6^A`P{;?,^:A@pnloPppr8888\3wH\fe @C{itW# )(;@ gK-<#dдC! (l €NL7%@8BKl!DhYBa+4͝@N\`C`p;3)Pum+8w _ 0dCm U%3+dgd\AP)/ :``'3 wx^BaIA `x=9wf.vPG?6p @w%Z|2 >Ps 8;@|Tsr̠`g)j.w 8 7#x<5bv%]e]JW))ibqqxx|FQAa)ogn,K#0sLR-`eߎ_zƿ  [?uq߀=UA]lV ;  @!fjPg_X@ j0' ?=`K3Jہ~/@o//~fw.[7K> ~$`z@v?]qe]q<S~@|*O_$ϧ<|@<v]z@p.:E_$ׁE<| |a0鿈 @̝8ZC+W7/7<b7r+29a yo?x99T̡.FX=Mn~ZM`w^Ãδx8w^<2aoG8xF3p¥ޜxa[GGOyA`wȿdl@0?=?ٝ-  ^.ώvA@x=8#yBf n^M_4 w}v"ت6Zԍms4MgCzK3[W+x\ֺDBDcx"0%-vvfDGMnLĬ'SYM/*-P>1G#Q pc"^Fbk<-T|0\f*wwoNx'Gz 0I%&158Y-\qE:8vC˻ s-`f M $5'hqjzlcG~tO(Fi12qpZjL)á ' ãVOgijd>a_?Ni:Vn'אM8̏(cu/ Jt 'Bq#ܟɌЛ%gO:J9Q3DGTip`uH&6-2؟Mbɖv㉟?yw4lj\1IT칐|T1.槺/HG+v IM9N[L*Ұ/8Xz:[ֽty:^kIgk>puٰTƩ_)6jQ%-@KQw&y_ߤe߅ʘ粦WpwKqP@w>&AAUj*UCZ^!eDoQ*Ip[Gse!%ޢޭOi2/R5`-"̗fY؁T }G3[5U4r\cw>I e1@] [Z}x+׵HLP-fL9TϜְp;q^s\,UPL֯zq%APt j7"1G `{ep`$֩v&p#Y6LVbV1Od4d*-<@PE$7rg\X`}K2Ny]{B4qHs~TVU*t$槞 x_JU4_I)aw56 +?Ϻ.-ѧSX}fɍ*ij%"X._W{cM`dxP %Nn{Ҕ,LR^-#8Fg*>O_qހRf>N.YE"=} O} E .K[LH~IT'e8#-z\\.vLl_dzbOiqj,n!e ͜_-4-׫]}~Z2;1YkVҸsܮth}&lǡ;38z놎2v|­VjaI3"Gɒ8PtcҸ^e\8 ds94diye;5&HR6_kV͛E ^gҡm*t, Uqզ[ l\:)R^N\Y98 TMYwt -E錐lz"l3-ȐΠ$Rx' NpW*Gv ֣0rf|b*Ngf۽IAILI[Uޣ yg*q(.Y+MrX1Q?Cw\8 t#_/9c_bw/.=\DJb1!#zt`_IWݰ\t~~[% P(U\ P۲ڑysAَÏ0̮i*91MrYҲT? qF bF. t8$vd?R^;{UkYɗR mW_{lT /]ѨޖNy]2tĀdeSbfc}dUWJlX"=K9^k%YKC R -%hkT|u(d8o66|";ɀ HL,>~s C@n=< r\ VEZnĿ T{]"'7bҐ seJ ?"Pº լ*$hZj%P(8&h.HH3gv,DpOז0u?MochueY9߳w\1bx s> n`H7^ԓ8uD¥R6jS@K8Wf! elGkԊ esqV8C4Cڧh~AkMpxBQ~M@5H[ij΀x<.Gsǘ@16xF,fn$WjN㐈}^Cߛ쾴~1 \9olc?ȰZP|- l|#.Oa%+-N 8#lj֐[IU.:̨+Ǎ[e]6p{cr]6N/99!ڌK0#Nc%J\a|mWA)"nxɍ^qq=0UU S˱C-Ǘ_#*VJJy\!0UW'7m?E_e tu~D)VDDANMA3][3bgo|lnA_pYm<^Am:]c7 yZj $XnqzMhQݬwi;nb[C~U|__fGeAgKaK,VOʙx߻o}p@zMUNCi{yF4-_S~#+/"@$&o=k#-PzxG92Eپڪ nE nkQ ޥD {\Ru>MeQ?A6$@N|rRnbK fm&/ 0 Se/

    ITr2,T a¡ 0@v-FḾuQ? [2GҀ} )8iE)\}@ոgs_ wrr$Q!&Aֶg N9(zD̵~]VKhZmm-rZ\ᇋ`oIv5=Vѩ1mJ<βͦu8{՗IST֘sm*P̌ʂ7ΐZ^$ؖs=7Kzďbl=u_>g:J.'V >3Kuy 9ZS=4=ZNGB0e甧FNF$_nz5bų^?KH2n!o h}A2׹Oe}nwFu'4xBr,]&36!׽͛+m?OwmCi&+ams]L`rT>)ӔL Ӕx؋)Jd=iO&a'c `u"u2<7:"b"0kjLs@6˝Y^ $ WsЋ]wsМ#`FF4!iͳhm h,0c|dķADٍzt.jzAh{Q?`{8j+7%8*4h2u2¾ <|_|F@'y[ζiN9kziT"ĿX5"BkvidN3uZ3{Z~ޟ.! U[YA7[0tji&DOkV.{94q3E JbSF`uýTLu8)`9 cY*-=bt}l)*I:sŚho=Rܡccs60!>ǎà?#[ m#b]ldX~Pi#{f%LǮϪ j&"d12^,0Ohd Δ'ޓTxqa1@*ִN}zF[ @rrgOHt2o?BOӈ. CGC%t zX{m2 [%*6B-ʟCn1"iwkVZȒKs|[ȽZ0G.Z$21FXU,$)&6iL]hkqi1CsbwΗ@,gߦoܬ!xBgtOWe9 OLfT9Cu*8| 9y)=E5뜞߬l*(f#QP/ɾ%u#Y'ʓ]9Z>DkY-8'Y%uYBWǞN ~>y}8f+Y\XK*Q~zWIyOgـ䪖\Uv.憊q8,ع?Nc7pwy ]?X &\QȄ7>{#bo i8VwQO|֥tf[Oz LdV%?UNeoCqh 5/ \b|eKwW^IsVu]yX8r*|9SW煘F0%|7" gZQOI~]>^ 9Qva,F],4y2M:nI''[)'ۗq)Ukˏ~8#Z/UHJ¨yFG:7KƌKT7q+/v/Qt|9*+ F:EZy\mkYyt(?F"ϾY@%2H|Vw.;E7,{((iIGUFJmaL D@!-<WU"ڰ2&Y}r` 7GJTV(?֋ti-)$b܈~dm*omÏr/=; ʐ6l*ō茂,UvSf̣/m\rL^JF1q"ݛ -#^ʐwk_j,F5[=g2-VK\H1&a_Z$G9EB8þuwt 1GS |y|VaZO/C-ubbb͵O)*@{7gAnH[+ԡq)L/MbY@AW3M1kz$q&&c Z2ad/Xesԭ%jW~6%foyRD:Y 8Cc,@X,Ek^tsC7n*v(N_My!rE:6.Yѥ/ãȂ<40?wĢp30r#>LuE>dW7kx=oJZoU#C'|OUXnt|p%ۈ:N&z"zG5'UBB a:ĽՁʎuD)8yYY.;{fY}^U{ _U טRDŽ>߭Vרܭ%c_֢?8\`MV{o=xH'-RRzi>r:'٘bJm9:0**haߟdYn<CFЗR0>)b6tZ2%4P+CN_("HދA&*PV׫ %KC(T$q:`O.IX^y腾d}34RĘLG7srEoLB:6kޑŞ ]ǠMrӣփܹH-af5!?u*q=(ry˭JNnsYoL d1ۖc䴑B2T(nXPb71F9?e0vMnSdx"˞m Yߝdlg @wvI,{d}C% 5Lu==%u ݫ DdDP(!g૛/wJ]Dgډ?iy}*.NƷ(8c9_9r$}H32uu*Pԭl qRhTJUȌ`癑"BOQfNe˛4KC ξ'F_e'hoQ#k(9$|h\H8ҕN_4CzeGz5~O,L겝gbge&24p?kHi_0.u4 S~TicPo˖S6D_@,:=ͺ!!7W$ ncsb~RʻsD4@dYM9V7zg ?Õx̦A ?Eݐ7; {FlZYm>J_=c 5(9Y^%z%ؽ?[1u3LQM<-Xlsӑ1jo~d91`~C[`v.݉]G.KٹG7Mܤ*cF !ElW&JHT=$0)k1[X"M;r!rbqe$E^-~6COMh78̍iuİrk@*o1XU*MElapji3HFQi;XL< vAA8NV3`Bױr-~%[uk ыzUɃ$c)޹|ͫbdcӊہNk*=^<$Zm90qlȹ2qcԵtAyԝ,?ihgIH N^voZnPi-/YUiK]*Q* K0F7kN[PbH͏þ >k0y Z?𤧋8׻i^oН6S r)h+~MHkVطV!s:>Qu&QַUN(cXYW7?6K|Qܘu0ͳL`_ ^rx(ofA$eT0 U0 $ram1*D{(ά&8(_ԍ$YWLbm~>{{ic;RHET3ZG;IW rWV(')BIZhd߶sL,lkZDZjg1#I۷FA6.*_wNT#c?"Ck1I9D TT2EkGZ*|k~g?>wi *h!R Žk.dl'ҫD_݋rᬵO;x߶l 7snkF! p*bN^h=^k~)Y Vfmi2٨X'?:3 52$pzF.Hƅ@fJjVg%+o9 &ՓT;/b06DIRm@㤬9/^~"a<{X<%\}t8 Zl endstream endobj 92 0 obj << /Producer (pdfTeX-1.40.20) /Author()/Title()/Subject()/Creator(LaTeX with hyperref)/Keywords() /CreationDate (D:20211102165112-04'00') /ModDate (D:20211102165112-04'00') /Trapped /False /PTEX.Fullbanner (This is pdfTeX, Version 3.14159265-2.6-1.40.20 (TeX Live 2019/Debian) kpathsea version 6.3.1) >> endobj 2 0 obj << /Type /ObjStm /N 79 /First 599 /Length 2786 /Filter /FlateDecode >> stream xZ[s۶~ׯcJ1THTi 6)4mKիi{lԆX~Ró #?Ͼ`넸V!px3>' 5}+7( їpI{xZ>MHuA$/H;(@0+`^f@0r[%D!LUB#6Pm'~BKJV.U{>w8]yFyKC{|{0,׆SdՏq@aO@X>_={+z+/Z)Vp85i/NY:¸ vaܙ\3guv6{SÄ7.o;~ȏ1 >W|ȇŸk~oxOxI]Vň5o,C?[ѳX^s?Ǔct*8!w`޹'`RJJQ 8o=;f,䟖g 8od?qd VGƿ_YY,;]DU HO;o YQm6г0<PF 3\b=,>?Q?] $v،+Ēԝ+\vw˯ ]>|>|5~ƃ_hf8R%y$?1_٠vtQlZ0zC+ Eݴ8_n'jйgMć LNcz B'B0)0`X"~wM(lpO:"iV)楱 7ayq#t}e+\rNrDFۉ6&T=S9GuN'E5, Ac)$o̰iUM3j17lݝGiGPjs6yӥvu[-BxjQ^{܉r͖d=_yt'l5e4[Ͷq5lO1j`f.ANԡ9 }3@emu]uRf'})f­lnWW;3&1"]e3tպmuҚ)F|\V4GV<(_ Gza_zP[  <546D8059775A27D4DFD398DA14274DC0>] /Length 228 /Filter /FlateDecode >> stream x7NCAsd? 9 &CDDf=PVktg4ҕDJd?DuSz4@# d  =:av`09+Sz000+}Pa&|#S03V;faa aVb-֬u+b}Fmڴ+Q[pppp_x endstream endobj startxref 95845 %%EOF biocViews/inst/doc/createReposHtml.Rnw0000644000175000017500000001565314136047116017666 0ustar nileshnilesh%\VignetteIndexEntry{biocViews-CreateRepositoryHTML} % % NOTE -- ONLY EDIT THE .Rnw FILE!!! The .tex file is % likely to be overwritten. % \documentclass[12pt]{article} \usepackage{amsmath} \usepackage[authoryear,round]{natbib} \usepackage{hyperref} \textwidth=6.2in \textheight=8.5in \oddsidemargin=.1in \evensidemargin=.1in \headheight=-.3in \newcommand{\scscst}{\scriptscriptstyle} \newcommand{\scst}{\scriptstyle} \newcommand{\Rfunction}[1]{{\texttt{#1}}} \newcommand{\Robject}[1]{{\texttt{#1}}} \newcommand{\Rpackage}[1]{{\textit{#1}}} \newcommand{\Rmethod}[1]{{\texttt{#1}}} \newcommand{\Rfunarg}[1]{{\texttt{#1}}} \newcommand{\Rclass}[1]{{\textit{#1}}} \textwidth=6.2in \bibliographystyle{plainnat} \begin{document} %\setkeys{Gin}{width=0.55\textwidth} \title{HOWTO generate repository HTML} \author{S. Falcon} \maketitle <>= library("biocViews") @ \section{Overview} This document assumes you have a collection of R packages on local disk that you would like to prepare for publishing to the web. The end result we are going for is: \begin{enumerate} \item Packages organized per CRAN-style repository standard \item PACKAGES files created for install.packages access \item VIEWS file created for generating biocViews \item A vignette directory created containing the extracted vignette pdf files from each source package in the repository. \item An html directory created containing html descriptions of each package with links for downloading available artifacts. \item A simple alphabetical listing index.html file \end{enumerate} \section{CRAN-style Layout} Establish a top-level directory for the repository, we will refer to this directory as reposRoot. Place your packages as follows: \begin{description} \item[src/contrib] Contains all source packages (*.tar.gz). \item[bin/windows/contrib/x.y] Contains all win.binary packages (*.zip). Where x.y is the major.minor version number of R. \item[bin/macosx/contrib/x.y] Contains the mac.binary (High Sierra) (*.tgz) packages. \end{description} You will need the following parameters: <>= reposRoot <- "path/to/reposRoot" ## The names are essential contribPaths <- c(source="src/contrib", win.binary="bin/windows/contrib/4.0", mac.binary="bin/macosx/contrib/4.0") @ \section{Extracting vignettes} The \Rfunction{extractVignettes} function extracts pdf files from inst/doc. The default is to extract to a reposRoot/vignettes. <>= extractVignettes(reposRoot, contribPaths["source"]) @ \section{Generating the control files} The \Rfunction{genReposControlFiles} function will generate the PACKAGES files for each contrib path and also create a VIEWS file with complete info for later use by biocViews. <>= genReposControlFiles(reposRoot, contribPaths) @ \section{Generating the HTML} The \Rfunction{writeRepositoryHtml} will generate HTML detail files for each package in reposRoot/html. The function will also create an index.html file at the top level. Two CSS files are included with \Rpackage{biocViews} that are automatically copied along side the appropriate HTML files during the HTML generation process. These CSS files are: \begin{verbatim} reposRoot/repository-detail.css reposRoot/html/package-detail.css \end{verbatim} \section{Design and extension notes} The basic idea is that using the VIEWS file and the known repository structure (location of packages and extracted vignettes), we represent the details for each package in the repository in a \Rclass{PackageDetail-class} instance. \Rclass{packageDetail-class} objects know how to write themselves to HTML using the \Rmethod{htmlValue} method. We used the \Rpackage{XML} package's \Rfunction{xmlOutputDOM} function to build up the HTML documents. Each HTML producing class extends \Rclass{Htmlized-class} which contains a slot to hold the DOM tree and provides a place to put methods that are not specific to any given HTML outputting class. In terms of extending this to generate the biocViews, have a look at \Rfunction{setDependsOnMeImportsMeSuggestsMe} which builds up an adjacency matrix representing package dependencies, importations, and suggestions. The matrix is square with rows and columns labeled with the names of the packages. The entries are 0/1 with $a_{ij}=1$ meaning that package $j$ depends on package $i$. \subsection{Details on HTML generation} I started by breaking the \Rmethod{htmlValue} method for \Rclass{PackageDetail-class} into one helper function for each logical section of the HTML we produce (author, description, details, downloads, and vignettes). That made the long method short enough to be readable. In order to be able to mix and match the different chunks and be able to more easily create new renderings, it seemed that it would be easiest to be able to render to HTML each chunk with a method. One possibility is a function \Rfunction{htmlChunk(object, ``descriptions'')} where the dispatch would be done using a switch statement or similar. A more flexible approach is to create dummy classes for each output ``chunk''. Each dummy class contains (subclasses) \Rclass{PackageDescription} and that's it. We then can take advantage of the behavior of the \Rmethod{as} method to convert. <>= ## Define classes like this for each logical document chunk setClass("pdAuthorMaintainerInfo", contains="PackageDetail") setClass("pdVignetteInfo", contains="PackageDetail") ## Then define a htmlValue method setMethod("htmlValue", signature(object="pdDescriptionInfo"), function(object) { node <- xmlNode("p", cleanText(object@Description), attrs=c(class="description")) node }) ## Then you can make use of all this... ## Assume object contains a PackageDetail instance authorInfo <- as(object, "pdAuthorMaintainerInfo") dom$addNode(htmlValue(authorInfo)) @ One advantage of this setup is that we can now define a method to generate complete HTML documents that will work for all the dummy classes. Hence mix and match. \subsection{A note on the htmlValue method for PackageDetail} We could parameterize as follows. Not sure this makes things easier to follow, but it does demonstrate how you could start building up documents in a more programatic fashion. \begin{verbatim} details <- list(heading=list(tag="h3", text="Details"), content="pdDetailsInfo") downloads <- list(heading=list(tag="h3", text="Download Package"), content="pdDownloadInfo") vignettes <- list(heading=list(tag="h3", text="Vignettes (Documentation)"), content="pdVignetteInfo") doSection <- function(sec) { dom$addTag(sec$heading$tag, sec$heading$text) secObj <- as(object, sec$content) dom$addNode(htmlValue(secObj)) } lapply(list(details, downloads, vignettes), doSection) \end{verbatim} \end{document} biocViews/inst/doc/createReposHtml.R0000644000175000017500000000350114140322302017272 0ustar nileshnilesh### R code from vignette source 'createReposHtml.Rnw' ################################################### ### code chunk number 1: createReposHtml.Rnw:41-42 ################################################### library("biocViews") ################################################### ### code chunk number 2: params ################################################### reposRoot <- "path/to/reposRoot" ## The names are essential contribPaths <- c(source="src/contrib", win.binary="bin/windows/contrib/4.0", mac.binary="bin/macosx/contrib/4.0") ################################################### ### code chunk number 3: extractVigs (eval = FALSE) ################################################### ## extractVignettes(reposRoot, contribPaths["source"]) ################################################### ### code chunk number 4: controlFiles (eval = FALSE) ################################################### ## genReposControlFiles(reposRoot, contribPaths) ################################################### ### code chunk number 5: exampleOfHtmlDesign (eval = FALSE) ################################################### ## ## Define classes like this for each logical document chunk ## setClass("pdAuthorMaintainerInfo", contains="PackageDetail") ## setClass("pdVignetteInfo", contains="PackageDetail") ## ## ## Then define a htmlValue method ## setMethod("htmlValue", signature(object="pdDescriptionInfo"), ## function(object) { ## node <- xmlNode("p", cleanText(object@Description), ## attrs=c(class="description")) ## node ## }) ## ## ## Then you can make use of all this... ## ## Assume object contains a PackageDetail instance ## authorInfo <- as(object, "pdAuthorMaintainerInfo") ## dom$addNode(htmlValue(authorInfo)) ## biocViews/inst/unitTests/0000755000175000017500000000000014136047116015320 5ustar nileshnileshbiocViews/inst/unitTests/test_biocViews.R0000644000175000017500000000350214136047116020434 0ustar nileshnileshtest_findBranchReadDot <- function() { checkException(.findBranchReadDot(current=c("ChipName"), branch="Software")) checkException(.findBranchReadDot(current=c("RNASeq","ChipName"), branch="Software")) checkException(.findBranchReadDot(current=c("Software"))) checkException(.findBranchReadDot( current=c("GUI, DNAMethylation, MethylationArray, IlluminaChip"), branch=c("Software","AnnotationData","ExperimentData"))) } test_recommendPackages <- function() { checkException(recommendPackages("")) checkException(recommendPackages(c("foo"))) checkException(recommendPackages(c("aCGH","Agilentchip"))) checkException(recommendPackages(c("aCGH","Agilentchip", "CancerData"))) pca <- recommendPackages(c("PrincipalComponent")) dr <- recommendPackages(c("DimensionReduction")) ans <- intersect(dr,pca) test <- recommendPackages(c("PrincipalComponent", "DimensionReduction")) checkEquals(length(test), length(ans)) checkIdentical(test, ans) test2 <- recommendPackages(c("PrincipalComponent", "DimensionReduction"), intersect.views=FALSE) checkEquals(length(unique(c(pca,dr))), length(test2)) ans <- recommendPackages(c("Principal")) checkEquals(length(ans), 0L) } test_guessPackageType <- function(){ checkIdentical(guessPackageType(character()), "Software") checkIdentical(guessPackageType(c("Clustering", "Classification")), "Software") checkIdentical(guessPackageType(c("Organism", "Homo_sapien")), "AnnotationData") checkIdentical(guessPackageType(c("TechnologyData", "SequencingData")), "ExperimentData") checkIdentical(guessPackageType(c("TechnologyData", "SequencingData", "Software")), "ExperimentData") } biocViews/inst/unitTests/test_citations.R0000644000175000017500000000530214136047116020477 0ustar nileshnilesh setup1 <- function() { t <- file.path(tempdir(), "testing") if (file.exists(t)) unlink(t, recursive=TRUE) dir.create(t) dir.create(file.path(t, "testrepos", "src", "contrib"), recursive=TRUE) dir.create(file.path(t, "maketarballs", "biocViews"), recursive=TRUE) dir.create(file.path(t, "destdir")) file.copy(system.file("DESCRIPTION", package="biocViews"), file.path(t, "maketarballs", "biocViews")) vers <- as.character(packageVersion("biocViews")) oldwd <- getwd() on.exit(setwd(oldwd)) setwd(file.path(t, "maketarballs")) tar(file.path(t, "testrepos", "src", "contrib", paste0("biocViews_", vers, ".tar.gz")), "biocViews", compression="gzip" ) } test_citation_from_description <- function() { setup1() t <- file.path(tempdir(), "testing") unlink(file.path(t, "destdir"), recursive=TRUE) dir.create(file.path(t, "destdir")) extractCitations(file.path(t, "testrepos"), "src/contrib", file.path(t, "destdir")) checkTrue(file.exists(file.path(t, "destdir", "biocViews", "citation.html"))) lines <- readLines(file.path(t, "destdir", "biocViews", "citation.html")) #browser() checkTrue(any(grepl("Categorized views", lines))) } setup2 <- function() { t <- file.path(tempdir(), "testing") if (file.exists(t)) unlink(t, recursive=TRUE) dir.create(t) dir.create(file.path(t, "testrepos", "src", "contrib"), recursive=TRUE) dir.create(file.path(t, "maketarballs", "biocViews2", "inst"), recursive=TRUE) dir.create(file.path(t, "destdir")) file.copy(system.file("DESCRIPTION", package="biocViews"), file.path(t, "maketarballs", "biocViews2")) file.copy(system.file("unitTests", "CITATION-tmpl", package="biocViews"), file.path(t, "maketarballs", "biocViews2", "inst", "CITATION")) vers <- as.character(packageVersion("biocViews")) oldwd <- getwd() on.exit(setwd(oldwd)) setwd(file.path(t, "maketarballs")) tar(file.path(t, "testrepos", "src", "contrib", paste0("biocViews2_", vers, ".tar.gz")), "biocViews2", compression="gzip" ) } test_citation_from_citation <- function() { setup2() t <- file.path(tempdir(), "testing") unlink(file.path(t, "destdir"), recursive=TRUE) dir.create(file.path(t, "destdir")) extractCitations(file.path(t, "testrepos"), "src/contrib", file.path(t, "destdir")) checkTrue(file.exists(file.path(t, "destdir", "biocViews2", "citation.html"))) lines <- readLines(file.path(t, "destdir", "biocViews2", "citation.html")) checkTrue(any(grepl("Open software development", lines))) } biocViews/inst/unitTests/CITATION-tmpl0000644000175000017500000000121214136047116017423 0ustar nileshnileshcitEntry(entry="Article", author ="Robert C Gentleman and Vincent J. Carey and Douglas M. Bates and others", title ="Bioconductor: Open software development for computational biology and bioinformatics", journal = "Genome Biology", volume = "5", year = "2004", pages = "R80", url = "http://genomebiology.com/2004/5/10/R80", textVersion = paste( "Bioconductor: Open software development for computational biology", "and bioinformatics", "R. Gentleman, V. J. Carey, D. M. Bates, B.Bolstad, M. Dettling, S. Dudoit, B. Ellis, L. Gautier, Y. Ge, and others", "2004, Genome Biology, Vol. 5, R80") ) biocViews/inst/dot/0000755000175000017500000000000014136047116014104 5ustar nileshnileshbiocViews/inst/dot/biocViewsVocab.dot0000644000175000017500000004256414136047116017534 0ustar nileshnilesh/* Bioc Views Vocabular Definition in dot format */ /* How To Process this file: 1. Use dot2gxl from graphviz to transform into GXL format. dot2gxl biocViewsVocab.dot > biocViewsVocab.gxl 2. use graph::fromGXL to obtain a graphNEL object */ digraph G { /**************************************************************************** * Software * ****************************************************************************/ BiocViews -> Software; /* Software -> AssayDomain */ Software -> AssayDomain; AssayDomain -> aCGH; AssayDomain -> CellBasedAssays; AssayDomain -> ChIPchip; AssayDomain -> CopyNumberVariation; AssayDomain -> CpGIsland; AssayDomain -> DNAMethylation; AssayDomain -> ExonArray; AssayDomain -> GeneExpression; AssayDomain -> GeneticVariability; AssayDomain -> SNP; AssayDomain -> Transcription; /* Software -> Technology */ Software -> Technology; Technology -> Sequencing; Sequencing -> ATACSeq; Sequencing -> DNASeq; Sequencing -> DNaseSeq; Sequencing -> RiboSeq; Sequencing -> RNASeq; Sequencing -> ChIPSeq; Sequencing -> RIPSeq; Sequencing -> MethylSeq; Sequencing -> ExomeSeq; Sequencing -> miRNA; Sequencing -> SangerSeq; Sequencing -> SmallRNA; Sequencing -> Microbiome; Sequencing -> WholeGenome; Sequencing -> DenovoGenome; Sequencing -> TargetedResequencing; Sequencing -> DenovoTranscriptome; Sequencing -> MicrobialStrain; Sequencing -> HiC; Sequencing -> PooledScreens; Sequencing -> MNaseSeq; Technology -> Microarray; Microarray -> MultiChannel; Microarray -> OneChannel; Microarray -> TwoChannel; Microarray -> MethylationArray; Microarray -> GenotypingArray; Microarray -> MicroRNAArray; Microarray -> mRNAMicroarray; Microarray -> ChipOnChip; Microarray -> ReversePhaseProteinArray; Microarray -> TissueMicroarray; Microarray -> ProprietaryPlatforms; Technology -> FlowCytometry; Technology -> MassSpectrometry; MassSpectrometry -> ImagingMassSpectrometry; Technology -> qPCR; Technology -> MicrotitrePlateAssay; Technology -> SAGE; Technology -> CRISPR; Technology -> SingleCell; Technology -> Spatial; Technology -> ddPCR; Technology -> AnnotationHubSoftware; Technology -> ExperimentHubSoftware; /* Software -> ResearchFields */ Software -> ResearchField; ResearchField -> CellBiology; ResearchField -> Genetics; ResearchField -> Metabolomics; ResearchField -> Metagenomics; ResearchField -> Proteomics; ResearchField -> Lipidomics; ResearchField -> Epigenetics; ResearchField -> Phylogenetics; ResearchField -> Pharmacogenomics; ResearchField -> Pharmacogenetics; ResearchField -> Cheminformatics; ResearchField -> StructuralGenomics; ResearchField -> StructuralPrediction; ResearchField -> Biophysics; ResearchField -> MathematicalBiology; ResearchField -> BiomedicalInformatics; ResearchField -> ComparativeGenomics; ResearchField -> FunctionalGenomics; ResearchField -> SystemsBiology; ResearchField -> ComputationalChemistry; ResearchField -> Agroinformatics; ResearchField -> Transcriptomics; ResearchField -> ImmunoOncology; ResearchField -> Epitranscriptomics; /* Software -> BiologicalQuestion */ Software -> BiologicalQuestion; BiologicalQuestion -> AlternativeSplicing; BiologicalQuestion -> Coverage; BiologicalQuestion -> DemethylateRegionDetection; BiologicalQuestion -> DenovoAssembler; BiologicalQuestion -> DifferentialDNA3DStructure; BiologicalQuestion -> DifferentialExpression; BiologicalQuestion -> DifferentialMethylation; BiologicalQuestion -> DifferentialPeakCalling; BiologicalQuestion -> DifferentialSplicing; BiologicalQuestion -> DNA3DStructure; BiologicalQuestion -> DriverMutation; BiologicalQuestion -> FunctionalPrediction; BiologicalQuestion -> GeneFusionDetection; BiologicalQuestion -> GenePrediction; BiologicalQuestion -> GeneRegulation; BiologicalQuestion -> GeneSetEnrichment; BiologicalQuestion -> GeneSignaling; BiologicalQuestion -> GeneTarget; BiologicalQuestion -> GenomeAssembly; BiologicalQuestion -> GenomeWideAssociation; BiologicalQuestion -> GenomicVariation; BiologicalQuestion -> GenomeAnnotation; BiologicalQuestion -> GermlineMutation; BiologicalQuestion -> HistoneModification; BiologicalQuestion -> IndelDetection; BiologicalQuestion -> LinkageDisequilibrium; BiologicalQuestion -> MetagenomeAssembly; BiologicalQuestion -> MicrosatelliteDetection; BiologicalQuestion -> MotifAnnotation; BiologicalQuestion -> MotifDiscovery; BiologicalQuestion -> NetworkEnrichment; BiologicalQuestion -> NetworkInference; BiologicalQuestion -> NucleosomePositioning; BiologicalQuestion -> PeakDetection; BiologicalQuestion -> QuantitativeTrailLocus; BiologicalQuestion -> Scaffolding; BiologicalQuestion -> SequenceMatching; BiologicalQuestion -> SomaticMutation; BiologicalQuestion -> SplicedAlignment; BiologicalQuestion -> StructuralVariation; BiologicalQuestion -> TranscriptomeVariant; BiologicalQuestion -> VariantAnnotation; BiologicalQuestion -> VariantDetection; /* Software -> WorkflowStep */ Software -> WorkflowStep; WorkflowStep -> ExperimentalDesign; WorkflowStep -> Alignment; Alignment -> MultipleSequenceAlignment; WorkflowStep -> Annotation; WorkflowStep -> BatchEffect; WorkflowStep -> MultipleComparison; WorkflowStep -> Normalization; WorkflowStep -> Pathways; Pathways -> GO; Pathways -> KEGG; Pathways -> Reactome; Pathways -> BioCarta; Pathways -> NCINatureCurated; WorkflowStep -> Preprocessing; WorkflowStep -> QualityControl; WorkflowStep -> ReportWriting; WorkflowStep -> Visualization; Visualization -> Network; WorkflowStep -> GenomeBrowsers; /* Software -> StatisticalMethod */ Software -> StatisticalMethod; StatisticalMethod -> Bayesian; StatisticalMethod -> Classification; StatisticalMethod -> Clustering; StatisticalMethod -> DecisionTree; StatisticalMethod -> DimensionReduction; StatisticalMethod -> FeatureExtraction; StatisticalMethod -> GraphAndNetwork; StatisticalMethod -> HiddenMarkovModel; StatisticalMethod -> MultidimensionalScaling; StatisticalMethod -> NeuralNetwork; StatisticalMethod -> PatternLogic; StatisticalMethod -> PrincipalComponent; StatisticalMethod -> Regression; StatisticalMethod -> StructuralEquationModels; StatisticalMethod -> SupportVectorMachine; StatisticalMethod -> Survival; StatisticalMethod -> TimeCourse; /* Software -> Infrastructure */ Software -> Infrastructure; Infrastructure -> ThirdPartyClient Infrastructure -> DataImport; Infrastructure -> DataRepresentation; Infrastructure -> GUI; /* Software -> ShinyApps */ Software -> ShinyApps; /**************************************************************************** * AnnotationData * ****************************************************************************/ BiocViews -> AnnotationData; /* AnnotationData -> Organism */ AnnotationData -> Organism; Organism -> Anopheles_gambiae; Organism -> Apis_mellifera; Organism -> Arabidopsis_lyrata; Organism -> Arabidopsis_thaliana; Organism -> Asparagus_officinalis; Organism -> Bacillus_subtilis; Organism -> Bos_taurus; Organism -> Caenorhabditis_elegans; Organism -> Callithrix_jacchus; Organism -> Canis_familiaris; Organism -> Cicer_arietinum; Organism -> Ciona_intestinalis; Organism -> Chlamydomonas_reinhardtii; Organism -> Danio_rerio; Organism -> Drosophila_melanogaster; Organism -> Drosophila_virilis; Organism -> Eremothecium_gossypii; Organism -> Escherichia_coli; Organism -> Gallus_gallus; Organism -> Gasterosteus_aculeatus; Organism -> Glycine_max; Organism -> Homo_sapiens; Organism -> Hordeum_vulgare; Organism -> Kluyveromyces_lactis; Organism -> Macaca_fascicularis; Organism -> Macaca_mulatta; Organism -> Magnaporthe_grisea; Organism -> Medicago_truncatula; Organism -> Monodelphis_domestica; Organism -> Mus_musculus; Organism -> Neurospora_crassa; Organism -> Oncorhynchus_mykiss; Organism -> Oryza_sativa; Organism -> Pan_paniscus; Organism -> Pan_troglodytes; Organism -> Plasmodium_falciparum; Organism -> Pseudomonas_aeruginosa; Organism -> Rattus_norvegicus; Organism -> Saccharomyces_cerevisiae; Organism -> Saccharum_officinarum; Organism -> Schizosaccharomyces_pombe; Organism -> Staphylococcus_aureus; Organism -> Sus_scrofa; Organism -> Taeniopygia_guttata; Organism -> Toxoplasma_gondii; Organism -> Triticum_aestivum; Organism -> Vitis_vinifera; Organism -> Xenopus_laevis; Organism -> Xenopus_tropicalis; Organism -> Zea_mays; /* AnnotationData -> ChipManufacturer */ AnnotationData -> ChipManufacturer; ChipManufacturer -> AffymetrixChip; ChipManufacturer -> AgilentChip; ChipManufacturer -> ClonetechChip; ChipManufacturer -> GEChip; ChipManufacturer -> INDACChip; ChipManufacturer -> IlluminaChip; ChipManufacturer -> QiagenChip; ChipManufacturer -> RNG_MRCChip; ChipManufacturer -> RocheChip; ChipManufacturer -> UniversityHealthNetwork; ChipManufacturer -> CodelinkChip; /* AnnotationData -> CustomCDF */ AnnotationData -> CustomCDF; CustomCDF -> GACustomCDF; CustomCDF -> MBNICustomCDF; /* AnnotationData -> CustomArray */ AnnotationData -> CustomArray; /* AnnotationData -> CustomDBSchema */ AnnotationData -> CustomDBSchema; CustomDBSchema -> GeneCardsCustomSchema; /* AnnotationData -> FunctionalAnnotation */ AnnotationData -> FunctionalAnnotation; /* AnnotationData -> SequenceAnnotation */ AnnotationData -> SequenceAnnotation; SequenceAnnotation -> GenomicSequence; /* AnnotationData -> ChipName */ AnnotationData -> ChipName; ChipName -> adme16cod; ChipName -> ag; ChipName -> ath1121501; ChipName -> celegans; ChipName -> drosgenome1; ChipName -> drosophila2; ChipName -> h10kcod; ChipName -> h20kcod; ChipName -> hcg110; ChipName -> hgfocus; ChipName -> hgu133a2; ChipName -> hgu133a; ChipName -> hgu133b; ChipName -> hgu133plus2; ChipName -> hgu95a; ChipName -> hgu95av2; ChipName -> hgu95b; ChipName -> hgu95c; ChipName -> hgu95d; ChipName -> hgu95e; ChipName -> hguatlas13k; ChipName -> hgug4100a; ChipName -> hgug4101a; ChipName -> hgug4110b; ChipName -> hgug4111a; ChipName -> hgug4112a; ChipName -> hguqiagenv3; ChipName -> hi16cod; ChipName -> hs25kresogen; ChipName -> hu35ksuba; ChipName -> hu35ksubb; ChipName -> hu35ksubc; ChipName -> hu35ksubd; ChipName -> hu6800; ChipName -> HuO22; ChipName -> hwgcod; ChipName -> indac; ChipName -> illuminaHumanv1; ChipName -> illuminaHumanv2; ChipName -> illuminaMousev1; ChipName -> illuminaMousev1p1; ChipName -> illuminaRatv1; ChipName -> JazaerimetaData; ChipName -> lumiHumanV1; ChipName -> lumiMouseV1; ChipName -> lumiHumanV2; ChipName -> lumiRatV1; ChipName -> m10kcod; ChipName -> m20kcod; ChipName -> mi16cod; ChipName -> mm24kresogen; ChipName -> mgu74a; ChipName -> mgu74av2; ChipName -> mgu74b; ChipName -> mgu74bv2; ChipName -> mgu74c; ChipName -> mgu74cv2; ChipName -> mguatlas5k; ChipName -> mgug4121a; ChipName -> mgug4122a; ChipName -> moe430a; ChipName -> moe430b; ChipName -> mouse4302; ChipName -> mouse430a2; ChipName -> mpedbarray; ChipName -> mu11ksuba; ChipName -> mu11ksubb; ChipName -> mu19ksuba; ChipName -> mu19ksubb; ChipName -> mu19ksubc; ChipName -> Mu15v1; ChipName -> Mu22v3; ChipName -> mwgcod; ChipName -> Norway981; ChipName -> OperonHumanV3; ChipName -> pedbarrayv9; ChipName -> pedbarrayv10; ChipName -> PartheenMetaData; ChipName -> r10kcod; ChipName -> rae230a; ChipName -> rae230b; ChipName -> rat2302; ChipName -> rgu34a; ChipName -> rgu34b; ChipName -> rgu34c; ChipName -> rgug4130a; ChipName -> ri16cod; ChipName -> rnu34; ChipName -> Roberts2005Annotation; ChipName -> rtu34; ChipName -> rwgcod; ChipName -> SHDZ; ChipName -> u133x3p; ChipName -> xenopuslaevis; ChipName -> yeast2; ChipName -> ygs98; ChipName -> zebrafish; ChipName -> hcgi12k; ChipName -> hcgi8k; /* AnnotationData -> PackageType */ AnnotationData -> PackageType; PackageType -> BSgenome; PackageType -> cdf; PackageType -> ChipDb; PackageType -> db0; PackageType -> InparanoidDb; PackageType -> OrganismDb; PackageType -> OrgDb; PackageType -> PolyPhen; PackageType -> probe; PackageType -> SIFT; PackageType -> SNPlocs; PackageType -> XtraSNPlocs; PackageType -> TxDb; PackageType -> MeSHDb; PackageType -> FRMA; PackageType -> AnnotationHub; PackageType -> EuPathDB; /**************************************************************************** * ExperimentData * ****************************************************************************/ BiocViews -> ExperimentData; ExperimentData -> ReproducibleResearch; ExperimentData -> SpecimenSource; SpecimenSource -> Tissue; SpecimenSource -> Proteome; SpecimenSource -> Genome; SpecimenSource -> StemCell; SpecimenSource -> CellCulture; SpecimenSource -> Germline; SpecimenSource -> Somatic; ExperimentData -> OrganismData; OrganismData -> Anopheles_gambiae_Data; OrganismData -> Apis_mellifera_Data; OrganismData -> Arabidopsis_lyrata_Data; OrganismData -> Arabidopsis_thaliana_Data; OrganismData -> Bacillus_subtilis_Data; OrganismData -> Bos_taurus_Data; OrganismData -> Caenorhabditis_elegans_Data; OrganismData -> Callithrix_jacchus_Data; OrganismData -> Canis_familiaris_Data; OrganismData -> Ciona_intestinalis_Data; OrganismData -> Danio_rerio_Data; OrganismData -> Drosophila_melanogaster_Data; OrganismData -> Drosophila_virilis_Data; OrganismData -> Eremothecium_gossypii_Data; OrganismData -> Escherichia_coli_Data; OrganismData -> Gallus_gallus_Data; OrganismData -> Gasterosteus_aculeatus_Data; OrganismData -> Glycine_max_Data; OrganismData -> Homo_sapiens_Data; OrganismData -> Hordeum_vulgare_Data; OrganismData -> Kluyveromyces_lactis_Data; OrganismData -> Macaca_mulatta_Data; OrganismData -> Magnaporthe_grisea_Data; OrganismData -> Medicago_truncatul_Data; OrganismData -> Monodelphis_domestica_Data; OrganismData -> Mus_musculus_Data; OrganismData -> Neurospora_crassa_Data; OrganismData -> Oncorhynchus_mykiss_Data; OrganismData -> Oryza_sativa_Data; OrganismData -> Pan_paniscus_Data; OrganismData -> Pan_troglodytes_Data; OrganismData -> Plasmodium_falciparum_Data; OrganismData -> Pseudomonas_aeruginosa_Data; OrganismData -> Rattus_norvegicus_Data; OrganismData -> Saccharomyces_cerevisiae_Data; OrganismData -> Saccharum_officinarum_Data; OrganismData -> Schizosaccharomyces_pombe_Data; OrganismData -> Staphylococcus_aureus_Data; OrganismData -> Sus_scrofa_Data; OrganismData -> Taeniopygia_guttata_Data; OrganismData -> Triticum_aestivum_Data; OrganismData -> Vitis_vinifera_Data; OrganismData -> Xenopus_laevis_Data; OrganismData -> Xenopus_tropicalis_Data; OrganismData -> Zea_mays_Data; ExperimentData -> DiseaseModel; DiseaseModel -> CancerData; CancerData -> BreastCancerData; CancerData -> ColonCancerData; CancerData -> KidneyCancerData; CancerData -> LeukemiaCancerData; CancerData -> LungCancerData; CancerData -> OvarianCancerData; CancerData -> ProstateCancerData; CancerData -> LeukemiaCancerData; DiseaseModel -> HIVData; DiseaseModel -> COPDData; ExperimentData -> TechnologyData; TechnologyData -> FlowCytometryData; TechnologyData -> HighThroughputImagingData; TechnologyData -> MassSpectrometryData; MassSpectrometryData -> ImagingMassSpectrometryData; TechnologyData -> qPCRData; TechnologyData -> MicrotitrePlateAssayData; TechnologyData -> SAGEData; TechnologyData -> CGHData; TechnologyData -> SequencingData; SequencingData -> DNASeqData; SequencingData -> RNASeqData; SequencingData -> ChIPSeqData; SequencingData -> RIPSeqData; SequencingData -> MethylSeqData; SequencingData -> ExomeSeqData; SequencingData -> miRNAData; SequencingData -> SangerSeqData; SequencingData -> SmallRNAData; SequencingData -> MicrobiomeData; SequencingData -> SingleCellData; SequencingData -> SpatialData; TechnologyData -> MicroarrayData; MicroarrayData -> MultiChannelData; MicroarrayData -> OneChannelData; MicroarrayData -> TwoChannelData; MicroarrayData -> MethylationArrayData; MicroarrayData -> GenotypingArrayData; MicroarrayData -> MicroRNAArrayData; MicroarrayData -> mRNAArrayData; MicroarrayData -> ChipOnChipData; MicroarrayData -> ReversePhaseProteinArrayData; MicroarrayData -> TissueMicroarrayData; MicroarrayData -> ProprietaryPlatformsData; ExperimentData -> AssayDomainData; AssayDomainData -> CopyNumberVariationData; AssayDomainData -> CpGIslandData; AssayDomainData -> SNPData; AssayDomainData -> ExpressionData; ExperimentData -> RepositoryData; RepositoryData -> HapMap; RepositoryData -> GEO; RepositoryData -> ArrayExpress; RepositoryData -> NCI; RepositoryData -> PathwayInteractionDatabase; RepositoryData -> Project1000genomes; RepositoryData -> ENCODE; ExperimentData -> PackageTypeData; PackageTypeData -> ExperimentHub; PackageTypeData -> ImmunoOncologyData; /**************************************************************************** * ExperimentData * ****************************************************************************/ BiocViews -> Workflow; Workflow -> BasicWorkflow; Workflow -> AnnotationWorkflow; Workflow -> GeneExpressionWorkflow; Workflow -> SingleCellWorkflow; Workflow -> SpatialWorkflow; Workflow -> GenomicVariantsWorkflow; Workflow -> EpigeneticsWorkflow; Workflow -> ProteomicsWorkflow; Workflow -> ResourceQueryingWorkflow; Workflow -> DifferentialSplicingWorkflow; Workflow -> ImmunoOncologyWorkflow; } biocViews/inst/script/0000755000175000017500000000000014136047116014622 5ustar nileshnileshbiocViews/inst/script/revise_expt_biocViews.R0000644000175000017500000001062114136047116021314 0ustar nileshnilesh## Rscript for updating Experiment Data biocViews ## Dec 5th 2014. library(biocViews) rm(list=ls()) dirname <- "pkgs" ## read in all files and recommend new biocViews. pkgnames <- list.files(dirname) pkgnames <- pkgnames[!grepl("add_data.py",pkgnames)] pkgnames <- pkgnames[!grepl(".manifest$",pkgnames)] pkgnames <- pkgnames[!grepl("README.txt",pkgnames)] pkgnames <- pkgnames[!grepl("-meat.sh$",pkgnames)] genome_tbl <- rtracklayer::ucscGenomes(organism=TRUE) result <- lapply(pkgnames, function(x) { pkgdir <- file.path(dirname,x) message(x) ## add tryCatch! tryCatch({ a <- recommendBiocViews(pkgdir, branch = "ExperimentData") message(a) a }, error=function(err) { warning(x, ": ", conditionMessage(err)) }) }) current <- sapply(result, "[[", "current") recommended <- sapply(result, "[[", "recommended") remove <- sapply(result, "[[", "remove") df<- data.frame(pkgnames=pkgnames, current=current, recommended=recommended, remove=remove, stringsAsFactors =FALSE) webmap <- c( FlowCytometry="FlowCytometryData", RNAExpressionData="RNASeqData", miRNAoverexpression="miRNAData", NormalTissue="Tissue" ) final <- apply(df,1, function(z){ c1 <- unlist(strsplit(as.character(z[2]),", ")) c1 <- c(c1, as.character(webmap[c1][complete.cases(webmap[c1])])) rec <- unlist(strsplit(as.character(z[3]),", ")) rem <- unlist(strsplit(as.character(z[4]),", ")) fi <- unique(setdiff(c(c1,rec),rem)) paste(fi, collapse=", ") }) df2 <- data.frame(df, final=final, stringsAsFactors = FALSE ) terms <- getCurrentbiocViews() expt <- terms$ExperimentData nf <- lapply(as.character(df2$final), function(z) unlist(strsplit(z,", "))) qr <- table(unlist(nf)) mat <- data.frame(names(qr), as.integer(qr)) mat <- mat[order(mat[,2]), ] colnames(mat) <- c("Expt_biocViews", "occurence_in_pkgs") write.table(mat, "count_of_biocViews_dec5.txt", sep="\t", quote=FALSE, row.names=FALSE) ### update in svn rm(list=ls()) df2 <- read.table("df2_dec5.txt", sep="\t", header=TRUE, stringsAsFactors = FALSE) df2[64,5]<-"GEO" dirname <- file.path(getwd(),"pkgs") pkgnames <- list.files(dirname) pkgnames <- pkgnames[!grepl("add_data.py",pkgnames)] pkgnames <- pkgnames[!grepl(".manifest$",pkgnames)] pkgnames <- pkgnames[!grepl("README.txt",pkgnames)] pkgnames <- pkgnames[!grepl("-meat.sh$",pkgnames)] changemat <- matrix(ncol=5, nrow=length(pkgnames)) reviseVersions <- function(v) { vsp <- strsplit(v,"[.]") vsp$Version[3] <- as.integer(vsp$Version[3])+1 paste(vsp$Version,collapse=".") } change=TRUE for (i in 1:length(pkgnames)){ pkg <- pkgnames[i] pkgdir <- file.path(dirname, pkg,"DESCRIPTION") data <- read.dcf(pkgdir, keep.white = TRUE) fi <- colnames(data) rm(data) data <- read.dcf(pkgdir, keep.white = fi) b_ind <- which(colnames(data)=="biocViews") if(length(b_ind)==0){ oldbiocView <-"" message(pkg) message("No biocViews in this package!!") mat=matrix(df2[i,"final"],nrow=1,ncol=1) newbiocView <- mat colnames(mat)<-"biocViews" data <- cbind(data,mat) oldVersion <- data[,"Version"] newVersion <- reviseVersions(oldVersion) data[,"Version"] <- newVersion } else{ oldbiocView <- gsub("\n","",data[,"biocViews"]) newbiocView <- df2[i,"final"] if(oldbiocView!=newbiocView){ data[,"biocViews"] <- newbiocView } oldVersion <- data[,"Version"] newVersion <- reviseVersions(oldVersion) data[,"Version"] <- newVersion } changemat[i,1] <- pkg changemat[i,2] <- oldbiocView changemat[i,3] <- newbiocView changemat[i,4] <- oldVersion changemat[i,5] <- newVersion if(change) write.dcf(data, pkgdir, keep.white=fi) } write.table(changemat, "changemat_dec5.txt",sep="\t", col.names=c("pkg","oldbiocView","newbiocView","oldVer","newVer"), quote=FALSE, row.names=FALSE) ##get email id of maintainer to email emailist <- lapply(pkgnames, function(p){ message(p) pkgdir <- file.path(dirname, p,"DESCRIPTION") data <- read.dcf(pkgdir) data[,"Maintainer"] }) em <- unique(unlist(emailist)) em <- gsub("\n","",em) em2 <- paste(em, collapse=", ") biocViews/inst/script/reviseBiocViews.R0000644000175000017500000002523514136047116020064 0ustar nileshnilesh## this script contains functions used in devel -2.14 version ##-------------helper functions rm(list=ls()) biocViewMap <- function() { webmap <- c( AssayDomains=NA_character_, AssayTechnologies="Technology", Bioinformatics=NA_character_, BiologicalDomains=NA_character_, ConnectTools="ThirdPartyClient", Enrichment=NA_character_, GraphsAndNetworks="GraphAndNetwork", HighThroughputSequencing="Sequencing", Methylseq="MethylSeq", MultipleComparisons="MultipleComparison", NetworkAnalysis="Network", Networks="Network", NetworkVisualization="Visualization", Regulation=NA_character_, RNAseq="RNASeq", Sequences=NA_character_, Signaling= NA_character_ ) usermap <- c( AffymetrixChip="OneChannel", Affymetrix="OneChannel", BatchEffectAssessment="BatchEffect", ChiPseq="ChIPSeq", ChIPseq="ChIPSeq", ClusterValidation="Clustering", CopyNumberVariants="CopyNumberVariation", CNV="CopyNumberVariation", DataPreprocessing="Preprocessing", Design="ExperimentalDesign", DNAmethylation="DifferentialMethylation", DualChannel="TwoChannel", Flowcytometry="FlowCytometry", FlowCytData="FlowCytometry", `Flow cytometry`="FlowCytometry", `High Throughput Sequencing`="Sequencing", genetics="Genetics", HighTroughputSequencingData="Sequencing", HighThroughputSequencingData="Sequencing", Microarrays="Microarray", MicroArray="Microarray", microRNA="miRNA", MRNAMicroarray="mRNAMicroarray", `Multiple Comparisons`="MultipleComparison", RIPseq="RIPSeq", RNAExpressionData="DifferentialExpression", SequenceAnnotation="GenomeAnnotation", SequencingMatching="SequenceMatching", `SNP.`="SNP", Statistics="StatisticalMethod", Technology=NA_character_, Visualisation="Visualization", visualization="Visualization" ) c(webmap,usermap) } readPathFromManifest <- function(rpacks, manifest) { pkgs <- readLines(file.path(rpacks, manifest)) pkgs <- sub("Package:[[:space:]]*([[:alnum:]\\.]+)[[:space:]]*$", "\\1", pkgs[grepl("Package:", pkgs)]) fls <- sprintf(file.path(rpacks, "%s/DESCRIPTION"), pkgs) names(fls) <- pkgs fls <- fls[file.exists(fls)] } readbiocViewsFromRpacks <- function(fls) { otermsl <- lapply(fls, function(fl) { term <- read.dcf(fl, c("biocViews","BiocViews")) term <- term[!is.na(term)] if(length(term!=0)) strsplit(term, "[[:space:]]*,[[:space:]]*")[[1]] else NA_character_ }) pkgterm <- data.frame(pkg = rep(names(otermsl), sapply(otermsl, length)), term = unlist(unname(otermsl)), stringsAsFactors=FALSE) } generatebiocViewsMap <- function(pkgterm, map) { pkgterm$newterm <- pkgterm$term idx <- match(pkgterm$newterm, names(map)) pkgterm$newterm[!is.na(idx)] <- unname(map[pkgterm$newterm[!is.na(idx)]]) pkgterm } readVersionFromRpacks <- function(versionPath) { otermslVersion <- lapply(versionPath, function(ver) { dcf <- read.dcf(ver ) v <- package_version(dcf[, "Version"]) v0 = unclass(v) v0$Version[3] = v0$Version[3] +1 class(v0) = class(v) c(as.character(v),as.character(v0)) }) ver <- data.frame(matrix(unlist(otermslVersion), nrow=length(otermslVersion), byrow=T, dimnames=list(names(otermslVersion),c("oldVer","newVer")))) ver <- cbind(rownames(ver),ver ) names(ver)<- c("pkg","oldVer","newVer") rownames(ver) <- NULL ver } readDot <- function(fl) { dot <- readLines(fl) dot <- dot[seq(grep("BiocViews -> Software", dot), grep("BiocViews -> AnnotationData", dot) - 1)] sub(" *; *$", "", dot[grepl("^[[:space:][:alpha:]]+->", dot)]) } getPathfromPkgName<- function(fls, pkglist) { fls[which(names(fls) %in% pkglist)] } suggestbiocViews <- function(pkgterm, mer, biocViewdotfile, flag=TRUE,fls) { ##read in dot file to get new terms dot <- readDot(biocViewdotfile) dotterms <- unique(unlist(strsplit(dot, " *-> *"))) ##no biocViews? xx = sapply(split(is.na(pkgterm$newterm), pkgterm$pkg), function(elt) sum(elt) == length(elt)) any(xx) nobiocView <- xx[xx] names(nobiocView) pkgterm[which(pkgterm$pkg %in% names(xx[xx])),] #get the path for packages that do not have biocViews nobiocViewPath <- getPathfromPkgName(fls, names(nobiocView)) sugbiocView <- lapply(nobiocViewPath, function(x){ words <- unique(unlist(strsplit(read.dcf(x,c("Description","Title","Package"))," "))) idx <- which(tolower(dotterms) %in% tolower(words)) dotterms[idx] }) if(flag) { ##packages that have biocViews now! found <- sugbiocView[lapply(sugbiocView,length)>0] found <- lapply(found, function(x) paste(unlist(x),collapse=", " )) #add the suggested biocViews to mer. idx <- match(names(found), mer$pkg) mer[idx,3]<- as.character(found) }else{ #still do not have biocViews! realbad <- sugbiocView[lapply(sugbiocView,length)==0] #these files have no biocViews - manually add biocViews for them. mer <- mer[which(mer[,1] %in% names(realbad)),] } mer } ##--------main function newBiocViews <- function(manifest,rpacks,biocViewdotfile, makeChanges=FALSE, resfilename) { #The manifest file contains all the packages list. # Read in all package names from here. fls <- readPathFromManifest(rpacks, manifest) cat("Total no of packages :",length(fls) ) #get the biocViews from all packages in the repository pkgterm <- readbiocViewsFromRpacks(fls) ##read in changes map <- biocViewMap() ##map the new/suggested biocViews to existing biocViews pkgterm <- generatebiocViewsMap(pkgterm, map) ## comma sepearated biocViews yy = lapply(split(pkgterm, pkgterm$pkg), function(elt) { elt$term <- paste(elt$term,collapse=", ") elt$newterm <- paste(na.omit(elt$newterm),collapse=", ") unique(elt) }) #represnt as a data.frame yes <- do.call(rbind.data.frame,yy) ## which packages had no change in their biocViews? nochange2 <- yes[which(yes$term==yes$newterm),] cat("no of packages notchanges at all :",length(nochange2[,1]) ) ## which package had changes in their biocViews modified2 <- yes[which(yes$term!=yes$newterm),] cat("no of packages changed :",length(modified2[,2]) ) #get packages whose version has to be bumped versionfls<- modified2[,1] #get the path for each of these packages versionPath <- getPathfromPkgName(fls, versionfls) # data.frame with package name, old followed by new version number. versiondf <- readVersionFromRpacks(versionPath) #merging mer <- merge(modified2,versiondf, by="pkg") ##suggest biocViews for packages with no biocViews ## returns a data.frame for modified mer <- suggestbiocViews(pkgterm, mer, biocViewdotfile,flag=TRUE,fls) ## which packages are realbad? still do not have biocViews - just write to file badmer <- suggestbiocViews(pkgterm,mer, biocViewdotfile,flag=FALSE,fls) write.table(badmer,"badbiocViews.txt",sep="\t",quote=FALSE,row.names=FALSE) if(makeChanges) { ##how do we make the changes here? }else{ write.table(mer, resfilename, sep="\t",quote=FALSE,row.names=FALSE) } } makechanges<- function(filename) { #filename <- "revisebiocViews.txt" revisemat <- read.table(filename, sep="\t",header=TRUE, stringsAsFactors=FALSE) # no of packages to be changes pkglist <- nrow(revisemat) # first get the path for each package in file pkgpath <- file.path(rpacks,revisemat[,1],"DESCRIPTION") for (x in 1:nrow(revisemat)){ cat(x,"\n") # open the description file data <- read.dcf(pkgpath[x]) #bump the version number data[,"Version"] <- revisemat[x,"newVer"] ## four cases possible #1 - no biocViews eg:which(revisemat[,1]=="vtpnet") -476 #2 - BiocViews eg: which(revisemat[,1]=="PSICQUIC") -348 #3 - biocViews eg: which(revisemat[,1]=="a4") - 1 #4- bioViews eg: which(revisemat[,1]=="EBSeq") -139 wrongidx <- which(colnames(data) %in% c("BiocViews","bioViews","biocViews")) ## contains BiocViews or bioViews ( remove it!) if(length(wrongidx) != 0){ cat("I am in !") data <- data[1, -wrongidx,drop=FALSE] } ## add biocViews to pkg data <- cbind(data,"biocViews"=revisemat[x,"newterm"]) ##write to package write.dcf(data,file=pkgpath[x]) } } # usage # ## on rhino01 # ## devel # # rpacks <- file.path("~/biosrc/Rpacks") # manifest <- "bioc_2.14.manifest" # biocViewdotfile <- "biocViewsVocab.dot" # newBiocViews(manifest, rpacks, biocViewdotfile, # makeChanges=FALSE,"revisebiocViews-devel.txt") # # makechanges("revisebiocViews-devel.txt") # # ## on rhino01 # ## release # # rpacks <- file.path("~/Rpacks") # manifest <- "bioc_2.14.manifest" # biocViewdotfile <- "biocViewsVocab.dot" # newBiocViews(manifest, rpacks, biocViewdotfile, # makeChanges=FALSE,"revisebiocViews-release.txt") # # makechanges("revisebiocViews-release.txt") ##Modify biocViews to remove duplicate biocViews duplicatedbiocViews <- function(rpacks, filename) { revisemat <- read.table(filename, sep="\t", header=TRUE, stringsAsFactors=FALSE) pkglist <- nrow(revisemat) pkgpath <- file.path(rpacks,revisemat[,1],"DESCRIPTION") result <- lapply(pkgpath, function(fl) { u <- unique(unlist(strsplit(read.dcf(fl,"biocViews"),", "))) o <- unlist(strsplit(read.dcf(fl,"biocViews"),", ")) identical(o,u) }) pkgpath[which(result==FALSE)] } ##This function reads a character conatining old biocViews and returns ## the correspoponding new biocView terms. old2newbiocViews <- function(file) { terms <- read.dcf(file, c("biocViews","BiocViews")) old <- strsplit(terms, "[[:space:]]*,[[:space:]]*")[[1]] map <- biocViewMap() idx <- match(old, names(map)) newbiocView <- old newbiocView[!is.na(idx)] <- unname(map[newbiocView[!is.na(idx)]]) paste(newbiocView[complete.cases(newbiocView)],collapse=", ") } biocViews/inst/script/revisebiocViews2014.R0000644000175000017500000002012414136047116020423 0ustar nileshnilesh ## ----style, eval=TRUE, echo=FALSE, results="asis"------------------------ BiocStyle::latex() ## ----preliminaries, echo=FALSE------------------------------------------- rm(list=ls()) biocViewMap <- function() { webmap <- c( AssayDomains=NA_character_, AssayTechnologies="Technology", Bioinformatics=NA_character_, BiologicalDomains=NA_character_, ConnectTools="ThirdPartyClient", Enrichment=NA_character_, GraphsAndNetworks="GraphAndNetwork", HighThroughputSequencing="Sequencing", Methylseq="MethylSeq", MultipleComparisons="MultipleComparison", NetworkAnalysis="Network", Networks="Network", NetworkVisualization="Visualization", Regulation=NA_character_, RNAseq="RNASeq", Sequences=NA_character_, Signaling= NA_character_ ) usermap <- c( AffymetrixChip="OneChannel", Affymetrix="OneChannel", BatchEffectAssessment="BatchEffect", ChiPseq="ChIPSeq", ChIPseq="ChIPSeq", ClusterValidation="Clustering", CopyNumberVariants="CopyNumberVariation", CNV="CopyNumberVariation", DataPreprocessing="Preprocessing", Design="ExperimentalDesign", DNAmethylation="DifferentialMethylation", DualChannel="TwoChannel", Flowcytometry="FlowCytometry", FlowCytData="FlowCytometry", `Flow cytometry`="FlowCytometry", `High Throughput Sequencing`="Sequencing", genetics="Genetics", HighTroughputSequencingData="Sequencing", HighThroughputSequencingData="Sequencing", Microarrays="Microarray", MicroArray="Microarray", microRNA="miRNA", MRNAMicroarray="mRNAMicroarray", `Multiple Comparisons`="MultipleComparison", RIPseq="RIPSeq", RNAExpressionData="DifferentialExpression", SequenceAnnotation="GenomeAnnotation", SequencingMatching="SequenceMatching", `SNP.`="SNP", Statistics="StatisticalMethod", Technology=NA_character_, Visualisation="Visualization", visualization="Visualization" ) c(webmap,usermap) } readPathFromManifest <- function(rpacks, manifest) { pkgs <- readLines(file.path(rpacks, manifest)) pkgs <- sub("Package:[[:space:]]*([[:alnum:]\\.]+)[[:space:]]*$", "\\1", pkgs[grepl("Package:", pkgs)]) fls <- sprintf(file.path(rpacks, "%s/DESCRIPTION"), pkgs) names(fls) <- pkgs fls <- fls[file.exists(fls)] } readbiocViewsFromRpacks <- function(fls) { otermsl <- lapply(fls, function(fl) { term <- read.dcf(fl, c("biocViews","BiocViews")) term <- term[!is.na(term)] if(length(term!=0)) strsplit(term, "[[:space:]]*,[[:space:]]*")[[1]] else NA_character_ }) pkgterm <- data.frame(pkg = rep(names(otermsl), sapply(otermsl, length)), term = unlist(unname(otermsl)), stringsAsFactors=FALSE) } generatebiocViewsMap <- function(pkgterm, map) { pkgterm$newterm <- pkgterm$term idx <- match(pkgterm$newterm, names(map)) pkgterm$newterm[!is.na(idx)] <- unname(map[pkgterm$newterm[!is.na(idx)]]) pkgterm } readVersionFromRpacks <- function(versionPath) { otermslVersion <- lapply(versionPath, function(ver) { dcf <- read.dcf(ver ) v <- package_version(dcf[, "Version"]) v0 = unclass(v) v0$Version[3] = v0$Version[3] +1 class(v0) = class(v) c(as.character(v),as.character(v0)) }) ver <- data.frame(matrix(unlist(otermslVersion), nrow=length(otermslVersion), byrow=T, dimnames=list(names(otermslVersion),c("oldVer","newVer")))) ver <- cbind(rownames(ver),ver ) names(ver)<- c("pkg","oldVer","newVer") rownames(ver) <- NULL ver } readDot <- function(fl) { dot <- readLines(fl) dot <- dot[seq(grep("BiocViews -> Software", dot), grep("BiocViews -> AnnotationData", dot) - 1)] sub(" *; *$", "", dot[grepl("^[[:space:][:alpha:]]+->", dot)]) } getPathfromPkgName<- function(pkglist) { fls[which(names(fls) %in% pkglist)] } ## ----code-1-------------------------------------------------------------- ##get the listof packages from manifest file in Rpacks rpacks <- file.path("C:","Users","sarora.FHCRC","Documents","Rpacks") manifest <- "bioc_2.14.manifest" fls <- readPathFromManifest(rpacks, manifest) ## ----code2--------------------------------------------------------------- #this will read in all the biocViews from each package pkgterm <- readbiocViewsFromRpacks(fls) ## ----code3--------------------------------------------------------------- ## read in biocViews map map <- biocViewMap() as.data.frame(map) ## ----code-4-------------------------------------------------------------- ## revise biocViews pkgterm <- generatebiocViewsMap(pkgterm, map) ## ----Danfile------------------------------------------------------------- ## comma sepearated biocViews yy = lapply(split(pkgterm, pkgterm$pkg), function(elt) { elt$term <- paste(elt$term,collapse=", ") elt$newterm <- paste(na.omit(elt$newterm),collapse=", ") unique(elt) }) #represnt as a data.frame yes <- do.call(rbind.data.frame,yy) ## ----code-5-------------------------------------------------------------- ## which packages had no change in their biocViews? nochange2 <- yes[which(yes$term==yes$newterm),] length(nochange2[,1]) ## ----modified------------------------------------------------------------ ## which package had changes in their biocViews modified2 <- yes[which(yes$term!=yes$newterm),] length(modified2[,2]) ## ----version------------------------------------------------------------- #get packages whose version has to be bumped versionfls<- modified2[,1] #get the path for each of these packages versionPath <- getPathfromPkgName(versionfls) # data.frame with package name, old followed by new version number. versiondf <- readVersionFromRpacks(versionPath) #merging mer <- merge(modified2,versiondf, by="pkg") ## ----nobiocViews------------------------------------------------------------------ xx = sapply(split(is.na(pkgterm$newterm), pkgterm$pkg), function(elt) sum(elt) == length(elt)) any(xx) nobiocView <- xx[xx] names(nobiocView) pkgterm[which(pkgterm$pkg %in% names(xx[xx])),] ## ----suggest------------------------------------------------------------- ##read in the biocViews from dot file. dirpath <- file.path("C:","Users","sarora.FHCRC","Documents","sandbox", "project biocviews","19feb2014") dot <- readDot(file.path(dirpath, "biocViewsVocab.dot")) dotterms <- unique(unlist(strsplit(dot, " *-> *"))) nobiocViewPath <- getPathfromPkgName(names(nobiocView)) getDescription <- function(package) { lapply(package, function(x) read.dcf(x,"Description")) } sugbiocView <- lapply(nobiocViewPath, function(x){ words <- unique(unlist(strsplit(read.dcf(x,c("Description","Title","Package"))," "))) idx <- which(tolower(dotterms) %in% tolower(words)) dotterms[idx] }) ##packages that have biocViews now! found <- sugbiocView[lapply(sugbiocView,length)>0] found <- lapply(found, function(x) paste(unlist(x),collapse=", " )) #add the suggested biocViews to mer. idx <- match(names(found), mer$pkg) mer[idx,3]<- as.character(found) #still do not have biocViews! realbad <- sugbiocView[lapply(sugbiocView,length)==0] #these files have no biocViews - manually add biocViews for them. badmer <- mer[which(mer[,1] %in% names(realbad)),] ## ----write--------------------------------------------------------------- #write file for manual analysis setwd(file.path("C:","Users","sarora.FHCRC","Documents","sandbox", "project biocviews","19feb2014")) write.table(nochange2[1:2], "nochangebiocViews.txt",quote=FALSE, sep="\t",row.names=FALSE) write.table(badmer, "badbiocViews.txt",quote=FALSE,sep="\t",row.names=FALSE) write.table(mer, "revisebiocViews.txt",quote=FALSE,sep="\t",row.names=FALSE) ## ----sessionInfo--------------------------------------------------------- sessionInfo() biocViews/inst/script/recommend_biocViews_rpacks.R0000644000175000017500000001255314136047116022301 0ustar nileshnileshrm(list=ls()) readPathFromManifest <- function(rpacks, manifest) { pkgs <- readLines(file.path(rpacks, manifest)) pkgs <- sub("Package:[[:space:]]*([[:alnum:]\\.]+)[[:space:]]*$", "\\1", pkgs[grepl("Package:", pkgs)]) fls <- sprintf(file.path(rpacks, "%s/DESCRIPTION"), pkgs) names(fls) <- pkgs fls <- fls[file.exists(fls)] } readDot <- function(fl) { dot <- readLines(fl) dot <- dot[seq(grep("BiocViews -> Software", dot), grep("BiocViews -> AnnotationData", dot) - 1)] sub(" *; *$", "", dot[grepl("^[[:space:][:alpha:]]+->", dot)]) } findbiocViews<- function(file, dotterms, terms) { ## strategy 1- parse the words in the DESCRIPTION file to get ## biocViews dcf <- read.dcf(file, c("Description", "Title", "Package")) words1 <- unique(unlist(strsplit(dcf, " "))) ## strategy 2- get biocViews of packages in depends field. pkgs <- read.dcf(file, "Depends") pkgs <- unlist(strsplit(gsub("[0-9.()>= ]", "", pkgs), ",")) x <- readLines(url("http://bioconductor.org/js/versions.js")) dv <- x[grep("develVersion", x)] devel_version <- strsplit(dv, '"')[[1]][2] repos <- c("bioc", "data/annotation", "data/experiment") urls <- paste0("http://bioconductor.org/packages/", devel_version, "/bioc/VIEWS") words2 <- character() con <- url(urls) biocpkgs <- read.dcf(con,"Package") idx <- which(biocpkgs %in% pkgs) if (length(idx)!=0) { wrd <- read.dcf(con, "biocViews")[idx] wrd <- unique(unlist(strsplit(wrd, ", "))) words2 <- c(words2,wrd) } close(con) ##stragegy -3 man pages parsing. manfls <- list.files(file.path(gsub("/DESCRIPTION","",file),"man"), full.names=TRUE,pattern="\\.Rd$") ##stragegy -4 vignette pages parsing. vinfls <- list.files(file.path(gsub("/DESCRIPTION","",file),"vignettes"), full.names=TRUE, pattern="\\.Rnw$") allfls <- c(manfls,vinfls) if(length(allfls)==0){ all_words <- NA }else{ q <- lapply(allfls, readLines) temp <- unlist(strsplit(q[[1]], "[[:punct:]]", perl = TRUE)) temp <- unlist(strsplit(temp, "[[:space:]]", perl = TRUE)) all_words <- unique(temp[temp != ""]) } # combine words from all sources and map if (length(words2)!=0) { words <- c(words1, words2,all_words) } else { words <- c(words1,all_words) } words <- unique(unlist(strsplit(words,"\n"))) ## match against biocViews. idx <- which(tolower(dotterms) %in% tolower(words)) temp <- dotterms[idx] ## only if both "decision" and "tree" are found add biocView "DecisionTree" split_word <- mapply(FUN= function(x,y){ i <- which(tolower(x) %in% tolower(words)) ifelse(length(i)==length(x), y, NA) }, terms, names(terms), USE.NAMES=FALSE) suggest_bioc <- unique(c(split_word[complete.cases(split_word)], temp)) commonbiocViews <- c("Infrastructure","Software", "AssayDomain","BiologicalQuestion","Infrastructure", "ResearchField","StatisticalMethod","Technology", "Annotation","Visualization","DataRepresentation", "miRNA","SNP","qPCR","SAGE","Genetics" ) suggest_bioc <- setdiff(suggest_bioc,commonbiocViews) ## existing biocView in test package? current <- read.dcf(file, c("biocViews","BiocViews")) current <- current[!is.na(current)] ## setdiff between current and suggested biocViews. if(length(current)!=0){ current <- strsplit(current, "[[:space:]]*,[[:space:]]*")[[1]] new_bioc <- setdiff(suggest_bioc, current) }else{ new_bioc <- NA_character_ } ## some pkgs have terms which do not belong to software branch. remove <- setdiff(current, dotterms) ##maintainer - email email <- read.dcf(file, "Maintainer") list(current = paste(current, collapse=", "), new = paste(new_bioc, collapse=", "), remove = paste(remove, collapse=", "), email = paste(unlist(strsplit(email,"\n")),collapse=" ") ) } library(BiocParallel) rpacks <- file.path("~/Rpacks") #rpacks <- file.path("home","sarora","Rpacks") # on rhino manifest <- "bioc_3.0.manifest" biocViewdotfile <- system.file("dot","biocViewsVocab.dot", package="biocViews") # read path from Rpacks fls <- readPathFromManifest(rpacks, manifest) cat("Total no of packages :",length(fls) ) #read biocViews from dot file. dot <- readDot(biocViewdotfile) dotterms <- unique(unlist(strsplit(dot, " *-> *"))) ### split "DecsisionTree" to "decision" , "tree" terms <- sapply(dotterms, function(x){ m <- gregexpr(pattern= "[[:upper:]]", text = x, ignore.case=FALSE) s1 <- unlist(regmatches(x,m)) s2 <- unlist(strsplit(x, "[[:upper:]]")) s2 <- s2[-1] word<-function(s1,s2) paste0(s1,s2) mapply(word, s1,s2, USE.NAMES=FALSE) }, simplify = TRUE) ## suggest biocViews. system.time(result <- lapply(fls[1:20], findbiocViews, dotterms, terms)) newbioc2 <- do.call(rbind.data.frame,result) write.table(newbioc2,"suggestedbiocViews_rpacks.txt",sep="\t", quote=FALSE) biocViews/data/0000755000175000017500000000000014136047116013252 5ustar nileshnileshbiocViews/data/biocViewsVocab.rda0000644000175000017500000001260314136047116016651 0ustar nileshnilesh]t$IyWJUt{X#iHsl3mڝ 8gs8l`p'lcq69ۜ骮dxcUU]{L?8]*ƈJgi+OT*JSwi5k\qoCT8&_Ku`zu1)ݸr/ҤHO!.T4ϭ;~JfxY`uiE ul9<%dRB] Pʣ/ XMi޵j'<*[s8k,dJ-vڲ߉&f*aΪV1m `Rk0(YKvH?^ߩI iL#1,HIնi}\&+wyH|aǭ(0c*{g"; x ≲dFDeo^SC`.TJN{P Vf.fX,g7}^XwP4՚wm̎yh.s`yǮa:|Cvfpܚ ${m.9(\]Qn{NUc'+]y\s)|B91tL2ZhUOW਻^%s;B^b5.>0y3 l/mu0x$zL}9TCMpsXL6ؐ\{vKPA2s.wmOLizTi6t&+e[zH>z~>]e:`I6uvMxBKlGu{,x%N CÈ9T<3If8 tzvw=Kak zՉx%Ө[sɊ)ws7u;b?ڝ:B1wԋvJx:"z=!kvziq3k] e88;t4T%dZ0¯$jD %LΣ /Y#%v}k*,6տ2%J| 9|ڔ3yb<[Ybe1\wz9{ømcV4;v$B0Aq X%hF=ѕ%LZã suϧza8wb~ad1#L/?s %YX;3q%=%>4fp:=cIzm!},.{ZbF<($i51 x[y -v*ط&k72<G%GXZ+Rn`=?^8IUk#J/ȑ殤nlnqfih[A*ߩl0*X<0*IST>+H@+N&&f"LPI=ފ]“lafzUejanWeu@1ZaKd3;\X)ؚ,LXv$&.Ss׸#sGD-ɐ2\A oSReC1k];hհrck+VƆ /Ѵx|mna8/cyq%;J*XP߭_Q$/t Z_[#.*ΗLq,1KD~OV=P*a.|@0*DZ lN?Ȓ3zIԅ0O] aob k̲z#LhŁ* tY6[n`l<,ޛiE\R7h0m/ۿ*B ۋOX/Ts`Ψ6ȶѓda­evF]aFv4>gآlDͤGAbGDm:}, o-̰ms[~At\̽NǶ}#"--jM\-VXp.otP3TB;~qC{ROU_4@aKÓ_rL]MxA,?D7P]ȫ:;=a; Ylg7bP jƼFOVF U[}؏#A0PvXAWZ؃0x{uaLj;(>KuY^[Eo@˷,a5cpWPR>\Y{XR M'k)^۷1fjp>==>DH)ɞ*|þ2z1nEmRAhD3RWNlnhOS;MpjDilAv6=ũ= řuskE:طj~gl"+pPbl1si.!w9>ryK(z\^._\*K[fH>ق 2.ĻeɎY'DK8dWVcS=x!,;LkOfixĵX/uswyi圠 5#)jfT3mY^ o|ϥ%*[rlb9;eAyx9/qU템' fd"f-$9Ӿ55uQҦ)Ԓ"tpMuGs9\wrLw+ uT>,+3/.GdӉeMf-g::2SxRn$7lxࡶ{~QtQVs"5VZH/3l<砗r2Nx- X*:Y,QvR;wV-+n]*x)Զ'R Ϸ[xZcJ2zNU7 |E{W`;5=<TNT9'ƮU凙CF,~X_Kψ$Go7.ieHHb|JM33S9r=};j5紏Y QO >{(>3 #d=.QIW\!Umhc9mzbҪ_2Tt.KSuƸ ^|6rTx^CuoM3RΡi1|n. Y1--U z%ƺ\ 2sTCGD,.d fƄհB1$5ճX@Oe YcCZcIgƄ ^8&zn Ky _^CC_cD0335̔0bޫWN#@:mھu[ ^xm+[Wy|6+^NA{|(6TEqE-~([xqdE_aV芮>H.͎ srΥ蒽dzŲ̃(:Q4Tr ESJߔS L Qz1zt dEtivXX܁rFPfgGmvGϐv7#em䥟`tzU& UT2YS_DL?mDM>W~f]^ZZ~OlTv 㠖|<'Ou;H\9\Hࣀ>l'UigRvpԁ}~ Cg|6la:.>c)XJ0hR`,%K&n|.ys> /K/h,u ? +x5 zo8O( l)S,e`X.O.O> >x& `x1:^ r MGXlq䛁o6haT85Ac»)ু7o ?/ _~x [~ mwo#wxXG Q(uPJ`(% Ҕ =?3/ad?0ID$=BCHz蹇LR%dI,%$BDȒY!K"dI䆐BrC/ $7䆐BrC<B`,B`,B <^ cy !!1<BCcy 䰓1!Wz5`#6B`##dTzAzAo?lG|Gxhл?ClAHre\z9 >S"z7;M00000ӛS4@[hz+M M Mo%}ɋ{]]]]]]O")]]]]g~<ةNN>Nc-!/k6)'RؼA7+ ҍR4mG|,'R2>9Gx y-F, Z(V8{0},GTR8P:m(e]~+񢲓Nv6>MhlbiocViews/NAMESPACE0000644000175000017500000000326114136047116013562 0ustar nileshnileshimport(methods) importMethodsFrom(graph, acc, adj, edges, inEdges, nodes, subGraph) importFrom(Biobase, copySubstitute) importFrom(RBGL, sp.between) importFrom(tools, write_PACKAGES, package_dependencies) importFrom(BiocManager, repositories) importFrom(stats, complete.cases, setNames) importFrom(utils, download.file, Stangle, available.packages, capture.output, contrib.url, data, file_test, head, packageDescription, readCitationFile, untar) importMethodsFrom(XML, saveXML) importFrom(XML, xmlNode, xmlOutputDOM, xmlTree, htmlParse, xpathApply, xmlValue) importFrom(RCurl, getURL) importFrom(RUnit, checkTrue) exportClasses("Htmlized", "PackageDetail", "pdAuthorMaintainerInfo", "pdVignetteInfo", "pdDownloadInfo", "pdDetailsInfo", "pdDescriptionInfo", "pdVigsAndDownloads", "RepositoryDetail", "rdPackageTable", "BiocView", "bvTitle", "bvPackageTable", "bvSubViews", "bvParentViews") exportMethods("coerce", "show", "htmlDoc", "htmlValue", "htmlFilename") export("writeBiocViews", "getBiocViews", "write_VIEWS", "write_REPOSITORY", "genReposControlFiles", "extractVignettes", "extractManuals", "extractCitations", "getCurrentbiocViews", "extractNEWS", "extractHTMLDocuments", "extractTopLevelFiles", "writeRepositoryHtml", "writePackageDetailHtml", "getSubTerms", "getBiocSubViews", "validate_bioc_views", "writeTopLevelView", "writeHtmlDoc", "write_SYMBOLS", "writeRFilesFromVignettes", "getPackageNEWS", "printNEWS","recommendBiocViews", "recommendPackages", "guessPackageType")