downloader/0000755000175000017500000000000012547466555013743 5ustar sebastiansebastiandownloader/man/0000755000175000017500000000000012547346233014504 5ustar sebastiansebastiandownloader/man/download.Rd0000644000175000017500000000325012547076364016607 0ustar sebastiansebastian% Generated by roxygen2 (4.1.1): do not edit by hand % Please edit documentation in R/download.r \name{download} \alias{download} \title{Download a file, using http, https, or ftp} \usage{ download(url, ...) } \arguments{ \item{url}{The URL to download.} \item{...}{Other arguments that are passed to \code{\link{download.file}}.} } \description{ This is a wrapper for \code{\link{download.file}} and takes all the same arguments. The only difference is that, if the protocol is https, it changes some settings to make it work. How exactly the settings are changed differs among platforms. } \details{ This function also should follow http redirects on all platforms, which is something that does not happen by default when \code{curl} is used, as on Mac OS X. With Windows, it either uses the \code{"wininet"} method (for R 3.2) or uses the \code{"internal"} method after first ensuring that \code{setInternet2}, is active (which tells R to use the \code{internet2.dll}). On other platforms, it will try to use \code{libcurl}, \code{wget}, then \code{curl}, and then \code{lynx} to download the file. R 3.2 will typically have the \code{libcurl} method and for previous versions of R Linux platforms will have \code{wget} installed, and Mac OS X will have \code{curl}. Note that for many (perhaps most) types of files, you will want to use \code{mode="wb"} so that the file is downloaded in binary mode. } \examples{ \dontrun{ # Download the downloader source, in binary mode download("https://github.com/wch/downloader/zipball/master", "downloader.zip", mode = "wb") } } \seealso{ \code{\link{download.file}} for more information on the arguments that can be used with this function. } downloader/man/source_url.Rd0000644000175000017500000000463612547346233017166 0ustar sebastiansebastian% Generated by roxygen2 (4.1.1): do not edit by hand % Please edit documentation in R/source_url.r \name{source_url} \alias{source_url} \title{Download an R file from a URL and source it} \usage{ source_url(url, sha = NULL, ..., prompt = TRUE, quiet = FALSE) } \arguments{ \item{url}{The URL to download.} \item{sha}{A SHA-1 hash of the file at the URL.} \item{...}{Other arguments that are passed to \code{\link{source}()}.} \item{prompt}{Prompt the user if no value for \code{sha} is provided.} \item{quiet}{If \code{FALSE} (the default), print out status messages about checking SHA.} } \description{ This will download a file and source it. Because it uses the \code{\link{download}()} function, it can handle https URLs. } \details{ By default, \code{source_url()} checks the SHA-1 hash of the file. If it differs from the expected value, it will throw an error. The default expectation is that a hash is provided; if not, \code{source_url()} will prompt the user, asking if they are sure they want to continue, unless \code{prompt=FALSE} is used. In other words, if you use \code{prompt=FALSE}, it will run the remote code without checking the hash, and without asking the user. The purpose of checking the hash is to ensure that the file has not changed. If a \code{source_url} command with a hash is posted in a public forum, then others who source the URL (with the hash) are guaranteed to run the same code every time. This means that the author doesn't need to worry about the security of the server hosting the file. It also means that the users don't have to worry about the file being replaced with a damaged or maliciously-modified version. To find the hash of a local file, use \code{\link{digest}()}. For a simple way to find the hash of a remote file, use \code{\link{sha_url}()}. } \examples{ \dontrun{ # Source the a sample file # This is a very long URL; break it up so it can be seen more easily in the # examples. test_url <- paste0("https://gist.github.com/wch/dae7c106ee99fe1fdfe7", "/raw/db0c9bfe0de85d15c60b0b9bf22403c0f5e1fb15/test.r") downloader::source_url(test_url, sha = "9b8ff5213e32a871d6cb95cce0bed35c53307f61") # Find the hash of a file downloader::sha_url(test_url) } } \seealso{ \code{\link{source}()} for more information on the arguments that can be used with this function. The \code{\link{sha_url}()} function can be used to find the SHA-1 hash of a remote file. } downloader/man/sha_url.Rd0000644000175000017500000000244212547346233016432 0ustar sebastiansebastian% Generated by roxygen2 (4.1.1): do not edit by hand % Please edit documentation in R/sha_url.r \name{sha_url} \alias{sha_url} \title{Download a file from a URL and find a SHA-1 hash of it} \usage{ sha_url(url, cmd = TRUE) } \arguments{ \item{url}{The URL of the file to find a hash of.} \item{cmd}{If \code{TRUE} (the default), print out a command for sourcing the URL with \code{\link{source_url}()}, including the hash.} } \description{ This will download a file and find a SHA-1 hash of it, using \code{\link{digest}()}. The primary purpose of this function is to provide an easy way to find the value of \code{sha} which can be passed to \code{\link{source_url}()}. } \examples{ \dontrun{ # Get the SHA hash of a file. It will print the text below and return # the hash as a string. This is a very long URL; break it up so it can be # seen more easily in the examples. test_url <- paste0("https://gist.github.com/wch/dae7c106ee99fe1fdfe7", "/raw/db0c9bfe0de85d15c60b0b9bf22403c0f5e1fb15/test.r") sha_url(test_url) # Command for sourcing the URL: # downloader::source_url("https://gist.github.com/wch/dae7c106ee99fe1fdfe7 # /raw/db0c9bfe0de85d15c60b0b9bf22403c0f5e1fb15/test.r", # sha="9b8ff5213e32a871d6cb95cce0bed35c53307f61") # [1] "9b8ff5213e32a871d6cb95cce0bed35c53307f61" } } downloader/man/downloader.Rd0000644000175000017500000000124512547076364017140 0ustar sebastiansebastian% Generated by roxygen2 (4.1.1): do not edit by hand % Please edit documentation in R/downloader-package.r \docType{package} \name{downloader} \alias{downloader} \alias{downloader-package} \title{downloader: a package for making it easier to download files over https} \description{ This package provides a wrapper for the download.file function, making it possible to download files over https on Windows, Mac OS X, and other Unix-like platforms. The RCurl package provides this functionality (and much more) but can be difficult to install because it must be compiled with external dependencies. This package has no external dependencies, so it is much easier to install. } downloader/inst/0000755000175000017500000000000012547076364014713 5ustar sebastiansebastiandownloader/inst/tests/0000755000175000017500000000000012547076364016055 5ustar sebastiansebastiandownloader/inst/tests/test-sha.r0000644000175000017500000000414512547076364017774 0ustar sebastiansebastiancontext("sha") test_that('sha_url', { # Create a temp file with simple R code temp_file <- tempfile() str <- 'a <<- a + 1' # Write str it to a file writeLines(str, sep = '', con = temp_file) url <- paste('file://', temp_file, sep = '') # Compare result from sha_url() to result directly from digest() expect_equal(sha_url(url), digest(str, algo = 'sha1', serialize = FALSE)) }) test_that('Check SHA hash with source_url', { # Create a temp file with simple R code temp_file <- tempfile() writeLines('a <<- a + 1', con = temp_file) url <- paste('file://', temp_file, sep = '') # Calculate the correct and incorrect SHA right_sha <- sha_url(url) wrong_sha <- '0000000000000000000000000000000000000000' # Counter - should be incremented by the code in the URL, which is a <<- a + 1 .GlobalEnv$a <- 0 # There are a total of 2x3x2=12 conditions, but we don't need to test them all # prompt=TRUE, right SHA, quiet=FALSE: print message expect_message(source_url(url, sha = right_sha), 'matches expected') expect_equal(a, 1) # prompt=TRUE, wrong SHA, quiet=FALSE: error expect_error(source_url(url, sha = wrong_sha)) expect_equal(a, 1) # prompt=TRUE, no SHA, quiet=FALSE: should prompt and respond to y/n # (no way to automatically test this) #source_url(url) # prompt=FALSE, no SHA, quiet=FALSE: do it, with message about not checking expect_message(source_url(url, prompt = FALSE), 'Not checking') expect_equal(a, 2) # prompt=FALSE, right SHA, quiet=FALSE: should just do it, with message about match expect_message(source_url(url, sha = right_sha, prompt = FALSE), 'matches expected') expect_equal(a, 3) # prompt=FALSE, wrong SHA, quiet=FALSE: should error expect_error(source_url(url, sha = wrong_sha, prompt = FALSE)) expect_equal(a, 3) # prompt=FALSE, no SHA, quiet=TRUE: should just do it, with no message about not checking source_url(url, prompt = FALSE, quiet = TRUE) expect_equal(a, 4) # prompt=FALSE, right SHA, quiet=TRUE: should just do it, with no message source_url(url, sha = right_sha, prompt = FALSE, quiet = TRUE) expect_equal(a, 5) }) downloader/inst/tests/test-download.r0000644000175000017500000000265112547076364021030 0ustar sebastiansebastiancontext("download") # Download from a url, and return the contents of the file as a string download_result <- function(url) { tfile <- tempfile() download(url, tfile, mode = "wb") # Read the file tfile_fd <- file(tfile, "r") dl_text <- readLines(tfile_fd, warn = FALSE) dl_text <- paste(dl_text, collapse = "\n") close(tfile_fd) unlink(tfile) dl_text } # CRAN has intermittent problems with these tests, since they rely on a # particular website being accessible. This makes it run with devtools::test() # but not on CRAN if (Sys.getenv('NOT_CRAN') == "true") { test_that("downloading http and https works properly", { # Download http from httpbin.org result <- download_result("http://httpbin.org/ip") # Check that it has the string "origin" in the text expect_true(grepl("origin", result)) # Download https from httpbin.org result <- download_result("https://httpbin.org/ip") # Check that it has the string "origin" in the text expect_true(grepl("origin", result)) }) test_that("follows redirects", { # Download http redirect from httpbin.org result <- download_result("http://httpbin.org/redirect/3") # Check that it has the string "origin" in the text expect_true(grepl("origin", result)) # Download https redirect from httpbin.org result <- download_result("https://httpbin.org/redirect/3") # Check that it has the string "origin" in the text expect_true(grepl("origin", result)) }) } downloader/NAMESPACE0000644000175000017500000000024412547076364015155 0ustar sebastiansebastian# Generated by roxygen2 (4.1.1): do not edit by hand export(download) export(sha_url) export(source_url) importFrom(digest,digest) importFrom(utils,download.file) downloader/MD50000644000175000017500000000134712547466555014260 0ustar sebastiansebastian4dfdf8b343119e2e138980b3297c7673 *DESCRIPTION 3628fb18c1e260b2ab16f58df07da7a1 *NAMESPACE dad3d7600527fe5e0abfaf2655f741a1 *NEWS 277477bb3ee86166b64697932f895018 *R/download.r de21e78821c43a2c90f2cfb9fec2dc6d *R/downloader-package.r 91dd3acb776aa923270c99e737ed5581 *R/sha_url.r eb7e970f16ac9134e791ea619d39f356 *R/source_url.r 6faf10fe18d73cd9b260a08156583f02 *README.md 445bb9feecafd409496db71611f9a17c *inst/tests/test-download.r 9ae6c32f1be586c0e9444c3daa6e1059 *inst/tests/test-sha.r 0634e47e5cdae1aa4395f34eb40f340e *man/download.Rd d519e8ec62892b654d83487b15eefd41 *man/downloader.Rd 69b351b86555ff0cd47ff8139ea03df2 *man/sha_url.Rd f9ee45459506c8d46bb0e13102777a9d *man/source_url.Rd 760b5bb5a5b44be51e89cde3827cb738 *tests/test-all.R downloader/DESCRIPTION0000644000175000017500000000151612547466555015454 0ustar sebastiansebastianPackage: downloader Maintainer: Winston Chang Author: Winston Chang Version: 0.4 License: GPL-2 Title: Download Files over HTTP and HTTPS Description: Provides a wrapper for the download.file function, making it possible to download files over HTTPS on Windows, Mac OS X, and other Unix-like platforms. The 'RCurl' package provides this functionality (and much more) but can be difficult to install because it must be compiled with external dependencies. This package has no external dependencies, so it is much easier to install. URL: https://github.com/wch/downloader Imports: utils, digest Suggests: testthat BugReports: https://github.com/wch/downloader/issues NeedsCompilation: no Packaged: 2015-07-09 01:22:22 UTC; winston Repository: CRAN Date/Publication: 2015-07-09 14:47:41 downloader/README.md0000644000175000017500000000130312547076364015212 0ustar sebastiansebastiandownloader ========== This package provides a wrapper for the download.file function, making it possible to download files over https on Windows, Mac OS X, and other Unix-like platforms. The RCurl package provides this functionality (and much more) but can be difficult to install because it must be compiled with external dependencies. This package has no external dependencies, so it is much easier to install. Example usage ============= This will download the source code for the downloader package: ```R # First install downloader from CRAN install.packages("downloader") library(downloader) download("https://github.com/wch/downloader/zipball/master", "downloader.zip", mode = "wb") ``` downloader/R/0000755000175000017500000000000012547346233014132 5ustar sebastiansebastiandownloader/R/sha_url.r0000644000175000017500000000300112547346233015744 0ustar sebastiansebastian#' Download a file from a URL and find a SHA-1 hash of it #' #' This will download a file and find a SHA-1 hash of it, using #' \code{\link{digest}()}. The primary purpose of this function is to provide #' an easy way to find the value of \code{sha} which can be passed to #' \code{\link{source_url}()}. #' #' @param url The URL of the file to find a hash of. #' @param cmd If \code{TRUE} (the default), print out a command for sourcing the #' URL with \code{\link{source_url}()}, including the hash. #' #' @export #' @examples #' \dontrun{ #' # Get the SHA hash of a file. It will print the text below and return #' # the hash as a string. This is a very long URL; break it up so it can be #' # seen more easily in the examples. #' test_url <- paste0("https://gist.github.com/wch/dae7c106ee99fe1fdfe7", #' "/raw/db0c9bfe0de85d15c60b0b9bf22403c0f5e1fb15/test.r") #' sha_url(test_url) #' # Command for sourcing the URL: #' # downloader::source_url("https://gist.github.com/wch/dae7c106ee99fe1fdfe7 #' # /raw/db0c9bfe0de85d15c60b0b9bf22403c0f5e1fb15/test.r", #' # sha="9b8ff5213e32a871d6cb95cce0bed35c53307f61") #' # [1] "9b8ff5213e32a871d6cb95cce0bed35c53307f61" #' } #' #' #' @importFrom digest digest sha_url <- function(url, cmd = TRUE) { temp_file <- tempfile() download(url, temp_file) on.exit(unlink(temp_file)) sha <- digest(file = temp_file, algo = 'sha1') if (cmd) { message('Command for sourcing the URL:\n', ' downloader::source_url("', url, '", sha="', sha, '")') } sha } downloader/R/download.r0000644000175000017500000000766112547076364016143 0ustar sebastiansebastian#' Download a file, using http, https, or ftp #' #' This is a wrapper for \code{\link{download.file}} and takes all the same #' arguments. The only difference is that, if the protocol is https, it changes #' some settings to make it work. How exactly the settings are changed differs #' among platforms. #' #' This function also should follow http redirects on all platforms, which is #' something that does not happen by default when \code{curl} is used, as on Mac #' OS X. #' #' With Windows, it either uses the \code{"wininet"} method (for R 3.2) or uses #' the \code{"internal"} method after first ensuring that \code{setInternet2}, #' is active (which tells R to use the \code{internet2.dll}). #' #' On other platforms, it will try to use \code{libcurl}, \code{wget}, then #' \code{curl}, and then \code{lynx} to download the file. R 3.2 will typically #' have the \code{libcurl} method and for previous versions of R Linux platforms #' will have \code{wget} installed, and Mac OS X will have \code{curl}. #' #' Note that for many (perhaps most) types of files, you will want to use #' \code{mode="wb"} so that the file is downloaded in binary mode. #' #' @param url The URL to download. #' @param ... Other arguments that are passed to \code{\link{download.file}}. #' #' @seealso \code{\link{download.file}} for more information on the arguments #' that can be used with this function. #' #' @export #' @examples #' \dontrun{ #' # Download the downloader source, in binary mode #' download("https://github.com/wch/downloader/zipball/master", #' "downloader.zip", mode = "wb") #' } #' #' @importFrom utils download.file download <- function(url, ...) { # First, check protocol. If http or https, check platform: if (grepl('^https?://', url)) { # Check whether we are running R 3.2 isR32 <- getRversion() >= "3.2" # Windows if (.Platform$OS.type == "windows") { if (isR32) { method <- "wininet" } else { # If we directly use setInternet2, R CMD CHECK gives a Note on Mac/Linux seti2 <- `::`(utils, 'setInternet2') # Check whether we are already using internet2 for internal internet2_start <- seti2(NA) # If not then temporarily set it if (!internet2_start) { # Store initial settings, and restore on exit on.exit(suppressWarnings(seti2(internet2_start))) # Needed for https. Will get warning if setInternet2(FALSE) already run # and internet routines are used. But the warnings don't seem to matter. suppressWarnings(seti2(TRUE)) } method <- "internal" } # download.file will complain about file size with something like: # Warning message: # In download.file(url, ...) : downloaded length 19457 != reported length 200 # because apparently it compares the length with the status code returned (?) # so we supress that suppressWarnings(download.file(url, method = method, ...)) } else { # If non-Windows, check for libcurl/curl/wget/lynx, then call download.file with # appropriate method. if (isR32 && capabilities("libcurl")) { method <- "libcurl" } else if (nzchar(Sys.which("wget")[1])) { method <- "wget" } else if (nzchar(Sys.which("curl")[1])) { method <- "curl" # curl needs to add a -L option to follow redirects. # Save the original options and restore when we exit. orig_extra_options <- getOption("download.file.extra") on.exit(options(download.file.extra = orig_extra_options)) options(download.file.extra = paste("-L", orig_extra_options)) } else if (nzchar(Sys.which("lynx")[1])) { method <- "lynx" } else { stop("no download method found") } download.file(url, method = method, ...) } } else { download.file(url, ...) } } downloader/R/downloader-package.r0000644000175000017500000000103012547076364020043 0ustar sebastiansebastian#' downloader: a package for making it easier to download files over https #' #' This package provides a wrapper for the download.file function, #' making it possible to download files over https on Windows, Mac OS X, and #' other Unix-like platforms. The RCurl package provides this functionality #' (and much more) but can be difficult to install because it must be compiled #' with external dependencies. This package has no external dependencies, so #' it is much easier to install. #' #' @name downloader #' @docType package NULL downloader/R/source_url.r0000644000175000017500000000663412547346233016510 0ustar sebastiansebastian#' Download an R file from a URL and source it #' #' This will download a file and source it. Because it uses the #' \code{\link{download}()} function, it can handle https URLs. #' #' By default, \code{source_url()} checks the SHA-1 hash of the file. If it #' differs from the expected value, it will throw an error. The default #' expectation is that a hash is provided; if not, \code{source_url()} will #' prompt the user, asking if they are sure they want to continue, unless #' \code{prompt=FALSE} is used. In other words, if you use \code{prompt=FALSE}, #' it will run the remote code without checking the hash, and without asking #' the user. #' #' The purpose of checking the hash is to ensure that the file has not changed. #' If a \code{source_url} command with a hash is posted in a public forum, then #' others who source the URL (with the hash) are guaranteed to run the same #' code every time. This means that the author doesn't need to worry about the #' security of the server hosting the file. It also means that the users don't #' have to worry about the file being replaced with a damaged or #' maliciously-modified version. #' #' To find the hash of a local file, use \code{\link{digest}()}. For a simple #' way to find the hash of a remote file, use \code{\link{sha_url}()}. #' #' @param url The URL to download. #' @param sha A SHA-1 hash of the file at the URL. #' @param prompt Prompt the user if no value for \code{sha} is provided. #' @param quiet If \code{FALSE} (the default), print out status messages about #' checking SHA. #' @param ... Other arguments that are passed to \code{\link{source}()}. #' #' @seealso \code{\link{source}()} for more information on the arguments #' that can be used with this function. The \code{\link{sha_url}()} function #' can be used to find the SHA-1 hash of a remote file. #' #' @export #' @examples #' \dontrun{ #' # Source the a sample file #' #' # This is a very long URL; break it up so it can be seen more easily in the #' # examples. #' test_url <- paste0("https://gist.github.com/wch/dae7c106ee99fe1fdfe7", #' "/raw/db0c9bfe0de85d15c60b0b9bf22403c0f5e1fb15/test.r") #' downloader::source_url(test_url, #' sha = "9b8ff5213e32a871d6cb95cce0bed35c53307f61") #' #' # Find the hash of a file #' downloader::sha_url(test_url) #' } #' #' #' @importFrom digest digest source_url <- function(url, sha = NULL, ..., prompt = TRUE, quiet = FALSE) { if (prompt && (is.null(sha) || sha == '')) { resp <- readline(prompt = paste(sep = '', ' No SHA-1 hash specified for the file. The hash is needed to ensure that\n', ' the file at the URL has not changed. See ?source_url for information on\n', ' why this is useful. Are sure you want to continue? [y/n] ')) sha <- NULL # Set to NULL for simpler check later on if (tolower(resp) != "y") { message("Quitting") return(invisible()) } } temp_file <- tempfile() download(url, temp_file) on.exit(unlink(temp_file)) if (!is.null(sha)) { url_sha <- digest(file = temp_file, algo = 'sha1') if (url_sha == sha) { if (!quiet) { message('Hash ', url_sha, ' matches expected value.') } } else { stop('Hash ', url_sha, ' does not match expected value!') } } else { if (!quiet) { message('Not checking SHA-1 of downloaded file.') } } source(temp_file, ...) } downloader/tests/0000755000175000017500000000000012547076364015100 5ustar sebastiansebastiandownloader/tests/test-all.R0000644000175000017500000000010212547076364016741 0ustar sebastiansebastianlibrary(testthat) library(downloader) test_package("downloader") downloader/NEWS0000644000175000017500000000215612547076364014441 0ustar sebastiansebastianVersion 0.4 -------------------------------------------------------------------------------- * Use new R 3.2 download methods ("wininet" and "libcurl") when available. Version 0.3 -------------------------------------------------------------------------------- * `source_url()` function now checks the SHA-1 hash the downloaded file. * Add `sha_url()` function, for finding the SHA-1 hash of a remote file. Version 0.2.2 -------------------------------------------------------------------------------- * Disable all network tests when running on CRAN, because the connection to the remote test website may not be reliable. Version 0.2.1 -------------------------------------------------------------------------------- * Change https redirection test to not run on CRAN because their Windows build machine has more stringent security settings. Version 0.2 -------------------------------------------------------------------------------- * Switched to using `Sys.which` to find external programs. * Added tests. * When using curl, follow redirects with http. (It already worked with https.) * Add `source_url` function.