downlit/ 0000755 0001762 0000144 00000000000 14447404110 011731 5 ustar ligges users downlit/NAMESPACE 0000644 0001762 0000144 00000001464 14137022063 013154 0 ustar ligges users # Generated by roxygen2: do not edit by hand
S3method(is_low_change,default)
S3method(replay_html,"NULL")
S3method(replay_html,character)
S3method(replay_html,error)
S3method(replay_html,html)
S3method(replay_html,htmlwidget)
S3method(replay_html,list)
S3method(replay_html,message)
S3method(replay_html,recordedplot)
S3method(replay_html,shiny.tag)
S3method(replay_html,shiny.tag.function)
S3method(replay_html,shiny.tag.list)
S3method(replay_html,source)
S3method(replay_html,warning)
export(autolink)
export(autolink_url)
export(classes_chroma)
export(classes_pandoc)
export(downlit_html_node)
export(downlit_html_path)
export(downlit_md_path)
export(downlit_md_string)
export(evaluate_and_highlight)
export(highlight)
export(href_article)
export(href_package)
export(href_topic)
export(is_low_change)
import(rlang)
downlit/LICENSE 0000644 0001762 0000144 00000000045 13731234742 012744 0 ustar ligges users YEAR: 2020
COPYRIGHT HOLDER: RStudio
downlit/README.md 0000644 0001762 0000144 00000007376 14250475251 013233 0 ustar ligges users # downlit
[](https://github.com/r-lib/downlit/actions/workflows/R-CMD-check.yaml)
[](https://app.codecov.io/gh/r-lib/downlit?branch=main)
[](https://CRAN.R-project.org/package=downlit)
The goal of downlit is to provide syntax highlighting and automatic linking of R code in a way that is easily used from RMarkdown packages like [pkgdown](https://pkgdown.r-lib.org/), [bookdown](https://bookdown.org), and [hugodown](https://hugodown.r-lib.org/).
## Installation
Install downlit from CRAN with:
```r
install.packages("downlit")
```
## Features
downlit has two slightly different highlighting/linking engines:
* `highlight()` works with multiline code blocks and does syntax highlighting,
function linking, and comment styling.
* `autolink()` works with inline code and only does linking.
Multiline code blocks have:
* Code syntax highlighted using R's parser.
* Function calls automatically linked to their corresponding documentation.
* Comments styled by transforming ANSI escapes sequences to their HTML
equivalents (thanks [fansi](https://github.com/brodieG/fansi) package).
The following forms of inline code are recognized and automatically linked:
* `fun()`, `pkg::fun()`.
* `?fun`, `pkg::fun`, `type?topic`.
* `help("fun")`, `help("fun", package = "package")`, `help(package = "package")`.
* `vignette("name")`, `vignette("name", package = "package")`.
* `library(package)`, `require(package)`, `requireNamespace("package")`.
* `{package}` gets linked (if possible) _and formatted as plain text_.
### Cross-package links
If downlit can find a pkgdown site for the remote package, it will link to it; otherwise it will link to
}.}
\item{code}{If \code{TRUE}, wrap output in a \verb{} block}
}
\value{
If \code{text} is valid R code, an HTML \verb{} tag. Otherwise,
\code{NA}.
A string containing syntax highlighted HTML or \code{NA} (if \code{text}
isn't parseable).
}
\description{
This function:
\itemize{
\item syntax highlights code
\item links function calls to their documentation (where possible)
\item in comments, translates ANSI escapes in to HTML equivalents.
}
}
\section{Options}{
downlit provides a number of options to control the details of the linking.
They are particularly important if you want to generate "local" links.
\itemize{
\item \code{downlit.package}: name of the current package. Determines when
\code{topic_index} and \code{article_index}
\item \code{downlit.topic_index} and \code{downlit.article_index}: named character
vector that maps from topic/article name to path.
\item \code{downlit.rdname}: name of current Rd file being documented (if any);
used to avoid self-links.
\item \code{downlit.attached}: character vector of currently attached R packages.
\item \code{downlit.local_packages}: named character vector providing relative
paths (value) to packages (name) that can be reached with relative links
from the target HTML document.
\item \code{downlit.topic_path} and \code{downlit.article_path}: paths to reference
topics and articles/vignettes relative to the "current" file.
}
}
\examples{
cat(highlight("1 + 1"))
cat(highlight("base::t(1:3)"))
# Unparseable R code returns NA
cat(highlight("base::t("))
}
downlit/man/downlit_html_path.Rd 0000644 0001762 0000144 00000003101 13750320240 016503 0 ustar ligges users % Generated by roxygen2: do not edit by hand
% Please edit documentation in R/downlit-html.R
\name{downlit_html_path}
\alias{downlit_html_path}
\alias{downlit_html_node}
\title{Syntax highlight and link an HTML page}
\usage{
downlit_html_path(in_path, out_path, classes = classes_pandoc())
downlit_html_node(x, classes = classes_pandoc())
}
\arguments{
\item{in_path, out_path}{Input and output paths for HTML file}
\item{classes}{A mapping between token names and CSS class names.
Bundled \code{classes_pandoc()} and \code{classes_chroma()} provide mappings
that (roughly) match Pandoc and chroma (used by hugo) classes so you
can use existing themes.}
\item{x}{An \code{xml2::xml_node}}
}
\value{
\code{downlit_html_path()} invisibly returns \code{output_path};
\code{downlit_html_node()} modifies \code{x} in place and returns nothing.
}
\description{
\itemize{
\item Code blocks, identified by \verb{} tags with class \verb{sourceCode r}
or any \verb{} tag inside of \verb{}, are
processed with \code{\link[=highlight]{highlight()}}.
\item Inline code, identified by \verb{} tags that contain only text
(and don't have a header tag (e.g. \verb{}) or \verb{} as an ancestor)
are processed processed with \code{\link[=autolink]{autolink()}}.
}
Use \code{downlit_html_path()} to process an \code{.html} file on disk;
use \code{downlit_html_node()} to process an in-memory \code{xml_node} as part of a
larger pipeline.
}
\examples{
node <- xml2::read_xml("base::t()
")
node
# node is modified in place
downlit_html_node(node)
node
}
downlit/man/autolink.Rd 0000644 0001762 0000144 00000003052 14122441316 014620 0 ustar ligges users % Generated by roxygen2: do not edit by hand
% Please edit documentation in R/link.R
\name{autolink}
\alias{autolink}
\alias{autolink_url}
\title{Automatically link inline code}
\usage{
autolink(text)
autolink_url(text)
}
\arguments{
\item{text}{String of code to highlight and link.}
}
\value{
If \code{text} is linkable, an HTML link for \code{autolink()}, and or just
the URL for \code{autolink_url()}. Both return \code{NA} if the text is not
linkable.
}
\description{
Automatically link inline code
}
\section{Options}{
downlit provides a number of options to control the details of the linking.
They are particularly important if you want to generate "local" links.
\itemize{
\item \code{downlit.package}: name of the current package. Determines when
\code{topic_index} and \code{article_index}
\item \code{downlit.topic_index} and \code{downlit.article_index}: named character
vector that maps from topic/article name to path.
\item \code{downlit.rdname}: name of current Rd file being documented (if any);
used to avoid self-links.
\item \code{downlit.attached}: character vector of currently attached R packages.
\item \code{downlit.local_packages}: named character vector providing relative
paths (value) to packages (name) that can be reached with relative links
from the target HTML document.
\item \code{downlit.topic_path} and \code{downlit.article_path}: paths to reference
topics and articles/vignettes relative to the "current" file.
}
}
\examples{
autolink("stats::median()")
autolink("vignette('grid', package = 'grid')")
autolink_url("stats::median()")
}
downlit/man/is_low_change.Rd 0000644 0001762 0000144 00000000565 14042044644 015605 0 ustar ligges users % Generated by roxygen2: do not edit by hand
% Please edit documentation in R/evaluate.R
\name{is_low_change}
\alias{is_low_change}
\title{Compare two recorded plots}
\usage{
is_low_change(p1, p2)
}
\arguments{
\item{p1, p2}{Plot results}
}
\value{
Logical value indicating whether \code{p2} is a low-level update of \code{p1}.
}
\description{
Compare two recorded plots
}
downlit/man/downlit-package.Rd 0000644 0001762 0000144 00000001603 14447366212 016056 0 ustar ligges users % Generated by roxygen2: do not edit by hand
% Please edit documentation in R/downlit-package.R
\docType{package}
\name{downlit-package}
\alias{downlit}
\alias{downlit-package}
\title{downlit: Syntax Highlighting and Automatic Linking}
\description{
Syntax highlighting of R code, specifically designed for the needs of 'RMarkdown' packages like 'pkgdown', 'hugodown', and 'bookdown'. It includes linking of function calls to their documentation on the web, and automatic translation of ANSI escapes in output to the equivalent HTML.
}
\seealso{
Useful links:
\itemize{
\item \url{https://downlit.r-lib.org/}
\item \url{https://github.com/r-lib/downlit}
\item Report bugs at \url{https://github.com/r-lib/downlit/issues}
}
}
\author{
\strong{Maintainer}: Hadley Wickham \email{hadley@posit.co}
Other contributors:
\itemize{
\item RStudio [copyright holder, funder]
}
}
\keyword{internal}
downlit/man/downlit_md_path.Rd 0000644 0001762 0000144 00000002457 14042074111 016152 0 ustar ligges users % Generated by roxygen2: do not edit by hand
% Please edit documentation in R/downlit-md.R
\name{downlit_md_path}
\alias{downlit_md_path}
\alias{downlit_md_string}
\title{Syntax highlight and link a md document}
\usage{
downlit_md_path(in_path, out_path, format = NULL)
downlit_md_string(x, format = NULL)
}
\arguments{
\item{in_path, out_path}{Input and output paths for markdown file.}
\item{format}{Pandoc format; defaults to "gfm" if you have pandoc 2.0.0 or
greater, otherwise "markdown_github".}
\item{x}{A string containing markdown.}
}
\value{
\code{downlit_md_path()} invisibly returns \code{output_path};
\code{downlit_md_string()} returns a string containing markdown.
}
\description{
\verb{downlit_md_*} works by traversing the markdown AST generated by Pandoc.
It applies \code{\link[=highlight]{highlight()}} to \code{CodeBlock}s and \code{\link[=autolink]{autolink()}} to inline \code{Code}.
Use \code{downlit_md_path()} to transform a file on disk; use
\code{downlit_md_string()} to transform a string containing markdown as part
of a larger pipeline.
Needs pandoc 1.19 or later.
}
\examples{
if (rmarkdown::pandoc_available("1.19")) {
downlit_md_string("`base::t()`")
downlit_md_string("`base::t`")
downlit_md_string("* `base::t`")
# But don't highlight in headings
downlit_md_string("## `base::t`")
}
}
downlit/man/href_topic.Rd 0000644 0001762 0000144 00000001447 14255166156 015137 0 ustar ligges users % Generated by roxygen2: do not edit by hand
% Please edit documentation in R/link.R
\name{href_topic}
\alias{href_topic}
\alias{href_article}
\alias{href_package}
\title{Generate url for topic/article/package}
\usage{
href_topic(topic, package = NULL, is_fun = FALSE)
href_article(article, package = NULL)
href_package(package)
}
\arguments{
\item{topic, article}{Topic/article name}
\item{package}{Optional package name. If not supplied, will search
in all attached packages.}
\item{is_fun}{Only return topics that are (probably) for functions.}
}
\value{
URL topic or article; \code{NA} if can't find one.
}
\description{
Generate url for topic/article/package
}
\examples{
href_topic("t")
href_topic("DOESN'T EXIST")
href_topic("href_topic", "downlit")
href_package("downlit")
}
\keyword{internal}
downlit/man/evaluate_and_highlight.Rd 0000644 0001762 0000144 00000004412 14250475251 017461 0 ustar ligges users % Generated by roxygen2: do not edit by hand
% Please edit documentation in R/evaluate.R
\name{evaluate_and_highlight}
\alias{evaluate_and_highlight}
\title{Evaluate code and syntax highlight the results}
\usage{
evaluate_and_highlight(
code,
fig_save,
classes = downlit::classes_pandoc(),
env = NULL,
output_handler = evaluate::new_output_handler(),
highlight = TRUE
)
}
\arguments{
\item{code}{Code to evaluate (as a string).}
\item{fig_save}{A function with arguments \code{plot} and \code{id} that is
responsible for saving \code{plot} to a file (using \code{id} to disambiguate
multiple plots in the same chunk). It should return a list with
components \code{path}, \code{width}, and \code{height}.}
\item{classes}{A mapping between token names and CSS class names.
Bundled \code{classes_pandoc()} and \code{classes_chroma()} provide mappings
that (roughly) match Pandoc and chroma (used by hugo) classes so you
can use existing themes.}
\item{env}{Environment in which to evaluate code; if not supplied,
defaults to a child of the global environment.}
\item{output_handler}{Custom output handler for \code{\link[evaluate:evaluate]{evaluate::evaluate()}}.}
\item{highlight}{Optionally suppress highlighting. This is useful for tests.}
}
\value{
An string containing HTML with a \code{dependencies} attribute
giving an additional htmltools dependencies required to render the HTML.
}
\description{
This function runs \code{code} and captures the output using
\code{\link[evaluate:evaluate]{evaluate::evaluate()}}. It syntax higlights code with \code{\link[=highlight]{highlight()}}, and
intermingles it with output.
}
\examples{
cat(evaluate_and_highlight("1 + 2"))
cat(evaluate_and_highlight("x <- 1:10\nmean(x)"))
# -----------------------------------------------------------------
# evaluate_and_highlight() powers pkgdown's documentation formatting so
# here I include a few examples to make sure everything looks good
# -----------------------------------------------------------------
blue <- function(x) paste0("\033[34m", x, "\033[39m")
f <- function(x) {
cat("This is some output. My favourite colour is ", blue("blue"), ".\n", sep = "")
message("This is a message. My favourite fruit is ", blue("blueberries"))
warning("Now at stage ", blue("blue"), "!")
}
f()
plot(1:10)
}
downlit/DESCRIPTION 0000644 0001762 0000144 00000002334 14447404110 013441 0 ustar ligges users Package: downlit
Title: Syntax Highlighting and Automatic Linking
Version: 0.4.3
Authors@R: c(
person("Hadley", "Wickham", , "hadley@posit.co", role = c("aut", "cre")),
person("RStudio", role = c("cph", "fnd"))
)
Description: Syntax highlighting of R code, specifically designed for the
needs of 'RMarkdown' packages like 'pkgdown', 'hugodown', and
'bookdown'. It includes linking of function calls to their
documentation on the web, and automatic translation of ANSI escapes in
output to the equivalent HTML.
License: MIT + file LICENSE
URL: https://downlit.r-lib.org/, https://github.com/r-lib/downlit
BugReports: https://github.com/r-lib/downlit/issues
Depends: R (>= 3.4.0)
Imports: brio, desc, digest, evaluate, fansi, memoise, rlang, vctrs,
withr, yaml
Suggests: covr, htmltools, jsonlite, MASS, MassSpecWavelet, pkgload,
rmarkdown, testthat (>= 3.0.0), xml2
Config/Needs/website: tidyverse/tidytemplate
Config/testthat/edition: 3
Encoding: UTF-8
RoxygenNote: 7.2.3
NeedsCompilation: no
Packaged: 2023-06-29 22:06:14 UTC; hadleywickham
Author: Hadley Wickham [aut, cre],
RStudio [cph, fnd]
Maintainer: Hadley Wickham
Repository: CRAN
Date/Publication: 2023-06-29 22:40:08 UTC
downlit/tests/ 0000755 0001762 0000144 00000000000 14447400126 013076 5 ustar ligges users downlit/tests/testthat/ 0000755 0001762 0000144 00000000000 14447404110 014733 5 ustar ligges users downlit/tests/testthat/markdown-definition.md 0000644 0001762 0000144 00000000065 13721205213 021223 0 ustar ligges users Term 1 (`base::t()`)
: Definition 1 (`base::t()`)
downlit/tests/testthat/autolink.html 0000644 0001762 0000144 00000000540 14122441320 017440 0 ustar ligges users
# This is a comment
stats::median()
stats::median()
stats::median()
stats::median()
stats::median()
downlit/tests/testthat/test-downlit-md-v20.txt 0000644 0001762 0000144 00000001231 14255166156 021145 0 ustar ligges users > cat(downlit_md_string("* `base::t`"))
- [`base::t`](https://rdrr.io/r/base/t.html)
> cat(downlit_md_string("```\nbase::t(1)\n```"))
base::t(1)
> cat(downlit_md_string(brio::read_lines(test_path("markdown-table.md"))))
Table: Caption [`base::t`](https://rdrr.io/r/base/t.html)
| [`base::t`](https://rdrr.io/r/base/t.html) | xx |
| -----------------------------------------: | :- |
| [`base::t`](https://rdrr.io/r/base/t.html) | yy |
downlit/tests/testthat/test-downlit-html.R 0000644 0001762 0000144 00000001705 14122441316 020457 0 ustar ligges users test_that("can highlight html file", {
# verify_output() seems to be generating the wrong line endings
skip_on_os("windows")
verify_output(test_path("test-downlit-html.txt"), {
out <- downlit_html_path(test_path("autolink.html"), tempfile())
cat(brio::read_lines(out), sep = "\n")
})
})
test_that("highlight all pre inside div.downlit", {
html <- xml2::read_xml("
1 + 2
3 + 4
No hightlight
"
)
downlit_html_node(html)
expect_snapshot_output(show_xml(html))
})
test_that("special package string gets linked", {
html <- xml2::read_xml("before {downlit}
after
")
downlit_html_node(html)
expect_snapshot_output(show_xml(html))
# But only when it's a real package
html <- xml2::read_xml("before {notapkg}
after
")
downlit_html_node(html)
expect_snapshot_output(show_xml(html))
})
downlit/tests/testthat/test-topic-index.R 0000644 0001762 0000144 00000002263 14256115056 020267 0 ustar ligges users test_that("NULL package uses context", {
local_options("downlit.topic_index" = c(foo = "bar"))
expect_equal(topic_index(NULL), c(foo = "bar"))
})
test_that("can capture index from in-development package", {
local_devtools_package(test_path("index"))
expect_equal(topic_index("index"), c("a" = "a", "b" = "b", "c" = "b"))
})
test_that("can capture index from installed package", {
skip_on_cran()
grid_index <- topic_index("grid")
expect_equal(grid_index[["unit"]], "unit")
})
# find_rdname -------------------------------------------------------------
test_that("can find topic in specified package", {
skip_on_cran()
grid_index <- topic_index("grid")
expect_equal(find_rdname("grid", "unit"), "unit")
expect_equal(find_rdname("grid", "DOESNOTEXIST"), NULL)
})
test_that("can find topic in attached packages", {
local_options("downlit.attached" = "grid")
expect_equal(find_rdname_attached("unit"), list(rdname = "unit", package = "grid"))
expect_equal(find_rdname_attached("DOESNOTEXIST"), NULL)
})
test_that("doesn't error if 'attached' package not installed", {
local_options(downlit.attached = "uninstalled")
expect_equal(find_rdname_attached("foo"), NULL)
})
downlit/tests/testthat/test-article-index.R 0000644 0001762 0000144 00000002320 13731234742 020567 0 ustar ligges users test_that("NULL package uses context", {
local_options("downlit.article_index" = c(a = "a"))
expect_equal(article_index(NULL), c(a = "a"))
})
test_that("can capture index from in-development package", {
local_devtools_package(test_path("index"))
expect_equal(
article_index("index"),
c("test-1" = "test-1.html", "test-2" = "test-2.html")
)
expect_equal(find_article("index", "test-1"), "test-1.html")
expect_equal(find_article("index", "missing"), NULL)
})
test_that("can capture index of installed package (no vignettes)", {
# Ensure we skip this test if MASS ever gets vignettes
skip_if_not(nrow(vignette(package = "MASS")$results) == 0)
expect_equal(
article_index("base"),
character()
)
})
test_that("can capture index of installed package (vignettes + pkgdown)", {
# Requires internet + number of vignettes might change
skip_on_cran()
expect_true("custom-expectation.html" %in% article_index("testthat"))
})
test_that("can capture index of installed package (vignettes + pkgdown)", {
n_vignettes <- nrow(vignette(package = "grid")$results)
expect_length(article_index("grid"), n_vignettes)
})
# find_article ------------------------------------------------------------
downlit/tests/testthat/test-link.R 0000644 0001762 0000144 00000021101 14447373707 017004 0 ustar ligges users test_that("can link function calls", {
local_options(
"downlit.package" = "test",
"downlit.topic_index" = c(foo = "bar")
)
expect_equal(href_expr_(foo()), "bar.html")
# even if namespaced
expect_equal(href_expr_(test::foo()), "bar.html")
# but functions with arguments are ignored
expect_equal(href_expr_(foo(1, 2, 3)), NA_character_)
# as are function factories are ignored
expect_equal(href_expr_(foo()(1, 2, 3)), NA_character_)
expect_equal(href_expr_(test::foo()(1, 2, 3)), NA_character_)
})
test_that("base function calls linked", {
expect_equal(href_expr_(median()), href_topic_remote("median", "stats"))
})
test_that("respects options that define current location", {
local_options(
"downlit.topic_index" = c(bar = "bar"),
"downlit.topic_path" = "myref/"
)
# when not in an Rd file, link with topic_path
local_options("downlit.rdname" = "")
expect_equal(href_expr_(bar()), "myref/bar.html")
# don't link to self
local_options("downlit.rdname" = "bar")
expect_equal(href_expr_(bar()), NA_character_)
})
test_that("can link remote objects", {
expect_equal(href_expr_(MASS::abbey), href_topic_remote("abbey", "MASS"))
expect_equal(href_expr_(MASS::addterm()), href_topic_remote("addterm", "MASS"))
expect_equal(href_expr_(MASS::addterm.default()), href_topic_remote("addterm", "MASS"))
expect_equal(href_expr_(base::`::`), href_topic_remote("::", "base"))
# Doesn't exist
expect_equal(href_expr_(MASS::blah), NA_character_)
})
test_that("can link to topics in registered packages", {
local_options("downlit.attached" = "MASS")
expect_equal(href_expr_(addterm()), href_topic_remote("addterm", "MASS"))
expect_equal(href_expr_(?abbey), href_topic_remote("abbey", "MASS"))
# but has to be a function
expect_equal(href_expr_(abbey()), NA_character_)
})
test_that("can link to package names in registered packages", {
expect_equal(
autolink_curly("{downlit}"),
"downlit"
)
expect_equal(autolink_curly("{package}"), NA_character_)
# No curly = no link
expect_equal(autolink_curly(""), NA_character_)
})
test_that("can link to functions in base packages", {
expect_equal(href_expr_(abbreviate()), href_topic_remote("abbreviate", "base"))
expect_equal(href_expr_(median()), href_topic_remote("median", "stats"))
})
test_that("links to home of re-exported functions", {
expect_equal(href_expr_(testthat::`%>%`), href_topic_remote("%>%", "magrittr"))
})
test_that("fails gracely if can't find re-exported function", {
local_options(
"downlit.package" = "downlit",
"downlit.topic_index" = c(foo = "reexports")
)
expect_equal(href_expr_(foo()), NA_character_)
})
test_that("can link to remote pkgdown sites", {
# use autolink() to avoid R CMD check NOTE
expect_equal(autolink_url("pkgdown::add_slug"), href_topic_remote("pkgdown", "add_slug"))
expect_equal(autolink_url("pkgdown::add_slug(1)"), href_topic_remote("pkgdown", "add_slug"))
})
test_that("or local sites, if registered", {
local_options("downlit.local_packages" = c("MASS" = "MASS"))
expect_equal(href_expr_(MASS::abbey), "MASS/reference/abbey.html")
})
test_that("bare bare symbols are not linked", {
expect_equal(autolink_url("%in%"), NA_character_)
expect_equal(autolink_url("foo"), NA_character_)
})
test_that("returns NA for bad inputs", {
expect_equal(autolink_url(""), NA_character_)
expect_equal(autolink_url("a; b"), NA_character_)
expect_equal(autolink_url("1"), NA_character_)
expect_equal(autolink_url("ls *t??ne.pb"), NA_character_)
})
# help --------------------------------------------------------------------
test_that("can link ? calls", {
local_options(
"downlit.package" = "test",
"downlit.topic_index" = c(foo = "foo", "foo-package" = "foo-package")
)
expect_equal(href_expr_(?foo), "foo.html")
expect_equal(href_expr_(?"foo"), "foo.html")
expect_equal(href_expr_(?test::foo), "foo.html")
expect_equal(href_expr_(package?foo), "foo-package.html")
})
test_that("can link help calls", {
local_options(
"downlit.package" = "test",
"downlit.topic_index" = c(foo = "foo", "foo-package" = "foo-package")
)
expect_equal(href_expr_(help("foo")), "foo.html")
expect_equal(href_expr_(help("foo", "test")), "foo.html")
expect_equal(href_expr_(help(package = "MASS")), "https://rdrr.io/pkg/MASS/man")
expect_equal(href_expr_(help()), "https://rdrr.io/r/utils/help.html")
expect_equal(href_expr_(help(a$b)), NA_character_)
})
# library and friends -----------------------------------------------------
test_that("library() linked to package reference", {
skip_on_cran() # in case URLs change
skip_on_os("solaris")
expect_equal(href_expr_(library(rlang)), "https://rlang.r-lib.org")
expect_equal(href_expr_(library(MASS)), "http://www.stats.ox.ac.uk/pub/MASS4/")
})
test_that("except when not possible", {
topic_library <- href_topic("library", "base")
expect_equal(href_expr_(library()), topic_library)
expect_equal(href_expr_(library(doesntexist)), topic_library)
expect_equal(href_expr_(library(package = )), topic_library)
expect_equal(href_expr_(library("x", "y", "z")), topic_library)
})
test_that("requireNamespace doesn't use NSE", {
require_topic <- href_topic("requireNamespace", "base")
expect_equal(href_expr_(requireNamespace(rlang)), require_topic)
expect_equal(href_expr_(requireNamespace("rlang")), "https://rlang.r-lib.org")
})
# vignette ----------------------------------------------------------------
test_that("can link to local articles", {
local_options(
"downlit.package" = "test",
"downlit.article_index" = c(x = "y.html"),
"downlit.article_path" = "my_path/",
)
expect_equal(href_expr_(vignette("x")), "my_path/y.html")
expect_equal(href_expr_(vignette("x", package = "test")), "my_path/y.html")
expect_equal(href_expr_(vignette("y")), NA_character_)
})
test_that("can link to bioconductor vignettes", {
skip_if_not_installed("MassSpecWavelet")
skip_on_cran()
# Varies between .html and .pdf based on R version
expect_equal(
tools::file_path_sans_ext(href_expr_(vignette("MassSpecWavelet", "MassSpecWavelet"))),
"https://bioconductor.org/packages/release/bioc/vignettes/MassSpecWavelet/inst/doc/MassSpecWavelet"
)
})
test_that("can link to remote articles", {
skip_on_cran()
expect_equal(
href_expr_(vignette("sha1", "digest")),
"https://cran.rstudio.com/web/packages/digest/vignettes/sha1.html"
)
expect_equal(href_expr_(vignette("blah1", "digest")), NA_character_)
expect_equal(
href_expr_(vignette(package = "digest", "sha1")),
"https://cran.rstudio.com/web/packages/digest/vignettes/sha1.html"
)
expect_equal(
href_expr_(vignette("custom-expectation", "testthat")),
"https://testthat.r-lib.org/articles/custom-expectation.html"
)
})
test_that("or local sites, if registered", {
local_options("downlit.local_packages" = c("digest" = "digest"))
expect_equal(href_expr_(vignette("sha1", "digest")), "digest/articles/sha1.html")
})
test_that("looks in attached packages", {
local_options("downlit.attached" = c("grid", "digest"))
expect_equal(
href_expr_(vignette("sha1")),
"https://cran.rstudio.com/web/packages/digest/vignettes/sha1.html"
)
expect_equal(
href_expr_(vignette("moveline")),
"https://cran.rstudio.com/web/packages/grid/vignettes/moveline.pdf"
)
})
test_that("fail gracefully with non-working calls", {
expect_equal(href_expr_(vignette()), "https://rdrr.io/r/utils/vignette.html")
expect_equal(href_expr_(vignette(package = package)), NA_character_)
expect_equal(href_expr_(vignette(1, 2)), NA_character_)
expect_equal(href_expr_(vignette(, )), NA_character_)
})
test_that("spurious functions are not linked (#889)", {
expect_equal(href_expr_(Authors@R), NA_character_)
expect_equal(href_expr_(content-home.html), NA_character_)
expect_equal(href_expr_(toc: depth), NA_character_)
})
test_that("autolink generates HTML if linkable", {
expect_equal(
autolink("stats::median()"),
"stats::median()"
)
expect_equal(autolink("1 +"), NA_character_)
})
test_that("href_package can handle non-existing packages", {
expect_equal(href_package("NotAPackage"), NA_character_)
})
# find_reexport_source ----------------------------------------------------
test_that("can find functions", {
expect_equal(find_reexport_source(is.null), "base")
expect_equal(find_reexport_source(mean), "base")
})
test_that("can find other objects", {
expect_equal(find_reexport_source(na_cpl, "downlit", "na_cpl"), "rlang")
expect_equal(find_reexport_source(na_cpl, "downlit", "MISSING"), NA_character_)
})
downlit/tests/testthat/test-highlight.txt 0000644 0001762 0000144 00000004746 14447371667 020457 0 ustar ligges users > # explicit package
> cat(highlight("MASS::addterm()"))
MASS::addterm()
> cat(highlight("MASS::addterm"))
MASS::addterm
> cat(highlight("?MASS::addterm"))
?MASS::addterm
> # implicit package
> cat(highlight("library(MASS)"))
library(MASS)
> cat(highlight("addterm()"))
addterm()
> cat(highlight("median()"))
median()
> # local package
> cat(highlight("test::foo()"))
test::foo()
> # operators / special syntax
> cat(highlight("1 + 2 * 3"))
1 + 2 * 3
> cat(highlight("x %in% y"))
x %in% y
> cat(highlight("if (FALSE) 1"))
if (FALSE) 1
> cat(highlight("f <- function(x = 'a') {}"))
f <- function(x = 'a') {}
> # ansi escapes + unicode
> cat(highlight("# \033[34mblue\033[39m"))
# blue
downlit/tests/testthat/test-metadata.R 0000644 0001762 0000144 00000002660 14447367567 017647 0 ustar ligges users test_that("can extract urls for package", {
# since the package urls might potentially change
skip_on_cran()
expect_equal(package_urls("base"), character())
expect_equal(package_urls("packagethatdoesn'texist"), character())
expect_equal(package_urls(""), character())
expect_equal(package_urls("MASS"), "http://www.stats.ox.ac.uk/pub/MASS4/")
})
test_that("can extract urls for uninstalled packages from CRAN", {
# Pretend that rlang isn't installed
local_mocked_bindings(is_installed = function(...) FALSE)
rlang_urls <- c("https://rlang.r-lib.org", "https://github.com/r-lib/rlang")
expect_equal(package_urls("rlang"), rlang_urls)
# Always adds CRAN
expect_equal(package_urls("rlang", repos = c()), rlang_urls)
# But prefers user specified repo
fake_repo <- paste0("file:", test_path("fake-repo"))
expect_equal(package_urls("rlang", repos = fake_repo), "https://trick-url.com/")
# even if CRAN comes first
cran_repo <- "https://cran.rstudio.com"
expect_equal(
package_urls("rlang", repos = c(CRAN = cran_repo, fake_repo)),
"https://trick-url.com/"
)
})
test_that("handle common url formats", {
ab <- c("https://a.com", "https://b.com")
expect_equal(parse_urls("https://a.com,https://b.com"), ab)
expect_equal(parse_urls("https://a.com, https://b.com"), ab)
expect_equal(parse_urls("https://a.com https://b.com"), ab)
expect_equal(parse_urls("https://a.com (comment) https://b.com"), ab)
})
downlit/tests/testthat/test-evaluate.R 0000644 0001762 0000144 00000005462 14136771000 017650 0 ustar ligges users test_that("handles parsing failures gracefully", {
expect_snapshot(test_evaluate("1 + ", highlight = TRUE))
})
test_that("highlights when requested", {
expect_snapshot(test_evaluate("1 + \n 2 + 3", highlight = TRUE))
})
test_that("handles basic cases", {
expect_snapshot({
test_evaluate("# comment")
test_evaluate("message('x')")
test_evaluate("warning('x')")
test_evaluate("stop('x', call. = FALSE)")
test_evaluate("f <- function() stop('x'); f()")
})
})
test_that("each line of input gets span", {
expect_snapshot({
test_evaluate("1 +\n 2 +\n 3 +\n 4 +\n 5")
})
})
test_that("output always gets trailing nl", {
# These two calls should produce the same output
expect_snapshot({
test_evaluate('cat("a")\ncat("a\\n")')
})
})
test_that("combines plots as needed", {
expect_snapshot({
f1 <- function() plot(1)
f2 <- function() lines(0:2, 0:2)
test_evaluate("f1()\nf2()\n")
})
expect_snapshot({
f3 <- function() { plot(1); plot(2) }
test_evaluate("f3()")
})
})
test_that("handles other plots", {
# Check that we can drop the inclusion of the first one
registerS3method("is_low_change", "fakePlot", function(p1, p2) TRUE,
envir = asNamespace("downlit")
)
registerS3method("replay_html", "fakePlot", function(x, ...) {
paste0("")
}, envir = asNamespace("downlit"))
registerS3method("print", "fakePlot", function(x, ...) x)
expect_snapshot_output({
f3 <- function() structure(3, class = c("fakePlot", "otherRecordedplot"))
f4 <- function() structure(4, class = c("fakePlot", "otherRecordedplot"))
test_evaluate("f3()\nf4()")
})
})
test_that("ansi escapes are translated to html", {
expect_snapshot({
blue <- function(x) paste0("\033[34m", x, "\033[39m")
f <- function(x) {
cat("Output: ", blue("blue"), "\n", sep = "")
message(paste0("Message: ", blue("blue")))
warning(blue("blue"), call. = FALSE)
stop(blue("blue"), call. = FALSE)
}
test_evaluate("f()\n")
})
})
# html --------------------------------------------------------------------
test_that("can include literal HTML", {
output <- evaluate::new_output_handler(value = identity)
env <- env(foo = function() htmltools::div("foo"))
html <- evaluate_and_highlight("foo()", env = env, output_handler = output, highlight = FALSE)
expect_equal(as.character(html), "foo()\nfoo")
})
test_that("captures dependencies", {
output <- evaluate::new_output_handler(value = identity)
dummy_dep <- htmltools::htmlDependency("dummy", "1.0.0", "dummy.js")
env <- env(foo = function() htmltools::div("foo", dummy_dep))
html <- evaluate_and_highlight("foo()", env = env, output_handler = output, highlight = FALSE)
expect_equal(attr(html, "dependencies"), list(dummy_dep))
})
downlit/tests/testthat/fake-repo/ 0000755 0001762 0000144 00000000000 14447400126 016607 5 ustar ligges users downlit/tests/testthat/fake-repo/src/ 0000755 0001762 0000144 00000000000 14447400126 017376 5 ustar ligges users downlit/tests/testthat/fake-repo/src/contrib/ 0000755 0001762 0000144 00000000000 14137016133 021033 5 ustar ligges users downlit/tests/testthat/fake-repo/src/contrib/PACKAGES 0000644 0001762 0000144 00000000071 14447367216 022151 0 ustar ligges users Package: rlang
Type: Package
URL: https://trick-url.com/
downlit/tests/testthat/index/ 0000755 0001762 0000144 00000000000 14447400126 016045 5 ustar ligges users downlit/tests/testthat/index/index.Rproj 0000644 0001762 0000144 00000000634 13664031627 020203 0 ustar ligges users Version: 1.0
RestoreWorkspace: No
SaveWorkspace: No
AlwaysSaveHistory: Default
EnableCodeIndexing: Yes
UseSpacesForTab: Yes
NumSpacesForTab: 2
Encoding: UTF-8
RnwWeave: knitr
LaTeX: XeLaTeX
AutoAppendNewline: Yes
StripTrailingWhitespace: Yes
LineEndingConversion: Posix
BuildType: Package
PackageUseDevtools: Yes
PackageInstallArgs: --no-multiarch --with-keep.source
PackageRoxygenize: rd,collate,namespace
downlit/tests/testthat/index/NAMESPACE 0000644 0001762 0000144 00000000102 13664031675 017266 0 ustar ligges users # Generated by roxygen2: do not edit by hand
export(a)
export(b)
downlit/tests/testthat/index/LICENSE 0000644 0001762 0000144 00000000054 13664026071 017054 0 ustar ligges users YEAR: 2020
COPYRIGHT HOLDER: Hadley Wickham
downlit/tests/testthat/index/LICENSE.md 0000644 0001762 0000144 00000002061 13664026071 017453 0 ustar ligges users # MIT License
Copyright (c) 2020 Hadley Wickham
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all
copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
SOFTWARE.
downlit/tests/testthat/index/man/ 0000755 0001762 0000144 00000000000 13664031650 016622 5 ustar ligges users downlit/tests/testthat/index/man/a.Rd 0000644 0001762 0000144 00000000223 13664026074 017332 0 ustar ligges users % Generated by roxygen2: do not edit by hand
% Please edit documentation in R/basic.R
\name{a}
\alias{a}
\title{A}
\usage{
a()
}
\description{
A
}
downlit/tests/testthat/index/man/b.Rd 0000644 0001762 0000144 00000000245 13664031650 017333 0 ustar ligges users % Generated by roxygen2: do not edit by hand
% Please edit documentation in R/basic.R
\name{b}
\alias{b}
\alias{c}
\title{B + C}
\usage{
b()
}
\description{
B + C
}
downlit/tests/testthat/index/DESCRIPTION 0000644 0001762 0000144 00000001011 13664026146 017552 0 ustar ligges users Package: index
Title: Test package for document/article indices
Version: 0.0.0.9000
Authors@R:
c(person(given = "Hadley",
family = "Wickham",
role = c("aut", "cre"),
email = "hadley@rstudio.com"),
person(given = "RStudio",
role = "cph"))
Description: What the package does (one paragraph).
License: MIT + file LICENSE
Encoding: UTF-8
LazyData: true
Roxygen: list(markdown = TRUE)
RoxygenNote: 7.1.0
Suggests:
knitr,
rmarkdown
VignetteBuilder: knitr
downlit/tests/testthat/index/vignettes/ 0000755 0001762 0000144 00000000000 14447400126 020055 5 ustar ligges users downlit/tests/testthat/index/vignettes/test-1.Rmd 0000644 0001762 0000144 00000000245 13664025336 021645 0 ustar ligges users ---
title: "test-1"
output: rmarkdown::html_vignette
vignette: >
%\VignetteIndexEntry{test-1}
%\VignetteEngine{knitr::rmarkdown}
%\VignetteEncoding{UTF-8}
---
downlit/tests/testthat/index/vignettes/test-2.Rmd 0000644 0001762 0000144 00000000245 13664025334 021644 0 ustar ligges users ---
title: "test-2"
output: rmarkdown::html_vignette
vignette: >
%\VignetteIndexEntry{test-2}
%\VignetteEngine{knitr::rmarkdown}
%\VignetteEncoding{UTF-8}
---
downlit/tests/testthat/index/R/ 0000755 0001762 0000144 00000000000 13664025304 016247 5 ustar ligges users downlit/tests/testthat/index/R/basic.R 0000644 0001762 0000144 00000000131 13664031647 017455 0 ustar ligges users #' A
#' @export
a <- function() {}
#' B + C
#' @aliases c
#' @export
b <- function() {}
downlit/tests/testthat/_snaps/ 0000755 0001762 0000144 00000000000 14255166156 016232 5 ustar ligges users downlit/tests/testthat/_snaps/highlight.md 0000644 0001762 0000144 00000002563 14447371670 020533 0 ustar ligges users # custom infix operators are linked, but regular are not
x %in% y
---
x + y
# syntax can span multiple lines
Code
cat(highlight("f(\n\n)"))
Output
f(
)
---
Code
cat(highlight("'\n\n'"))
Output
'
'
# ansi escapes are converted to html
[1] "# hello"
---
[1] "# hello"
# placeholder in R pipe gets highlighted and not linked
Code
highlight("1:10 |> mean(x = _)", classes = classes_pandoc())
Output
[1] "1:10 |> mean(x = _)"
downlit/tests/testthat/_snaps/evaluate.md 0000644 0001762 0000144 00000010165 14447371666 020374 0 ustar ligges users # handles parsing failures gracefully
Code
test_evaluate("1 + ", highlight = TRUE)
Output
1 +
#> Error: <text>:2:0: unexpected end of input
#> 1: 1 +
#> ^
# highlights when requested
Code
test_evaluate("1 + \n 2 + 3", highlight = TRUE)
Output
1 +
2 + 3
#> [1] 6
# handles basic cases
Code
test_evaluate("# comment")
Output
# comment
Code
test_evaluate("message('x')")
Output
message('x')
#> x
Code
test_evaluate("warning('x')")
Output
warning('x')
#> Warning: x
Code
test_evaluate("stop('x', call. = FALSE)")
Output
stop('x', call. = FALSE)
#> Error: x
Code
test_evaluate("f <- function() stop('x'); f()")
Output
f <- function() stop('x'); f()
#> Error in f(): x
# each line of input gets span
Code
test_evaluate("1 +\n 2 +\n 3 +\n 4 +\n 5")
Output
1 +
2 +
3 +
4 +
5
#> [1] 15
# output always gets trailing nl
Code
test_evaluate("cat(\"a\")\ncat(\"a\\n\")")
Output
cat("a")
#> a
cat("a\n")
#> a
# combines plots as needed
Code
f1 <- (function() plot(1))
f2 <- (function() lines(0:2, 0:2))
test_evaluate("f1()\nf2()\n")
Output
f1()
f2()
---
Code
f3 <- (function() {
plot(1)
plot(2)
})
test_evaluate("f3()")
Output
f3()
# handles other plots
f3()
f4()
# ansi escapes are translated to html
Code
blue <- (function(x) paste0("\033[34m", x, "\033[39m"))
f <- (function(x) {
cat("Output: ", blue("blue"), "\n", sep = "")
message(paste0("Message: ", blue("blue")))
warning(blue("blue"), call. = FALSE)
stop(blue("blue"), call. = FALSE)
})
test_evaluate("f()\n")
Output
f()
#> Output: blue
#> Message: blue
#> Warning: blue
#> Error: blue
downlit/tests/testthat/_snaps/downlit-html.md 0000644 0001762 0000144 00000001221 14447371664 021177 0 ustar ligges users # highlight all pre inside div.downlit
1 + 2
3 + 4
No hightlight
# special package string gets linked
before downlit after
---
before {notapkg}
after
downlit/tests/testthat/test-packages.R 0000644 0001762 0000144 00000002672 14447373707 017641 0 ustar ligges users test_that("extracts typical library()/require() calls", {
expect_equal(extract_package_attach_(library("blah")), "blah")
expect_equal(extract_package_attach_(library(blah)), "blah")
expect_equal(extract_package_attach_(require("blah")), "blah")
expect_equal(extract_package_attach_(require(blah)), "blah")
})
test_that("detects in nested code", {
expect_equal(extract_package_attach_({
library(a)
x <- 2
{
library(b)
y <- 3
{
library(c)
z <- 4
}
}
}), c("a", "b", "c"))
})
test_that("handles expressions", {
# which will usually come from parse()d code
#
x <- expression(
x <- 1,
library("a"),
y <- 2,
library("b")
)
expect_equal(extract_package_attach(x), c("a", "b"))
})
test_that("detects with non-standard arg order", {
withr::local_options(warnPartialMatchArgs = FALSE)
expect_equal(extract_package_attach_(library(quiet = TRUE, pa = "a")), "a")
expect_equal(extract_package_attach_(library(quiet = TRUE, a)), "a")
})
test_that("doesn't include if character.only = TRUE", {
expect_equal(
extract_package_attach_(library(x, character.only = TRUE)),
character()
)
})
test_that("adds packages attached by dependencies", {
skip_if_not_installed("ggraph")
expect_true("ggplot2" %in% add_depends("ggraph"))
})
test_that("adds tidyverse packages", {
skip_if_not_installed("tidyverse")
expect_true("ggplot2" %in% add_depends("tidyverse"))
})
downlit/tests/testthat/test-downlit-md-v21.txt 0000644 0001762 0000144 00000001234 14255166156 021151 0 ustar ligges users > cat(downlit_md_string("* `base::t`"))
- [`base::t`](https://rdrr.io/r/base/t.html)
> cat(downlit_md_string("```\nbase::t(1)\n```"))
base::t(1)
> cat(downlit_md_string(brio::read_lines(test_path("markdown-table.md"))))
Table: Caption [`base::t`](https://rdrr.io/r/base/t.html)
| [`base::t`](https://rdrr.io/r/base/t.html) | xx |
|-------------------------------------------:|:----|
| [`base::t`](https://rdrr.io/r/base/t.html) | yy |
downlit/tests/testthat/test-downlit-md.R 0000644 0001762 0000144 00000003560 14042074111 020110 0 ustar ligges users # Sys.setenv("RSTUDIO_PANDOC" = "")
# rmarkdown::find_pandoc(cache = FALSE)
test_that("common across multiple versions", {
skip_if_not(rmarkdown::pandoc_version() > "2.0.0")
verify_output(test_path("test-downlit-md.txt"), {
"Bare code"
cat(downlit_md_string("`base::t`"))
"No transforms"
cat(downlit_md_string("## `base::t`"))
cat(downlit_md_string("[`base::t`](http://google.com)"))
"Nested"
cat(downlit_md_string("*`base::t`*"))
cat(downlit_md_string("`base::t`"))
cat(downlit_md_string("1. `base::t`"))
"Markdown extensions not in GFM"
cat(downlit_md_string("| `base::t`", format = "markdown"))
md <- brio::read_lines(test_path("markdown-definition.md"))
cat(downlit_md_string(md, "markdown"))
})
})
test_that("pandoc AST v1.20", {
skip_if_not(rmarkdown::pandoc_version() > "2.0.0")
skip_if_not(rmarkdown::pandoc_version() < "2.10")
verify_output(test_path("test-downlit-md-v20.txt"), {
cat(downlit_md_string("* `base::t`"))
cat(downlit_md_string("```\nbase::t(1)\n```"))
cat(downlit_md_string(brio::read_lines(test_path("markdown-table.md"))))
})
})
test_that("pandoc AST v1.21", {
skip_if_not(rmarkdown::pandoc_version() >= "2.10")
skip_if_not(rmarkdown::pandoc_version() < "2.11")
verify_output(test_path("test-downlit-md-v21.txt"), {
cat(downlit_md_string("* `base::t`"))
cat(downlit_md_string("```\nbase::t(1)\n```"))
cat(downlit_md_string(brio::read_lines(test_path("markdown-table.md"))))
})
})
test_that("Special package string gets linked", {
# needed for eof setting on windows
skip_if_not(rmarkdown::pandoc_version() > "2.0.0")
expect_equal(
downlit_md_string("`{downlit}`"),
"[downlit](https://downlit.r-lib.org/)\n"
)
expect_equal(
downlit_md_string("`{thisisrealltnotapackagename}`"),
"`{thisisrealltnotapackagename}`\n"
)
})
downlit/tests/testthat/test-utils.R 0000644 0001762 0000144 00000000275 14136771000 017177 0 ustar ligges users test_that("converts Latin1 encoded text to utf8", {
x <- "'\xfc'"
Encoding(x) <- "latin1"
y <- safe_parse(x)[[1]]
expect_equal(Encoding(y), "UTF-8")
expect_equal(y, "\u00fc")
})
downlit/tests/testthat/markdown-table.md 0000644 0001762 0000144 00000000124 13665174616 020201 0 ustar ligges users Table: Caption `base::t`
| `base::t` | xx |
|----------:|:---|
| `base::t` | yy |
downlit/tests/testthat/test-downlit-md.txt 0000644 0001762 0000144 00000001653 14447371665 020556 0 ustar ligges users > # Bare code
> cat(downlit_md_string("`base::t`"))
[`base::t`](https://rdrr.io/r/base/t.html)
> # No transforms
> cat(downlit_md_string("## `base::t`"))
## `base::t`
> cat(downlit_md_string("[`base::t`](http://google.com)"))
[`base::t`](http://google.com)
> # Nested
> cat(downlit_md_string("*`base::t`*"))
*[`base::t`](https://rdrr.io/r/base/t.html)*
> cat(downlit_md_string("`base::t`"))
[`base::t`](https://rdrr.io/r/base/t.html)
> cat(downlit_md_string("1. `base::t`"))
1. [`base::t`](https://rdrr.io/r/base/t.html)
> # Markdown extensions not in GFM
> cat(downlit_md_string("| `base::t`", format = "markdown"))
| [`base::t`](https://rdrr.io/r/base/t.html)
> md <- brio::read_lines(test_path("markdown-definition.md"))
> cat(downlit_md_string(md, "markdown"))
Term 1 ([`base::t()`](https://rdrr.io/r/base/t.html))
: Definition 1 ([`base::t()`](https://rdrr.io/r/base/t.html))
downlit/tests/testthat/test-highlight.R 0000644 0001762 0000144 00000010406 14447366104 020015 0 ustar ligges users test_that("can link to external topics that use ::", {
local_options(
"downlit.package" = "test",
"downlit.topic_index" = c(foo = "bar")
)
verify_output(test_path("test-highlight.txt"), {
"explicit package"
cat(highlight("MASS::addterm()"))
cat(highlight("MASS::addterm"))
cat(highlight("?MASS::addterm"))
"implicit package"
cat(highlight("library(MASS)"))
cat(highlight("addterm()"))
cat(highlight("median()")) # base
"local package"
cat(highlight("test::foo()"))
"operators / special syntax"
cat(highlight("1 + 2 * 3"))
cat(highlight("x %in% y"))
cat(highlight("if (FALSE) 1"))
cat(highlight("f <- function(x = 'a') {}"))
"ansi escapes + unicode"
cat(highlight("# \033[34mblue\033[39m"))
})
})
test_that("don't link to non-functions with matching topic name", {
local_options("downlit.attached" = "MASS")
expect_equal(
highlight("abbey()"),
"abbey()"
)
})
test_that("empty string returns empty string", {
expect_equal(highlight(""), "")
})
test_that("unicode is not mangled", {
skip_on_os("windows")
expect_equal(highlight("# \u2714"), "# \u2714")
})
test_that("custom infix operators are linked, but regular are not", {
expect_snapshot_output(cat(highlight("x %in% y\n")))
expect_snapshot_output(cat(highlight("x + y\n")))
})
test_that("distinguish logical and numeric",{
expect_equal(highlight("TRUE"), "TRUE")
expect_equal(highlight("FALSE"), "FALSE")
expect_equal(highlight("1"), "1")
})
test_that("can parse code with carriage returns", {
lines <- strsplit(highlight("1\r\n2"), "\n")[[1]]
expect_equal(lines[[1]], "1")
expect_equal(lines[[2]], "2")
})
test_that("can highlight code in Latin1", {
x <- "'\xfc'"
Encoding(x) <- "latin1"
out <- highlight(x)
expect_equal(Encoding(out), "UTF-8")
expect_equal(out, "'\u00fc'")
})
test_that("syntax can span multiple lines", {
expect_snapshot(cat(highlight("f(\n\n)")))
expect_snapshot(cat(highlight("'\n\n'")))
})
test_that("code with tab is not mangled", {
expect_equal(highlight("\tf()"), " f()")
expect_equal(highlight("'\t'"), "' '")
})
test_that("unparsable code returns NULL", {
expect_equal(highlight("<"), NA_character_)
# but pure comments still highlighted
expect_equal(
highlight("#"),
"#"
)
})
test_that("R6 methods don't get linked", {
expect_equal(
highlight("x$get()"),
"x$get()"
)
expect_equal(
highlight("x$library()"),
"x$library()"
)
})
test_that("R6 instantiation gets linked", {
expect_equal(
highlight("mean$new()"),
"mean$new()"
)
# But not new itself
expect_equal(
highlight("new()"),
"new()"
)
})
test_that("ansi escapes are converted to html", {
expect_snapshot_output(highlight("# \033[31mhello\033[m"))
expect_snapshot_output(highlight("# \u2029[31mhello\u2029[m"))
})
test_that("can highlight vers long strings", {
val <- paste0(rep('very', 200), collapse = " ")
out <- downlit::highlight(sprintf("'%s'", val))
expect_equal(out, paste0("'", val, "'"))
})
test_that("placeholder in R pipe gets highlighted and not linked", {
skip_if_not(getRversion() >= "4.2", message = "Pipes are available from R 4.1")
expect_snapshot(highlight("1:10 |> mean(x = _)", classes = classes_pandoc()))
})
downlit/tests/testthat/test-downlit-html.txt 0000644 0001762 0000144 00000001436 14447371663 021117 0 ustar ligges users > out <- downlit_html_path(test_path("autolink.html"), tempfile())
> cat(brio::read_lines(out), sep = "\n")
# This is a comment
stats::median()
stats::median()
stats::median()
stats::median()
downlit/tests/testthat.R 0000644 0001762 0000144 00000000072 13663736745 015102 0 ustar ligges users library(testthat)
library(downlit)
test_check("downlit")
downlit/R/ 0000755 0001762 0000144 00000000000 14447373707 012153 5 ustar ligges users downlit/R/rd.R 0000644 0001762 0000144 00000001070 13664033224 012664 0 ustar ligges users package_rd <- function(path) {
rd <- dir(path, pattern = "\\.[Rr]d$", full.names = TRUE)
names(rd) <- basename(rd)
lapply(rd, rd_file, pkg_path = dirname(path))
}
rd_file <- function(path, pkg_path = NULL) {
if (getRversion() >= "3.4.0") {
macros <- tools::loadPkgRdMacros(pkg_path)
} else {
macros <- tools::loadPkgRdMacros(pkg_path, TRUE)
}
tools::parse_Rd(path, macros = macros, encoding = "UTF-8")
}
extract_alias <- function(x, tag) {
is_alias <- vapply(x, function(x) attr(x, "Rd_tag") == "\\alias", logical(1))
unlist(x[is_alias])
}
downlit/R/downlit-html.R 0000644 0001762 0000144 00000006053 14136771000 014703 0 ustar ligges users #' Syntax highlight and link an HTML page
#'
#' @description
#' * Code blocks, identified by `` tags with class `sourceCode r`
#' or any `` tag inside of ``, are
#' processed with [highlight()].
#'
#' * Inline code, identified by `` tags that contain only text
#' (and don't have a header tag (e.g. ``) or `` as an ancestor)
#' are processed processed with [autolink()].
#'
#' Use `downlit_html_path()` to process an `.html` file on disk;
#' use `downlit_html_node()` to process an in-memory `xml_node` as part of a
#' larger pipeline.
#'
#' @param in_path,out_path Input and output paths for HTML file
#' @inheritParams highlight
#' @param x An `xml2::xml_node`
#' @return `downlit_html_path()` invisibly returns `output_path`;
#' `downlit_html_node()` modifies `x` in place and returns nothing.
#' @export
#' @examples
#' node <- xml2::read_xml("base::t()
")
#' node
#'
#' # node is modified in place
#' downlit_html_node(node)
#' node
downlit_html_path <- function(in_path, out_path, classes = classes_pandoc()) {
if (!is_installed("xml2")) {
abort("xml2 package required .html transformation")
}
html <- xml2::read_html(in_path, encoding = "UTF-8")
downlit_html_node(html, classes = classes)
xml2::write_html(html, out_path, format = FALSE)
invisible(out_path)
}
#' @export
#' @rdname downlit_html_path
downlit_html_node <- function(x, classes = classes_pandoc()) {
stopifnot(inherits(x, "xml_node"))
xpath <- c(
# Usual block generated by pandoc (after syntax highlighting)
".//pre[contains(@class, 'sourceCode r')]",
# Special case that allows us to override usually rules if needed
".//div[contains(@class, 'downlit')]//pre"
)
xpath_block <- paste(xpath, collapse = "|")
tweak_children(x, xpath_block, highlight,
pre_class = "downlit sourceCode r",
classes = classes,
replace = "node",
code = TRUE
)
# Identify containing only text (i.e. no children)
# that are not descendants of an element where links are undesirable
bad_ancestor <- c("h1", "h2", "h3", "h4", "h5", "a", "summary")
bad_ancestor <- paste0("ancestor::", bad_ancestor, collapse = "|")
xpath_inline <- paste0(".//code[count(*) = 0 and not(", bad_ancestor, ")]")
# replace inline code "{packagename}" with linked text if possible
tweak_children(x, xpath_inline, autolink_curly, replace = "node")
# handle remaining inline code
tweak_children(x, xpath_inline, autolink, replace = "contents")
invisible()
}
tweak_children <- function(node, xpath, fun, ..., replace = c("node", "contents")) {
replace <- arg_match(replace)
nodes <- xml2::xml_find_all(node, xpath)
text <- xml2::xml_text(nodes)
replacement <- map_chr(text, fun, ...)
to_update <- !is.na(replacement)
old <- nodes[to_update]
if (replace == "contents") {
old <- xml2::xml_contents(old)
}
new <- lapply(replacement[to_update], as_xml)
xml2::xml_replace(old, new, .copy = FALSE)
invisible()
}
as_xml <- function(x) {
xml2::xml_contents(xml2::xml_contents(xml2::read_html(x)))[[1]]
}
downlit/R/metadata.R 0000644 0001762 0000144 00000007634 14447366765 014076 0 ustar ligges users remote_urls <- function(package) {
local <- getOption("downlit.local_packages")
if (has_name(local, package)) {
base_url <- local[[package]]
list(
reference = file.path(base_url, "reference"),
article = file.path(base_url, "articles")
)
} else {
remote_metadata(package)$urls
}
}
remote_package_reference_url <- function(package) {
remote_urls(package)$reference
}
remote_package_article_url <- function(package) {
remote_urls(package)$article
}
# Retrieve remote metadata ------------------------------------------------
remote_metadata <- function(package) {
# Is the metadata installed with the package?
meta <- local_metadata(package)
if (!is.null(meta)) {
return(meta)
}
# Otherwise, look in package websites, caching since this is a slow operation
tempdir <- Sys.getenv("RMARKDOWN_PREVIEW_DIR", unset = tempdir())
dir.create(file.path(tempdir, "downlit"), showWarnings = FALSE)
cache_path <- file.path(tempdir, "downlit", package)
if (file.exists(cache_path)) {
readRDS(cache_path)
} else {
meta <- remote_metadata_slow(package)
saveRDS(meta, cache_path)
meta
}
}
local_metadata <- function(package) {
local_path <- system.file("pkgdown.yml", package = package)
if (local_path == "") {
NULL
} else {
yaml::read_yaml(local_path)
}
}
remote_metadata_slow <- function(package) {
urls <- package_urls(package)
for (url in urls) {
url <- paste0(url, "/pkgdown.yml")
yaml <- tryCatch(fetch_yaml(url), error = function(e) NULL)
if (is.list(yaml)) {
if (has_name(yaml, "articles")) {
yaml$articles <- unlist(yaml$articles)
}
if (!has_name(yaml, "urls")) {
base_url <- dirname(url)
yaml$urls <- list(
reference = paste0(base_url, "/reference"),
article = paste0(base_url, "/articles")
)
}
return(yaml)
}
}
NULL
}
fetch_yaml <- function(url) {
path <- tempfile()
if (suppressWarnings(utils::download.file(url, path, quiet = TRUE) != 0)) {
abort("Failed to download")
}
# This call may warn if the URL doesn't have a final LF;
# see pkgdown issue #1419
suppressWarnings(yaml::read_yaml(path))
}
# Helpers -----------------------------------------------------------------
package_urls <- function(package, repos = getOption("repos")) {
if (package == "") { # if e.g. library(a$pkg) then pkg is ""
return(character())
}
if (is_installed(package)) {
path <- system.file("DESCRIPTION", package = package)
# If the package is installed, use its DESCRIPTION
url <- read.dcf(path, fields = "URL")[[1]]
} else {
# Otherwise try repo metadata, always trying CRAN last
user_repos <- repos[names2(repos) != "CRAN"]
meta <- c(lapply(user_repos, repo_urls), list(CRAN_urls()))
urls <- unlist(lapply(meta, function(pkgs) pkgs$URL[match(package, pkgs[["Package"]])]))
# Take first non-NA (if any)
url <- urls[!is.na(urls)]
if (all(is.na(urls))) {
url <- NA_character_
} else {
url <- urls[!is.na(urls)][[1]]
}
}
parse_urls(url)
}
parse_urls <- function(x) {
urls <- trimws(strsplit(trimws(x), "[,\\s]+", perl = TRUE)[[1]])
urls <- urls[grepl("^http", urls)]
sub_special_cases(urls)
}
# Both memoised in .onLoad
repo_urls <- function(repo) {
# Only works if the repo exposes the URL field in PACKAGES, and most don't
as.data.frame(
utils::available.packages(repos = repo, fields = "URL"),
stringsAsFactors = FALSE
)
}
CRAN_urls <- function() {
# Substantially faster to use RStudio mirror: in my testing this reduced
# download time from ~2s to 0.6s
withr::local_envvar(R_CRAN_WEB = "https://cran.rstudio.com")
tools::CRAN_package_db()
}
# All rOpenSci repositories have a known pkgdown URL.
# Todo: could generalise this concept for other orgs.
sub_special_cases <- function(urls){
sub("^https?://github.com/ropensci/(\\w+).*$", "https://docs.ropensci.org/\\1", urls)
}
downlit/R/downlit-package.R 0000644 0001762 0000144 00000000333 14122441316 015324 0 ustar ligges users #' @keywords internal
"_PACKAGE"
# The following block is used by usethis to automatically manage
# roxygen namespace tags. Modify with care!
## usethis namespace: start
#' @import rlang
## usethis namespace: end
NULL
downlit/R/utils.R 0000644 0001762 0000144 00000003232 14250475251 013423 0 ustar ligges users devtools_loaded <- function(x) {
if (!x %in% loadedNamespaces()) {
return(FALSE)
}
ns <- .getNamespace(x)
env_has(ns, ".__DEVTOOLS__")
}
invert_index <- function(x) {
stopifnot(is.list(x))
if (length(x) == 0)
return(list())
key <- rep(names(x), lengths(x))
val <- unlist(x, use.names = FALSE)
split(key, val)
}
standardise_text <- function(x) {
x <- enc2utf8(x)
x <- gsub("\t", " ", x, fixed = TRUE, useBytes = TRUE)
x <- gsub("\r", "", x, fixed = TRUE, useBytes = TRUE)
# \033 can't be represented in xml (and hence is ignored by xml2)
# so we convert to \u2029 in order to survive a round trip
x <- gsub("\u2029", "\033", x, fixed = TRUE, useBytes = TRUE)
x
}
safe_parse <- function(text, standardise = TRUE) {
if (standardise) {
text <- standardise_text(text)
}
lines <- strsplit(text, "\n", fixed = TRUE, useBytes = TRUE)[[1]]
srcfile <- srcfilecopy("test.r", lines)
# https://github.com/gaborcsardi/rencfaq#how-to-parse-utf-8-text-into-utf-8-code
Encoding(text) <- "unknown"
con <- textConnection(text)
on.exit(close(con), add = TRUE)
tryCatch(
parse(con, keep.source = TRUE, encoding = "UTF-8", srcfile = srcfile),
error = function(e) NULL
)
}
extract_curly_package <- function(x) {
# regex adapted from https://github.com/r-lib/usethis/blob/d5857737b4780c3c3d8fe6fb44ef70e81796ac8e/R/description.R#L134
if (! grepl("^\\{[a-zA-Z][a-zA-Z0-9.]+\\}$", x)) {
return(NA)
}
# remove first curly brace
x <- sub("\\{", "", x)
# remove second curly brace and return
x <- sub("\\}", "", x)
x
}
show_xml <- function(x) {
cat(as.character(x, options = c("format", "no_declaration")))
}
downlit/R/zzz.R 0000644 0001762 0000144 00000000334 14260703760 013120 0 ustar ligges users .onLoad <- function(libname, pkgname) {
repo_urls <<- memoise::memoise(repo_urls)
CRAN_urls <<- memoise::memoise(CRAN_urls)
# Silence R CMD check note, since only used in memoised function
withr::local_envvar
}
downlit/R/test-helpers.R 0000644 0001762 0000144 00000000546 14136776231 014714 0 ustar ligges users local_devtools_package <- function(path, ..., env = parent.frame()) {
pkgload::load_all(path, ..., quiet = TRUE)
defer(pkgload::unload(pkgload::pkg_name(path)), scope = env)
}
defer <- function(expr, scope = parent.frame()) {
expr <- enquo(expr)
call <- expr(on.exit(rlang::eval_tidy(!!expr), add = TRUE))
eval_bare(call, scope)
invisible()
}
downlit/R/packages.R 0000644 0001762 0000144 00000003534 14447373707 014061 0 ustar ligges users extract_package_attach <- function(expr) {
if (is.expression(expr)) {
packages <- lapply(expr, extract_package_attach)
unlist(packages)
} else if (is_call(expr)) {
if (is_call(expr, c("library", "require"))) {
if (is_call(expr, "library")) {
expr <- match.call(library, expr)
} else {
expr <- match.call(require, expr)
}
if (!is_true(expr$character.only)) {
as.character(expr$package)
} else {
character()
}
} else {
args <- as.list(expr[-1])
unlist(lapply(args, extract_package_attach))
}
} else {
character()
}
}
# Helper for testing
extract_package_attach_ <- function(expr) {
extract_package_attach(enexpr(expr))
}
register_attached_packages <- function(packages) {
packages <- add_depends(packages)
options("downlit.attached" = union(packages, getOption("downlit.attached")))
}
add_depends <- function(packages) {
if ("tidyverse" %in% packages && is_installed("tidyverse")) {
core <- getNamespace("tidyverse")$core
packages <- union(packages, core)
}
# add packages attached by depends
depends <- unlist(lapply(packages, package_depends))
union(packages, depends)
}
package_depends <- function(package) {
if (!is_installed(package)) {
return(character())
}
if (!is.null(devtools_meta(package))) {
path_desc <- system.file("DESCRIPTION", package = "pkgdown")
deps <- desc::desc_get_deps(path_desc)
depends <- deps$package[deps$type == "Depends"]
depends <- depends[depends != "R"]
return(depends)
}
path_meta <- system.file("Meta", "package.rds", package = package)
meta <- readRDS(path_meta)
names(meta$Depends)
}
# from https://github.com/r-lib/pkgdown/blob/8e0838e273462cec420dfa20f240c684a33425d9/R/utils.r#L62
devtools_meta <- function(x) {
ns <- .getNamespace(x)
ns[[".__DEVTOOLS__"]]
}
downlit/R/topic-index.R 0000644 0001762 0000144 00000003234 14257417776 014527 0 ustar ligges users # Compute topic index -----------------------------------------------------
# The topic index is a character vector that maps aliases to Rd file names
# (sans extension). Memoised for performance.
topic_index <- function(package) {
if (is.null(package)) {
getOption("downlit.topic_index")
} else if (devtools_loaded(package)) {
# Use live docs for in-development packages
topic_index_source(package)
} else {
topic_index_installed(package)
}
}
topic_index_source <- function(package) {
path <- file.path(find.package(package), "man")
if (!file.exists(path)) {
return(character())
}
rd <- package_rd(path)
aliases <- lapply(rd, extract_alias)
names(aliases) <- gsub("\\.Rd$", "", names(rd))
unlist(invert_index(aliases))
}
topic_index_installed <- function(package) {
path <- system.file("help", "aliases.rds", package = package)
if (path == "")
return(character())
readRDS(path)
}
find_rdname <- function(package, topic) {
index <- topic_index(package)
if (has_name(index, topic)) {
index[[topic]]
} else {
NULL
}
}
find_rdname_attached <- function(topic, is_fun = FALSE) {
packages <- c(
getOption("downlit.attached"),
c("datasets", "utils", "grDevices", "graphics", "stats", "base")
)
for (package in packages) {
if (!is_installed(package)) {
next
}
if (is_fun && !is_exported(topic, package)) {
next
}
rdname <- find_rdname(package, topic)
if (!is.null(rdname)) {
return(list(rdname = rdname, package = package))
}
}
NULL
}
# https://github.com/r-lib/rlang/issues/1434
is_installed <- function(x) {
!identical(system.file(package = x), "")
}
downlit/R/evaluate.R 0000644 0001762 0000144 00000016347 14250475251 014104 0 ustar ligges users #' Evaluate code and syntax highlight the results
#'
#' This function runs `code` and captures the output using
#' [evaluate::evaluate()]. It syntax higlights code with [highlight()], and
#' intermingles it with output.
#'
#' @param code Code to evaluate (as a string).
#' @param fig_save A function with arguments `plot` and `id` that is
#' responsible for saving `plot` to a file (using `id` to disambiguate
#' multiple plots in the same chunk). It should return a list with
#' components `path`, `width`, and `height`.
#' @param env Environment in which to evaluate code; if not supplied,
#' defaults to a child of the global environment.
#' @param output_handler Custom output handler for [evaluate::evaluate()].
#' @param highlight Optionally suppress highlighting. This is useful for tests.
#' @return An string containing HTML with a `dependencies` attribute
#' giving an additional htmltools dependencies required to render the HTML.
#' @inheritParams highlight
#' @export
#' @examples
#' cat(evaluate_and_highlight("1 + 2"))
#' cat(evaluate_and_highlight("x <- 1:10\nmean(x)"))
#'
#' # -----------------------------------------------------------------
#' # evaluate_and_highlight() powers pkgdown's documentation formatting so
#' # here I include a few examples to make sure everything looks good
#' # -----------------------------------------------------------------
#'
#' blue <- function(x) paste0("\033[34m", x, "\033[39m")
#' f <- function(x) {
#' cat("This is some output. My favourite colour is ", blue("blue"), ".\n", sep = "")
#' message("This is a message. My favourite fruit is ", blue("blueberries"))
#' warning("Now at stage ", blue("blue"), "!")
#' }
#' f()
#'
#' plot(1:10)
evaluate_and_highlight <- function(code,
fig_save,
classes = downlit::classes_pandoc(),
env = NULL,
output_handler = evaluate::new_output_handler(),
highlight = TRUE) {
env <- env %||% child_env(global_env())
expr <- evaluate::evaluate(code, child_env(env), new_device = TRUE,
output_handler = output_handler)
replay_html(expr,
fig_save = fig_save,
fig_id = unique_id(),
classes = classes,
highlight = highlight
)
}
test_evaluate <- function(code, ..., highlight = FALSE) {
fig_save <- function(plot, id) {
list(path = paste0(id, ".png"), width = 10, height = 10)
}
cat(evaluate_and_highlight(
code,
fig_save = fig_save,
env = caller_env(),
highlight = highlight,
...
))
}
replay_html <- function(x, ...) UseMethod("replay_html", x)
#' @export
replay_html.list <- function(x, ...) {
# keep only high level plots
x <- merge_low_plot(x)
html <- lapply(x, replay_html, ...)
dependencies <- unlist(lapply(html, attr, "dependencies"), recursive = FALSE)
structure(
paste0(html, collapse = ""),
dependencies = dependencies
)
}
#' @export
replay_html.NULL <- function(x, ...) ""
#' @export
replay_html.character <- function(x, ...) {
label_output(escape_html(x), "r-out")
}
#' @export
replay_html.source <- function(x, ..., classes, highlight = FALSE) {
if (highlight) {
html <- highlight(x$src, classes = classes)
}
if (!highlight || is.na(html)) {
html <- escape_html(x$src)
}
label_input(html, "r-in")
}
#' @export
replay_html.warning <- function(x, ...) {
message <- paste0(span("Warning: ", class = "warning"), escape_html(conditionMessage(x)))
label_output(message, "r-wrn")
}
#' @export
replay_html.message <- function(x, ...) {
message <- escape_html(paste0(gsub("\n$", "", conditionMessage(x))))
label_output(message, "r-msg")
}
#' @export
replay_html.error <- function(x, ...) {
if (is.null(x$call)) {
prefix <- "Error:"
} else {
prefix <- paste0("Error in ", escape_html(paste0(deparse(x$call), collapse = "")), ":")
}
message <- paste0(span(prefix, class = "error"), " ", escape_html(conditionMessage(x)))
label_output(message, "r-err")
}
#' @export
replay_html.recordedplot <- function(x, fig_save, fig_id, ...) {
fig <- fig_save(x, fig_id())
img <- paste0(
"
"
)
paste0(span(img, class = "r-plt img"), "\n")
}
# htmltools ---------------------------------------------------------------
# These will only be encountered if output_handler has a custom value
# callback that returns HTML.
#' @export
replay_html.html <- function(x, ...) {
rendered <- htmltools::renderTags(x)
structure(
rendered$html,
dependencies = rendered$dependencies,
class = "downlit_html"
)
}
#' @export
replay_html.shiny.tag <- replay_html.html
#' @export
replay_html.shiny.tag.function <- replay_html.html
#' @export
replay_html.shiny.tag.list <- replay_html.html
#' @export
replay_html.htmlwidget <- replay_html.html
# helpers -----------------------------------------------------------------
label_output <- function(x, class) {
prompt <- span("#>", class = "r-pr")
lines <- strsplit(x, "\n")[[1]]
lines <- fansi::sgr_to_html(lines)
lines <- paste0(prompt, " ", lines)
lines <- span(lines, class = paste(class, "co"))
paste0(lines, "\n", collapse = "")
}
label_input <- function(x, class) {
lines <- strsplit(x, "\n")[[1]]
lines <- span(lines, class = class)
paste0(lines, "\n", collapse = "")
}
span <- function(..., class = NULL) {
paste0(
"",
...,
""
)
}
unique_id <- function() {
i <- 0
function() {
i <<- i + 1
i
}
}
# Knitr functions ------------------------------------------------------------
# The functions below come from package knitr (Yihui Xie) in file plot.R
# get MD5 digests of recorded plots so that merge_low_plot works
digest_plot = function(x, level = 1) {
if (inherits(x, "otherRecordedplot"))
return(x)
if (!is.list(x) || level >= 3) return(structure(digest::digest(x),
class = "plot_digest"))
lapply(x, digest_plot, level = level + 1)
}
is_plot_output = function(x) {
evaluate::is.recordedplot(x) || inherits(x, 'otherRecordedplot')
}
# merge low-level plotting changes
merge_low_plot = function(x, idx = vapply(x, is_plot_output, logical(1L))) {
idx = which(idx); n = length(idx); m = NULL # store indices that will be removed
if (n <= 1) return(x)
# digest of recorded plots
rp_dg <- lapply(x[idx], digest_plot)
i1 = idx[1]; i2 = idx[2] # compare plots sequentially
for (i in 1:(n - 1)) {
# remove the previous plot and move its index to the next plot
if (is_low_change(rp_dg[[i]], rp_dg[[i+1]])) m = c(m, i1)
i1 = idx[i + 1]
i2 = idx[i + 2]
}
if (is.null(m)) x else x[-m]
}
#' Compare two recorded plots
#'
#' @param p1,p2 Plot results
#'
#' @return Logical value indicating whether `p2` is a low-level update of `p1`.
#' @export
is_low_change = function(p1, p2) {
UseMethod("is_low_change")
}
#' @export
is_low_change.default = function(p1, p2) {
p1 = p1[[1]]; p2 = p2[[1]] # real plot info is in [[1]]
if ((n2 <- length(p2)) < (n1 <- length(p1))) return(FALSE) # length must increase
identical(p1[1:n1], p2[1:n1])
}
downlit/R/downlit-md.R 0000644 0001762 0000144 00000015766 14042074111 014345 0 ustar ligges users #' Syntax highlight and link a md document
#'
#' @description
#' `downlit_md_*` works by traversing the markdown AST generated by Pandoc.
#' It applies [highlight()] to `CodeBlock`s and [autolink()] to inline `Code`.
#'
#' Use `downlit_md_path()` to transform a file on disk; use
#' `downlit_md_string()` to transform a string containing markdown as part
#' of a larger pipeline.
#'
#' Needs pandoc 1.19 or later.
#'
#' @export
#' @param in_path,out_path Input and output paths for markdown file.
#' @param x A string containing markdown.
#' @param format Pandoc format; defaults to "gfm" if you have pandoc 2.0.0 or
#' greater, otherwise "markdown_github".
#' @return `downlit_md_path()` invisibly returns `output_path`;
#' `downlit_md_string()` returns a string containing markdown.
#' @examples
#' if (rmarkdown::pandoc_available("1.19")) {
#' downlit_md_string("`base::t()`")
#' downlit_md_string("`base::t`")
#' downlit_md_string("* `base::t`")
#'
#' # But don't highlight in headings
#' downlit_md_string("## `base::t`")
#' }
downlit_md_path <- function(in_path, out_path, format = NULL) {
check_packages()
ast_path <- tempfile()
on.exit(unlink(ast_path))
md2ast(in_path, ast_path, format = format)
ast <- jsonlite::read_json(ast_path)
ast$blocks <- transform_code(ast$blocks, ast_version(ast))
jsonlite::write_json(ast, ast_path, auto_unbox = TRUE, null = "null")
ast2md(ast_path, out_path, format = format)
}
#' @export
#' @rdname downlit_md_path
downlit_md_string <- function(x, format = NULL) {
check_packages()
path <- tempfile()
on.exit(unlink(path))
brio::write_lines(x, path)
downlit_md_path(path, path, format = format)
brio::read_file(path)
}
# Markdown <-> pandoc AST -------------------------------------------------
md2ast <- function(path, out_path, format = NULL) {
format <- format %||% md_format()
rmarkdown::pandoc_convert(
input = normalizePath(path, mustWork = FALSE),
output = normalizePath(out_path, mustWork = FALSE),
from = format,
to = "json"
)
invisible(out_path)
}
ast2md <- function(path, out_path, format = NULL) {
format <- format %||% md_format()
options <- c(
if (rmarkdown::pandoc_available("2.0")) "--eol=lf",
if (rmarkdown::pandoc_version() < "2.11.2") "--atx-headers", # 1.19-2.11.2
"--wrap=none" # 1.16
)
rmarkdown::pandoc_convert(
input = normalizePath(path, mustWork = FALSE),
output = normalizePath(out_path, mustWork = FALSE),
from = "json",
to = format,
options = options
)
invisible(out_path)
}
ast_version <- function(ast) {
string <- paste(unlist(ast$`pandoc-api-version`), collapse = ".")
package_version(string)
}
md_format <- function() {
if (rmarkdown::pandoc_available("2.0.0")) {
"gfm"
} else {
"markdown_github"
}
}
# Code transformation -----------------------------------------------------
# Data types at
# https://hackage.haskell.org/package/pandoc-types-1.20/docs/Text-Pandoc-Definition.html
transform_code <- function(x, version) {
stopifnot(is.list(x))
# Blocks that are a list of blocks
block_list <- c(
# Block
"Plain", "Para", "LineBlock", "BlockQuote", "BulletList",
# Inline
"Emph", "Strong", "Strikeout", "Superscript", "Subscript",
"SmallCaps", "Note", "Underline"
)
# Blocks that have a list of blocks as second child
block_list2 <- c(
"OrderedList", "Quoted",
"Div", "Span",
"Caption", "TableHead", "TableFoot", "Row"
)
skip <- c(
"Header", "CodeBlock", "RawBlock", "HorizontalRule", "Null",
"Math", "RawInline", "Link", "Image", "Cite",
"Str", "Space", "SoftBreak", "LineBreak"
)
if (!is_named(x)) {
lapply(x, transform_code, version = version)
} else {
if (x$t == "Code") {
package_name <- extract_curly_package(x$c[[2]])
# packages à la {pkgname}
if(!is.na(package_name)) {
href <- href_package(package_name)
if (!is.na(href)) {
x <- list(t = "Str", c = package_name)
x <- pandoc_link(pandoc_attr(), list(x), pandoc_target(href))
} # otherwise we do not touch x
} else {
# other cases
href <- autolink_url(x$c[[2]])
if (!is.na(href)) {
x <- pandoc_link(pandoc_attr(), list(x), pandoc_target(href))
}
}
} else if (x$t == "CodeBlock") {
out <- highlight(x$c[[2]], pre_class = "chroma")
if (!is.na(out)) {
x <- pandoc_raw_block("html", out)
}
} else if (x$t %in% block_list) {
# Plain [Inline]
# Para [Inline]
# LineBlock [[Inline]]
# BlockQuote [Block]
# BulletList [[Block]]
# Emph [Inline]
# Strong [Inline]
# Strikeout [Inline]
# Superscript [Inline]
# Subscript [Inline]
# SmallCaps [Inline]
# Note [Block]
# Underline [Inline]
x$c <- lapply(x$c, transform_code, version = version)
} else if (x$t %in% block_list2) {
# OrderedList ListAttributes [[Block]]
# Quoted QuoteType [Inline]
# Div Attr [Block]
# Span Attr [Inline]
# TableHead Attr [Row]
# TableFoot Attr [Row]
# Caption (Maybe ShortCaption) [Block]
x$c[[2]] <- lapply(x$c[[2]], transform_code, version = version)
} else if (x$t %in% "Table") {
if (version >= "1.21") {
# Attr Caption [ColSpec] TableHead [TableBody] TableFoot
x$c[c(2, 4, 5, 6)] <- lapply(x$c[c(2, 4, 5, 6)], transform_code, version = version)
} else {
# [Inline] [Alignment] [Double] [TableCell] [[TableCell]]
x$c[c(1, 4, 5)] <- lapply(x$c[c(1, 4, 5)], transform_code, version = version)
}
} else if (x$t %in% "TableBody") {
# Attr RowHeadColumns [Row] [Row]
x$c[c(3, 4)] <- lapply(x$c[c(3, 4)], transform_code, version = version)
} else if (x$t %in% "Cell") {
# Attr Alignment RowSpan ColSpan [Block]
x$c[[5]] <- lapply(x$c[[5]], transform_code, version = version)
} else if (x$t %in% "DefinitionList") {
# DefinitionList [([Inline], [[Block]])]
x$c <- lapply(x$c,
function(x) list(
transform_code(x[[1]], version = version),
transform_code(x[[2]], version = version)
)
)
} else if (x$t %in% skip) {
} else {
inform(paste0("Unknown type: ", x$t))
}
x
}
}
# Pandoc AST constructors -------------------------------------------------
pandoc_node <- function(type, ...) {
list(t = type, c = list(...))
}
pandoc_raw_block <- function(format, text) {
# Format Text
pandoc_node("RawBlock", format, text)
}
pandoc_link <- function(attr, contents, target) {
# Attr [Inline] Target
pandoc_node("Link", attr, contents, target)
}
pandoc_attr <- function(id = "", classes = list(), keyval = list()) {
list(id, classes, keyval)
}
pandoc_target <- function(url, title = "") {
list(url, title)
}
# Helpers -----------------------------------------------------------------
check_packages <- function() {
if (!is_installed("rmarkdown") || !is_installed("jsonlite")) {
abort("rmarkdown and jsonlite required for .md transformation")
}
}
downlit/R/highlight.R 0000644 0001762 0000144 00000026012 14255166156 014241 0 ustar ligges users #' Highlight and link a code block
#'
#' @description
#' This function:
#' * syntax highlights code
#' * links function calls to their documentation (where possible)
#' * in comments, translates ANSI escapes in to HTML equivalents.
#'
#' # Options
#'
#' downlit provides a number of options to control the details of the linking.
#' They are particularly important if you want to generate "local" links.
#'
#' * `downlit.package`: name of the current package. Determines when
#' `topic_index` and `article_index`
#'
#' * `downlit.topic_index` and `downlit.article_index`: named character
#' vector that maps from topic/article name to path.
#'
#' * `downlit.rdname`: name of current Rd file being documented (if any);
#' used to avoid self-links.
#'
#' * `downlit.attached`: character vector of currently attached R packages.
#'
#' * `downlit.local_packages`: named character vector providing relative
#' paths (value) to packages (name) that can be reached with relative links
#' from the target HTML document.
#'
#' * `downlit.topic_path` and `downlit.article_path`: paths to reference
#' topics and articles/vignettes relative to the "current" file.
#' @export
#' @param text String of code to highlight and link.
#' @param classes A mapping between token names and CSS class names.
#' Bundled `classes_pandoc()` and `classes_chroma()` provide mappings
#' that (roughly) match Pandoc and chroma (used by hugo) classes so you
#' can use existing themes.
#' @param pre_class Class(es) to give output ``.
#' @param code If `TRUE`, wrap output in a `` block
#' @return If `text` is valid R code, an HTML `` tag. Otherwise,
#' `NA`.
#' @return A string containing syntax highlighted HTML or `NA` (if `text`
#' isn't parseable).
#' @examples
#' cat(highlight("1 + 1"))
#' cat(highlight("base::t(1:3)"))
#'
#' # Unparseable R code returns NA
#' cat(highlight("base::t("))
highlight <- function(text, classes = classes_chroma(), pre_class = NULL, code = FALSE) {
parsed <- parse_data(text)
if (is.null(parsed)) {
return(NA_character_)
}
# Figure out which packages are attached to the search path. This is a
# hack because loading a package will affect code _before_ the library()
# call. But it should lead to relatively few false positives and is simple.
packages <- extract_package_attach(parsed$expr)
register_attached_packages(packages)
# Highlight, link, and escape
out <- parsed$data
out$class <- token_class(out$token, out$text, classes)
out$href <- token_href(out$token, out$text)
out$escaped <- token_escape(out$token, out$text)
# Update input - basic idea from prettycode
changed <- !is.na(out$href) | !is.na(out$class) | out$text != out$escaped
changes <- out[changed, , drop = FALSE]
loc <- line_col(parsed$text)
start <- vctrs::vec_match(data.frame(line = changes$line1, col = changes$col1), loc)
end <- vctrs::vec_match(data.frame(line = changes$line2, col = changes$col2), loc)
new <- style_token(changes$escaped, changes$href, changes$class)
out <- replace_in_place(parsed$text, start, end, replacement = new)
# Add per-line span to match pandoc
lines <- strsplit(out, "\n")[[1]]
if (length(lines) > 0) {
lines <- paste0("", lines, "")
}
out <- paste0(lines, collapse = "\n")
if (!is.null(pre_class)) {
out <- paste0(
"\n",
if (code) paste0(""),
out,
if (code) paste("
"),
"
"
)
}
Encoding(out) <- "UTF-8"
out
}
style_token <- function(x, href = NA, class = NA) {
# Split tokens in to lines
lines <- strsplit(x, "\n")
n <- lengths(lines)
xs <- unlist(lines)
href <- rep(href, n)
class <- rep(class, n)
# Add links and class
xs <- ifelse(is.na(href), xs, paste0("", xs, ""))
xs <- ifelse(is.na(class), xs, paste0("", xs, ""))
# Re-combine back into lines
new_lines <- split(xs, rep(seq_along(x), n))
map_chr(new_lines, paste0, collapse = "\n")
}
# From prettycode:::replace_in_place
replace_in_place <- function(str, start, end, replacement) {
stopifnot(
length(str) == 1, length(start) == length(end),
length(end) == length(replacement)
)
keep <- substring(str, c(1, end + 1), c(start - 1, nchar(str)))
pieces <- character(length(replacement) * 2 + 1)
even <- seq_along(replacement) * 2
odd <- c(1, even + 1)
pieces[even] <- replacement
pieces[odd] <- keep
paste0(pieces, collapse = "")
}
line_col <- function(x) {
char <- strsplit(x, "")[[1]]
nl <- char == "\n"
line <- cumsum(c(TRUE, nl[-length(char)]))
col <- sequence(rle(line)$lengths)
data.frame(line, col)
}
# utils::getParseData will truncate very long strings or tokens;
# this function checks for that and uses the slow
# utils::getParseText function when necessary.
getFullParseData <- function(x) {
res <- utils::getParseData(x)
truncated <- res$terminal &
substr(res$text, 1, 1) == "[" &
nchar(res$text) > 5 # 5 is arbitrary, 2 would probably be enough
if (any(truncated))
res$text[truncated] <- utils::getParseText(res, res$id[truncated])
res
}
parse_data <- function(text) {
text <- standardise_text(text)
stopifnot(is.character(text), length(text) == 1)
expr <- safe_parse(text, standardise = FALSE)
if (is.null(expr)) {
return(NULL)
}
list(text = text, expr = expr, data = getFullParseData(expr))
}
# Highlighting ------------------------------------------------------------
token_class <- function(token, text, classes) {
token <- token_type(token, text)
unname(classes[token])
}
# Collapse token types to a smaller set of categories that we care about
# for syntax highlighting
# https://github.com/wch/r-source/blob/trunk/src/main/gram.c#L511
token_type <- function(x, text) {
special <- c(
"FUNCTION",
"FOR", "IN", "BREAK", "NEXT", "REPEAT", "WHILE",
"IF", "ELSE"
)
rstudio_special <- c(
"return", "switch", "try", "tryCatch", "stop",
"warning", "require", "library", "attach", "detach",
"source", "setMethod", "setGeneric", "setGroupGeneric",
"setClass", "setRefClass", "R6Class", "UseMethod", "NextMethod"
)
x[x %in% special] <- "special"
x[x == "SYMBOL_FUNCTION_CALL" & text %in% rstudio_special] <- "special"
infix <- c(
# algebra
"'-'", "'+'", "'~'", "'*'", "'/'", "'^'",
# comparison
"LT", "GT", "EQ", "GE", "LE", "NE",
# logical
"'!'", "AND", "AND2", "OR", "OR2",
# assignment / equals
"LEFT_ASSIGN", "RIGHT_ASSIGN", "EQ_ASSIGN", "EQ_FORMALS", "EQ_SUB",
# miscellaneous
"'$'", "'@'","'~'", "'?'", "':'", "SPECIAL",
# pipes
"PIPE", "PIPEBIND"
)
x[x %in% infix] <- "infix"
parens <- c("LBB", "'['", "']'", "'('", "')'", "'{'", "'}'")
x[x %in% parens] <- "parens"
# Matches treatment of constants in RStudio
constant <- c(
"NA", "Inf", "NaN", "TRUE", "FALSE",
"NA_integer_", "NA_real_", "NA_character_", "NA_complex_"
)
x[x == "NUM_CONST" & text %in% constant] <- "constant"
x[x == "SYMBOL" & text %in% c("T", "F")] <- "constant"
x[x == "NULL_CONST"] <- "constant"
x[x == "NULL_CONST"] <- "constant"
# Treats pipe's placeholder '_' as a SYMBOL
x[x == "PLACEHOLDER"] <- "SYMBOL"
x
}
# Pandoc styles are based on KDE default styles:
# https://docs.kde.org/stable5/en/applications/katepart/highlight.html#kate-highlight-default-styles
# But in HTML use two letter abbreviations:
# https://github.com/jgm/skylighting/blob/a1d02a0db6260c73aaf04aae2e6e18b569caacdc/skylighting-core/src/Skylighting/Format/HTML.hs#L117-L147
# Summary at
# https://docs.google.com/spreadsheets/d/1JhBtQSCtQ2eu2RepLTJONFdLEnhM3asUyMMLYE3tdYk/edit#gid=0
#
# Default syntax highlighting def for R:
# https://github.com/KDE/syntax-highlighting/blob/master/data/syntax/r.xml
#' @export
#' @rdname highlight
classes_pandoc <- function() {
c(
"constant" = "cn",
"NUM_CONST" = "fl",
"STR_CONST" = "st",
"special" = "kw",
"parens" = "op",
"infix" = "op",
"SLOT" = "va",
"SYMBOL" = "va",
"SYMBOL_FORMALS" = "va",
"NS_GET" = "fu",
"NS_GET_INT" = "fu",
"SYMBOL_FUNCTION_CALL" = "fu",
"SYMBOL_PACKAGE" = "fu",
"COMMENT" = "co"
)
}
# Derived from https://github.com/ropensci/roweb2/blob/master/themes/ropensci/static/css/pygments.css
#' @export
#' @rdname highlight
classes_chroma <- function() {
c(
"constant" = "kc",
"NUM_CONST" = "m",
"STR_CONST" = "s",
"special" = "kr",
"parens" = "o",
"infix" = "o",
"SLOT" = "nv",
"SYMBOL" = "nv",
"SYMBOL_FORMALS" = "nv",
"NS_GET" = "nf",
"NS_GET_INT" = "nf",
"SYMBOL_FUNCTION_CALL" = "nf",
"SYMBOL_PACKAGE" = "nf",
"COMMENT" = "c"
)
}
classes_show <- function(x, classes = classes_pandoc()) {
text <- paste0(deparse(substitute(x)), collapse = "\n")
out <- parse_data(text)$data
out$class <- token_class(out$token, out$text, classes)
out$class[is.na(out$class)] <- ""
out <- out[out$terminal, c("token", "text", "class")]
rownames(out) <- NULL
out
}
# Linking -----------------------------------------------------------------
token_href <- function(token, text) {
href <- rep(NA, length(token))
to_end <- length(token) - seq_along(token) + 1
# Highlight namespaced function calls. In the parsed tree, these are
# SYMBOL_PACKAGE then NS_GET/NS_GET_INT then SYMBOL_FUNCTION_CALL/SYMBOL
ns_pkg <- which(token %in% "SYMBOL_PACKAGE" & to_end > 2)
ns_fun <- ns_pkg + 2L
href[ns_fun] <- map2_chr(text[ns_fun], text[ns_pkg], href_topic)
# Then highlight all remaining calls, using loaded packages registered
# above. These maintained at a higher-level, because (e.g) in .Rmds you want
# earlier library() statements to affect the highlighting of later blocks
fun <- which(token %in% "SYMBOL_FUNCTION_CALL")
fun <- setdiff(fun, ns_fun)
fun <- fun[token[fun-1] != "'$'"]
# Include custom infix operators
fun <- c(fun, which(token %in% "SPECIAL"))
# Highlight R6 instantiation
r6_new_call <- which(
text == "new" & token == "SYMBOL_FUNCTION_CALL"
)
r6_new_call <- r6_new_call[token[r6_new_call - 1] == "'$'"]
r6_new_call <- r6_new_call[token[r6_new_call - 3] == "SYMBOL"]
fun <- c(fun, r6_new_call - 3)
href[fun] <- map_chr(text[fun], href_topic_local, is_fun = TRUE)
# Highlight packages
lib_call <- which(
token == "SYMBOL_FUNCTION_CALL" &
text %in% c("library", "require") &
to_end > 3
)
pkg <- lib_call + 3 # expr + '(' + STR_CONST
href[pkg] <- map_chr(gsub("['\"]", "", text[pkg]), href_package)
href
}
map_chr <- function(.x, .f, ...) {
vapply(.x, .f, ..., FUN.VALUE = character(1), USE.NAMES = FALSE)
}
map2_chr <- function(.x, .y, .f, ...) {
vapply(seq_along(.x), function(i) .f(.x[[i]], .y[[i]], ...), character(1))
}
# Escaping ----------------------------------------------------------------
token_escape <- function(token, text) {
text <- escape_html(text)
is_comment <- token == "COMMENT"
text[is_comment] <- fansi::sgr_to_html(text[is_comment])
text
}
escape_html <- function(x) {
x <- gsub("&", "&", x)
x <- gsub("<", "<", x)
x <- gsub(">", ">", x)
x
}
downlit/R/article-index.R 0000644 0001762 0000144 00000002462 13750523135 015016 0 ustar ligges users article_index <- function(package) {
if (is.null(package)) {
getOption("downlit.article_index")
} else if (devtools_loaded(package)) {
# Use live docs for in-development packages
article_index_source(package)
} else {
article_index_remote(package)
}
}
article_index_source <- function(package) {
path <- file.path(find.package(package), "vignettes")
if (!file.exists(path)) {
return(character())
}
vig_path <- dir(path, pattern = "\\.[rR]md$", recursive = TRUE)
out_path <- gsub("\\.[rR]md$", ".html", vig_path)
vig_name <- gsub("\\.[rR]md$", "", basename(vig_path))
set_names(out_path, vig_name)
}
article_index_remote <- function(package) {
# Ideally will use published metadata because that includes all articles
# not just vignettes
metadata <- remote_metadata(package)
if (!is.null(metadata) && !is.null(metadata$articles)) {
return(metadata$articles)
}
# Otherwise, fallback to vignette index
path <- system.file("Meta", "vignette.rds", package = package)
if (path == "") {
return(character())
}
meta <- readRDS(path)
name <- tools::file_path_sans_ext(meta$File)
set_names(meta$PDF, name)
}
find_article <- function(package, name) {
index <- article_index(package)
if (has_name(index, name)) {
index[[name]]
} else {
NULL
}
}
downlit/R/link.R 0000644 0001762 0000144 00000022553 14447367602 013237 0 ustar ligges users #' Automatically link inline code
#'
#' @param text String of code to highlight and link.
#' @return
#' If `text` is linkable, an HTML link for `autolink()`, and or just
#' the URL for `autolink_url()`. Both return `NA` if the text is not
#' linkable.
#' @inheritSection highlight Options
#' @export
#' @examples
#' autolink("stats::median()")
#' autolink("vignette('grid', package = 'grid')")
#'
#' autolink_url("stats::median()")
autolink <- function(text) {
href <- autolink_url(text)
if (identical(href, NA_character_)) {
return(NA_character_)
}
paste0("", escape_html(text), "")
}
#' @export
#' @rdname autolink
autolink_url <- function(text) {
expr <- safe_parse(text)
if (length(expr) == 0) {
return(NA_character_)
}
href_expr(expr[[1]])
}
autolink_curly <- function(text) {
package_name <- extract_curly_package(text)
if (is.na(package_name)) {
return(NA_character_)
}
href <- href_package(package_name)
if (is.na(href)) {
return(NA_character_)
}
paste0("", package_name, "")
}
# Helper for testing
href_expr_ <- function(expr, ...) {
href_expr(substitute(expr), ...)
}
href_expr <- function(expr) {
if (!is_call(expr)) {
return(NA_character_)
}
fun <- expr[[1]]
if (is_call(fun, "::", n = 2)) {
pkg <- as.character(fun[[2]])
fun <- fun[[3]]
} else {
pkg <- NULL
}
if (!is_symbol(fun))
return(NA_character_)
fun_name <- as.character(fun)
n_args <- length(expr) - 1
if (n_args == 0) {
href_topic(fun_name, pkg, is_fun = TRUE)
} else if (fun_name %in% c("library", "require", "requireNamespace")) {
simple_call <- n_args == 1 &&
is.null(names(expr)) &&
(is_string(expr[[2]]) || (fun_name != "requireNamespace") && is_symbol(expr[[2]]))
if (simple_call) {
pkg <- as.character(expr[[2]])
topic <- href_package(pkg)
if (is.na(topic)) {
href_topic(fun_name)
} else {
topic
}
} else {
href_topic(fun_name, is_fun = TRUE)
}
} else if (fun_name == "vignette" && n_args >= 1) {
# vignette("foo", "package")
expr <- match.call(utils::vignette, expr)
topic_ok <- is.character(expr$topic)
package_ok <- is.character(expr$package) || is.null(expr$package)
if (topic_ok && package_ok) {
href_article(expr$topic, expr$package)
} else {
NA_character_
}
} else if (fun_name == "?" && n_args == 1) {
topic <- expr[[2]]
if (is_call(topic, "::")) {
# ?pkg::x
href_topic(as.character(topic[[3]]), as.character(topic[[2]]))
} else if (is_symbol(topic) || is_string(topic)) {
# ?x
href_topic(as.character(expr[[2]]))
} else {
NA_character_
}
} else if (fun_name == "?" && n_args == 2) {
# package?x
href_topic(paste0(expr[[3]], "-", expr[[2]]))
} else if (fun_name == "help" && n_args >= 1) {
expr <- match.call(utils::help, expr)
if (is_help_literal(expr$topic) && is_help_literal(expr$package)) {
href_topic(as.character(expr$topic), as.character(expr$package))
} else if (is_help_literal(expr$topic) && is.null(expr$package)) {
href_topic(as.character(expr$topic))
} else if (is.null(expr$topic) && is_help_literal(expr$package)) {
href_package_ref(as.character(expr$package))
} else {
NA_character_
}
} else if (fun_name == "::" && n_args == 2) {
href_topic(as.character(expr[[3]]), as.character(expr[[2]]))
} else {
NA_character_
}
}
is_help_literal <- function(x) is_string(x) || is_symbol(x)
# Topics ------------------------------------------------------------------
#' Generate url for topic/article/package
#'
#' @param topic,article Topic/article name
#' @param package Optional package name. If not supplied, will search
#' in all attached packages.
#' @param is_fun Only return topics that are (probably) for functions.
#' @keywords internal
#' @export
#' @return URL topic or article; `NA` if can't find one.
#' @examples
#' href_topic("t")
#' href_topic("DOESN'T EXIST")
#' href_topic("href_topic", "downlit")
#'
#' href_package("downlit")
href_topic <- function(topic, package = NULL, is_fun = FALSE) {
if (length(topic) != 1L) {
return(NA_character_)
}
if (is_package_local(package)) {
href_topic_local(topic, is_fun = is_fun)
} else {
href_topic_remote(topic, package)
}
}
is_package_local <- function(package) {
if (is.null(package)) {
return(TRUE)
}
cur <- getOption("downlit.package")
if (is.null(cur)) {
return(FALSE)
}
package == cur
}
href_topic_local <- function(topic, is_fun = FALSE) {
rdname <- find_rdname(NULL, topic)
if (is.null(rdname)) {
# Check attached packages
loc <- find_rdname_attached(topic, is_fun = is_fun)
if (is.null(loc)) {
return(NA_character_)
} else {
return(href_topic_remote(topic, loc$package))
}
}
if (rdname == "reexports") {
return(href_topic_reexported(topic, getOption("downlit.package")))
}
cur_rdname <- getOption("downlit.rdname", "")
if (rdname == cur_rdname) {
return(NA_character_)
}
if (cur_rdname != "") {
paste0(rdname, ".html")
} else {
paste0(getOption("downlit.topic_path"), rdname, ".html")
}
}
href_topic_remote <- function(topic, package) {
rdname <- find_rdname(package, topic)
if (is.null(rdname)) {
return(NA_character_)
}
if (is_reexported(topic, package)) {
href_topic_reexported(topic, package)
} else {
paste0(href_package_ref(package), "/", rdname, ".html")
}
}
is_reexported <- function(name, package) {
if (package == "base") {
return(FALSE)
}
is_imported <- env_has(ns_imports_env(package), name)
is_imported && is_exported(name, package)
}
is_exported <- function(name, package) {
name %in% getNamespaceExports(ns_env(package))
}
# If it's a re-exported function, we need to work a little harder to
# find out its source so that we can link to it.
href_topic_reexported <- function(topic, package) {
ns <- ns_env(package)
if (!env_has(ns, topic, inherit = TRUE)) {
return(NA_character_)
}
obj <- env_get(ns, topic, inherit = TRUE)
ex_package <- find_reexport_source(obj, ns, topic)
# Give up if we're stuck in an infinite loop
if (package == ex_package) {
return(NA_character_)
}
href_topic_remote(topic, ex_package)
}
find_reexport_source <- function(obj, ns, topic) {
if (is.primitive(obj)) {
# primitive functions all live in base
"base"
} else if (is.function(obj)) {
## For functions, we can just take their environment.
ns_env_name(get_env(obj))
} else {
## For other objects, we need to check the import env of the package,
## to see where 'topic' is coming from. The import env has redundant
## information. It seems that we just need to find a named list
## entry that contains `topic`.
imp <- getNamespaceImports(ns)
imp <- imp[names(imp) != ""]
wpkgs <- vapply(imp, `%in%`, x = topic, FUN.VALUE = logical(1))
if (!any(wpkgs)) {
return(NA_character_)
}
pkgs <- names(wpkgs)[wpkgs]
# Take the last match, in case imports have name clashes.
pkgs[[length(pkgs)]]
}
}
# Articles ----------------------------------------------------------------
#' @export
#' @rdname href_topic
href_article <- function(article, package = NULL) {
if (is_package_local(package)) {
path <- find_article(NULL, article)
if (!is.null(path)) {
return(paste0(getOption("downlit.article_path"), path))
}
}
if (is.null(package)) {
package <- find_vignette_package(article)
if (is.null(package)) {
return(NA_character_)
}
}
path <- find_article(package, article)
if (is.null(path)) {
return(NA_character_)
}
base_url <- remote_package_article_url(package)
if (!is.null(base_url)) {
paste0(base_url, "/", path)
} else if (is_bioc_pkg(package)) {
paste0("https://bioconductor.org/packages/release/bioc/vignettes/", package, "/inst/doc/", path)
} else {
paste0("https://cran.rstudio.com/web/packages/", package, "/vignettes/", path)
}
}
# Returns NA if package is not installed.
# Returns TRUE if `package` is from Bioconductor, FALSE otherwise
is_bioc_pkg <- function(package) {
if (!rlang::is_installed(package)) {
return(FALSE)
}
biocviews <- utils::packageDescription(package, fields = "biocViews")
!is.na(biocviews) && biocviews != ""
}
# Try to figure out package name from attached packages
find_vignette_package <- function(x) {
for (pkg in getOption("downlit.attached")) {
if (!is_installed(pkg)) {
next
}
info <- tools::getVignetteInfo(pkg)
if (x %in% info[, "Topic"]) {
return(pkg)
}
}
NULL
}
# Packages ----------------------------------------------------------------
#' @export
#' @rdname href_topic
href_package <- function(package) {
urls <- package_urls(package)
if (length(urls) == 0) {
NA_character_
} else {
urls[[1]]
}
}
href_package_ref <- function(package) {
reference_url <- remote_package_reference_url(package)
if (!is.null(reference_url)) {
reference_url
} else {
# Fall back to rdrr.io
if (is_base_package(package)) {
paste0("https://rdrr.io/r/", package)
} else {
paste0("https://rdrr.io/pkg/", package, "/man")
}
}
}
is_base_package <- function(x) {
x %in% c(
"base", "compiler", "datasets", "graphics", "grDevices", "grid",
"methods", "parallel", "splines", "stats", "stats4", "tcltk",
"tools", "utils"
)
}
downlit/NEWS.md 0000644 0001762 0000144 00000011315 14447400120 013025 0 ustar ligges users # downlit 0.4.3
* Fix for upcoming R-devel (#169).
# downlit 0.4.2
* `highlight()` no longer errors if a package imputed to have been attached
isn't installed.
* Correctly link `requireNamespace(MASS)` (#151).
# downlit 0.4.1
## Syntax highlighting
* Supports new base pipe `|>` syntax (#126).
* Every line get its own `` to match pandoc (#122).
* Multi-line tokens (e.g. strings) now get a `` per line (#139).
* Very long strings or other tokens are no longer truncated (@dmurdoch, #128).
## Auto-linkg
* Function calls (in inline and code blocks) will no longer to non-function
topics (#135).
* Re-exports detection no longer relies on name of `.Rd` file (#134).
* Link to correct topic with `::()` and `utils::help()` (@IndrajeetPatil, #131).
* Generate correct link for Bioconductor vignettes (@zeehio, #145)
# downlit 0.4.0
## Syntax highlighting
* Messages, warnings, and errors now get a much more minimal style.
Messages are styled the same way as output; and warnings and errors
only style the "Warning" or "Error" text. This makes these types of
output easier to read, and easier for package developers to style
themselves.
* Messages, warnings, and errors now retrieve their text using
`conditionMessage()`, which supports more advanced types of conditions
(@davidchall, #100).
* The overall structure of the syntax highlighting has been overhauled.
Now each line is wrapped in a `` with class `r-in` (input code),
`r-out` (output printed to console), `r-plot` (plots), `r-msg` (messages),
`r-wrn` (warnings), and `r-err` (errors). Additionally, the prompt (`#>`)
is wrapped in a ``. Altogether, these changes
should give much more flexibility for styling with CSS (#90).
* ANSI escapes no longer generate invalid HTML (#79).
* Changes to better support for HTML widgets and rgl in pkgdown
(@dmurdoch, #78). In particular, `evaluate_and_highlight()` now returns
an additional attribute containing all extra dependencies needed to render
the returned HTML.
## Auto-linking
* Packages attached when you call `library(package)` (including by the
tidyverse), are now taken into account when autolinking (#87).
* In code blocks, custom infix operators are now autolinked (#89).
In inline code, infix operators are no longer autolinked; this lead to too
many false positives. You can still link with (e.g.) `?"-"` (#86).
* Inline calls with arguments (e.g. `foo(1, 2)`) are no longer auto-linked,
as intended and documented (#82).
* Autolinking `vignette(foo, package = "pkg")` no longer errors if `pkg` is not
installed (#85).
* Unusual help calls (e.g. `help(a$b)`), no longer generate errors (#77).
* Rare warning about incomplete final line in `autolink_url("pkgname::foo")`
is now suppressed (@dmurdoch, pkgdown#1419).
* `href_package()` is now exported (#103).
* Auto-linking is more likely to succeed when the remote package is not
installed as downlit now looks for the URL using `tools::CRAN_package_db()`
for CRAN packages, and and `available.packages()` for packages installed
from non-CRAN repos (@ARawles, #108).
* Functions in HTML `` elements are no longer autolinked
(@gadenbuie, #105).
# downlit 0.2.1
* When auto-linking `vignette(foo)`, downlit now looks for a vignette named
foo in the packages it knows to be attached (#61).
* Can now force highlighting of any `` by wrapping it inside a ``
with `class = "downlit"`. This is useful in cases where it may otherwise
be hard to set the class of the ``.
* In comments, `\u2029` is converted to `\033` to make it possible to preserve
ANSI escapes when passed through xml2.
* No longer errors on library calls with named but empty arguments.
# downlit 0.2.0
* Autolinking can use metadata stored in package itself with pkgdown setting
`deploy.install_metadata`; this is useful for packages that only have
private websites (@matthewstrasiotto, #29)
* Autolinking guesses reference and article urls for pkgdown sites that haven't
set url (@krlmlr, #44).
* R6 classes are autolinked when a new object is created i.e. in
`r6_object$new()`, `r6_object` will link to its docs (#59, @maelle).
* R6 methods are no longer autolinked as if they were functions of the same
name (#54, @maelle).
* `classes_pandoc()` and `classes_chroma()` have been thoroughly revieweed to
produce syntax highlighting as similar as possible to RStudio.
* `downlit_html_path()` has a more flexible XPath identifying R code blocks,
and a `classes` argument (#53, @maelle, @cderv)
* Trailing `/` are no longer stripped from URLs (#45, @krlmlr).
* Removed extra newline in `` output (#42, @krlmlr).
# downlit 0.1.0
* Added a `NEWS.md` file to track changes to the package.
downlit/MD5 0000644 0001762 0000144 00000007024 14447404110 012244 0 ustar ligges users eddb1762a6edaebf455869fee31621dd *DESCRIPTION
5174dfc514f0941d2edd4b0b4c9941dd *LICENSE
5f93eb2a34eccb91bb5379c84dba1935 *NAMESPACE
95d72ebdf8ae39f21abf05b912bca1d0 *NEWS.md
da7b2ae5c8bbe42fa41d6a0dc8c12628 *R/article-index.R
5877ac0a0e09840784081acfc0a9c833 *R/downlit-html.R
f25d217ea1af9b0c2849ca7474a37482 *R/downlit-md.R
3d9b3aee602ef40039102704bf6943d6 *R/downlit-package.R
1222eb77e1a14a705266a604c35f182e *R/evaluate.R
f56891d9088454f0896a0f29b5a30883 *R/highlight.R
862af0012e5f333a6545f4fdb5f8d75f *R/link.R
ea2e8dd1b7c0f078d77e31adfc73e97b *R/metadata.R
8c6310047b0d8d3a2fc87fe771abdcbb *R/packages.R
2797b5e117659cbf98c03cc126c091c7 *R/rd.R
e9e788b6c703d7c5dc19d855d3898a4f *R/test-helpers.R
6e459cd330c4500728ef240cbb0fa3c9 *R/topic-index.R
dbc5e17fb06c399c8d777ab5e8d5444a *R/utils.R
54260ad0b69c0d945b654f03124f520b *R/zzz.R
a859213845b1bd2fe5103c6c94bc6a8e *README.md
2f5ded4720a7ee335439f52c56978e56 *inst/WORDLIST
2f8ad11d48a00bc5c546498f66647399 *man/autolink.Rd
201f00d46732504e00ad7eda19a951c4 *man/downlit-package.Rd
386d3ce7a51b5730015576d9515e722c *man/downlit_html_path.Rd
a5e785e4b0d72e64d028bea09f69e5a0 *man/downlit_md_path.Rd
d8a95da1a4b1acaca630fc1006409fa1 *man/evaluate_and_highlight.Rd
1da661de3745613b4292cc76acc65ee5 *man/highlight.Rd
62cfd33db37ea134463d39f20ec6f9bc *man/href_topic.Rd
5897699c6c5697490bd8e96a2552200b *man/is_low_change.Rd
57552b43f20ac10efa0ea1c0d326fd25 *tests/testthat.R
d5636e9a01f46b5018eaa6a85b695351 *tests/testthat/_snaps/downlit-html.md
7307b352258855d80370d252e7bf350b *tests/testthat/_snaps/evaluate.md
5cb84cbc1983602902d93661d2234f63 *tests/testthat/_snaps/highlight.md
1e058d3a949c29a03d8a6d6e68cae954 *tests/testthat/autolink.html
6d8c7188032c6043863c475c0763e4b8 *tests/testthat/fake-repo/src/contrib/PACKAGES
a9414277155b5cb45700110e1975f57a *tests/testthat/index/DESCRIPTION
92f87050d2894835fbd50002fc4a28b0 *tests/testthat/index/LICENSE
2cb41eabeb5cfba6a8ba143c4bb472dc *tests/testthat/index/LICENSE.md
a75c6241d729146e8dccca76ba49b2d4 *tests/testthat/index/NAMESPACE
e948a22eabd9accca8b07baad7cc695d *tests/testthat/index/R/basic.R
b4e842ed76441c222cd7b1fd24a3ecf8 *tests/testthat/index/index.Rproj
c645215df63b558e0388839ce4783544 *tests/testthat/index/man/a.Rd
576ad046290a96fd2a54e9bb258037ee *tests/testthat/index/man/b.Rd
b6cf9ccb2335d528d46fd530e420b29e *tests/testthat/index/vignettes/test-1.Rmd
dc56a11410d7de3cc75bef53b3b8d7fc *tests/testthat/index/vignettes/test-2.Rmd
d07e39cd2f266c6a2c31b9ba0b09a36f *tests/testthat/markdown-definition.md
897a79e3540dd87cafd229077eb4548e *tests/testthat/markdown-table.md
811364c4270da5fb381510f780198d2f *tests/testthat/test-article-index.R
2c6508fdef2b3701e063b8ec6d96bb5e *tests/testthat/test-downlit-html.R
62cec245efaf9cb62b6c8d376f85d1ac *tests/testthat/test-downlit-html.txt
3099dd991ee9f63453a349c6550505bc *tests/testthat/test-downlit-md-v20.txt
f948f865dfeff71e91cf5b2cff9fefcb *tests/testthat/test-downlit-md-v21.txt
d90193a2bdf2fd626b6b21af6477326c *tests/testthat/test-downlit-md.R
2dea158374a8df3a3ea316487edbfe29 *tests/testthat/test-downlit-md.txt
d0cabd8ab44db99ac62c99265a103573 *tests/testthat/test-evaluate.R
1ab3e137a38d3cb4acd6b7f71ffadb4c *tests/testthat/test-highlight.R
1eae92541c8a5be0f6a7027797415829 *tests/testthat/test-highlight.txt
89797c2e247366dd2688760dd2bbfecb *tests/testthat/test-link.R
24913ce190cca89c8fdc229db570d900 *tests/testthat/test-metadata.R
cc46254ea8c668499f14de29dfb6d540 *tests/testthat/test-packages.R
8e8b4619a99799d99e2d97d454980535 *tests/testthat/test-topic-index.R
38079bf196a257cfc6a325c7d872dfef *tests/testthat/test-utils.R
downlit/inst/ 0000755 0001762 0000144 00000000000 13731234742 012715 5 ustar ligges users downlit/inst/WORDLIST 0000644 0001762 0000144 00000000144 13731234742 014106 0 ustar ligges users AST
bookdown
chroma
dplyr
dplyr's
fansi
findable
hugo
hugodown
linkable
md
Pandoc
pkgdown
RMarkdown