downlit/0000755000175000017500000000000014137047030012063 5ustar nileshnileshdownlit/MD50000644000175000017500000000702414137047030012376 0ustar nileshnilesh96c940b831a9a88be136b431c2c4a224 *DESCRIPTION 5174dfc514f0941d2edd4b0b4c9941dd *LICENSE 5f93eb2a34eccb91bb5379c84dba1935 *NAMESPACE efa8222ec23a389a1e840a5afedacf62 *NEWS.md da7b2ae5c8bbe42fa41d6a0dc8c12628 *R/article-index.R 5877ac0a0e09840784081acfc0a9c833 *R/downlit-html.R f25d217ea1af9b0c2849ca7474a37482 *R/downlit-md.R 3d9b3aee602ef40039102704bf6943d6 *R/downlit-package.R 8c6f83ec8b8c6799525c9b39e67db884 *R/evaluate.R 06308dcc53091dac99e3f99520322a1b *R/highlight.R 5db95bb90713ac437177862e8143c057 *R/link.R c9a52c3082ca3e233141a3581ef32b12 *R/metadata.R b5ee7c2f0fc66dab601d701ea409cd15 *R/packages.R 2797b5e117659cbf98c03cc126c091c7 *R/rd.R e9e788b6c703d7c5dc19d855d3898a4f *R/test-helpers.R 8b6b004c4423cb52e5dca0a95d87dc65 *R/topic-index.R 230bc1df65ed8461a429c65a1e053f0e *R/utils.R 2ec11320046c0421694f5b7e727d112b *R/zzz.R 2de5575bc1aa745881f2381c2dd0c2c8 *README.md 2f5ded4720a7ee335439f52c56978e56 *inst/WORDLIST 2f8ad11d48a00bc5c546498f66647399 *man/autolink.Rd 5c6d02e15a211516a6902357541b5269 *man/downlit-package.Rd 386d3ce7a51b5730015576d9515e722c *man/downlit_html_path.Rd a5e785e4b0d72e64d028bea09f69e5a0 *man/downlit_md_path.Rd 8643aec467973d1023746923bf6b086d *man/evaluate_and_highlight.Rd 1da661de3745613b4292cc76acc65ee5 *man/highlight.Rd 19c6211bdca051f37bf19dfbe8e7bb8e *man/href_topic.Rd 5897699c6c5697490bd8e96a2552200b *man/is_low_change.Rd 57552b43f20ac10efa0ea1c0d326fd25 *tests/testthat.R a4b75f09868462f2f43b5c7098588ab2 *tests/testthat/_snaps/downlit-html.md ae13abd9da946e66e8e0ffd7f0bf8d11 *tests/testthat/_snaps/evaluate.md 7b56a7bf3c95fdc7261d952942b4bfcf *tests/testthat/_snaps/highlight.md 1e058d3a949c29a03d8a6d6e68cae954 *tests/testthat/autolink.html 1f03de7136ee7f8748e120b46a5114ad *tests/testthat/fake-repo/src/contrib/PACKAGES a9414277155b5cb45700110e1975f57a *tests/testthat/index/DESCRIPTION 92f87050d2894835fbd50002fc4a28b0 *tests/testthat/index/LICENSE 2cb41eabeb5cfba6a8ba143c4bb472dc *tests/testthat/index/LICENSE.md a75c6241d729146e8dccca76ba49b2d4 *tests/testthat/index/NAMESPACE e948a22eabd9accca8b07baad7cc695d *tests/testthat/index/R/basic.R b4e842ed76441c222cd7b1fd24a3ecf8 *tests/testthat/index/index.Rproj c645215df63b558e0388839ce4783544 *tests/testthat/index/man/a.Rd 576ad046290a96fd2a54e9bb258037ee *tests/testthat/index/man/b.Rd b6cf9ccb2335d528d46fd530e420b29e *tests/testthat/index/vignettes/test-1.Rmd dc56a11410d7de3cc75bef53b3b8d7fc *tests/testthat/index/vignettes/test-2.Rmd d07e39cd2f266c6a2c31b9ba0b09a36f *tests/testthat/markdown-definition.md 897a79e3540dd87cafd229077eb4548e *tests/testthat/markdown-table.md 811364c4270da5fb381510f780198d2f *tests/testthat/test-article-index.R 2c6508fdef2b3701e063b8ec6d96bb5e *tests/testthat/test-downlit-html.R 73450f6786b42594852aef3f73bdc2c7 *tests/testthat/test-downlit-html.txt c2de778b9cb9e3f7e17627fceea88b76 *tests/testthat/test-downlit-md-v20.txt 209a7cc1aebf59d07d459e3cf1dbce3c *tests/testthat/test-downlit-md-v21.txt d90193a2bdf2fd626b6b21af6477326c *tests/testthat/test-downlit-md.R 2dea158374a8df3a3ea316487edbfe29 *tests/testthat/test-downlit-md.txt d0cabd8ab44db99ac62c99265a103573 *tests/testthat/test-evaluate.R ffe1c73e5ed49b46e767e8024b53b3ef *tests/testthat/test-highlight.R 8e6a78dc1deac23fefee82dd03daf1c8 *tests/testthat/test-highlight.txt 2b17e6740c1db3e204c365c8a01d77d6 *tests/testthat/test-link.R 36ce317d4c7df18d68602a9f121db4de *tests/testthat/test-metadata.R 37e5e358914123915a68da1288fb467a *tests/testthat/test-packages.R e5250d266402d0cbb40ecccd1f8e9dbe *tests/testthat/test-topic-index.R 38079bf196a257cfc6a325c7d872dfef *tests/testthat/test-utils.R downlit/NEWS.md0000644000175000017500000000762114137033425013172 0ustar nileshnilesh# downlit 0.4.0 # downlit 0.3.0 ## Syntax highlighting * Messages, warnings, and errors now get a much more minimal style. Messages are styled the same way as output; and warnings and errors only style the "Warning" or "Error" text. This makes these types of output easier to read, and easier for package developers to style themselves. * Messages, warnings, and errors now retrieve their text using `conditionMessage()`, which supports more advanced types of conditions (@davidchall, #100). * The overall structure of the syntax highlighting has been overhauled. Now each line is wrapped in a `` with class `r-in` (input code), `r-out` (output printed to console), `r-plot` (plots), `r-msg` (messages), `r-wrn` (warnings), and `r-err` (errors). Additionally, the prompt (`#>`) is wrapped in a ``. Altogether, these changes should give much more flexibility for styling with CSS (#90). * ANSI escapes no longer generate invalid HTML (#79). * Changes to better support for HTML widgets and rgl in pkgdown (@dmurdoch, #78). In particular, `evaluate_and_highlight()` now returns an additional attribute containing all extra dependencies needed to render the returned HTML. ## Auto-linking * Packages attached when you call `library(package)` (including by the tidyverse), are now taken into account when autolinking (#87). * In code blocks, custom infix operators are now autolinked (#89). In inline code, infix operators are no longer autolinked; this lead to too many false positives. You can still link with (e.g.) `?"-"` (#86). * Inline calls with arguments (e.g. `foo(1, 2)`) are no longer auto-linked, as intended and documented (#82). * Autolinking `vignette(foo, package = "pkg")` no longer errors if `pkg` is not installed (#85). * Unusual help calls (e.g. `help(a$b)`), no longer generate errors (#77). * Rare warning about incomplete final line in `autolink_url("pkgname::foo")` is now suppressed (@dmurdoch, pkgdown#1419). * `href_package()` is now exported (#103). * Auto-linking is more likely to succeed when the remote package is not installed as downlit now looks for the URL using `tools::CRAN_package_db()` for CRAN packages, and and `available.packages()` for packages installed from non-CRAN repos (@ARawles, #108). * Functions in HTML `` elements are no longer autolinked (@gadenbuie, #105). # downlit 0.2.1 * When auto-linking `vignette(foo)`, downlit now looks for a vignette named foo in the packages it knows to be attached (#61). * Can now force highlighting of any `
` by wrapping it inside a `
` with `class = "downlit"`. This is useful in cases where it may otherwise be hard to set the class of the `
`.

* In comments, `\u2029` is converted to `\033` to make it possible to preserve
  ANSI escapes when passed through xml2.

* No longer errors on library calls with named but empty arguments.

# downlit 0.2.0

* Autolinking can use metadata stored in package itself with pkgdown setting
  `deploy.install_metadata`; this is useful for packages that only have 
  private websites (@matthewstrasiotto, #29)

* Autolinking guesses reference and article urls for pkgdown sites that haven't
  set url (@krlmlr, #44).

* R6 classes are autolinked when a new object is created i.e. in 
  `r6_object$new()`, `r6_object` will link to its docs (#59, @maelle). 

* R6 methods are no longer autolinked as if they were functions of the same 
  name (#54, @maelle).

* `classes_pandoc()` and `classes_chroma()` have been thoroughly revieweed to
  produce syntax highlighting as similar as possible to RStudio.

* `downlit_html_path()` has a more flexible XPath identifying R code blocks, 
  and a `classes` argument (#53, @maelle, @cderv)

* Trailing `/` are no longer stripped from URLs (#45, @krlmlr).

* Removed extra newline in `
` output (#42, @krlmlr).

# downlit 0.1.0

* Added a `NEWS.md` file to track changes to the package.
downlit/DESCRIPTION0000644000175000017500000000222014137047030013565 0ustar  nileshnileshPackage: downlit
Title: Syntax Highlighting and Automatic Linking
Version: 0.4.0
Authors@R: c(
    person("Hadley", "Wickham", , "hadley@rstudio.com", role = c("aut", "cre")),
    person("RStudio", role = "cph")
  )
Description: Syntax highlighting of R code, specifically designed for the
    needs of 'RMarkdown' packages like 'pkgdown', 'hugodown', and
    'bookdown'. It includes linking of function calls to their
    documentation on the web, and automatic translation of ANSI escapes in
    output to the equivalent HTML.
License: MIT + file LICENSE
URL: https://downlit.r-lib.org/, https://github.com/r-lib/downlit
BugReports: https://github.com/r-lib/downlit/issues
Depends: R (>= 3.4.0)
Imports: brio, desc, digest, evaluate, fansi, memoise, rlang, vctrs,
        yaml
Suggests: covr, htmltools, jsonlite, leaflet, MASS, pkgload, rmarkdown,
        testthat (>= 3.0.0), xml2
Config/testthat/edition: 3
Encoding: UTF-8
RoxygenNote: 7.1.2
NeedsCompilation: no
Packaged: 2021-10-29 18:02:03 UTC; hadley
Author: Hadley Wickham [aut, cre],
  RStudio [cph]
Maintainer: Hadley Wickham 
Repository: CRAN
Date/Publication: 2021-10-29 19:40:08 UTC
downlit/README.md0000644000175000017500000000732614137022042013346 0ustar  nileshnilesh# downlit


[![R-CMD-check](https://github.com/r-lib/downlit/workflows/R-CMD-check/badge.svg)](https://github.com/r-lib/downlit/actions)
[![Codecov test coverage](https://codecov.io/gh/r-lib/downlit/branch/main/graph/badge.svg)](https://app.codecov.io/gh/r-lib/downlit?branch=main)
[![CRAN status](https://www.r-pkg.org/badges/version/downlit)](https://CRAN.R-project.org/package=downlit)


The goal of downlit is to provide syntax highlighting and automatic linking of R code in a way that is easily used from RMarkdown packages like [pkgdown](https://pkgdown.r-lib.org/), [bookdown](https://bookdown.org), and [hugodown](https://hugodown.r-lib.org/).

## Installation

Install downlit from CRAN with:

```r
install.packages("downlit")
```

## Features

downlit has two slightly different highlighting/linking engines:

* `highlight()` works with multiline code blocks and does syntax highlighting,
  function linking, and comment styling.
* `autolink()` works with inline code and only does linking.

Multiline code blocks have:

* Code syntax highlighted using R's parser.
* Function calls automatically linked to their corresponding documentation.
* Comments styled by transforming ANSI escapes sequences to their HTML 
  equivalents (thanks [fansi](https://github.com/brodieG/fansi) package).

The following forms of inline code are recognized and automatically linked:

* `fun()`, `pkg::fun()`.
* `?fun`, `pkg::fun`, `type?topic`.
* `help("fun")`, `help("fun", package = "package")`, `help(package = "package")`.
* `vignette("name")`, `vignette("name", package = "package")`.
* `library(package)`, `require(package)`, `requireNamespace("package")`.
* `{package}` gets linked (if possible) _and formatted as plain text_. 

### Cross-package links

If downlit can find a pkgdown site for the remote package, it will link to it; otherwise it will link to  for documentation, and CRAN for vignettes. In order for a pkgdown site to be findable, it needs to be listed in two places:

*   In the `URL` field in the `DESCRIPTION`, as in
    [dplyr](https://github.com/tidyverse/dplyr/blob/85faf79c1fd74f4b4f95319e5be6a124a8075502/DESCRIPTION#L15):
  
    ```
    URL: https://dplyr.tidyverse.org, https://github.com/tidyverse/dplyr
    ```

*   In the `url` field in `_pkgdown.yml`, as in 
    [dplyr](https://github.com/tidyverse/dplyr/blob/master/_pkgdown.yml#L1)
    
    ```yaml
    url: https://dplyr.tidyverse.org
    ```
    
    When this field is defined, pkgdown generates a public facing
    [`pkgdown.yml` file](https://dplyr.tidyverse.org/pkgdown.yml) that 
    provides metadata about the site:
    
    ```yaml
    pandoc: '2.2'
    pkgdown: 1.3.0
    pkgdown_sha: ~
    articles:
      compatibility: compatibility.html
      dplyr: dplyr.html
    urls:
      reference: https://dplyr.tidyverse.org/reference
      article: https://dplyr.tidyverse.org/articles
    ```

So when you build a pkgdown site that links to the dplyr documentation (e.g., `dplyr::mutate()`), pkgdown looks first in dplyr's `DESCRIPTION` to find its website, then it looks for `pkgdown.yml`, and uses the metadata to generate the correct links.

## Usage

downlit is designed to be used by other packages, and I expect most uses of downlit will use it via another package (e.g. [hugodown](https://github.com/r-lib/hugodown)). If you want to use it in your own package, you'll typically want to apply it as part of some bigger transformation process. You can get some sense of how this might work by reading the source code of [`downlit_html()`](https://github.com/r-lib/downlit/blob/master/R/downlit-html.R) and [`downlit_md()`](https://github.com/r-lib/downlit/blob/master/R/downlit-md.R), which transform HTML and markdown documents respectively.
downlit/man/0000755000175000017500000000000014136771000012636 5ustar  nileshnileshdownlit/man/downlit_md_path.Rd0000644000175000017500000000245714042074111016304 0ustar  nileshnilesh% Generated by roxygen2: do not edit by hand
% Please edit documentation in R/downlit-md.R
\name{downlit_md_path}
\alias{downlit_md_path}
\alias{downlit_md_string}
\title{Syntax highlight and link a md document}
\usage{
downlit_md_path(in_path, out_path, format = NULL)

downlit_md_string(x, format = NULL)
}
\arguments{
\item{in_path, out_path}{Input and output paths for markdown file.}

\item{format}{Pandoc format; defaults to "gfm" if you have pandoc 2.0.0 or
greater, otherwise "markdown_github".}

\item{x}{A string containing markdown.}
}
\value{
\code{downlit_md_path()} invisibly returns \code{output_path};
\code{downlit_md_string()} returns a string containing markdown.
}
\description{
\verb{downlit_md_*} works by traversing the markdown AST generated by Pandoc.
It applies \code{\link[=highlight]{highlight()}} to \code{CodeBlock}s and \code{\link[=autolink]{autolink()}} to inline \code{Code}.

Use \code{downlit_md_path()} to transform a file on disk; use
\code{downlit_md_string()} to transform a string containing markdown as part
of a larger pipeline.

Needs pandoc 1.19 or later.
}
\examples{
if (rmarkdown::pandoc_available("1.19")) {
downlit_md_string("`base::t()`")
downlit_md_string("`base::t`")
downlit_md_string("* `base::t`")

# But don't highlight in headings
downlit_md_string("## `base::t`")
}
}
downlit/man/href_topic.Rd0000644000175000017500000000123114136771000015244 0ustar  nileshnilesh% Generated by roxygen2: do not edit by hand
% Please edit documentation in R/link.R
\name{href_topic}
\alias{href_topic}
\alias{href_article}
\alias{href_package}
\title{Generate url for topic/article/package}
\usage{
href_topic(topic, package = NULL)

href_article(article, package = NULL)

href_package(package)
}
\arguments{
\item{topic, article}{Topic/article name}

\item{package}{Optional package name}
}
\value{
URL topic or article; \code{NA} if can't find one.
}
\description{
Generate url for topic/article/package
}
\examples{
href_topic("t")
href_topic("DOESN'T EXIST")
href_topic("href_topic", "downlit")

href_package("downlit")
}
\keyword{internal}
downlit/man/autolink.Rd0000644000175000017500000000305214122441316014752 0ustar  nileshnilesh% Generated by roxygen2: do not edit by hand
% Please edit documentation in R/link.R
\name{autolink}
\alias{autolink}
\alias{autolink_url}
\title{Automatically link inline code}
\usage{
autolink(text)

autolink_url(text)
}
\arguments{
\item{text}{String of code to highlight and link.}
}
\value{
If \code{text} is linkable, an HTML link for \code{autolink()}, and or just
the URL for \code{autolink_url()}. Both return \code{NA} if the text is not
linkable.
}
\description{
Automatically link inline code
}
\section{Options}{
downlit provides a number of options to control the details of the linking.
They are particularly important if you want to generate "local" links.
\itemize{
\item \code{downlit.package}: name of the current package. Determines when
\code{topic_index} and \code{article_index}
\item \code{downlit.topic_index} and \code{downlit.article_index}: named character
vector that maps from topic/article name to path.
\item \code{downlit.rdname}: name of current Rd file being documented (if any);
used to avoid self-links.
\item \code{downlit.attached}: character vector of currently attached R packages.
\item \code{downlit.local_packages}: named character vector providing relative
paths (value) to packages (name) that can be reached with relative links
from the target HTML document.
\item \code{downlit.topic_path} and \code{downlit.article_path}: paths to reference
topics and articles/vignettes relative to the "current" file.
}
}

\examples{
autolink("stats::median()")
autolink("vignette('grid', package = 'grid')")

autolink_url("stats::median()")
}
downlit/man/highlight.Rd0000644000175000017500000000426514122441316015102 0ustar  nileshnilesh% Generated by roxygen2: do not edit by hand
% Please edit documentation in R/highlight.R
\name{highlight}
\alias{highlight}
\alias{classes_pandoc}
\alias{classes_chroma}
\title{Highlight and link a code block}
\usage{
highlight(text, classes = classes_chroma(), pre_class = NULL, code = FALSE)

classes_pandoc()

classes_chroma()
}
\arguments{
\item{text}{String of code to highlight and link.}

\item{classes}{A mapping between token names and CSS class names.
Bundled \code{classes_pandoc()} and \code{classes_chroma()} provide mappings
that (roughly) match Pandoc and chroma (used by hugo) classes so you
can use existing themes.}

\item{pre_class}{Class(es) to give output \verb{
}.}

\item{code}{If \code{TRUE}, wrap output in a \verb{}  block}
}
\value{
If \code{text} is valid R code, an HTML \verb{
} tag. Otherwise,
\code{NA}.

A string containing syntax highlighted HTML or \code{NA} (if \code{text}
isn't parseable).
}
\description{
This function:
\itemize{
\item syntax highlights code
\item links function calls to their documentation (where possible)
\item in comments, translates ANSI escapes in to HTML equivalents.
}
}
\section{Options}{
downlit provides a number of options to control the details of the linking.
They are particularly important if you want to generate "local" links.
\itemize{
\item \code{downlit.package}: name of the current package. Determines when
\code{topic_index} and \code{article_index}
\item \code{downlit.topic_index} and \code{downlit.article_index}: named character
vector that maps from topic/article name to path.
\item \code{downlit.rdname}: name of current Rd file being documented (if any);
used to avoid self-links.
\item \code{downlit.attached}: character vector of currently attached R packages.
\item \code{downlit.local_packages}: named character vector providing relative
paths (value) to packages (name) that can be reached with relative links
from the target HTML document.
\item \code{downlit.topic_path} and \code{downlit.article_path}: paths to reference
topics and articles/vignettes relative to the "current" file.
}
}

\examples{
cat(highlight("1 + 1"))
cat(highlight("base::t(1:3)"))

# Unparseable R code returns NA
cat(highlight("base::t("))
}
downlit/man/downlit-package.Rd0000644000175000017500000000157614136771000016207 0ustar  nileshnilesh% Generated by roxygen2: do not edit by hand
% Please edit documentation in R/downlit-package.R
\docType{package}
\name{downlit-package}
\alias{downlit}
\alias{downlit-package}
\title{downlit: Syntax Highlighting and Automatic Linking}
\description{
Syntax highlighting of R code, specifically designed for the needs of 'RMarkdown' packages like 'pkgdown', 'hugodown', and 'bookdown'. It includes linking of function calls to their documentation on the web, and automatic translation of ANSI escapes in output to the equivalent HTML.
}
\seealso{
Useful links:
\itemize{
  \item \url{https://downlit.r-lib.org/}
  \item \url{https://github.com/r-lib/downlit}
  \item Report bugs at \url{https://github.com/r-lib/downlit/issues}
}

}
\author{
\strong{Maintainer}: Hadley Wickham \email{hadley@rstudio.com}

Other contributors:
\itemize{
  \item RStudio [copyright holder]
}

}
\keyword{internal}
downlit/man/evaluate_and_highlight.Rd0000644000175000017500000000454614137021046017614 0ustar  nileshnilesh% Generated by roxygen2: do not edit by hand
% Please edit documentation in R/evaluate.R
\name{evaluate_and_highlight}
\alias{evaluate_and_highlight}
\title{Evaluate code and syntax highlight the results}
\usage{
evaluate_and_highlight(
  code,
  fig_save,
  classes = downlit::classes_pandoc(),
  env = NULL,
  output_handler = evaluate::new_output_handler(),
  highlight = TRUE
)
}
\arguments{
\item{code}{Code to evaluate (as a string).}

\item{fig_save}{A function with arguments \code{plot} and \code{id} that is
responsible for saving \code{plot} to a file (using \code{id} to disambiguate
multiple plots in the same chunk). It should return a list with
components \code{path}, \code{width}, and \code{height}.}

\item{classes}{A mapping between token names and CSS class names.
Bundled \code{classes_pandoc()} and \code{classes_chroma()} provide mappings
that (roughly) match Pandoc and chroma (used by hugo) classes so you
can use existing themes.}

\item{env}{Environment in which to evaluate code; if not supplied,
defaults to a child of the global environment.}

\item{output_handler}{Custom output handler for \code{\link[evaluate:evaluate]{evaluate::evaluate()}}.}

\item{highlight}{Optionally suppress highlighting. This is useful for tests.}
}
\value{
An string containing HTML with a \code{dependencies} attribute
giving an additional htmltools dependencies required to render the HTML.
}
\description{
This function runs \code{code} and captures the output using
\code{\link[evaluate:evaluate]{evaluate::evaluate()}}. It syntax higlights code with \code{\link[=highlight]{highlight()}}, and
intermingles it with output.
}
\examples{
cat(evaluate_and_highlight("1 + 2"))
cat(evaluate_and_highlight("x <- 1:10\nmean(x)"))

# -----------------------------------------------------------------
# evaluate_and_highlight() powers pkgdown's documentation formatting so
# here I include a few examples to make sure everything looks good
# -----------------------------------------------------------------

blue <- function(x) paste0("\033[34m", x, "\033[39m")
f <- function(x) {
  cat("This is some output. My favourite colour is ", blue("blue"), ".\n", sep = "")
  message("This is a message. My favourite fruit is ", blue("blueberries"))
  warning("Now at stage ", blue("blue"), "!")
}
f()

plot(1:10)

if (requireNamespace("leaflet", quietly = TRUE)) {
leaflet::addTiles(leaflet::leaflet())
}
}
downlit/man/downlit_html_path.Rd0000644000175000017500000000310113750320240016635 0ustar  nileshnilesh% Generated by roxygen2: do not edit by hand
% Please edit documentation in R/downlit-html.R
\name{downlit_html_path}
\alias{downlit_html_path}
\alias{downlit_html_node}
\title{Syntax highlight and link an HTML page}
\usage{
downlit_html_path(in_path, out_path, classes = classes_pandoc())

downlit_html_node(x, classes = classes_pandoc())
}
\arguments{
\item{in_path, out_path}{Input and output paths for HTML file}

\item{classes}{A mapping between token names and CSS class names.
Bundled \code{classes_pandoc()} and \code{classes_chroma()} provide mappings
that (roughly) match Pandoc and chroma (used by hugo) classes so you
can use existing themes.}

\item{x}{An \code{xml2::xml_node}}
}
\value{
\code{downlit_html_path()} invisibly returns \code{output_path};
\code{downlit_html_node()} modifies \code{x} in place and returns nothing.
}
\description{
\itemize{
\item Code blocks, identified by \verb{
} tags with class \verb{sourceCode r}
or any \verb{
} tag inside of \verb{
}, are processed with \code{\link[=highlight]{highlight()}}. \item Inline code, identified by \verb{} tags that contain only text (and don't have a header tag (e.g. \verb{

}) or \verb{} as an ancestor) are processed processed with \code{\link[=autolink]{autolink()}}. } Use \code{downlit_html_path()} to process an \code{.html} file on disk; use \code{downlit_html_node()} to process an in-memory \code{xml_node} as part of a larger pipeline. } \examples{ node <- xml2::read_xml("

base::t()

") node # node is modified in place downlit_html_node(node) node } downlit/man/is_low_change.Rd0000644000175000017500000000056514042044644015737 0ustar nileshnilesh% Generated by roxygen2: do not edit by hand % Please edit documentation in R/evaluate.R \name{is_low_change} \alias{is_low_change} \title{Compare two recorded plots} \usage{ is_low_change(p1, p2) } \arguments{ \item{p1, p2}{Plot results} } \value{ Logical value indicating whether \code{p2} is a low-level update of \code{p1}. } \description{ Compare two recorded plots } downlit/tests/0000755000175000017500000000000013663736745013252 5ustar nileshnileshdownlit/tests/testthat/0000755000175000017500000000000014137047030015065 5ustar nileshnileshdownlit/tests/testthat/test-topic-index.R0000644000175000017500000000213013670512467020420 0ustar nileshnileshtest_that("NULL package uses context", { local_options("downlit.topic_index" = c(foo = "bar")) expect_equal(topic_index(NULL), c(foo = "bar")) }) test_that("can capture index from in-development package", { local_devtools_package(test_path("index")) expect_equal(topic_index("index"), c("a" = "a", "b" = "b", "c" = "b")) }) test_that("can capture index from installed package", { skip_on_cran() grid_index <- topic_index("grid") expect_equal(grid_index[["unit"]], "unit") }) # find_rdname ------------------------------------------------------------- test_that("can find topic in specified package", { skip_on_cran() grid_index <- topic_index("grid") expect_equal(find_rdname("grid", "unit"), "unit") expect_equal(find_rdname("grid", "DOESNOTEXIST"), NULL) expect_warning(find_rdname("grid", "DOESNOTEXIST", TRUE), "Failed to find") }) test_that("can find topic in attached packages", { local_options("downlit.attached" = "grid") expect_equal(find_rdname_attached("unit"), list(rdname = "unit", package = "grid")) expect_equal(find_rdname_attached("DOESNOTEXIST"), NULL) }) downlit/tests/testthat/test-highlight.txt0000644000175000017500000000451214137021617020557 0ustar nileshnilesh> # explicit package > cat(highlight("MASS::addterm()")) MASS::
addterm() > cat(highlight("MASS::addterm")) MASS::addterm > cat(highlight("?MASS::addterm")) ?MASS::addterm > # implicit package > cat(highlight("library(MASS)")) library(MASS) > cat(highlight("addterm()")) addterm() > cat(highlight("median()")) median() > # local package > cat(highlight("test::foo()")) test::foo() > # operators / special syntax > cat(highlight("1 + 2 * 3")) 1 + 2 * 3 > cat(highlight("x %in% y")) x %in% y > cat(highlight("if (FALSE) 1")) if (FALSE) 1 > cat(highlight("f <- function(x = 'a') {}")) f <- function(x = 'a') {} > # ansi escapes + unicode > cat(highlight("# \033[34mblue\033[39m")) # blue downlit/tests/testthat/autolink.html0000644000175000017500000000054014122441320017572 0ustar nileshnilesh
# This is a comment

stats::median()

stats::median()

stats::median()

stats::median()
stats::median()
downlit/tests/testthat/test-evaluate.R0000644000175000017500000000546214136771000020002 0ustar nileshnileshtest_that("handles parsing failures gracefully", { expect_snapshot(test_evaluate("1 + ", highlight = TRUE)) }) test_that("highlights when requested", { expect_snapshot(test_evaluate("1 + \n 2 + 3", highlight = TRUE)) }) test_that("handles basic cases", { expect_snapshot({ test_evaluate("# comment") test_evaluate("message('x')") test_evaluate("warning('x')") test_evaluate("stop('x', call. = FALSE)") test_evaluate("f <- function() stop('x'); f()") }) }) test_that("each line of input gets span", { expect_snapshot({ test_evaluate("1 +\n 2 +\n 3 +\n 4 +\n 5") }) }) test_that("output always gets trailing nl", { # These two calls should produce the same output expect_snapshot({ test_evaluate('cat("a")\ncat("a\\n")') }) }) test_that("combines plots as needed", { expect_snapshot({ f1 <- function() plot(1) f2 <- function() lines(0:2, 0:2) test_evaluate("f1()\nf2()\n") }) expect_snapshot({ f3 <- function() { plot(1); plot(2) } test_evaluate("f3()") }) }) test_that("handles other plots", { # Check that we can drop the inclusion of the first one registerS3method("is_low_change", "fakePlot", function(p1, p2) TRUE, envir = asNamespace("downlit") ) registerS3method("replay_html", "fakePlot", function(x, ...) { paste0("") }, envir = asNamespace("downlit")) registerS3method("print", "fakePlot", function(x, ...) x) expect_snapshot_output({ f3 <- function() structure(3, class = c("fakePlot", "otherRecordedplot")) f4 <- function() structure(4, class = c("fakePlot", "otherRecordedplot")) test_evaluate("f3()\nf4()") }) }) test_that("ansi escapes are translated to html", { expect_snapshot({ blue <- function(x) paste0("\033[34m", x, "\033[39m") f <- function(x) { cat("Output: ", blue("blue"), "\n", sep = "") message(paste0("Message: ", blue("blue"))) warning(blue("blue"), call. = FALSE) stop(blue("blue"), call. = FALSE) } test_evaluate("f()\n") }) }) # html -------------------------------------------------------------------- test_that("can include literal HTML", { output <- evaluate::new_output_handler(value = identity) env <- env(foo = function() htmltools::div("foo")) html <- evaluate_and_highlight("foo()", env = env, output_handler = output, highlight = FALSE) expect_equal(as.character(html), "foo()\n
foo
") }) test_that("captures dependencies", { output <- evaluate::new_output_handler(value = identity) dummy_dep <- htmltools::htmlDependency("dummy", "1.0.0", "dummy.js") env <- env(foo = function() htmltools::div("foo", dummy_dep)) html <- evaluate_and_highlight("foo()", env = env, output_handler = output, highlight = FALSE) expect_equal(attr(html, "dependencies"), list(dummy_dep)) }) downlit/tests/testthat/test-downlit-md.txt0000644000175000017500000000165314137021616020670 0ustar nileshnilesh> # Bare code > cat(downlit_md_string("`base::t`")) [`base::t`](https://rdrr.io/r/base/t.html) > # No transforms > cat(downlit_md_string("## `base::t`")) ## `base::t` > cat(downlit_md_string("[`base::t`](http://google.com)")) [`base::t`](http://google.com) > # Nested > cat(downlit_md_string("*`base::t`*")) *[`base::t`](https://rdrr.io/r/base/t.html)* > cat(downlit_md_string("`base::t`")) [`base::t`](https://rdrr.io/r/base/t.html) > cat(downlit_md_string("1. `base::t`")) 1. [`base::t`](https://rdrr.io/r/base/t.html) > # Markdown extensions not in GFM > cat(downlit_md_string("| `base::t`", format = "markdown")) | [`base::t`](https://rdrr.io/r/base/t.html) > md <- brio::read_lines(test_path("markdown-definition.md")) > cat(downlit_md_string(md, "markdown")) Term 1 ([`base::t()`](https://rdrr.io/r/base/t.html)) : Definition 1 ([`base::t()`](https://rdrr.io/r/base/t.html)) downlit/tests/testthat/test-link.R0000644000175000017500000001742214136771000017130 0ustar nileshnileshtest_that("can link function calls", { local_options( "downlit.package" = "test", "downlit.topic_index" = c(foo = "bar") ) expect_equal(href_expr_(foo()), "bar.html") # even if namespaced expect_equal(href_expr_(test::foo()), "bar.html") # but functions with arguments are ignored expect_equal(href_expr_(foo(1, 2, 3)), NA_character_) # as are function factories are ignored expect_equal(href_expr_(foo()(1, 2, 3)), NA_character_) expect_equal(href_expr_(test::foo()(1, 2, 3)), NA_character_) }) test_that("base function calls linked", { expect_equal(href_expr_(median()), href_topic_remote("median", "stats")) }) test_that("respects options that define current location", { local_options( "downlit.topic_index" = c(bar = "bar"), "downlit.topic_path" = "myref/" ) # when not in an Rd file, link with topic_path local_options("downlit.rdname" = "") expect_equal(href_expr_(bar()), "myref/bar.html") # don't link to self local_options("downlit.rdname" = "bar") expect_equal(href_expr_(bar()), NA_character_) }) test_that("can link remote objects", { expect_equal(href_expr_(MASS::abbey), href_topic_remote("abbey", "MASS")) expect_equal(href_expr_(MASS::addterm()), href_topic_remote("addterm", "MASS")) expect_equal(href_expr_(MASS::addterm.default()), href_topic_remote("addterm", "MASS")) # Doesn't exist expect_equal(href_expr_(MASS::blah), NA_character_) }) test_that("can link to functions in registered packages", { local_options("downlit.attached" = "MASS") expect_equal(href_expr_(addterm()), href_topic_remote("addterm", "MASS")) expect_equal(href_expr_(addterm.default()), href_topic_remote("addterm", "MASS")) }) test_that("can link to package names in registered packages", { expect_equal( autolink_curly("{downlit}"), "downlit" ) expect_equal(autolink_curly("{package}"), NA_character_) # No curly = no link expect_equal(autolink_curly(""), NA_character_) }) test_that("can link to functions in base packages", { expect_equal(href_expr_(abbreviate()), href_topic_remote("abbreviate", "base")) expect_equal(href_expr_(median()), href_topic_remote("median", "stats")) }) test_that("links to home of re-exported functions", { expect_equal(href_expr_(testthat::`%>%`), href_topic_remote("%>%", "magrittr")) }) test_that("fails gracely if can't find re-exported function", { local_options( "downlit.package" = "downlit", "downlit.topic_index" = c(foo = "reexports") ) expect_equal(href_expr_(foo()), NA_character_) }) test_that("can link to remote pkgdown sites", { # use autolink() to avoid R CMD check NOTE expect_equal(autolink_url("pkgdown::add_slug"), href_topic_remote("pkgdown", "add_slug")) expect_equal(autolink_url("pkgdown::add_slug(1)"), href_topic_remote("pkgdown", "add_slug")) }) test_that("or local sites, if registered", { local_options("downlit.local_packages" = c("MASS" = "MASS")) expect_equal(href_expr_(MASS::abbey), "MASS/reference/abbey.html") }) test_that("bare bare symbols are not linked", { expect_equal(autolink_url("%in%"), NA_character_) expect_equal(autolink_url("foo"), NA_character_) }) test_that("returns NA for bad inputs", { expect_equal(autolink_url(""), NA_character_) expect_equal(autolink_url("a; b"), NA_character_) expect_equal(autolink_url("1"), NA_character_) }) # help -------------------------------------------------------------------- test_that("can link ? calls", { local_options( "downlit.package" = "test", "downlit.topic_index" = c(foo = "foo", "foo-package" = "foo-package") ) expect_equal(href_expr_(?foo), "foo.html") expect_equal(href_expr_(?"foo"), "foo.html") expect_equal(href_expr_(?test::foo), "foo.html") expect_equal(href_expr_(package?foo), "foo-package.html") }) test_that("can link help calls", { local_options( "downlit.package" = "test", "downlit.topic_index" = c(foo = "foo", "foo-package" = "foo-package") ) expect_equal(href_expr_(help("foo")), "foo.html") expect_equal(href_expr_(help("foo", "test")), "foo.html") expect_equal(href_expr_(help(package = "MASS")), "https://rdrr.io/pkg/MASS/man") expect_equal(href_expr_(help()), NA_character_) expect_equal(href_expr_(help(a$b)), NA_character_) }) # library and friends ----------------------------------------------------- test_that("library() linked to package reference", { skip_on_cran() # in case URLs change skip_on_os("solaris") expect_equal(href_expr_(library(rlang)), "https://rlang.r-lib.org") expect_equal(href_expr_(library(MASS)), "http://www.stats.ox.ac.uk/pub/MASS4/") }) test_that("except when not possible", { expect_equal(href_expr_(library()), "https://rdrr.io/r/base/library.html") expect_equal(href_expr_(library(doesntexist)), "https://rdrr.io/r/base/library.html") expect_equal(href_expr_(library(package = )), "https://rdrr.io/r/base/library.html") expect_equal(href_expr_(library("x", "y", "z")), "https://rdrr.io/r/base/library.html") }) # vignette ---------------------------------------------------------------- test_that("can link to local articles", { local_options( "downlit.package" = "test", "downlit.article_index" = c(x = "y.html"), "downlit.article_path" = "my_path/", ) expect_equal(href_expr_(vignette("x")), "my_path/y.html") expect_equal(href_expr_(vignette("x", package = "test")), "my_path/y.html") expect_equal(href_expr_(vignette("y")), NA_character_) }) test_that("can link to remote articles", { skip_on_cran() expect_equal( href_expr_(vignette("sha1", "digest")), "https://cran.rstudio.com/web/packages/digest/vignettes/sha1.html" ) expect_equal(href_expr_(vignette("blah1", "digest")), NA_character_) expect_equal( href_expr_(vignette(package = "digest", "sha1")), "https://cran.rstudio.com/web/packages/digest/vignettes/sha1.html" ) expect_equal( href_expr_(vignette("custom-expectation", "testthat")), "https://testthat.r-lib.org/articles/custom-expectation.html" ) }) test_that("or local sites, if registered", { local_options("downlit.local_packages" = c("digest" = "digest")) expect_equal(href_expr_(vignette("sha1", "digest")), "digest/articles/sha1.html") }) test_that("looks in attached packages", { local_options("downlit.attached" = c("grid", "digest")) expect_equal( href_expr_(vignette("sha1")), "https://cran.rstudio.com/web/packages/digest/vignettes/sha1.html" ) expect_equal( href_expr_(vignette("moveline")), "https://cran.rstudio.com/web/packages/grid/vignettes/moveline.pdf" ) }) test_that("fail gracefully with non-working calls", { expect_equal(href_expr_(vignette()), "https://rdrr.io/r/utils/vignette.html") expect_equal(href_expr_(vignette(package = package)), NA_character_) expect_equal(href_expr_(vignette(1, 2)), NA_character_) expect_equal(href_expr_(vignette(, )), NA_character_) }) test_that("spurious functions are not linked (#889)", { expect_equal(href_expr_(Authors@R), NA_character_) expect_equal(href_expr_(content-home.html), NA_character_) expect_equal(href_expr_(toc: depth), NA_character_) }) test_that("autolink generates HTML if linkable", { expect_equal( autolink("stats::median()"), "stats::median()" ) expect_equal(autolink("1 +"), NA_character_) }) test_that("href_package can handle non-existing packages", { expect_equal(href_package("NotAPackage"), NA_character_) }) # find_reexport_source ---------------------------------------------------- test_that("can find functions", { expect_equal(find_reexport_source(is.null), "base") expect_equal(find_reexport_source(mean), "base") }) test_that("can find other objects", { expect_equal(find_reexport_source(na_cpl, "downlit", "na_cpl"), "rlang") expect_equal(find_reexport_source(na_cpl, "downlit", "MISSING"), NA_character_) }) downlit/tests/testthat/index/0000755000175000017500000000000014137033433016176 5ustar nileshnileshdownlit/tests/testthat/index/DESCRIPTION0000644000175000017500000000101113664026146017704 0ustar nileshnileshPackage: index Title: Test package for document/article indices Version: 0.0.0.9000 Authors@R: c(person(given = "Hadley", family = "Wickham", role = c("aut", "cre"), email = "hadley@rstudio.com"), person(given = "RStudio", role = "cph")) Description: What the package does (one paragraph). License: MIT + file LICENSE Encoding: UTF-8 LazyData: true Roxygen: list(markdown = TRUE) RoxygenNote: 7.1.0 Suggests: knitr, rmarkdown VignetteBuilder: knitr downlit/tests/testthat/index/LICENSE.md0000644000175000017500000000206113664026071017605 0ustar nileshnilesh# MIT License Copyright (c) 2020 Hadley Wickham Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. downlit/tests/testthat/index/man/0000755000175000017500000000000013664031650016754 5ustar nileshnileshdownlit/tests/testthat/index/man/b.Rd0000644000175000017500000000024513664031650017465 0ustar nileshnilesh% Generated by roxygen2: do not edit by hand % Please edit documentation in R/basic.R \name{b} \alias{b} \alias{c} \title{B + C} \usage{ b() } \description{ B + C } downlit/tests/testthat/index/man/a.Rd0000644000175000017500000000022313664026074017464 0ustar nileshnilesh% Generated by roxygen2: do not edit by hand % Please edit documentation in R/basic.R \name{a} \alias{a} \title{A} \usage{ a() } \description{ A } downlit/tests/testthat/index/vignettes/0000755000175000017500000000000014137033433020206 5ustar nileshnileshdownlit/tests/testthat/index/vignettes/test-1.Rmd0000644000175000017500000000024513664025336021777 0ustar nileshnilesh--- title: "test-1" output: rmarkdown::html_vignette vignette: > %\VignetteIndexEntry{test-1} %\VignetteEngine{knitr::rmarkdown} %\VignetteEncoding{UTF-8} --- downlit/tests/testthat/index/vignettes/test-2.Rmd0000644000175000017500000000024513664025334021776 0ustar nileshnilesh--- title: "test-2" output: rmarkdown::html_vignette vignette: > %\VignetteIndexEntry{test-2} %\VignetteEngine{knitr::rmarkdown} %\VignetteEncoding{UTF-8} --- downlit/tests/testthat/index/index.Rproj0000644000175000017500000000063413664031627020335 0ustar nileshnileshVersion: 1.0 RestoreWorkspace: No SaveWorkspace: No AlwaysSaveHistory: Default EnableCodeIndexing: Yes UseSpacesForTab: Yes NumSpacesForTab: 2 Encoding: UTF-8 RnwWeave: knitr LaTeX: XeLaTeX AutoAppendNewline: Yes StripTrailingWhitespace: Yes LineEndingConversion: Posix BuildType: Package PackageUseDevtools: Yes PackageInstallArgs: --no-multiarch --with-keep.source PackageRoxygenize: rd,collate,namespace downlit/tests/testthat/index/R/0000755000175000017500000000000013664025304016401 5ustar nileshnileshdownlit/tests/testthat/index/R/basic.R0000644000175000017500000000013113664031647017607 0ustar nileshnilesh#' A #' @export a <- function() {} #' B + C #' @aliases c #' @export b <- function() {} downlit/tests/testthat/index/LICENSE0000644000175000017500000000005413664026071017206 0ustar nileshnileshYEAR: 2020 COPYRIGHT HOLDER: Hadley Wickham downlit/tests/testthat/index/NAMESPACE0000644000175000017500000000010213664031675017420 0ustar nileshnilesh# Generated by roxygen2: do not edit by hand export(a) export(b) downlit/tests/testthat/fake-repo/0000755000175000017500000000000014137033433016740 5ustar nileshnileshdownlit/tests/testthat/fake-repo/src/0000755000175000017500000000000014137033433017527 5ustar nileshnileshdownlit/tests/testthat/fake-repo/src/contrib/0000755000175000017500000000000014137016133021165 5ustar nileshnileshdownlit/tests/testthat/fake-repo/src/contrib/PACKAGES0000644000175000017500000000101714137016133022265 0ustar nileshnileshPackage: BMRSr Type: Package Title: Wrapper Functions to the 'BMRS API' Version: 1.0.3 Authors@R: person("Adam", "Rawles", email = "adamrawles@hotmail.co.uk", role = c("aut", "cre")) Description: A set of wrapper functions to better interact with the 'Balancing Mechanism Reporting System API' (). License: GPL (>= 2) Encoding: UTF-8 LazyData: true Depends: R (>= 2.10) Imports: httr, xml2, stringr, tibble, readr, methods, purrr, dplyr, rlang RoxygenNote: 7.1.1 URL: https://trick-url.com/ downlit/tests/testthat/test-highlight.R0000644000175000017500000000660514136771000020143 0ustar nileshnileshtest_that("can link to external topics that use ::", { local_options( "downlit.package" = "test", "downlit.topic_index" = c(foo = "bar") ) verify_output(test_path("test-highlight.txt"), { "explicit package" cat(highlight("MASS::addterm()")) cat(highlight("MASS::addterm")) cat(highlight("?MASS::addterm")) "implicit package" cat(highlight("library(MASS)")) cat(highlight("addterm()")) cat(highlight("median()")) # base "local package" cat(highlight("test::foo()")) "operators / special syntax" cat(highlight("1 + 2 * 3")) cat(highlight("x %in% y")) cat(highlight("if (FALSE) 1")) cat(highlight("f <- function(x = 'a') {}")) "ansi escapes + unicode" cat(highlight("# \033[34mblue\033[39m")) }) }) test_that("unicode is not mangled", { skip_on_os("windows") expect_equal(highlight("# \u2714"), "# \u2714") }) test_that("custom infix operators are linked, but regular are not", { expect_snapshot_output(cat(highlight("x %in% y\n"))) expect_snapshot_output(cat(highlight("x + y\n"))) }) test_that("distinguish logical and numeric",{ expect_equal(highlight("TRUE"), "TRUE") expect_equal(highlight("FALSE"), "FALSE") expect_equal(highlight("1"), "1") }) test_that("can parse code with carriage returns", { lines <- strsplit(highlight("1\r\n2"), "\n")[[1]] expect_equal(lines[[1]], "1") expect_equal(lines[[2]], "2") }) test_that("can highlight code in Latin1", { x <- "'\xfc'" Encoding(x) <- "latin1" out <- highlight(x) expect_equal(Encoding(out), "UTF-8") expect_equal(out, "'\u00fc'") }) test_that("syntax can span multiple lines", { expect_equal(highlight("f(\n\n)"), "f(\n\n)") expect_equal(highlight("'\n\n'"), "'\n\n'") }) test_that("code with tab is not mangled", { expect_equal(highlight("\tf()"), " f()") expect_equal(highlight("'\t'"), "' '") }) test_that("unparsable code returns NULL", { expect_equal(highlight("<"), NA_character_) # but pure comments still highlighted expect_equal( highlight("#"), "#" ) }) test_that("R6 methods don't get linked", { expect_equal( highlight("x$get()"), "x$get()" ) expect_equal( highlight("x$library()"), "x$library()" ) }) test_that("R6 instantiation gets linked", { expect_equal( highlight("mean$new()"), "mean$new()" ) # But not new itself expect_equal( highlight("new()"), "new()" ) }) test_that("ansi escapes are converted to html", { expect_snapshot_output(highlight("# \033[31mhello\033[m")) expect_snapshot_output(highlight("# \u2029[31mhello\u2029[m")) }) downlit/tests/testthat/test-metadata.R0000644000175000017500000000310614137016133017745 0ustar nileshnileshtest_that("can extract urls for package", { # since the package urls might potentially change skip_on_cran() expect_equal(package_urls("base"), character()) expect_equal(package_urls("packagethatdoesn'texist"), character()) expect_equal(package_urls(""), character()) expect_equal(package_urls("MASS"), "http://www.stats.ox.ac.uk/pub/MASS4/") }) test_that("can extract urls for uninstalled packages from CRAN", { skip_on_cran() pkg <- "BMRSr" skip_if(requireNamespace(pkg, quietly = TRUE), "BMRSr package is installed") # We're testing here that we can find URLs for packages that aren't installed # I'm assuming that BMRSr isn't going to be installed (because why would it), # but this might not always be true expect_equal(package_urls("BMRSr"), "https://bmrsr.arawles.co.uk/") expect_equal(package_urls("BMRSr", repos = c()), "https://bmrsr.arawles.co.uk/") # Prefers user specified repo cran_repo <- "https://cran.rstudio.com" fake_repo <- paste0("file:", test_path("fake-repo")) expect_equal(package_urls("BMRSr", repos = c(fake_repo)), "https://trick-url.com/") # even if CRAN comes first expect_equal( package_urls("BMRSr", repos = c(CRAN = cran_repo, fake_repo)), "https://trick-url.com/" ) }) test_that("handle common url formats", { ab <- c("https://a.com", "https://b.com") expect_equal(parse_urls("https://a.com,https://b.com"), ab) expect_equal(parse_urls("https://a.com, https://b.com"), ab) expect_equal(parse_urls("https://a.com https://b.com"), ab) expect_equal(parse_urls("https://a.com (comment) https://b.com"), ab) }) downlit/tests/testthat/test-downlit-html.txt0000644000175000017500000000135314137021614021227 0ustar nileshnilesh> out <- downlit_html_path(test_path("autolink.html"), tempfile()) > cat(brio::read_lines(out), sep = "\n")

# This is a comment

stats::median()

stats::median()

stats::median()

stats::median()
stats::median()
downlit/tests/testthat/test-article-index.R0000644000175000017500000000232013731234742020721 0ustar nileshnileshtest_that("NULL package uses context", { local_options("downlit.article_index" = c(a = "a")) expect_equal(article_index(NULL), c(a = "a")) }) test_that("can capture index from in-development package", { local_devtools_package(test_path("index")) expect_equal( article_index("index"), c("test-1" = "test-1.html", "test-2" = "test-2.html") ) expect_equal(find_article("index", "test-1"), "test-1.html") expect_equal(find_article("index", "missing"), NULL) }) test_that("can capture index of installed package (no vignettes)", { # Ensure we skip this test if MASS ever gets vignettes skip_if_not(nrow(vignette(package = "MASS")$results) == 0) expect_equal( article_index("base"), character() ) }) test_that("can capture index of installed package (vignettes + pkgdown)", { # Requires internet + number of vignettes might change skip_on_cran() expect_true("custom-expectation.html" %in% article_index("testthat")) }) test_that("can capture index of installed package (vignettes + pkgdown)", { n_vignettes <- nrow(vignette(package = "grid")$results) expect_length(article_index("grid"), n_vignettes) }) # find_article ------------------------------------------------------------ downlit/tests/testthat/markdown-definition.md0000644000175000017500000000006513721205213021355 0ustar nileshnileshTerm 1 (`base::t()`) : Definition 1 (`base::t()`) downlit/tests/testthat/test-downlit-md.R0000644000175000017500000000356014042074111020242 0ustar nileshnilesh# Sys.setenv("RSTUDIO_PANDOC" = "") # rmarkdown::find_pandoc(cache = FALSE) test_that("common across multiple versions", { skip_if_not(rmarkdown::pandoc_version() > "2.0.0") verify_output(test_path("test-downlit-md.txt"), { "Bare code" cat(downlit_md_string("`base::t`")) "No transforms" cat(downlit_md_string("## `base::t`")) cat(downlit_md_string("[`base::t`](http://google.com)")) "Nested" cat(downlit_md_string("*`base::t`*")) cat(downlit_md_string("`base::t`")) cat(downlit_md_string("1. `base::t`")) "Markdown extensions not in GFM" cat(downlit_md_string("| `base::t`", format = "markdown")) md <- brio::read_lines(test_path("markdown-definition.md")) cat(downlit_md_string(md, "markdown")) }) }) test_that("pandoc AST v1.20", { skip_if_not(rmarkdown::pandoc_version() > "2.0.0") skip_if_not(rmarkdown::pandoc_version() < "2.10") verify_output(test_path("test-downlit-md-v20.txt"), { cat(downlit_md_string("* `base::t`")) cat(downlit_md_string("```\nbase::t(1)\n```")) cat(downlit_md_string(brio::read_lines(test_path("markdown-table.md")))) }) }) test_that("pandoc AST v1.21", { skip_if_not(rmarkdown::pandoc_version() >= "2.10") skip_if_not(rmarkdown::pandoc_version() < "2.11") verify_output(test_path("test-downlit-md-v21.txt"), { cat(downlit_md_string("* `base::t`")) cat(downlit_md_string("```\nbase::t(1)\n```")) cat(downlit_md_string(brio::read_lines(test_path("markdown-table.md")))) }) }) test_that("Special package string gets linked", { # needed for eof setting on windows skip_if_not(rmarkdown::pandoc_version() > "2.0.0") expect_equal( downlit_md_string("`{downlit}`"), "[downlit](https://downlit.r-lib.org/)\n" ) expect_equal( downlit_md_string("`{thisisrealltnotapackagename}`"), "`{thisisrealltnotapackagename}`\n" ) }) downlit/tests/testthat/test-downlit-html.R0000644000175000017500000000170514122441316020611 0ustar nileshnileshtest_that("can highlight html file", { # verify_output() seems to be generating the wrong line endings skip_on_os("windows") verify_output(test_path("test-downlit-html.txt"), { out <- downlit_html_path(test_path("autolink.html"), tempfile()) cat(brio::read_lines(out), sep = "\n") }) }) test_that("highlight all pre inside div.downlit", { html <- xml2::read_xml("
1 + 2
3 + 4
No hightlight
" ) downlit_html_node(html) expect_snapshot_output(show_xml(html)) }) test_that("special package string gets linked", { html <- xml2::read_xml("

before {downlit} after

") downlit_html_node(html) expect_snapshot_output(show_xml(html)) # But only when it's a real package html <- xml2::read_xml("

before {notapkg} after

") downlit_html_node(html) expect_snapshot_output(show_xml(html)) }) downlit/tests/testthat/test-downlit-md-v21.txt0000644000175000017500000000121713750317074021300 0ustar nileshnilesh> cat(downlit_md_string("* `base::t`")) - [`base::t`](https://rdrr.io/r/base/t.html) > cat(downlit_md_string("```\nbase::t(1)\n```"))
base::t(1)
> cat(downlit_md_string(brio::read_lines(test_path("markdown-table.md")))) Table: Caption [`base::t`](https://rdrr.io/r/base/t.html) | [`base::t`](https://rdrr.io/r/base/t.html) | xx | |-------------------------------------------:|:----| | [`base::t`](https://rdrr.io/r/base/t.html) | yy | downlit/tests/testthat/test-downlit-md-v20.txt0000644000175000017500000000121513733213502021265 0ustar nileshnilesh> cat(downlit_md_string("* `base::t`")) - [`base::t`](https://rdrr.io/r/base/t.html) > cat(downlit_md_string("```\nbase::t(1)\n```"))
base::t(1)
> cat(downlit_md_string(brio::read_lines(test_path("markdown-table.md")))) Table: Caption [`base::t`](https://rdrr.io/r/base/t.html) | [`base::t`](https://rdrr.io/r/base/t.html) | xx | | -----------------------------------------: | :- | | [`base::t`](https://rdrr.io/r/base/t.html) | yy | downlit/tests/testthat/markdown-table.md0000644000175000017500000000012413665174616020333 0ustar nileshnileshTable: Caption `base::t` | `base::t` | xx | |----------:|:---| | `base::t` | yy | downlit/tests/testthat/test-utils.R0000644000175000017500000000027514136771000017331 0ustar nileshnileshtest_that("converts Latin1 encoded text to utf8", { x <- "'\xfc'" Encoding(x) <- "latin1" y <- safe_parse(x)[[1]] expect_equal(Encoding(y), "UTF-8") expect_equal(y, "\u00fc") }) downlit/tests/testthat/test-packages.R0000644000175000017500000000260414042051604017742 0ustar nileshnileshtest_that("extracts typical library()/require() calls", { expect_equal(extract_package_attach_(library("blah")), "blah") expect_equal(extract_package_attach_(library(blah)), "blah") expect_equal(extract_package_attach_(require("blah")), "blah") expect_equal(extract_package_attach_(require(blah)), "blah") }) test_that("detects in nested code", { expect_equal(extract_package_attach_({ library(a) x <- 2 { library(b) y <- 3 { library(c) z <- 4 } } }), c("a", "b", "c")) }) test_that("handles expressions", { # which will usually come from parse()d code # x <- expression( x <- 1, library("a"), y <- 2, library("b") ) expect_equal(extract_package_attach(x), c("a", "b")) }) test_that("detects with non-standard arg order", { expect_equal(extract_package_attach_(library(quiet = TRUE, pa = "a")), "a") expect_equal(extract_package_attach_(library(quiet = TRUE, a)), "a") }) test_that("doesn't include if character.only = TRUE", { expect_equal( extract_package_attach_(library(x, character.only = TRUE)), character() ) }) test_that("adds packages attached by dependencies", { skip_if_not_installed("ggraph") expect_true("ggplot2" %in% add_depends("ggraph")) }) test_that("adds tidyverse packages", { skip_if_not_installed("tidyverse") expect_true("ggplot2" %in% add_depends("tidyverse")) }) downlit/tests/testthat/_snaps/0000755000175000017500000000000014137021524016350 5ustar nileshnileshdownlit/tests/testthat/_snaps/downlit-html.md0000644000175000017500000000116714137021615021322 0ustar nileshnilesh# highlight all pre inside div.downlit
    1 + 2
    3 + 4
No hightlight
# special package string gets linked

before downlit after

---

before {notapkg} after

downlit/tests/testthat/_snaps/evaluate.md0000644000175000017500000001013314137021617020501 0ustar nileshnilesh# handles parsing failures gracefully Code test_evaluate("1 + ", highlight = TRUE) Output 1 + #> Error: <text>:2:0: unexpected end of input #> 1: 1 + #> ^ # highlights when requested Code test_evaluate("1 + \n 2 + 3", highlight = TRUE) Output 1 + 2 + 3 #> [1] 6 # handles basic cases Code test_evaluate("# comment") Output # comment Code test_evaluate("message('x')") Output message('x') #> x Code test_evaluate("warning('x')") Output warning('x') #> Warning: x Code test_evaluate("stop('x', call. = FALSE)") Output stop('x', call. = FALSE) #> Error: x Code test_evaluate("f <- function() stop('x'); f()") Output f <- function() stop('x'); f() #> Error in f(): x # each line of input gets span Code test_evaluate("1 +\n 2 +\n 3 +\n 4 +\n 5") Output 1 + 2 + 3 + 4 + 5 #> [1] 15 # output always gets trailing nl Code test_evaluate("cat(\"a\")\ncat(\"a\\n\")") Output cat("a") #> a cat("a\n") #> a # combines plots as needed Code f1 <- (function() plot(1)) f2 <- (function() lines(0:2, 0:2)) test_evaluate("f1()\nf2()\n") Output f1() f2() --- Code f3 <- (function() { plot(1) plot(2) }) test_evaluate("f3()") Output f3() # handles other plots f3() f4() # ansi escapes are translated to html Code blue <- (function(x) paste0("\033[34m", x, "\033[39m")) f <- (function(x) { cat("Output: ", blue("blue"), "\n", sep = "") message(paste0("Message: ", blue("blue"))) warning(blue("blue"), call. = FALSE) stop(blue("blue"), call. = FALSE) }) test_evaluate("f()\n") Output f() #> Output: blue #> Message: blue #> Warning: blue #> Error: blue downlit/tests/testthat/_snaps/highlight.md0000644000175000017500000000073714137021620020645 0ustar nileshnilesh# custom infix operators are linked, but regular are not x %in% y --- x + y # ansi escapes are converted to html [1] "# hello" --- [1] "# hello" downlit/tests/testthat.R0000644000175000017500000000007213663736745015234 0ustar nileshnileshlibrary(testthat) library(downlit) test_check("downlit") downlit/R/0000755000175000017500000000000014137016133012264 5ustar nileshnileshdownlit/R/zzz.R0000644000175000017500000000020214137016133013236 0ustar nileshnilesh.onLoad <- function(libname, pkgname) { repo_urls <<- memoise::memoise(repo_urls) CRAN_urls <<- memoise::memoise(CRAN_urls) } downlit/R/metadata.R0000644000175000017500000000761714137016133014202 0ustar nileshnileshremote_urls <- function(package) { local <- getOption("downlit.local_packages") if (has_name(local, package)) { base_url <- local[[package]] list( reference = file.path(base_url, "reference"), article = file.path(base_url, "articles") ) } else { remote_metadata(package)$urls } } remote_package_reference_url <- function(package) { remote_urls(package)$reference } remote_package_article_url <- function(package) { remote_urls(package)$article } # Retrieve remote metadata ------------------------------------------------ remote_metadata <- function(package) { # Is the metadata installed with the package? meta <- local_metadata(package) if (!is.null(meta)) { return(meta) } # Otherwise, look in package websites, caching since this is a slow operation tempdir <- Sys.getenv("RMARKDOWN_PREVIEW_DIR", unset = tempdir()) dir.create(file.path(tempdir, "downlit"), showWarnings = FALSE) cache_path <- file.path(tempdir, "downlit", package) if (file.exists(cache_path)) { readRDS(cache_path) } else { meta <- remote_metadata_slow(package) saveRDS(meta, cache_path) meta } } local_metadata <- function(package) { local_path <- system.file("pkgdown.yml", package = package) if (local_path == "") { NULL } else { yaml::read_yaml(local_path) } } remote_metadata_slow <- function(package) { urls <- package_urls(package) for (url in urls) { url <- paste0(url, "/pkgdown.yml") yaml <- tryCatch(fetch_yaml(url), error = function(e) NULL) if (is.list(yaml)) { if (has_name(yaml, "articles")) { yaml$articles <- unlist(yaml$articles) } if (!has_name(yaml, "urls")) { base_url <- dirname(url) yaml$urls <- list( reference = paste0(base_url, "/reference"), article = paste0(base_url, "/articles") ) } return(yaml) } } NULL } fetch_yaml <- function(url) { path <- tempfile() if (suppressWarnings(utils::download.file(url, path, quiet = TRUE) != 0)) { abort("Failed to download") } # This call may warn if the URL doesn't have a final LF; # see pkgdown issue #1419 suppressWarnings(yaml::read_yaml(path)) } # Helpers ----------------------------------------------------------------- package_urls <- function(package, repos = getOption("repos")) { if (package == "") { # if e.g. library(a$pkg) then pkg is "" return(character()) } path <- system.file("DESCRIPTION", package = package) if (path != "") { # If the package is installed, use its DESCRIPTION url <- read.dcf(path, fields = "URL")[[1]] } else { # Otherwise try repo metadata, always trying CRAN last user_repos <- repos[names2(repos) != "CRAN"] meta <- c(lapply(user_repos, repo_urls), list(CRAN_urls())) urls <- unlist(lapply(meta, function(pkgs) pkgs$URL[match(package, pkgs[["Package"]])])) # Take first non-NA (if any) url <- urls[!is.na(urls)] if (all(is.na(urls))) { url <- NA_character_ } else { url <- urls[!is.na(urls)][[1]] } } parse_urls(url) } parse_urls <- function(x) { urls <- trimws(strsplit(trimws(x), "[,\\s]+", perl = TRUE)[[1]]) urls <- urls[grepl("^http", urls)] sub_special_cases(urls) } # Both memoised in .onLoad repo_urls <- function(repo) { # Only works if the repo exposes the URL field in PACKAGES, and most don't as.data.frame( utils::available.packages(repos = repo, fields = "URL"), stringsAsFactors = FALSE ) } CRAN_urls <- function() { # Substantially faster to use RStudio mirror: in my testing this reduced # download time from ~2s to 0.6s withr::local_envvar(R_CRAN_WEB = "https://cran.rstudio.com") tools::CRAN_package_db() } # All rOpenSci repositories have a known pkgdown URL. # Todo: could generalise this concept for other orgs. sub_special_cases <- function(urls){ sub("^https?://github.com/ropensci/(\\w+).*$", "https://docs.ropensci.org/\\1", urls) } downlit/R/link.R0000644000175000017500000002021014136771000013337 0ustar nileshnilesh#' Automatically link inline code #' #' @param text String of code to highlight and link. #' @return #' If `text` is linkable, an HTML link for `autolink()`, and or just #' the URL for `autolink_url()`. Both return `NA` if the text is not #' linkable. #' @inheritSection highlight Options #' @export #' @examples #' autolink("stats::median()") #' autolink("vignette('grid', package = 'grid')") #' #' autolink_url("stats::median()") autolink <- function(text) { href <- autolink_url(text) if (identical(href, NA_character_)) { return(NA_character_) } paste0("", escape_html(text), "") } #' @export #' @rdname autolink autolink_url <- function(text) { expr <- safe_parse(text) if (length(expr) == 0) { return(NA_character_) } href_expr(expr[[1]]) } autolink_curly <- function(text) { package_name <- extract_curly_package(text) if (is.na(package_name)) { return(NA_character_) } href <- href_package(package_name) if (is.na(href)) { return(NA_character_) } paste0("", package_name, "") } # Helper for testing href_expr_ <- function(expr, ...) { href_expr(substitute(expr), ...) } href_expr <- function(expr) { if (!is_call(expr)) { return(NA_character_) } fun <- expr[[1]] if (is_call(fun, "::", n = 2)) { pkg <- as.character(fun[[2]]) fun <- fun[[3]] } else { pkg <- NULL } if (!is_symbol(fun)) return(NA_character_) fun_name <- as.character(fun) n_args <- length(expr) - 1 if (fun_name %in% c("library", "require", "requireNamespace")) { if (n_args == 1 && is.null(names(expr))) { pkg <- as.character(expr[[2]]) topic <- href_package(pkg) if (is.na(topic)) { href_topic(fun_name) } else { topic } } else { href_topic(fun_name) } } else if (fun_name == "vignette") { if (length(expr) == 1) { return(href_topic(fun_name)) } expr <- call_standardise(expr) topic_ok <- is.character(expr$topic) package_ok <- is.character(expr$package) || is.null(expr$package) if (topic_ok && package_ok) { href_article(expr$topic, expr$package) } else { NA_character_ } } else if (fun_name == "?") { if (n_args == 1) { topic <- expr[[2]] if (is_call(topic, "::")) { # ?pkg::x href_topic(as.character(topic[[3]]), as.character(topic[[2]])) } else if (is_symbol(topic) || is_string(topic)) { # ?x href_topic(as.character(expr[[2]])) } else { NA_character_ } } else if (n_args == 2) { # package?x href_topic(paste0(expr[[3]], "-", expr[[2]])) } } else if (fun_name == "help") { expr <- call_standardise(expr) if (is_help_literal(expr$topic) && is_help_literal(expr$package)) { href_topic(as.character(expr$topic), as.character(expr$package)) } else if (is_help_literal(expr$topic) && is.null(expr$package)) { href_topic(as.character(expr$topic)) } else if (is.null(expr$topic) && is_help_literal(expr$package)) { href_package_ref(as.character(expr$package)) } else { NA_character_ } } else if (fun_name == "::") { href_topic(as.character(expr[[3]]), as.character(expr[[2]])) } else if (n_args == 0) { href_topic(fun_name, pkg) } else { NA_character_ } } is_help_literal <- function(x) is_string(x) || is_symbol(x) # Topics ------------------------------------------------------------------ #' Generate url for topic/article/package #' #' @param topic,article Topic/article name #' @param package Optional package name #' @keywords internal #' @export #' @return URL topic or article; `NA` if can't find one. #' @examples #' href_topic("t") #' href_topic("DOESN'T EXIST") #' href_topic("href_topic", "downlit") #' #' href_package("downlit") href_topic <- function(topic, package = NULL) { if (is_package_local(package)) { href_topic_local(topic) } else { href_topic_remote(topic, package) } } is_package_local <- function(package) { if (is.null(package)) { return(TRUE) } cur <- getOption("downlit.package") if (is.null(cur)) { return(FALSE) } package == cur } href_topic_local <- function(topic) { rdname <- find_rdname(NULL, topic) if (is.null(rdname)) { # Check attached packages loc <- find_rdname_attached(topic) if (is.null(loc)) { return(NA_character_) } else { return(href_topic_remote(topic, loc$package)) } } if (rdname == "reexports") { return(href_topic_reexported(topic, getOption("downlit.package"))) } cur_rdname <- getOption("downlit.rdname", "") if (rdname == cur_rdname) { return(NA_character_) } if (cur_rdname != "") { paste0(rdname, ".html") } else { paste0(getOption("downlit.topic_path"), rdname, ".html") } } href_topic_remote <- function(topic, package) { rdname <- find_rdname(package, topic) if (is.null(rdname)) { return(NA_character_) } if (rdname == "reexports") { return(href_topic_reexported(topic, package)) } paste0(href_package_ref(package), "/", rdname, ".html") } # If it's a re-exported function, we need to work a little harder to # find out its source so that we can link to it. href_topic_reexported <- function(topic, package) { ns <- ns_env(package) exports <- .getNamespaceInfo(ns, "exports") if (!env_has(exports, topic)) { NA_character_ } else { obj <- env_get(ns, topic, inherit = TRUE) package <- find_reexport_source(obj, ns, topic) href_topic_remote(topic, package) } } find_reexport_source <- function(obj, ns, topic) { if (is.primitive(obj)) { # primitive functions all live in base "base" } else if (is.function(obj)) { ## For functions, we can just take their environment. ns_env_name(get_env(obj)) } else { ## For other objects, we need to check the import env of the package, ## to see where 'topic' is coming from. The import env has redundant ## information. It seems that we just need to find a named list ## entry that contains `topic`. imp <- getNamespaceImports(ns) imp <- imp[names(imp) != ""] wpkgs <- vapply(imp, `%in%`, x = topic, FUN.VALUE = logical(1)) if (!any(wpkgs)) { return(NA_character_) } pkgs <- names(wpkgs)[wpkgs] # Take the last match, in case imports have name clashes. pkgs[[length(pkgs)]] } } # Articles ---------------------------------------------------------------- #' @export #' @rdname href_topic href_article <- function(article, package = NULL) { if (is_package_local(package)) { path <- find_article(NULL, article) if (!is.null(path)) { return(paste0(getOption("downlit.article_path"), path)) } } if (is.null(package)) { package <- find_vignette_package(article) if (is.null(package)) { return(NA_character_) } } path <- find_article(package, article) if (is.null(path)) { return(NA_character_) } base_url <- remote_package_article_url(package) if (is.null(base_url)) { paste0("https://cran.rstudio.com/web/packages/", package, "/vignettes/", path) } else { paste0(base_url, "/", path) } } # Try to figure out package name from attached packages find_vignette_package <- function(x) { for (pkg in getOption("downlit.attached")) { if (!is_installed(pkg)) { next } info <- tools::getVignetteInfo(pkg) if (x %in% info[, "Topic"]) { return(pkg) } } NULL } # Packages ---------------------------------------------------------------- #' @export #' @rdname href_topic href_package <- function(package) { urls <- package_urls(package) if (length(urls) == 0) { NA_character_ } else { urls[[1]] } } href_package_ref <- function(package) { reference_url <- remote_package_reference_url(package) if (!is.null(reference_url)) { reference_url } else { # Fall back to rdrr.io if (is_base_package(package)) { paste0("https://rdrr.io/r/", package) } else { paste0("https://rdrr.io/pkg/", package, "/man") } } } is_base_package <- function(x) { x %in% c( "base", "compiler", "datasets", "graphics", "grDevices", "grid", "methods", "parallel", "splines", "stats", "stats4", "tcltk", "tools", "utils" ) } downlit/R/downlit-html.R0000644000175000017500000000605314136771000015035 0ustar nileshnilesh#' Syntax highlight and link an HTML page #' #' @description #' * Code blocks, identified by `
` tags with class `sourceCode r`
#'   or any `
` tag inside of `
`, are #' processed with [highlight()]. #' #' * Inline code, identified by `` tags that contain only text #' (and don't have a header tag (e.g. `

`) or `` as an ancestor) #' are processed processed with [autolink()]. #' #' Use `downlit_html_path()` to process an `.html` file on disk; #' use `downlit_html_node()` to process an in-memory `xml_node` as part of a #' larger pipeline. #' #' @param in_path,out_path Input and output paths for HTML file #' @inheritParams highlight #' @param x An `xml2::xml_node` #' @return `downlit_html_path()` invisibly returns `output_path`; #' `downlit_html_node()` modifies `x` in place and returns nothing. #' @export #' @examples #' node <- xml2::read_xml("

base::t()

") #' node #' #' # node is modified in place #' downlit_html_node(node) #' node downlit_html_path <- function(in_path, out_path, classes = classes_pandoc()) { if (!is_installed("xml2")) { abort("xml2 package required .html transformation") } html <- xml2::read_html(in_path, encoding = "UTF-8") downlit_html_node(html, classes = classes) xml2::write_html(html, out_path, format = FALSE) invisible(out_path) } #' @export #' @rdname downlit_html_path downlit_html_node <- function(x, classes = classes_pandoc()) { stopifnot(inherits(x, "xml_node")) xpath <- c( # Usual block generated by pandoc (after syntax highlighting) ".//pre[contains(@class, 'sourceCode r')]", # Special case that allows us to override usually rules if needed ".//div[contains(@class, 'downlit')]//pre" ) xpath_block <- paste(xpath, collapse = "|") tweak_children(x, xpath_block, highlight, pre_class = "downlit sourceCode r", classes = classes, replace = "node", code = TRUE ) # Identify containing only text (i.e. no children) # that are not descendants of an element where links are undesirable bad_ancestor <- c("h1", "h2", "h3", "h4", "h5", "a", "summary") bad_ancestor <- paste0("ancestor::", bad_ancestor, collapse = "|") xpath_inline <- paste0(".//code[count(*) = 0 and not(", bad_ancestor, ")]") # replace inline code "{packagename}" with linked text if possible tweak_children(x, xpath_inline, autolink_curly, replace = "node") # handle remaining inline code tweak_children(x, xpath_inline, autolink, replace = "contents") invisible() } tweak_children <- function(node, xpath, fun, ..., replace = c("node", "contents")) { replace <- arg_match(replace) nodes <- xml2::xml_find_all(node, xpath) text <- xml2::xml_text(nodes) replacement <- map_chr(text, fun, ...) to_update <- !is.na(replacement) old <- nodes[to_update] if (replace == "contents") { old <- xml2::xml_contents(old) } new <- lapply(replacement[to_update], as_xml) xml2::xml_replace(old, new, .copy = FALSE) invisible() } as_xml <- function(x) { xml2::xml_contents(xml2::xml_contents(xml2::read_html(x)))[[1]] } downlit/R/topic-index.R0000644000175000017500000000310514042056540014632 0ustar nileshnilesh# Compute topic index ----------------------------------------------------- # The topic index is a character vector that maps aliases to Rd file names # (sans extension). Memoised for performance. topic_index <- function(package) { if (is.null(package)) { getOption("downlit.topic_index") } else if (devtools_loaded(package)) { # Use live docs for in-development packages topic_index_source(package) } else { topic_index_installed(package) } } topic_index_source <- function(package) { path <- file.path(find.package(package), "man") if (!file.exists(path)) { return(character()) } rd <- package_rd(path) aliases <- lapply(rd, extract_alias) names(aliases) <- gsub("\\.Rd$", "", names(rd)) unlist(invert_index(aliases)) } topic_index_installed <- function(package) { path <- system.file("help", "aliases.rds", package = package) if (path == "") return(character()) readRDS(path) } # A helper that can warn if the topic is not found find_rdname <- function(package, topic, warn_if_not_found = FALSE) { index <- topic_index(package) if (has_name(index, topic)) { index[[topic]] } else { if (warn_if_not_found) { warn(paste0("Failed to find topic `", topic, "`")) } NULL } } find_rdname_attached <- function(topic) { packages <- c( getOption("downlit.attached"), c("datasets", "utils", "grDevices", "graphics", "stats", "base") ) for (package in packages) { rdname <- find_rdname(package, topic) if (!is.null(rdname)) { return(list(rdname = rdname, package = package)) } } NULL } downlit/R/packages.R0000644000175000017500000000335714044304043014172 0ustar nileshnileshextract_package_attach <- function(expr) { if (is.expression(expr)) { packages <- lapply(expr, extract_package_attach) unlist(packages) } else if (is_call(expr)) { if (is_call(expr, c("library", "require"))) { expr <- call_standardise(expr) if (!is_true(expr$character.only)) { as.character(expr$package) } else { character() } } else { args <- as.list(expr[-1]) unlist(lapply(args, extract_package_attach)) } } else { character() } } # Helper for testing extract_package_attach_ <- function(expr) { extract_package_attach(enexpr(expr)) } register_attached_packages <- function(packages) { packages <- add_depends(packages) options("downlit.attached" = union(packages, getOption("downlit.attached"))) } add_depends <- function(packages) { if ("tidyverse" %in% packages && is_installed("tidyverse")) { core <- getNamespace("tidyverse")$core packages <- union(packages, core) } # add packages attached by depends depends <- unlist(lapply(packages, package_depends)) union(packages, depends) } package_depends <- function(package) { if (!is_installed(package)) { return(character()) } if (!is.null(devtools_meta(package))) { path_desc <- system.file("DESCRIPTION", package = "pkgdown") deps <- desc::desc_get_deps(path_desc) depends <- deps$package[deps$type == "Depends"] depends <- depends[depends != "R"] return(depends) } path_meta <- system.file("Meta", "package.rds", package = package) meta <- readRDS(path_meta) names(meta$Depends) } # from https://github.com/r-lib/pkgdown/blob/8e0838e273462cec420dfa20f240c684a33425d9/R/utils.r#L62 devtools_meta <- function(x) { ns <- .getNamespace(x) ns[[".__DEVTOOLS__"]] } downlit/R/highlight.R0000644000175000017500000002363314136771000014365 0ustar nileshnilesh#' Highlight and link a code block #' #' @description #' This function: #' * syntax highlights code #' * links function calls to their documentation (where possible) #' * in comments, translates ANSI escapes in to HTML equivalents. #' #' # Options #' #' downlit provides a number of options to control the details of the linking. #' They are particularly important if you want to generate "local" links. #' #' * `downlit.package`: name of the current package. Determines when #' `topic_index` and `article_index` #' #' * `downlit.topic_index` and `downlit.article_index`: named character #' vector that maps from topic/article name to path. #' #' * `downlit.rdname`: name of current Rd file being documented (if any); #' used to avoid self-links. #' #' * `downlit.attached`: character vector of currently attached R packages. #' #' * `downlit.local_packages`: named character vector providing relative #' paths (value) to packages (name) that can be reached with relative links #' from the target HTML document. #' #' * `downlit.topic_path` and `downlit.article_path`: paths to reference #' topics and articles/vignettes relative to the "current" file. #' @export #' @param text String of code to highlight and link. #' @param classes A mapping between token names and CSS class names. #' Bundled `classes_pandoc()` and `classes_chroma()` provide mappings #' that (roughly) match Pandoc and chroma (used by hugo) classes so you #' can use existing themes. #' @param pre_class Class(es) to give output `
`.
#' @param code If `TRUE`, wrap output in a ``  block
#' @return If `text` is valid R code, an HTML `
` tag. Otherwise,
#'   `NA`.
#' @return A string containing syntax highlighted HTML or `NA` (if `text`
#'   isn't parseable).
#' @examples
#' cat(highlight("1 + 1"))
#' cat(highlight("base::t(1:3)"))
#'
#' # Unparseable R code returns NA
#' cat(highlight("base::t("))
highlight <- function(text, classes = classes_chroma(), pre_class = NULL, code = FALSE) {
  parsed <- parse_data(text)
  if (is.null(parsed)) {
    return(NA_character_)
  }

  # Figure out which packages are attached to the search path. This is a
  # hack because loading a package will affect code _before_ the library()
  # call. But it should lead to relatively few false positives and is simple.
  packages <- extract_package_attach(parsed$expr)
  register_attached_packages(packages)

  # Highlight, link, and escape
  out <- parsed$data
  out$class <- token_class(out$token, out$text, classes)
  out$href <- token_href(out$token, out$text)
  out$escaped <- token_escape(out$token, out$text)

  # Update input - basic idea from prettycode
  changed <- !is.na(out$href) | !is.na(out$class) | out$text != out$escaped
  changes <- out[changed, , drop = FALSE]

  loc <- line_col(parsed$text)
  start <- vctrs::vec_match(data.frame(line = changes$line1, col = changes$col1), loc)
  end <- vctrs::vec_match(data.frame(line = changes$line2, col = changes$col2), loc)

  new <- style_token(changes$escaped, changes$href, changes$class)
  out <- replace_in_place(parsed$text, start, end, replacement = new)

  if (!is.null(pre_class)) {
    out <- paste0(
      "
\n",
      if (code) paste0(""),
      out,
      if (code) paste(""),
      "
" ) } Encoding(out) <- "UTF-8" out } style_token <- function(x, href = NA, class = NA) { x <- ifelse(is.na(href), x, paste0("
", x, "")) x <- ifelse(is.na(class), x, paste0("", x, "")) x } # From prettycode:::replace_in_place replace_in_place <- function(str, start, end, replacement) { stopifnot( length(str) == 1, length(start) == length(end), length(end) == length(replacement) ) keep <- substring(str, c(1, end + 1), c(start - 1, nchar(str))) pieces <- character(length(replacement) * 2 + 1) even <- seq_along(replacement) * 2 odd <- c(1, even + 1) pieces[even] <- replacement pieces[odd] <- keep paste0(pieces, collapse = "") } line_col <- function(x) { char <- strsplit(x, "")[[1]] nl <- char == "\n" line <- cumsum(c(TRUE, nl[-length(char)])) col <- sequence(rle(line)$lengths) data.frame(line, col) } parse_data <- function(text) { text <- standardise_text(text) stopifnot(is.character(text), length(text) == 1) expr <- safe_parse(text, standardise = FALSE) if (is.null(expr)) { return(NULL) } list(text = text, expr = expr, data = utils::getParseData(expr)) } # Highlighting ------------------------------------------------------------ token_class <- function(token, text, classes) { token <- token_type(token, text) unname(classes[token]) } # Collapse token types to a smaller set of categories that we care about # for syntax highlighting # https://github.com/wch/r-source/blob/trunk/src/main/gram.c#L511 token_type <- function(x, text) { special <- c( "FUNCTION", "FOR", "IN", "BREAK", "NEXT", "REPEAT", "WHILE", "IF", "ELSE" ) rstudio_special <- c( "return", "switch", "try", "tryCatch", "stop", "warning", "require", "library", "attach", "detach", "source", "setMethod", "setGeneric", "setGroupGeneric", "setClass", "setRefClass", "R6Class", "UseMethod", "NextMethod" ) x[x %in% special] <- "special" x[x == "SYMBOL_FUNCTION_CALL" & text %in% rstudio_special] <- "special" infix <- c( # algebra "'-'", "'+'", "'~'", "'*'", "'/'", "'^'", # comparison "LT", "GT", "EQ", "GE", "LE", "NE", # logical "'!'", "AND", "AND2", "OR", "OR2", # assignment / equals "LEFT_ASSIGN", "RIGHT_ASSIGN", "EQ_ASSIGN", "EQ_FORMALS", "EQ_SUB", # miscellaneous "'$'", "'@'","'~'", "'?'", "':'", "SPECIAL" ) x[x %in% infix] <- "infix" parens <- c("LBB", "'['", "']'", "'('", "')'", "'{'", "'}'") x[x %in% parens] <- "parens" # Matches treatment of constants in RStudio constant <- c( "NA", "Inf", "NaN", "TRUE", "FALSE", "NA_integer_", "NA_real_", "NA_character_", "NA_complex_" ) x[x == "NUM_CONST" & text %in% constant] <- "constant" x[x == "SYMBOL" & text %in% c("T", "F")] <- "constant" x[x == "NULL_CONST"] <- "constant" x } # Pandoc styles are based on KDE default styles: # https://docs.kde.org/stable5/en/applications/katepart/highlight.html#kate-highlight-default-styles # But in HTML use two letter abbreviations: # https://github.com/jgm/skylighting/blob/a1d02a0db6260c73aaf04aae2e6e18b569caacdc/skylighting-core/src/Skylighting/Format/HTML.hs#L117-L147 # Summary at # https://docs.google.com/spreadsheets/d/1JhBtQSCtQ2eu2RepLTJONFdLEnhM3asUyMMLYE3tdYk/edit#gid=0 # # Default syntax highlighting def for R: # https://github.com/KDE/syntax-highlighting/blob/master/data/syntax/r.xml #' @export #' @rdname highlight classes_pandoc <- function() { c( "constant" = "cn", "NUM_CONST" = "fl", "STR_CONST" = "st", "special" = "kw", "parens" = "op", "infix" = "op", "SLOT" = "va", "SYMBOL" = "va", "SYMBOL_FORMALS" = "va", "NS_GET" = "fu", "NS_GET_INT" = "fu", "SYMBOL_FUNCTION_CALL" = "fu", "SYMBOL_PACKAGE" = "fu", "COMMENT" = "co" ) } # Derived from https://github.com/ropensci/roweb2/blob/master/themes/ropensci/static/css/pygments.css #' @export #' @rdname highlight classes_chroma <- function() { c( "constant" = "kc", "NUM_CONST" = "m", "STR_CONST" = "s", "special" = "kr", "parens" = "o", "infix" = "o", "SLOT" = "nv", "SYMBOL" = "nv", "SYMBOL_FORMALS" = "nv", "NS_GET" = "nf", "NS_GET_INT" = "nf", "SYMBOL_FUNCTION_CALL" = "nf", "SYMBOL_PACKAGE" = "nf", "COMMENT" = "c" ) } classes_show <- function(x, classes = classes_pandoc()) { text <- paste0(deparse(substitute(x)), collapse = "\n") out <- parse_data(text)$data out$class <- token_class(out$token, out$text, classes) out$class[is.na(out$class)] <- "" out <- out[out$terminal, c("token", "text", "class")] rownames(out) <- NULL out } # Linking ----------------------------------------------------------------- token_href <- function(token, text) { href <- rep(NA, length(token)) to_end <- length(token) - seq_along(token) + 1 # Highlight namespaced function calls. In the parsed tree, these are # SYMBOL_PACKAGE then NS_GET/NS_GET_INT then SYMBOL_FUNCTION_CALL/SYMBOL ns_pkg <- which(token %in% "SYMBOL_PACKAGE" & to_end > 2) ns_fun <- ns_pkg + 2L href[ns_fun] <- map2_chr(text[ns_fun], text[ns_pkg], href_topic) # Then highlight all remaining calls, using loaded packages registered # above. These maintained at a higher-level, because (e.g) in .Rmds you want # earlier library() statements to affect the highlighting of later blocks fun <- which(token %in% "SYMBOL_FUNCTION_CALL") fun <- setdiff(fun, ns_fun) fun <- fun[token[fun-1] != "'$'"] # Include custom infix operators fun <- c(fun, which(token %in% "SPECIAL")) # Highlight R6 instantiation r6_new_call <- which( text == "new" & token == "SYMBOL_FUNCTION_CALL" ) r6_new_call <- r6_new_call[token[r6_new_call - 1] == "'$'"] r6_new_call <- r6_new_call[token[r6_new_call - 3] == "SYMBOL"] fun <- c(fun, r6_new_call - 3) href[fun] <- map_chr(text[fun], href_topic_local) # Highlight packages lib_call <- which( token == "SYMBOL_FUNCTION_CALL" & text %in% c("library", "require") & to_end > 3 ) pkg <- lib_call + 3 # expr + '(' + STR_CONST href[pkg] <- map_chr(gsub("['\"]", "", text[pkg]), href_package) href } map_chr <- function(.x, .f, ...) { vapply(.x, .f, ..., FUN.VALUE = character(1), USE.NAMES = FALSE) } map2_chr <- function(.x, .y, .f, ...) { vapply(seq_along(.x), function(i) .f(.x[[i]], .y[[i]], ...), character(1)) } # Escaping ---------------------------------------------------------------- token_escape <- function(token, text) { text <- escape_html(text) is_comment <- token == "COMMENT" text[is_comment] <- fansi::sgr_to_html(text[is_comment]) text } escape_html <- function(x) { x <- gsub("&", "&", x) x <- gsub("<", "<", x) x <- gsub(">", ">", x) x } downlit/R/test-helpers.R0000644000175000017500000000054614136776231015046 0ustar nileshnileshlocal_devtools_package <- function(path, ..., env = parent.frame()) { pkgload::load_all(path, ..., quiet = TRUE) defer(pkgload::unload(pkgload::pkg_name(path)), scope = env) } defer <- function(expr, scope = parent.frame()) { expr <- enquo(expr) call <- expr(on.exit(rlang::eval_tidy(!!expr), add = TRUE)) eval_bare(call, scope) invisible() } downlit/R/downlit-package.R0000644000175000017500000000033314122441316015456 0ustar nileshnilesh#' @keywords internal "_PACKAGE" # The following block is used by usethis to automatically manage # roxygen namespace tags. Modify with care! ## usethis namespace: start #' @import rlang ## usethis namespace: end NULL downlit/R/utils.R0000644000175000017500000000417214136771000013553 0ustar nileshnileshis_infix <- function(x) { ops <- c( "::", ":::", "$", "@", "[", "[[", "^", "-", "+", ":", "*", "/", "<", ">", "<=", ">=", "==", "!=", "!", "&", "&&", "|", "||", "~", "->", "->>", "<-", "<<-", "=", "?" ) grepl("^%.*%$", x) || x %in% ops } is_prefix <- function(x) { if (is_infix(x)) { return(FALSE) } special <- c( "(", "{", "if", "for", "while", "repeat", "next", "break", "function" ) if (x %in% special) { return(FALSE) } TRUE } devtools_loaded <- function(x) { if (!x %in% loadedNamespaces()) { return(FALSE) } ns <- .getNamespace(x) env_has(ns, ".__DEVTOOLS__") } invert_index <- function(x) { stopifnot(is.list(x)) if (length(x) == 0) return(list()) key <- rep(names(x), lengths(x)) val <- unlist(x, use.names = FALSE) split(key, val) } standardise_text <- function(x) { x <- enc2utf8(x) x <- gsub("\t", " ", x, fixed = TRUE, useBytes = TRUE) x <- gsub("\r", "", x, fixed = TRUE, useBytes = TRUE) # \033 can't be represented in xml (and hence is ignored by xml2) # so we convert to \u2029 in order to survive a round trip x <- gsub("\u2029", "\033", x, fixed = TRUE, useBytes = TRUE) x } safe_parse <- function(text, standardise = TRUE) { if (standardise) { text <- standardise_text(text) } lines <- strsplit(text, "\n", fixed = TRUE, useBytes = TRUE)[[1]] srcfile <- srcfilecopy("test.r", lines) # https://github.com/gaborcsardi/rencfaq#how-to-parse-utf-8-text-into-utf-8-code Encoding(text) <- "unknown" con <- textConnection(text) on.exit(close(con), add = TRUE) tryCatch( parse(con, keep.source = TRUE, encoding = "UTF-8", srcfile = srcfile), error = function(e) NULL ) } extract_curly_package <- function(x) { # regex adapted from https://github.com/r-lib/usethis/blob/d5857737b4780c3c3d8fe6fb44ef70e81796ac8e/R/description.R#L134 if (! grepl("^\\{[a-zA-Z][a-zA-Z0-9.]+\\}$", x)) { return(NA) } # remove first curly brace x <- sub("\\{", "", x) # remove second curly brace and return x <- sub("\\}", "", x) x } show_xml <- function(x) { cat(as.character(x, options = c("format", "no_declaration"))) } downlit/R/downlit-md.R0000644000175000017500000001576614042074111014477 0ustar nileshnilesh#' Syntax highlight and link a md document #' #' @description #' `downlit_md_*` works by traversing the markdown AST generated by Pandoc. #' It applies [highlight()] to `CodeBlock`s and [autolink()] to inline `Code`. #' #' Use `downlit_md_path()` to transform a file on disk; use #' `downlit_md_string()` to transform a string containing markdown as part #' of a larger pipeline. #' #' Needs pandoc 1.19 or later. #' #' @export #' @param in_path,out_path Input and output paths for markdown file. #' @param x A string containing markdown. #' @param format Pandoc format; defaults to "gfm" if you have pandoc 2.0.0 or #' greater, otherwise "markdown_github". #' @return `downlit_md_path()` invisibly returns `output_path`; #' `downlit_md_string()` returns a string containing markdown. #' @examples #' if (rmarkdown::pandoc_available("1.19")) { #' downlit_md_string("`base::t()`") #' downlit_md_string("`base::t`") #' downlit_md_string("* `base::t`") #' #' # But don't highlight in headings #' downlit_md_string("## `base::t`") #' } downlit_md_path <- function(in_path, out_path, format = NULL) { check_packages() ast_path <- tempfile() on.exit(unlink(ast_path)) md2ast(in_path, ast_path, format = format) ast <- jsonlite::read_json(ast_path) ast$blocks <- transform_code(ast$blocks, ast_version(ast)) jsonlite::write_json(ast, ast_path, auto_unbox = TRUE, null = "null") ast2md(ast_path, out_path, format = format) } #' @export #' @rdname downlit_md_path downlit_md_string <- function(x, format = NULL) { check_packages() path <- tempfile() on.exit(unlink(path)) brio::write_lines(x, path) downlit_md_path(path, path, format = format) brio::read_file(path) } # Markdown <-> pandoc AST ------------------------------------------------- md2ast <- function(path, out_path, format = NULL) { format <- format %||% md_format() rmarkdown::pandoc_convert( input = normalizePath(path, mustWork = FALSE), output = normalizePath(out_path, mustWork = FALSE), from = format, to = "json" ) invisible(out_path) } ast2md <- function(path, out_path, format = NULL) { format <- format %||% md_format() options <- c( if (rmarkdown::pandoc_available("2.0")) "--eol=lf", if (rmarkdown::pandoc_version() < "2.11.2") "--atx-headers", # 1.19-2.11.2 "--wrap=none" # 1.16 ) rmarkdown::pandoc_convert( input = normalizePath(path, mustWork = FALSE), output = normalizePath(out_path, mustWork = FALSE), from = "json", to = format, options = options ) invisible(out_path) } ast_version <- function(ast) { string <- paste(unlist(ast$`pandoc-api-version`), collapse = ".") package_version(string) } md_format <- function() { if (rmarkdown::pandoc_available("2.0.0")) { "gfm" } else { "markdown_github" } } # Code transformation ----------------------------------------------------- # Data types at # https://hackage.haskell.org/package/pandoc-types-1.20/docs/Text-Pandoc-Definition.html transform_code <- function(x, version) { stopifnot(is.list(x)) # Blocks that are a list of blocks block_list <- c( # Block "Plain", "Para", "LineBlock", "BlockQuote", "BulletList", # Inline "Emph", "Strong", "Strikeout", "Superscript", "Subscript", "SmallCaps", "Note", "Underline" ) # Blocks that have a list of blocks as second child block_list2 <- c( "OrderedList", "Quoted", "Div", "Span", "Caption", "TableHead", "TableFoot", "Row" ) skip <- c( "Header", "CodeBlock", "RawBlock", "HorizontalRule", "Null", "Math", "RawInline", "Link", "Image", "Cite", "Str", "Space", "SoftBreak", "LineBreak" ) if (!is_named(x)) { lapply(x, transform_code, version = version) } else { if (x$t == "Code") { package_name <- extract_curly_package(x$c[[2]]) # packages à la {pkgname} if(!is.na(package_name)) { href <- href_package(package_name) if (!is.na(href)) { x <- list(t = "Str", c = package_name) x <- pandoc_link(pandoc_attr(), list(x), pandoc_target(href)) } # otherwise we do not touch x } else { # other cases href <- autolink_url(x$c[[2]]) if (!is.na(href)) { x <- pandoc_link(pandoc_attr(), list(x), pandoc_target(href)) } } } else if (x$t == "CodeBlock") { out <- highlight(x$c[[2]], pre_class = "chroma") if (!is.na(out)) { x <- pandoc_raw_block("html", out) } } else if (x$t %in% block_list) { # Plain [Inline] # Para [Inline] # LineBlock [[Inline]] # BlockQuote [Block] # BulletList [[Block]] # Emph [Inline] # Strong [Inline] # Strikeout [Inline] # Superscript [Inline] # Subscript [Inline] # SmallCaps [Inline] # Note [Block] # Underline [Inline] x$c <- lapply(x$c, transform_code, version = version) } else if (x$t %in% block_list2) { # OrderedList ListAttributes [[Block]] # Quoted QuoteType [Inline] # Div Attr [Block] # Span Attr [Inline] # TableHead Attr [Row] # TableFoot Attr [Row] # Caption (Maybe ShortCaption) [Block] x$c[[2]] <- lapply(x$c[[2]], transform_code, version = version) } else if (x$t %in% "Table") { if (version >= "1.21") { # Attr Caption [ColSpec] TableHead [TableBody] TableFoot x$c[c(2, 4, 5, 6)] <- lapply(x$c[c(2, 4, 5, 6)], transform_code, version = version) } else { # [Inline] [Alignment] [Double] [TableCell] [[TableCell]] x$c[c(1, 4, 5)] <- lapply(x$c[c(1, 4, 5)], transform_code, version = version) } } else if (x$t %in% "TableBody") { # Attr RowHeadColumns [Row] [Row] x$c[c(3, 4)] <- lapply(x$c[c(3, 4)], transform_code, version = version) } else if (x$t %in% "Cell") { # Attr Alignment RowSpan ColSpan [Block] x$c[[5]] <- lapply(x$c[[5]], transform_code, version = version) } else if (x$t %in% "DefinitionList") { # DefinitionList [([Inline], [[Block]])] x$c <- lapply(x$c, function(x) list( transform_code(x[[1]], version = version), transform_code(x[[2]], version = version) ) ) } else if (x$t %in% skip) { } else { inform(paste0("Unknown type: ", x$t)) } x } } # Pandoc AST constructors ------------------------------------------------- pandoc_node <- function(type, ...) { list(t = type, c = list(...)) } pandoc_raw_block <- function(format, text) { # Format Text pandoc_node("RawBlock", format, text) } pandoc_link <- function(attr, contents, target) { # Attr [Inline] Target pandoc_node("Link", attr, contents, target) } pandoc_attr <- function(id = "", classes = list(), keyval = list()) { list(id, classes, keyval) } pandoc_target <- function(url, title = "") { list(url, title) } # Helpers ----------------------------------------------------------------- check_packages <- function() { if (!is_installed("rmarkdown") || !is_installed("jsonlite")) { abort("rmarkdown and jsonlite required for .md transformation") } } downlit/R/rd.R0000644000175000017500000000107013664033224013016 0ustar nileshnileshpackage_rd <- function(path) { rd <- dir(path, pattern = "\\.[Rr]d$", full.names = TRUE) names(rd) <- basename(rd) lapply(rd, rd_file, pkg_path = dirname(path)) } rd_file <- function(path, pkg_path = NULL) { if (getRversion() >= "3.4.0") { macros <- tools::loadPkgRdMacros(pkg_path) } else { macros <- tools::loadPkgRdMacros(pkg_path, TRUE) } tools::parse_Rd(path, macros = macros, encoding = "UTF-8") } extract_alias <- function(x, tag) { is_alias <- vapply(x, function(x) attr(x, "Rd_tag") == "\\alias", logical(1)) unlist(x[is_alias]) } downlit/R/article-index.R0000644000175000017500000000246213750523135015150 0ustar nileshnilesharticle_index <- function(package) { if (is.null(package)) { getOption("downlit.article_index") } else if (devtools_loaded(package)) { # Use live docs for in-development packages article_index_source(package) } else { article_index_remote(package) } } article_index_source <- function(package) { path <- file.path(find.package(package), "vignettes") if (!file.exists(path)) { return(character()) } vig_path <- dir(path, pattern = "\\.[rR]md$", recursive = TRUE) out_path <- gsub("\\.[rR]md$", ".html", vig_path) vig_name <- gsub("\\.[rR]md$", "", basename(vig_path)) set_names(out_path, vig_name) } article_index_remote <- function(package) { # Ideally will use published metadata because that includes all articles # not just vignettes metadata <- remote_metadata(package) if (!is.null(metadata) && !is.null(metadata$articles)) { return(metadata$articles) } # Otherwise, fallback to vignette index path <- system.file("Meta", "vignette.rds", package = package) if (path == "") { return(character()) } meta <- readRDS(path) name <- tools::file_path_sans_ext(meta$File) set_names(meta$PDF, name) } find_article <- function(package, name) { index <- article_index(package) if (has_name(index, name)) { index[[name]] } else { NULL } } downlit/R/evaluate.R0000644000175000017500000001651614137021240014221 0ustar nileshnilesh#' Evaluate code and syntax highlight the results #' #' This function runs `code` and captures the output using #' [evaluate::evaluate()]. It syntax higlights code with [highlight()], and #' intermingles it with output. #' #' @param code Code to evaluate (as a string). #' @param fig_save A function with arguments `plot` and `id` that is #' responsible for saving `plot` to a file (using `id` to disambiguate #' multiple plots in the same chunk). It should return a list with #' components `path`, `width`, and `height`. #' @param env Environment in which to evaluate code; if not supplied, #' defaults to a child of the global environment. #' @param output_handler Custom output handler for [evaluate::evaluate()]. #' @param highlight Optionally suppress highlighting. This is useful for tests. #' @return An string containing HTML with a `dependencies` attribute #' giving an additional htmltools dependencies required to render the HTML. #' @inheritParams highlight #' @export #' @examples #' cat(evaluate_and_highlight("1 + 2")) #' cat(evaluate_and_highlight("x <- 1:10\nmean(x)")) #' #' # ----------------------------------------------------------------- #' # evaluate_and_highlight() powers pkgdown's documentation formatting so #' # here I include a few examples to make sure everything looks good #' # ----------------------------------------------------------------- #' #' blue <- function(x) paste0("\033[34m", x, "\033[39m") #' f <- function(x) { #' cat("This is some output. My favourite colour is ", blue("blue"), ".\n", sep = "") #' message("This is a message. My favourite fruit is ", blue("blueberries")) #' warning("Now at stage ", blue("blue"), "!") #' } #' f() #' #' plot(1:10) #' #' if (requireNamespace("leaflet", quietly = TRUE)) { #' leaflet::addTiles(leaflet::leaflet()) #' } evaluate_and_highlight <- function(code, fig_save, classes = downlit::classes_pandoc(), env = NULL, output_handler = evaluate::new_output_handler(), highlight = TRUE) { env <- env %||% child_env(global_env()) expr <- evaluate::evaluate(code, child_env(env), new_device = TRUE, output_handler = output_handler) replay_html(expr, fig_save = fig_save, fig_id = unique_id(), classes = classes, highlight = highlight ) } test_evaluate <- function(code, ..., highlight = FALSE) { fig_save <- function(plot, id) { list(path = paste0(id, ".png"), width = 10, height = 10) } cat(evaluate_and_highlight( code, fig_save = fig_save, env = caller_env(), highlight = highlight, ... )) } replay_html <- function(x, ...) UseMethod("replay_html", x) #' @export replay_html.list <- function(x, ...) { # keep only high level plots x <- merge_low_plot(x) html <- lapply(x, replay_html, ...) dependencies <- unlist(lapply(html, attr, "dependencies"), recursive = FALSE) structure( paste0(html, collapse = ""), dependencies = dependencies ) } #' @export replay_html.NULL <- function(x, ...) "" #' @export replay_html.character <- function(x, ...) { label_output(escape_html(x), "r-out") } #' @export replay_html.source <- function(x, ..., classes, highlight = FALSE) { if (highlight) { html <- highlight(x$src, classes = classes) } if (!highlight || is.na(html)) { html <- escape_html(x$src) } label_input(html, "r-in") } #' @export replay_html.warning <- function(x, ...) { message <- paste0(span("Warning: ", class = "warning"), escape_html(conditionMessage(x))) label_output(message, "r-wrn") } #' @export replay_html.message <- function(x, ...) { message <- escape_html(paste0(gsub("\n$", "", conditionMessage(x)))) label_output(message, "r-msg") } #' @export replay_html.error <- function(x, ...) { if (is.null(x$call)) { prefix <- "Error:" } else { prefix <- paste0("Error in ", escape_html(paste0(deparse(x$call), collapse = "")), ":") } message <- paste0(span(prefix, class = "error"), " ", escape_html(conditionMessage(x))) label_output(message, "r-err") } #' @export replay_html.recordedplot <- function(x, fig_save, fig_id, ...) { fig <- fig_save(x, fig_id()) img <- paste0( "" ) paste0(span(img, class = "r-plt img"), "\n") } # htmltools --------------------------------------------------------------- # These will only be encountered if output_handler has a custom value # callback that returns HTML. #' @export replay_html.html <- function(x, ...) { rendered <- htmltools::renderTags(x) structure( rendered$html, dependencies = rendered$dependencies, class = "downlit_html" ) } #' @export replay_html.shiny.tag <- replay_html.html #' @export replay_html.shiny.tag.function <- replay_html.html #' @export replay_html.shiny.tag.list <- replay_html.html #' @export replay_html.htmlwidget <- replay_html.html # helpers ----------------------------------------------------------------- label_output <- function(x, class) { prompt <- span("#>", class = "r-pr") lines <- strsplit(x, "\n")[[1]] lines <- fansi::sgr_to_html(lines) lines <- paste0(prompt, " ", lines) lines <- span(lines, class = paste(class, "co")) paste0(lines, "\n", collapse = "") } label_input <- function(x, class) { lines <- strsplit(x, "\n")[[1]] lines <- span(lines, class = class) paste0(lines, "\n", collapse = "") } span <- function(..., class = NULL) { paste0( "", ..., "" ) } unique_id <- function() { i <- 0 function() { i <<- i + 1 i } } # Knitr functions ------------------------------------------------------------ # The functions below come from package knitr (Yihui Xie) in file plot.R # get MD5 digests of recorded plots so that merge_low_plot works digest_plot = function(x, level = 1) { if (inherits(x, "otherRecordedplot")) return(x) if (!is.list(x) || level >= 3) return(structure(digest::digest(x), class = "plot_digest")) lapply(x, digest_plot, level = level + 1) } is_plot_output = function(x) { evaluate::is.recordedplot(x) || inherits(x, 'otherRecordedplot') } # merge low-level plotting changes merge_low_plot = function(x, idx = vapply(x, is_plot_output, logical(1L))) { idx = which(idx); n = length(idx); m = NULL # store indices that will be removed if (n <= 1) return(x) # digest of recorded plots rp_dg <- lapply(x[idx], digest_plot) i1 = idx[1]; i2 = idx[2] # compare plots sequentially for (i in 1:(n - 1)) { # remove the previous plot and move its index to the next plot if (is_low_change(rp_dg[[i]], rp_dg[[i+1]])) m = c(m, i1) i1 = idx[i + 1] i2 = idx[i + 2] } if (is.null(m)) x else x[-m] } #' Compare two recorded plots #' #' @param p1,p2 Plot results #' #' @return Logical value indicating whether `p2` is a low-level update of `p1`. #' @export is_low_change = function(p1, p2) { UseMethod("is_low_change") } #' @export is_low_change.default = function(p1, p2) { p1 = p1[[1]]; p2 = p2[[1]] # real plot info is in [[1]] if ((n2 <- length(p2)) < (n1 <- length(p1))) return(FALSE) # length must increase identical(p1[1:n1], p2[1:n1]) } downlit/LICENSE0000644000175000017500000000004513731234742013076 0ustar nileshnileshYEAR: 2020 COPYRIGHT HOLDER: RStudio downlit/inst/0000755000175000017500000000000013731234742013047 5ustar nileshnileshdownlit/inst/WORDLIST0000644000175000017500000000014413731234742014240 0ustar nileshnileshAST bookdown chroma dplyr dplyr's fansi findable hugo hugodown linkable md Pandoc pkgdown RMarkdown downlit/NAMESPACE0000644000175000017500000000146414137022063013306 0ustar nileshnilesh# Generated by roxygen2: do not edit by hand S3method(is_low_change,default) S3method(replay_html,"NULL") S3method(replay_html,character) S3method(replay_html,error) S3method(replay_html,html) S3method(replay_html,htmlwidget) S3method(replay_html,list) S3method(replay_html,message) S3method(replay_html,recordedplot) S3method(replay_html,shiny.tag) S3method(replay_html,shiny.tag.function) S3method(replay_html,shiny.tag.list) S3method(replay_html,source) S3method(replay_html,warning) export(autolink) export(autolink_url) export(classes_chroma) export(classes_pandoc) export(downlit_html_node) export(downlit_html_path) export(downlit_md_path) export(downlit_md_string) export(evaluate_and_highlight) export(highlight) export(href_article) export(href_package) export(href_topic) export(is_low_change) import(rlang)