DBItest/0000755000176200001440000000000013232433022011542 5ustar liggesusersDBItest/inst/0000755000176200001440000000000013232432047012525 5ustar liggesusersDBItest/inst/doc/0000755000176200001440000000000013232432047013272 5ustar liggesusersDBItest/inst/doc/test.html0000644000176200001440000003457113232432047015151 0ustar liggesusers Testing DBI backends

Testing DBI backends

Kirill Müller

2018-01-25

This document shows how to use the DBItest package when implementing a new DBI backend or when applying it to an existing backend. The DBItest package provides a large collection of automated tests.

Testing a new backend

The test cases in the DBItest package are structured very similarly to the sections in the “backend” vignette:

vignette("backend", package = "DBI")

Like the “backend” vignette, this vignette assumes that you are implementing the RKazam package that has a Kazam() function that creates a new DBIDriver instance for connecting to a “Kazam” database.

You can add the tests in the DBItest package incrementally, as you proceed with implementing the various parts of the DBI. The DBItest package builds upon the testthat package. To enable it, run the following in your package directory (after installing or updating devtools):

devtools::use_testthat()
devtools::use_test("DBItest")

This creates, among others, a file test-DBItest.R in the tests/testthat directory. Replace its entire contents by the following:

DBItest::make_context(Kazam(), NULL)
DBItest::test_getting_started()

Now test your package with devtools::test(). If you followed at least the “Getting started” section of the DBI “backend” vignette, all tests should succeed.

By adding the corresponding test function to your tests/test-DBItest.R file before implementing a section, you get immediate feedback which functionality of this section still needs to be implemented by running devtools::test() again. Therefore, proceed by appending the following to tests/test-DBItest.R, to include a test case for the forthcoming section:

DBItest::test_driver()

Again, all tests should succeed when you are done with the “Driver” section. Add the call to the next tester function, implement the following section until all tests succeed, and so forth.

In this scenario, you are usually interested only in the first error the test suite finds. The StopReporter of testthat is most helpful here, activate it by passing reporter = "stop" to devtools::test(). Alternatively, call the relevant DBItest::test_() function directly.

The tests are documented with the corresponding functions: For instance, ?test_driver shows a coarse description of all tests for the “Driver” test case. Test failures will include the name of the test that is failing; in this case, investigating the documentation or the source code of the DBItest package will usually lead to the cause of the error.

Not all tests can be satisfied: For example, there is one test that tests that logical variables survive a write-read roundtrip to the database, whereas another test tests that logical variables are converted to integer in such a case. Tests can be skipped by adding regular expressions for the tests to skip as character vector to the call, as in the following[^termnull]:

DBItest::test_driver(skip = c(
  "data_type"           # Reason 1...
  "constructor.*",      # Reason 2...
  NULL
))
[^termnull]: The terminating `NULL` allows appending new lines to the end by copy-pasting an existing line, without having to take care of the terminating comma.

Some other reasons to skip tests are: - your database does not support a feature - you want to postpone or avoid the implementation of a feature - the test takes too long to run

Testing an existing backend

For an existing backends, simply enabling all tests may be the quickest way to get started. Run the following in your package directory (after installing or updating devtools):

devtools::use_testthat()
devtools::use_test("DBItest")

This creates, among others, a file test-DBItest.R in the tests/testthat directory. Replace its entire contents by the following:

DBItest::make_context(Kazam(), NULL)
DBItest::test_all()

The notes about “Kazam” and skipping tests from the previous section apply here as well. The test_all() function simply calls all test cases.

DBItest/inst/doc/test.Rmd0000644000176200001440000001026313071272237014723 0ustar liggesusers--- title: "Testing DBI backends" author: "Kirill Müller" date: "`r Sys.Date()`" output: rmarkdown::html_vignette vignette: > %\VignetteIndexEntry{Testing DBI backends} %\VignetteEngine{knitr::rmarkdown} \usepackage[utf8]{inputenc} --- This document shows how to use the `DBItest` package when implementing a new `DBI` backend or when applying it to an existing backend. The `DBItest` package provides a large collection of automated tests. ## Testing a new backend The test cases in the `DBItest` package are structured very similarly to the sections in the "backend" vignette: ```r vignette("backend", package = "DBI") ``` Like the "backend" vignette, this vignette assumes that you are implementing the `RKazam` package that has a `Kazam()` function that creates a new `DBIDriver` instance for connecting to a "Kazam" database. You can add the tests in the `DBItest` package incrementally, as you proceed with implementing the various parts of the DBI. The `DBItest` package builds upon the `testthat` package. To enable it, run the following in your package directory (after installing or updating `devtools`): ```r devtools::use_testthat() devtools::use_test("DBItest") ``` This creates, among others, a file `test-DBItest.R` in the `tests/testthat` directory. Replace its entire contents by the following: ```r DBItest::make_context(Kazam(), NULL) DBItest::test_getting_started() ``` Now test your package with `devtools::test()`. If you followed at least the "Getting started" section of the `DBI` "backend" vignette, all tests should succeed. By adding the corresponding test function to your `tests/test-DBItest.R` file *before* implementing a section, you get immediate feedback which functionality of this section still needs to be implemented by running `devtools::test()` again. Therefore, proceed by appending the following to `tests/test-DBItest.R`, to include a test case for the forthcoming section: ```r DBItest::test_driver() ``` Again, all tests should succeed when you are done with the "Driver" section. Add the call to the next tester function, implement the following section until all tests succeed, and so forth. In this scenario, you are usually interested only in the first error the test suite finds. The `StopReporter` of `testthat` is most helpful here, activate it by passing `reporter = "stop"` to `devtools::test()`. Alternatively, call the relevant `DBItest::test_()` function directly. The tests are documented with the corresponding functions: For instance, `?test_driver` shows a coarse description of all tests for the "Driver" test case. Test failures will include the name of the test that is failing; in this case, investigating the documentation or the source code of the `DBItest` package will usually lead to the cause of the error. Not all tests can be satisfied: For example, there is one test that tests that `logical` variables survive a write-read roundtrip to the database, whereas another test tests that `logical` variables are converted to `integer` in such a case. Tests can be skipped by adding regular expressions for the tests to skip as character vector to the call, as in the following[^termnull]: ```r DBItest::test_driver(skip = c( "data_type" # Reason 1... "constructor.*", # Reason 2... NULL )) [^termnull]: The terminating `NULL` allows appending new lines to the end by copy-pasting an existing line, without having to take care of the terminating comma. ``` Some other reasons to skip tests are: - your database does not support a feature - you want to postpone or avoid the implementation of a feature - the test takes too long to run ## Testing an existing backend For an existing backends, simply enabling all tests may be the quickest way to get started. Run the following in your package directory (after installing or updating `devtools`): ```r devtools::use_testthat() devtools::use_test("DBItest") ``` This creates, among others, a file `test-DBItest.R` in the `tests/testthat` directory. Replace its entire contents by the following: ```r DBItest::make_context(Kazam(), NULL) DBItest::test_all() ``` The notes about "Kazam" and skipping tests from the previous section apply here as well. The `test_all()` function simply calls all test cases. DBItest/tests/0000755000176200001440000000000013071272235012714 5ustar liggesusersDBItest/tests/testthat.R0000644000176200001440000000007213071272235014676 0ustar liggesuserslibrary(testthat) library(DBItest) test_check("DBItest") DBItest/tests/testthat/0000755000176200001440000000000013232433022014544 5ustar liggesusersDBItest/tests/testthat/test-tweaks.R0000644000176200001440000000055413071442207017154 0ustar liggesuserscontext("tweaks") test_that("tweaks work as expected", { expect_true(names(formals(tweaks))[[1]] == "...") expect_warning(tweaks(`_oooops` = 42, `_darn` = -1), "_oooops, _darn") expect_warning(tweaks(), NA) expect_warning(tweaks(5), "named") expect_warning(tweaks(5, `_ooops` = 42), "named") expect_warning(tweaks(constructor_name = "constr"), NA) }) DBItest/tests/testthat/test-consistency.R0000644000176200001440000000160213076746114020223 0ustar liggesuserscontext("consistency") test_that("no unnamed specs", { tests <- spec_all[!vapply(spec_all, is.null, logical(1L))] vicinity <- NULL if (any(names(tests) == "")) { vicinity <- sort(unique(unlist( lapply(which(names(tests) == ""), "+", -1:1) ))) vicinity <- vicinity[names(tests)[vicinity] != ""] } expect_null(vicinity) }) test_that("no duplicate spec names", { all_names <- names(spec_all) dupe_names <- unique(all_names[duplicated(all_names)]) expect_equal(dupe_names, rep("", length(dupe_names))) }) test_that("all specs used", { env <- asNamespace("DBItest") defined_spec_names <- ls(env, pattern = "^spec_") defined_specs <- mget(defined_spec_names, env) defined_spec_names <- unlist(sapply(defined_specs, names), use.names = FALSE) new_names <- setdiff(defined_spec_names, names(spec_all)) expect_equal(new_names, rep("", length(new_names))) }) DBItest/tests/testthat/test-context.R0000644000176200001440000000014313071272235017336 0ustar liggesuserscontext("context") test_that("default context is NULL", { expect_null(get_default_context()) }) DBItest/tests/testthat/test-lint.R0000644000176200001440000000047413071272237016631 0ustar liggesuserscontext("lint") test_that("lintr is happy", { skip_on_cran() expect_false("package:DBI" %in% search()) require(DBI) on.exit(detach(), add = TRUE) expect_true("package:DBI" %in% search()) #lintr::expect_lint_free() detach() on.exit(NULL, add = FALSE) expect_false("package:DBI" %in% search()) }) DBItest/NAMESPACE0000644000176200001440000000336413232405532012774 0ustar liggesusers# Generated by roxygen2: do not edit by hand S3method("$",DBItest_tweaks) S3method(format,DBItest_tweaks) S3method(print,DBItest_tweaks) export(get_default_context) export(make_context) export(set_default_context) export(test_all) export(test_compliance) export(test_connection) export(test_driver) export(test_getting_started) export(test_meta) export(test_result) export(test_some) export(test_sql) export(test_stress) export(test_transaction) export(tweaks) import(testthat) importFrom(DBI,SQL) importFrom(DBI,dbBegin) importFrom(DBI,dbBind) importFrom(DBI,dbBreak) importFrom(DBI,dbCallProc) importFrom(DBI,dbClearResult) importFrom(DBI,dbColumnInfo) importFrom(DBI,dbCommit) importFrom(DBI,dbConnect) importFrom(DBI,dbDataType) importFrom(DBI,dbDisconnect) importFrom(DBI,dbExecute) importFrom(DBI,dbExistsTable) importFrom(DBI,dbFetch) importFrom(DBI,dbGetDBIVersion) importFrom(DBI,dbGetInfo) importFrom(DBI,dbGetQuery) importFrom(DBI,dbGetRowCount) importFrom(DBI,dbGetRowsAffected) importFrom(DBI,dbGetStatement) importFrom(DBI,dbHasCompleted) importFrom(DBI,dbIsValid) importFrom(DBI,dbListConnections) importFrom(DBI,dbListFields) importFrom(DBI,dbListTables) importFrom(DBI,dbQuoteIdentifier) importFrom(DBI,dbQuoteString) importFrom(DBI,dbReadTable) importFrom(DBI,dbRemoveTable) importFrom(DBI,dbRollback) importFrom(DBI,dbSendQuery) importFrom(DBI,dbSendStatement) importFrom(DBI,dbSetDataMappings) importFrom(DBI,dbWithTransaction) importFrom(DBI,dbWriteTable) importFrom(desc,desc_get_deps) importFrom(methods,extends) importFrom(methods,findMethod) importFrom(methods,getClass) importFrom(methods,getClasses) importFrom(methods,hasMethod) importFrom(methods,is) importFrom(stats,setNames) importFrom(withr,with_output_sink) importFrom(withr,with_temp_libpaths) DBItest/NEWS.md0000644000176200001440000003145613232407247012663 0ustar liggesusers# DBItest 1.5-2 (2018-01-26) - Fix test that fails with "noLD". - Fix NOTEs on R-devel. # DBItest 1.5-1 (2017-12-10) - Remove `"cannot_forget_disconnect"` test that fails on R-devel (#150). # DBItest 1.5 (2017-06-18) Finalize specification. Most tests now come with a corresponding prose, only those where the behavior is not finally decided don't have a prose version yet (#88). New tests --------- - Test behavior of methods in presence of placeholders (#120). - Test column name mismatch behavior for appending tables (#93). - Test that `dbBind()` against factor works but raises a warning (#91). - Test roundtrip of alternating empty and non-empty strings (#42). - Test multiple columns of different types in one statement or table (#35). - Test `field.types` argument to `dbWriteTable()` (#12). - Added tests for invalid or closed connection argument to all methods that expect a connection as first argument (#117). - Enabled test that tests a missing `dbDisconnect()`. - Add test for unambiguous escaping of identifiers (rstats-db/RSQLite#123). - Reenable tests for visibility (#89). - Fix and specify 64-bit roundtrip test. - 64-bit integers only need to be coercible to `numeric` and `character` (#74). - Added roundtrip test for time values (#14). - Added tweaks for handling date, time, timestamp, ... (#53, #76). - Test that `dbFetch()` on update-only query returns warning (#66). Adapted tests ------------- - `NULL` is a valid value for the `row.names` argument, same as `FALSE`. - A column named `row_names` receives no special handling (#54). - A warning (not an error anymore) is expected when calling `dbDisconnect()` on a closed or invalid connection. - `row.names = FALSE` is now the default for methods that read or write tables. - Add `NA` to beginning and end of columns in table roundtrip tests (#24). - Stricter tests for confusion of named and unnamed SQL parameters and placeholders (#107). - Also check names of all returned data frames. - The return value for all calls to `dbGetQuery()`, `dbFetch()`, and `dbReadTable()` is now checked for consistency (all columns have the same length, length matches number of rows) (#126). - Removed stress tests that start a new session. - Allow `hms` (or other subclasses of `difftime`) to be returned as time class (#135, @jimhester). - Test that dates are of type `numeric` (#99, @jimhester). - Replace `POSIXlt` by `POSIXct` (#100, @jimhester). - Use `"PST8PDT"` instead of `"PST"` as time zone (#110, @thrasibule). - Added tests for support of `blob` objects (input and output), but backends are not required to return `blob` objects (#98). - The `logical_return`, `date_typed` and `timestamp_typed` tweaks are respected by the bind tests. - Fixed tests involving time comparison; now uses UTC timezone and compares against a `difftime`. - Tests for roundtrip of character values now includes tabs, in addition to many other special characters (#85). - Make sure at least one table exists in the `dbListTables()` test. - Fix roundtrip tests for raw columns: now expecting `NULL` and not `NA` entries for SQL NULL values. - Fix `expect_equal_df()` for list columns. - Testing that a warning is given if the user forgets to call `dbDisconnect()` or `dbClearResult()` (#103). - Numeric roundtrip accepts conversion of `NaN` to `NA` (#79). Internal -------- - Fix R CMD check errors. - Internal consistency checks (#114). - Skip patterns that don't match any of the tests now raise a warning (#84). - New `test_some()` to test individual tests (#136). - Use desc instead of devtools (#40). - All unexpected warnings are now reported as test failures (#113). - `DBItest_tweaks` class gains a `$` method, accessing an undefined tweak now raises an error. - The arguments of the `tweaks()` function now have default values that further describe their intended usage. - New `with_closed_connection()`, `with_invalid_connection()`, `with_result()` and `with_remove_test_table()` helpers, and `expect_visible()`, `expect_inbisible_true()`, and `expect_equal_df()` expectations for more concise tests. # DBItest 1.4 (2016-12-02) ## DBI specification - Use markdown in documentation. - Description of parametrized queries and statements (#88). - New hidden `DBIspec-wip` page for work-in-progress documentation. - Get rid of "Format" and "Usage" sections, and aliases, in the specs. ## Tests - Not testing for presence of `max.connections` element in `dbGetInfo(Driver)` (rstats-db/DBI#56). - Test multi-row binding for queries and statements (#96). - New `ellipsis` check that verifies that all implemented DBI methods contain `...` in their formals. This excludes `show()` and all methods defined in this or other packages. - Refactored `bind_` tests to use the new `parameter_pattern` tweak (#95). - Rough draft of transaction tests (#36). - New `fetch_zero_rows` test, split from `fetch_premature_close`. - The "compliance" test tests that the backend package exports exactly one subclass of each DBI virtual class. - Document and enhance test for `dbDataType("DBIDriver", "ANY")` (#88). - Minor corrections for "bind" tests. ## Internal - Isolate stress tests from main test suite (#92). - Refactor test specification in smaller modules, isolated from actual test execution (#81). This breaks the documentation of the tests, which will be substituted by a DBI specification in prose. - Align description of binding with code. - Refactor tests for `dbBind()`, test is run by `BindTester` class, and behavior is specified by members and by instances of the new `BindTesterExtra` class. - The `skip` argument to the `test_()` functions is again evaluated with `perl = TRUE` to support negative lookaheads (#33). - Use `dbSendStatement()` and `dbExecute()` where appropriate. - Avoid empty subsections in Rd documentation to satisfy `R CMD check` (#81). # DBItest 1.3 (2016-07-07) Bug fixes --------- - Fix `read_table` test when the backend actually returns the data in a different order. New tests --------- - Test `dbDataType()` on connections (#69, #75, @imanuelcostigan). - Check returned strings for UTF-8 encoding (#72). - Repeated `dbBind()` + `dbFetch()` on the same result set (#51). Features -------- - `tweaks()` gains an `...` as first argument to support future/deprecated tweaks (with a warning), and also to avoid unnamed arguments (#83). - `testthat` now shows a more accurate location for the source of errors, failures, and skips (#78). - Aggregate skipped tests, only one `skip()` call per test function. - Indicate that some tests are optional in documentation (#15). Internal -------- - New `constructor_relax_args` tweak, currently not queried. - The `ctx` argument is now explicit in the test functions. - Change underscores to dashes in file names. - Remove `testthat` compatibility hack. - New `all_have_utf8_or_ascii_encoding()` which vectorizes `has_utf8_or_ascii_encoding()`. - Test on AppVeyor (#73). - Work around regression in R 3.3.0 (fix scheduled for R 3.3.1) which affected stress tests. # DBItest 1.2 (2016-05-21) - Infrastructure - Support names for contexts (@hoesler, #67). - The `skip` argument to the test functions is now treated as a Perl regular expression to allow negative lookahead. Use `skip = "(?!test_regex).*"` to choose a single test to run (#33). - Added encoding arguments to non-ASCII string constants (#60, @hoesler). - Improve tests - `simultaneous_connections` test always closes all connections on exit (@hoesler, #68). - More generic compliance check (@hoesler, #61). - Update documentation to reflect test condition (@imanuelcostigan, #70). - `testthat` dependency - Import all of `testthat` to avoid `R CMD check` warnings. - Compatibility with dev version of `testthat` (#62). - Improve Travis builds - Use container-based builds on Travis. - Install `RPostgres` and `RMySQL` from `rstats-db`. - Install `DBI` and `testthat` from GitHub. Version 1.1 (2016-02-12) === - New feature: tweaks - New argument `tweaks` to `make_context()` (#49). - New `tweaks()`, essentially constructs a named list of tweaks but with predefined and documented argument names. - `constructor_name`, respected by the `constructor.*` tests. - `strict_identifier`, if `TRUE` all identifier must be syntactic names even if quoted. The quoting test is now split, and a part is ignored conditional to this tweak. The `roundtrip_quotes` tests also respects this tweak. - `omit_blob_tests` for DBMS that don't have a BLOB data type. - `current_needs_parens` -- some SQL dialects (e.g., BigQuery) require parentheses for the functions `current_date`, `current_time` and `current_timestamp`. - `union`, for specifying a nonstandard way of combining queries. All union queries now name each column in each subquery (required for `bigrquery`). - New tests - `dbGetInfo(Result)` (rstats-db/DBI#55). - `dbListFields()` (#26). - New `package_name` test in `test_getting_started()`. - Improved tests - Stress test now installs package in temporary library (before loading `DBI`) using `R CMD INSTALL` before loading DBI (rstats-db/RSQLite#128, #48). - Row count is now tested for equality but not identity, so that backends can return a numeric value > 2^31 at their discretion. - Call `dbRemoveTable()` instead of issuing `DROP` requests, the latter might be unsupported. - Use subqueries in queries that use `WHERE`. - Test that `dbClearResult()` on a closed result set raises a warning. - Expect a warning instead of an error for double disconnect (#50). - Move connection test that requires `dbFetch()` to `test_result()`. - Split `can_connect_and_disconnect` test. - Expect `DBI` to be in `Imports`, not in `Depends`. - Removed tests - Remove test for `dbGetException()` (rstats-db/DBI#51). - Bug fixes - Fix broken tests for quoting. - Self-testing - Test `RPostgres`, `RMySQL`, `RSQLite` and `RKazam` as part of the Travis-CI tests (#52). - Travis CI now installs rstats-db/DBI, updated namespace imports (`dbiCheckCompliance()`, `dbListResults()`). - Use fork of `testthat`. - Utilities - Return test results as named array of logical. Requires hadley/testthat#360, gracefully degrades with the CRAN version. - Internal - Refactored the `get_info_()` tests to use a vector of names. - Use versioned dependency for DBI - Use unqualified calls to `dbBind()` again Version 1.0 (2015-12-17) === - CRAN release - Eliminate errors on win-builder - Satisfy R CMD check - Use LGPL-2 license - Add RStudio as copyright holder - Move `devtools` package from "Imports" to "Suggests" Version 0.3 (2015-11-15) === - Feature-complete, ready for review - Tests from the proposal - Add missing methods to compliance check - Add simple read-only test (#27) - Add stress tests for repeated load/unload (with and without connecting) in new R session (#2), - Migrate all tests from existing backends (#28) - Refactor `data_` tests to use a worker function `test_select()` - Test tables with `NA` values above and below the non-`NA` value in `data_` tests - Test return values and error conditions for `dbBind()` and `dbClearResult()` (#31) - Test vectorization of `dbQuoteString()` and `dbQuoteIdentifier()` (#18) - Test that dates have `integer` as underlying data type (#9) - Roundtrip tests sort output table to be sure (#32) - Test `NA` to `NULL` conversion in `dbQuoteString()`, and false friends (#23) - Enhance test for `dbQuoteIdentifier()` (#30) - Style - Avoid using `data.frame()` for date and time columns (#10) - Use `expect_identical()` instead of `expect_equal()` in many places (#13) - Catch all errors in `on.exit()` handlers via `expect_error()` (#20). - Combine "meta" tests into new `test_meta()` (#37) - Documentation - New "test" vignette (#16) - Add package documentation (#38) - Same as 0.2-5 Version 0.2 (2015-11-11) === - Tests from the proposal - SQL - Metadata - DBI compliance (not testing read-only yet) - Migrate most of the tests from RMySQL - Test improvements - Test BLOB data type (#17) - Check actual availability of type returned by `dbDataType()` (#19) - Testing infrastructure - Disambiguate test names (#21) - Use regex matching for deciding skipped tests, skip regex must match the entire test name - Documentation - Document all tests in each test function using the new inline documentation feature of roxygen2 - Improve documentation for `test_all()`: Tests are listed in new "Tests" section - Add brief instructions to README - Move repository to rstats-db namespace - Same as 0.1-6 Version 0.1 (2015-10-11) === - First GitHub release - Builds successfully on Travis - Testing infrastructure - Test context - Skipped tests call `skip()` - Function `test_all()` that runs all tests - Tests from the proposal - Getting started - Driver - Connection - Results - Code formatting is checked with lintr - Same as 0.0-5 DBItest/R/0000755000176200001440000000000013232372773011762 5ustar liggesusersDBItest/R/spec-result.R0000644000176200001440000000102713232372770014350 0ustar liggesusers#' @template dbispec #' @format NULL spec_result <- c( spec_result_send_query, spec_result_fetch, spec_result_clear_result, spec_result_get_query, spec_result_send_statement, spec_result_execute, spec_result_create_table_with_data_type, spec_result_roundtrip ) # Helpers ----------------------------------------------------------------- union <- function(..., .order_by = NULL, .ctx) { query <- .ctx$tweaks$union(c(...)) if (!missing(.order_by)) { query <- paste(query, "ORDER BY", .order_by) } query } DBItest/R/spec-result-create-table-with-data-type.R0000644000176200001440000000220713232372770021536 0ustar liggesusers#' spec_result_create_table_with_data_type #' @usage NULL #' @format NULL #' @keywords NULL spec_result_create_table_with_data_type <- list( #' @section Specification: #' All data types returned by `dbDataType()` are usable in an SQL statement #' of the form data_type_create_table = function(ctx) { with_connection({ check_connection_data_type <- function(value) { with_remove_test_table({ #' `"CREATE TABLE test (a ...)"`. query <- paste0("CREATE TABLE test (a ", dbDataType(con, value), ")") eval(bquote(dbExecute(con, .(query)))) }) } expect_conn_has_data_type <- function(value) { eval(bquote( expect_error(check_connection_data_type(.(value)), NA))) } expect_conn_has_data_type(logical(1)) expect_conn_has_data_type(integer(1)) expect_conn_has_data_type(numeric(1)) expect_conn_has_data_type(character(1)) expect_conn_has_data_type(Sys.Date()) expect_conn_has_data_type(Sys.time()) if (!isTRUE(ctx$tweaks$omit_blob_tests)) { expect_conn_has_data_type(list(as.raw(1:10))) } }) }, NULL ) DBItest/R/test-connection.R0000644000176200001440000000070413156254271015217 0ustar liggesusers#' @name test_all #' @aliases NULL #' @section Tests: #' [test_connection()]: #' Test the "Connection" class NULL #' Test the "Connection" class #' #' @inheritParams test_all #' @include test-driver.R #' @family tests #' @importFrom withr with_temp_libpaths #' @importFrom methods is #' @export test_connection <- function(skip = NULL, ctx = get_default_context()) { test_suite <- "Connection" run_tests(ctx, spec_connection, skip, test_suite) } DBItest/R/spec-stress.R0000644000176200001440000000012213156254271014350 0ustar liggesusers#' @template dbispec #' @format NULL spec_stress <- c( spec_stress_connection ) DBItest/R/utf8.R0000644000176200001440000000101413232372770012764 0ustar liggesuserstext_cyrillic <- iconv(list(as.raw( c(0xd0, 0x9a, 0xd0, 0xb8, 0xd1, 0x80, 0xd0, 0xb8, 0xd0, 0xbb, 0xd0, 0xbb))), from = "UTF-8", to = "UTF-8") text_latin <- iconv(list(as.raw(c(0x4d, 0xc3, 0xbc, 0x6c, 0x6c, 0x65, 0x72))), from = "UTF-8", to = "UTF-8") text_chinese <- iconv(list(as.raw(c(0xe6, 0x88, 0x91, 0xe6, 0x98, 0xaf, 0xe8, 0xb0, 0x81))), from = "UTF-8", to = "UTF-8") text_ascii <- iconv("ASCII", to = "ASCII") texts <- c(text_cyrillic, text_latin, text_chinese, text_ascii) DBItest/R/utils.R0000644000176200001440000001046013232372770013243 0ustar liggesusers`%||%` <- function(a, b) if (is.null(a)) b else a get_pkg_path <- function(ctx) { pkg_name <- package_name(ctx) expect_is(pkg_name, "character") pkg_path <- find.package(pkg_name) pkg_path } utils::globalVariables("con") utils::globalVariables("con2") # Expects a variable "ctx" in the environment env, # evaluates the code inside local() after defining a variable "con" # (can be overridden by specifying con argument) # that points to a newly opened connection. Disconnects on exit. with_connection <- function(code, con = "con", env = parent.frame()) { code_sub <- substitute(code) con <- as.name(con) eval(bquote({ .(con) <- connect(ctx) on.exit(try_silent(dbDisconnect(.(con))), add = TRUE) local(.(code_sub)) } ), envir = env) } # Expects a variable "ctx" in the environment env, # evaluates the code inside local() after defining a variable "con" # (can be overridden by specifying con argument) # that points to a newly opened and then closed connection. Disconnects on exit. with_closed_connection <- function(code, con = "con", env = parent.frame()) { code_sub <- substitute(code) con <- as.name(con) eval(bquote({ .(con) <- connect(ctx) dbDisconnect(.(con)) local(.(code_sub)) } ), envir = env) } # Expects a variable "ctx" in the environment env, # evaluates the code inside local() after defining a variable "con" # (can be overridden by specifying con argument) # that points to a newly opened but invalidated connection. Disconnects on exit. with_invalid_connection <- function(code, con = "con", env = parent.frame()) { code_sub <- substitute(code) stopifnot(con != "..con") con <- as.name(con) eval(bquote({ ..con <- connect(ctx) on.exit(dbDisconnect(..con), add = TRUE) .(con) <- unserialize(serialize(..con, NULL)) local(.(code_sub)) } ), envir = env) } # Evaluates the code inside local() after defining a variable "res" # (can be overridden by specifying con argument) # that points to a result set created by query. Clears on exit. with_result <- function(query, code, res = "res", env = parent.frame()) { code_sub <- substitute(code) query_sub <- substitute(query) res <- as.name(res) eval(bquote({ .(res) <- .(query_sub) on.exit(dbClearResult(.(res)), add = TRUE) local(.(code_sub)) } ), envir = env) } # Evaluates the code inside local() after defining a variable "con" # (can be overridden by specifying con argument) # that points to a connection. Removes the table specified by name on exit, # if it exists. with_remove_test_table <- function(code, name = "test", con = "con", env = parent.frame()) { code_sub <- substitute(code) con <- as.name(con) eval(bquote({ on.exit( try_silent( dbExecute(.(con), paste0("DROP TABLE ", dbQuoteIdentifier(.(con), .(name)))) ), add = TRUE ) local(.(code_sub)) } ), envir = env) } # Evaluates the code inside local() after defining a variable "con" # (can be overridden by specifying con argument) # that points to a result set created by query. Clears on exit. with_rollback_on_error <- function(code, con = "con", env = parent.frame()) { code_sub <- substitute(code) con <- as.name(con) eval(bquote({ on.exit( try_silent( dbRollback(.(con)) ), add = TRUE ) local(.(code_sub)) on.exit(NULL, add = FALSE) } ), envir = env) } get_iris <- function(ctx) { datasets_iris <- datasets::iris if (isTRUE(ctx$tweaks$strict_identifier)) { names(datasets_iris) <- gsub(".", "_", names(datasets_iris), fixed = TRUE) } datasets_iris } unrowname <- function(x) { rownames(x) <- NULL x } random_table_name <- function(n = 10) { paste0(sample(letters, n, replace = TRUE), collapse = "") } compact <- function(x) { x[!vapply(x, is.null, logical(1L))] } expand_char <- function(...) { df <- expand.grid(..., stringsAsFactors = FALSE) do.call(paste0, df) } try_silent <- function(code) { tryCatch( code, error = function(e) NULL) } check_df <- function(df) { expect_is(df, "data.frame") if (length(df) >= 1L) { lengths <- vapply(df, length, integer(1L), USE.NAMES = FALSE) expect_equal(diff(lengths), rep(0L, length(lengths) - 1L)) expect_equal(nrow(df), lengths[[1]]) } df_names <- names(df) expect_true(all(df_names != "")) expect_false(anyNA(df_names)) df } DBItest/R/spec-meta-get-rows-affected.R0000644000176200001440000000416413232372770017271 0ustar liggesusers#' spec_meta_get_rows_affected #' @usage NULL #' @format NULL #' @keywords NULL spec_meta_get_rows_affected <- list( get_rows_affected_formals = function(ctx) { # expect_equal(names(formals(dbGetRowsAffected)), c("res", "...")) }, #' @return #' `dbGetRowsAffected()` returns a scalar number (integer or numeric), #' the number of rows affected by a data manipulation statement rows_affected_statement = function(ctx) { with_connection({ with_remove_test_table({ dbWriteTable(con, "test", data.frame(a = 1:10)) query <- paste0( "DELETE FROM ", dbQuoteIdentifier(con, "test"), " ", "WHERE a < 6" ) with_result( #' issued with [dbSendStatement()]. dbSendStatement(con, query), { rc <- dbGetRowsAffected(res) #' The value is available directly after the call expect_equal(rc, 5L) expect_warning(check_df(dbFetch(res))) rc <- dbGetRowsAffected(res) #' and does not change after calling [dbFetch()]. expect_equal(rc, 5L) } ) }) }) }, rows_affected_query = function(ctx) { with_connection({ query <- "SELECT 1 as a" with_result( #' For queries issued with [dbSendQuery()], dbSendQuery(con, query), { rc <- dbGetRowsAffected(res) #' zero is returned before expect_equal(rc, 0L) check_df(dbFetch(res)) rc <- dbGetRowsAffected(res) #' and after the call to `dbFetch()`. expect_equal(rc, 0L) } ) }) }, get_rows_affected_error = function(ctx) { with_connection({ query <- paste0( "CREATE TABLE ", dbQuoteIdentifier(con, "test"), " (a integer)" ) with_remove_test_table({ res <- dbSendStatement(con, query) dbClearResult(res) #' Attempting to get the rows affected for a result set cleared with #' [dbClearResult()] gives an error. expect_error(dbGetRowsAffected(res)) }) }) }, NULL ) DBItest/R/test-transaction.R0000644000176200001440000000060313156254271015403 0ustar liggesusers#' @name test_all #' @aliases NULL #' @section Tests: #' [test_transaction()]: #' Test transaction functions NULL #' Test transaction functions #' #' @inheritParams test_all #' @include test-meta.R #' @family tests #' @export test_transaction <- function(skip = NULL, ctx = get_default_context()) { test_suite <- "Transactions" run_tests(ctx, spec_transaction, skip, test_suite) } DBItest/R/test-getting-started.R0000644000176200001440000000102513156254271016162 0ustar liggesusers#' @name test_all #' @aliases NULL #' @section Tests: #' [test_getting_started()]: #' Getting started with testing NULL #' Getting started with testing #' #' Tests very basic features of a DBI driver package, to support testing #' and test-first development right from the start. #' #' @inheritParams test_all #' @include test-all.R #' @family tests #' @export test_getting_started <- function(skip = NULL, ctx = get_default_context()) { test_suite <- "Getting started" run_tests(ctx, spec_getting_started, skip, test_suite) } DBItest/R/spec-connection-get-info.R0000644000176200001440000000121313156254271016674 0ustar liggesusers#' @template dbispec-sub-wip #' @format NULL #' @section Connection: #' \subsection{`dbGetInfo("DBIConnection")` (deprecated)}{ spec_connection_get_info <- list( #' Return value of dbGetInfo has necessary elements get_info_connection = function(ctx) { with_connection({ info <- dbGetInfo(con) expect_is(info, "list") info_names <- names(info) necessary_names <- c("db.version", "dbname", "username", "host", "port") for (name in necessary_names) { eval(bquote( expect_true(.(name) %in% info_names))) } expect_false("password" %in% info_names) }) }, #' } NULL ) DBItest/R/spec-sql-quote-identifier.R0000644000176200001440000001355013232372770017110 0ustar liggesusers#' spec_sql_quote_identifier #' @usage NULL #' @format NULL #' @keywords NULL spec_sql_quote_identifier <- list( quote_identifier_formals = function(ctx) { # expect_equal(names(formals(dbQuoteIdentifier)), c("conn", "x", "...")) }, #' @return quote_identifier_return = function(ctx) { with_connection({ #' `dbQuoteIdentifier()` returns an object that can be coerced to [character], simple_out <- dbQuoteIdentifier(con, "simple") expect_error(as.character(simple_out), NA) expect_is(as.character(simple_out), "character") }) }, quote_identifier_vectorized = function(ctx) { with_connection({ #' of the same length as the input. simple <- "simple" simple_out <- dbQuoteIdentifier(con, simple) expect_equal(length(simple_out), 1L) letters_out <- dbQuoteIdentifier(con, letters) expect_equal(length(letters_out), length(letters)) #' For an empty character vector this function returns a length-0 object. empty <- character() empty_out <- dbQuoteIdentifier(con, empty) expect_equal(length(empty_out), 0L) #' An error is raised if the input contains `NA`, expect_error(dbQuoteIdentifier(con, NA)) expect_error(dbQuoteIdentifier(con, NA_character_)) expect_error(dbQuoteIdentifier(con, c("a", NA_character_))) #' but not for an empty string. expect_error(dbQuoteIdentifier(con, ""), NA) #' #' When passing the returned object again to `dbQuoteIdentifier()` #' as `x` #' argument, it is returned unchanged. expect_identical(dbQuoteIdentifier(con, simple_out), simple_out) expect_identical(dbQuoteIdentifier(con, letters_out), letters_out) expect_identical(dbQuoteIdentifier(con, empty_out), empty_out) #' Passing objects of class [SQL] should also return them unchanged. expect_identical(dbQuoteIdentifier(con, SQL(simple)), SQL(simple)) expect_identical(dbQuoteIdentifier(con, SQL(letters)), SQL(letters)) expect_identical(dbQuoteIdentifier(con, SQL(empty)), SQL(empty)) #' (For backends it may be most convenient to return [SQL] objects #' to achieve this behavior, but this is not required.) }) }, #' @section Specification: #' Calling [dbGetQuery()] for a query of the format `SELECT 1 AS ...` #' returns a data frame with the identifier, unquoted, as column name. quote_identifier = function(ctx) { with_connection({ #' Quoted identifiers can be used as table and column names in SQL queries, simple <- dbQuoteIdentifier(con, "simple") #' in particular in queries like `SELECT 1 AS ...` query <- paste0("SELECT 1 AS", simple) rows <- check_df(dbGetQuery(con, query)) expect_identical(names(rows), "simple") expect_identical(unlist(unname(rows)), 1L) #' and `SELECT * FROM (SELECT 1) ...`. query <- paste0("SELECT * FROM (SELECT 1) ", simple) rows <- check_df(dbGetQuery(con, query)) expect_identical(unlist(unname(rows)), 1L) }) }, #' The method must use a quoting mechanism that is unambiguously different #' from the quoting mechanism used for strings, so that a query like quote_identifier_string = function(ctx) { with_connection({ #' `SELECT ... FROM (SELECT 1 AS ...)` query <- paste0( "SELECT ", dbQuoteIdentifier(con, "b"), " FROM (", "SELECT 1 AS ", dbQuoteIdentifier(con, "a"), ")" ) #' throws an error if the column names do not match. eval(bquote(expect_error(dbGetQuery(con, .(query))))) }) }, quote_identifier_special = function(ctx) { with_connection({ #' #' The method can quote column names that #' contain special characters such as a space, with_space_in <- "with space" with_space <- dbQuoteIdentifier(con, with_space_in) #' a dot, with_dot_in <- "with.dot" with_dot <- dbQuoteIdentifier(con, with_dot_in) #' a comma, with_comma_in <- "with,comma" with_comma <- dbQuoteIdentifier(con, with_comma_in) #' or quotes used to mark strings with_quote_in <- as.character(dbQuoteString(con, "a")) with_quote <- dbQuoteIdentifier(con, with_quote_in) #' or identifiers, empty_in <- "" empty <- dbQuoteIdentifier(con, empty_in) quoted_empty <- dbQuoteIdentifier(con, as.character(empty)) quoted_with_space <- dbQuoteIdentifier(con, as.character(with_space)) quoted_with_dot <- dbQuoteIdentifier(con, as.character(with_dot)) quoted_with_comma <- dbQuoteIdentifier(con, as.character(with_comma)) quoted_with_quote <- dbQuoteIdentifier(con, as.character(with_quote)) #' if the database supports this. if (isTRUE(ctx$tweaks$strict_identifier)) { skip("tweak: strict_identifier") } #' In any case, checking the validity of the identifier #' should be performed only when executing a query, #' and not by `dbQuoteIdentifier()`. query <- paste0("SELECT ", "2 as", with_space, ",", "3 as", with_dot, ",", "4 as", with_comma, ",", "5 as", with_quote, ",", "6 as", quoted_empty, ",", "7 as", quoted_with_space, ",", "8 as", quoted_with_dot, ",", "9 as", quoted_with_comma, ",", "10 as", quoted_with_quote) rows <- check_df(dbGetQuery(con, query)) expect_identical(names(rows), c(with_space_in, with_dot_in, with_comma_in, with_quote_in, as.character(empty), as.character(with_space), as.character(with_dot), as.character(with_comma), as.character(with_quote))) expect_identical(unlist(unname(rows)), 2:10) }) }, NULL ) DBItest/R/spec-transaction-with-transaction.R0000644000176200001440000000721613232372770020661 0ustar liggesusers#' spec_transaction_with_transaction #' @usage NULL #' @format NULL #' @keywords NULL spec_transaction_with_transaction <- list( with_transaction_formals = function(ctx) { # expect_equal(names(formals(dbWithTransaction)), c("conn", "code", "...")) }, #' @return #' `dbWithTransaction()` returns the value of the executed code. with_transaction_return_value = function(ctx) { name <- random_table_name() with_connection({ expect_identical(dbWithTransaction(con, name), name) }) }, #' Failure to initiate the transaction #' (e.g., if the connection is closed with_transaction_error_closed = function(ctx) { with_closed_connection({ expect_error(dbWithTransaction(con, NULL)) }) }, #' or invalid with_transaction_error_invalid = function(ctx) { with_invalid_connection({ expect_error(dbWithTransaction(con, NULL)) }) }, #' of if [dbBegin()] has been called already) with_transaction_error_nested = function(ctx) { with_connection({ dbBegin(con) #' gives an error. expect_error(dbWithTransaction(con, NULL)) dbRollback(con) }) }, #' @section Specification: #' `dbWithTransaction()` initiates a transaction with `dbBegin()`, executes #' the code given in the `code` argument, and commits the transaction with #' [dbCommit()]. with_transaction_success = function(ctx) { with_connection({ with_remove_test_table({ expect_false(dbExistsTable(con, "test")) dbWithTransaction( con, { dbExecute(con, paste0("CREATE TABLE test (a ", dbDataType(con, 0L), ")")) dbExecute(con, paste0("INSERT INTO test (a) VALUES (1)")) expect_equal(check_df(dbReadTable(con, "test")), data.frame(a = 1)) } ) expect_equal(check_df(dbReadTable(con, "test")), data.frame(a = 1)) }) }) }, #' If the code raises an error, the transaction is instead aborted with #' [dbRollback()], and the error is propagated. with_transaction_failure = function(ctx) { name <- random_table_name() with_connection({ with_remove_test_table({ expect_false(dbExistsTable(con, "test")) expect_error( dbWithTransaction( con, { dbExecute(con, paste0("CREATE TABLE test (a ", dbDataType(con, 0L), ")")) dbExecute(con, paste0("INSERT INTO test (a) VALUES (1)")) stop(name) } ), name, fixed = TRUE ) expect_false(dbExistsTable(con, "test")) }) }) }, #' If the code calls `dbBreak()`, execution of the code stops and the #' transaction is silently aborted. with_transaction_break = function(ctx) { name <- random_table_name() with_connection({ with_remove_test_table({ expect_false(dbExistsTable(con, "test")) expect_error( dbWithTransaction( con, { dbExecute(con, paste0("CREATE TABLE test (a ", dbDataType(con, 0L), ")")) dbExecute(con, paste0("INSERT INTO test (a) VALUES (1)")) dbBreak() } ), NA ) expect_false(dbExistsTable(con, "test")) }) }) }, #' All side effects caused by the code with_transaction_side_effects = function(ctx) { with_connection({ expect_false(exists("a", inherits = FALSE)) #' (such as the creation of new variables) dbWithTransaction(con, a <- 42) #' propagate to the calling environment. expect_identical(get0("a", inherits = FALSE), 42) }) }, NULL ) DBItest/R/spec-sql-write-table.R0000644000176200001440000005667013232372770016064 0ustar liggesusers#' spec_sql_write_table #' @usage NULL #' @format NULL #' @keywords NULL spec_sql_write_table <- list( write_table_formals = function(ctx) { # expect_equal(names(formals(dbWriteTable)), c("conn", "name", "value", "...")) }, #' @return #' `dbWriteTable()` returns `TRUE`, invisibly. write_table_return = function(ctx) { with_connection({ with_remove_test_table({ expect_invisible_true(dbWriteTable(con, "test", data.frame(a = 1L))) }) }) }, #' If the table exists, and both `append` and `overwrite` arguments are unset, write_table_overwrite = function(ctx) { with_connection({ with_remove_test_table({ test_in <- data.frame(a = 1L) dbWriteTable(con, "test", test_in) expect_error(dbWriteTable(con, "test", data.frame(a = 2L))) test_out <- check_df(dbReadTable(con, "test")) expect_equal_df(test_out, test_in) }) }) }, #' or `append = TRUE` and the data frame with the new data has different #' column names, #' an error is raised; the remote table remains unchanged. write_table_append_incompatible = function(ctx) { with_connection({ with_remove_test_table({ test_in <- data.frame(a = 1L) dbWriteTable(con, "test", test_in) expect_error(dbWriteTable(con, "test", data.frame(b = 2L), append = TRUE)) test_out <- check_df(dbReadTable(con, "test")) expect_equal_df(test_out, test_in) }) }) }, #' #' An error is raised when calling this method for a closed write_table_closed_connection = function(ctx) { with_closed_connection({ expect_error(dbWriteTable(con, "test", data.frame(a = 1))) }) }, #' or invalid connection. write_table_invalid_connection = function(ctx) { with_invalid_connection({ expect_error(dbListTables(con, "test", data.frame(a = 1))) }) }, #' An error is also raised write_table_error = function(ctx) { with_connection({ test_in <- data.frame(a = 1L) with_remove_test_table({ #' if `name` cannot be processed with [dbQuoteIdentifier()] expect_error(dbWriteTable(con, NA, test_in)) #' or if this results in a non-scalar. expect_error(dbWriteTable(con, c("test", "test"), test_in)) #' Invalid values for the additional arguments `row.names`, #' `overwrite`, `append`, `field.types`, and `temporary` #' (non-scalars, expect_error(dbWriteTable(con, "test", test_in, row.names = letters)) expect_error(dbWriteTable(con, "test", test_in, overwrite = c(TRUE, FALSE))) expect_error(dbWriteTable(con, "test", test_in, append = c(TRUE, FALSE))) expect_error(dbWriteTable(con, "test", test_in, temporary = c(TRUE, FALSE))) #' unsupported data types, expect_error(dbWriteTable(con, "test", test_in, row.names = list(1L))) expect_error(dbWriteTable(con, "test", test_in, overwrite = 1L)) expect_error(dbWriteTable(con, "test", test_in, append = 1L)) expect_error(dbWriteTable(con, "test", test_in, field.types = 1L)) expect_error(dbWriteTable(con, "test", test_in, temporary = 1L)) #' `NA`, expect_error(dbWriteTable(con, "test", test_in, overwrite = NA)) expect_error(dbWriteTable(con, "test", test_in, append = NA)) expect_error(dbWriteTable(con, "test", test_in, field.types = NA)) expect_error(dbWriteTable(con, "test", test_in, temporary = NA)) #' incompatible values, expect_error(dbWriteTable(con, "test", test_in, field.types = letters)) expect_error(dbWriteTable(con, "test", test_in, field.types = c(b = "INTEGER"))) expect_error(dbWriteTable(con, "test", test_in, overwrite = TRUE, append = TRUE)) expect_error(dbWriteTable(con, "test", test_in, append = TRUE, field.types = c(a = "INTEGER"))) #' duplicate expect_error(dbWriteTable(con, "test", test_in, field.types = c(a = "INTEGER", a = "INTEGER"))) #' or missing names, expect_error(dbWriteTable(con, "test", test_in, field.types = c("INTEGER"))) }) with_remove_test_table({ dbWriteTable(con, "test", test_in) #' incompatible columns) expect_error(dbWriteTable(con, "test", data.frame(b = 2L, c = 3L), append = TRUE)) }) #' also raise an error. }) }, #' @section Additional arguments: #' The following arguments are not part of the `dbWriteTable()` generic #' (to improve compatibility across backends) #' but are part of the DBI specification: #' - `row.names` (default: `NA`) #' - `overwrite` (default: `FALSE`) #' - `append` (default: `FALSE`) #' - `field.types` (default: `NULL`) #' - `temporary` (default: `FALSE`) #' #' They must be provided as named arguments. #' See the "Specification" and "Value" sections for details on their usage. #' @section Specification: #' The `name` argument is processed as follows, write_table_name = function(ctx) { with_connection({ #' to support databases that allow non-syntactic names for their objects: if (isTRUE(ctx$tweaks$strict_identifier)) { table_names <- "a" } else { table_names <- c("a", "with spaces", "with,comma") } for (table_name in table_names) { test_in <- data.frame(a = 1) with_remove_test_table(name = dbQuoteIdentifier(con, table_name), { #' - If an unquoted table name as string: `dbWriteTable()` will do the quoting, dbWriteTable(con, table_name, test_in) test_out <- check_df(dbReadTable(con, dbQuoteIdentifier(con, table_name))) expect_equal_df(test_out, test_in) #' perhaps by calling `dbQuoteIdentifier(conn, x = name)` }) with_remove_test_table(name = dbQuoteIdentifier(con, table_name), { #' - If the result of a call to [dbQuoteIdentifier()]: no more quoting is done dbWriteTable(con, dbQuoteIdentifier(con, table_name), test_in) test_out <- check_df(dbReadTable(con, table_name)) expect_equal_df(test_out, test_in) }) } }) }, #' #' If the `overwrite` argument is `TRUE`, an existing table of the same name #' will be overwritten. overwrite_table = function(ctx) { with_connection({ with_remove_test_table(name = "iris", { iris <- get_iris(ctx) dbWriteTable(con, "iris", iris) expect_error(dbWriteTable(con, "iris", iris[1:10,], overwrite = TRUE), NA) iris_out <- check_df(dbReadTable(con, "iris")) expect_equal_df(iris_out, iris[1:10, ]) }) }) }, #' This argument doesn't change behavior if the table does not exist yet. overwrite_table_missing = function(ctx) { with_connection({ with_remove_test_table(name = "iris", { iris_in <- get_iris(ctx) expect_error(dbWriteTable(con, "iris", iris[1:10,], overwrite = TRUE), NA) iris_out <- check_df(dbReadTable(con, "iris")) expect_equal_df(iris_out, iris_in[1:10, ]) }) }) }, #' #' If the `append` argument is `TRUE`, the rows in an existing table are #' preserved, and the new data are appended. append_table = function(ctx) { with_connection({ with_remove_test_table(name = "iris", { iris <- get_iris(ctx) dbWriteTable(con, "iris", iris) expect_error(dbWriteTable(con, "iris", iris[1:10,], append = TRUE), NA) iris_out <- check_df(dbReadTable(con, "iris")) expect_equal_df(iris_out, rbind(iris, iris[1:10,])) }) }) }, #' If the table doesn't exist yet, it is created. append_table_new = function(ctx) { with_connection({ with_remove_test_table(name = "iris", { iris <- get_iris(ctx) expect_error(dbWriteTable(con, "iris", iris[1:10,], append = TRUE), NA) iris_out <- check_df(dbReadTable(con, "iris")) expect_equal_df(iris_out, iris[1:10,]) }) }) }, #' #' If the `temporary` argument is `TRUE`, the table is not available in a #' second connection and is gone after reconnecting. temporary_table = function(ctx) { #' Not all backends support this argument. if (!isTRUE(ctx$tweaks$temporary_tables)) { skip("tweak: temporary_tables") } with_connection({ with_remove_test_table(name = "iris", { iris <- get_iris(ctx)[1:30, ] dbWriteTable(con, "iris", iris, temporary = TRUE) iris_out <- check_df(dbReadTable(con, "iris")) expect_equal_df(iris_out, iris) with_connection( expect_error(dbReadTable(con2, "iris")), con = "con2") }) }) with_connection({ expect_error(dbReadTable(con, "iris")) }) }, #' A regular, non-temporary table is visible in a second connection table_visible_in_other_connection = function(ctx) { iris <- get_iris(ctx)[1:30,] with_connection({ dbWriteTable(con, "iris", iris) iris_out <- check_df(dbReadTable(con, "iris")) expect_equal_df(iris_out, iris) with_connection( expect_equal_df(dbReadTable(con2, "iris"), iris), con = "con2") }) #' and after reconnecting to the database. with_connection({ with_remove_test_table(name = "iris", { expect_equal_df(check_df(dbReadTable(con, "iris")), iris) }) }) }, #' #' SQL keywords can be used freely in table names, column names, and data. roundtrip_keywords = function(ctx) { with_connection({ tbl_in <- data.frame( SELECT = "UNIQUE", FROM = "JOIN", WHERE = "ORDER", stringsAsFactors = FALSE ) test_table_roundtrip(con, tbl_in, name = "EXISTS") }) }, #' Quotes, commas, and spaces can also be used in the data, #' and, if the database supports non-syntactic identifiers, #' also for table names and column names. roundtrip_quotes = function(ctx) { with_connection({ if (!isTRUE(ctx$tweaks$strict_identifier)) { table_names <- c( as.character(dbQuoteIdentifier(con, "")), as.character(dbQuoteString(con, "")), "with space", ",") } else { table_names <- "a" } for (table_name in table_names) { tbl_in <- data.frame( a = as.character(dbQuoteString(con, "")), b = as.character(dbQuoteIdentifier(con, "")), c = "with space", d = ",", stringsAsFactors = FALSE ) if (!isTRUE(ctx$tweaks$strict_identifier)) { names(tbl_in) <- c( as.character(dbQuoteIdentifier(con, "")), as.character(dbQuoteString(con, "")), "with space", ",") } test_table_roundtrip(con, tbl_in) } }) }, #' #' The following data types must be supported at least, #' and be read identically with [dbReadTable()]: #' - integer roundtrip_integer = function(ctx) { with_connection({ tbl_in <- data.frame(a = c(1:5)) test_table_roundtrip(con, tbl_in) }) }, #' - numeric roundtrip_numeric = function(ctx) { with_connection({ tbl_in <- data.frame(a = c(seq(1, 3, by = 0.5))) test_table_roundtrip(con, tbl_in) }) }, #' (also with `Inf` and `NaN` values, roundtrip_numeric_special = function(ctx) { with_connection({ tbl_in <- data.frame(a = c(seq(1, 3, by = 0.5), -Inf, Inf, NaN)) tbl_exp <- tbl_in #' the latter are translated to `NA`) tbl_exp$a[is.nan(tbl_exp$a)] <- NA_real_ test_table_roundtrip(con, tbl_in, tbl_exp) }) }, #' - logical roundtrip_logical = function(ctx) { with_connection({ tbl_in <- data.frame(a = c(TRUE, FALSE, NA)) tbl_exp <- tbl_in tbl_exp$a <- ctx$tweaks$logical_return(tbl_exp$a) test_table_roundtrip(con, tbl_in, tbl_exp) }) }, #' - `NA` as NULL roundtrip_null = function(ctx) { with_connection({ tbl_in <- data.frame(a = NA) test_table_roundtrip( con, tbl_in, transform = function(tbl_out) { tbl_out$a <- as.logical(tbl_out$a) # Plain NA is of type logical tbl_out } ) }) }, #' - 64-bit values (using `"bigint"` as field type); roundtrip_64_bit_numeric = function(ctx) { with_connection({ tbl_in <- data.frame(a = c(-1e14, 1e15)) test_table_roundtrip( con, tbl_in, transform = function(tbl_out) { #' the result can be converted to a numeric, which may lose precision, tbl_out$a <- as.numeric(tbl_out$a) tbl_out }, field.types = c(a = "BIGINT") ) }) }, roundtrip_64_bit_character = function(ctx) { with_connection({ tbl_in <- data.frame(a = c(-1e14, 1e15)) tbl_exp <- tbl_in tbl_exp$a <- format(tbl_exp$a, scientific = FALSE) test_table_roundtrip( con, tbl_in, tbl_exp, transform = function(tbl_out) { # ' or to character, which gives the full decimal representation as a # ' character vector tbl_out$a <- as.character(tbl_out$a) tbl_out }, field.types = c(a = "BIGINT") ) }) }, #' - character (in both UTF-8 roundtrip_character = function(ctx) { with_connection({ tbl_in <- data.frame( a = c(texts), stringsAsFactors = FALSE ) test_table_roundtrip(con, tbl_in) }) }, #' and native encodings), roundtrip_character_native = function(ctx) { with_connection({ tbl_in <- data.frame( a = c(enc2native(texts)), stringsAsFactors = FALSE ) test_table_roundtrip(con, tbl_in) }) }, #' supporting empty strings roundtrip_character_empty = function(ctx) { with_connection({ tbl_in <- data.frame( a = c("", "a"), stringsAsFactors = FALSE ) test_table_roundtrip(con, tbl_in) }) with_connection({ tbl_in <- data.frame( a = c("a", ""), stringsAsFactors = FALSE ) test_table_roundtrip(con, tbl_in) }) }, #' - factor (returned as character) roundtrip_factor = function(ctx) { with_connection({ tbl_in <- data.frame( a = factor(c(texts)) ) tbl_exp <- tbl_in tbl_exp$a <- as.character(tbl_exp$a) test_table_roundtrip(con, tbl_in, tbl_exp) }) }, #' - list of raw roundtrip_raw = function(ctx) { #' (if supported by the database) if (isTRUE(ctx$tweaks$omit_blob_tests)) { skip("tweak: omit_blob_tests") } with_connection({ tbl_in <- data.frame(id = 1L, a = I(list(as.raw(1:10)))) tbl_exp <- tbl_in tbl_exp$a <- blob::as.blob(unclass(tbl_in$a)) test_table_roundtrip( con, tbl_in, tbl_exp, transform = function(tbl_out) { tbl_out$a <- blob::as.blob(tbl_out$a) tbl_out } ) }) }, #' - objects of type [blob::blob] roundtrip_blob = function(ctx) { #' (if supported by the database) if (isTRUE(ctx$tweaks$omit_blob_tests)) { skip("tweak: omit_blob_tests") } with_connection({ tbl_in <- data.frame(id = 1L, a = blob::blob(as.raw(1:10))) test_table_roundtrip( con, tbl_in, transform = function(tbl_out) { tbl_out$a <- blob::as.blob(tbl_out$a) tbl_out } ) }) }, #' - date roundtrip_date = function(ctx) { #' (if supported by the database; if (!isTRUE(ctx$tweaks$date_typed)) { skip("tweak: !date_typed") } with_connection({ #' returned as `Date`) tbl_in <- data.frame(a = as_numeric_date(c(Sys.Date() + 1:5))) test_table_roundtrip( con, tbl_in, transform = function(tbl_out) { expect_is(unclass(tbl_out$a), "numeric") tbl_out } ) }) }, #' - time roundtrip_time = function(ctx) { #' (if supported by the database; if (!isTRUE(ctx$tweaks$time_typed)) { skip("tweak: !time_typed") } with_connection({ now <- Sys.time() tbl_in <- data.frame(a = c(now + 1:5) - now) tbl_exp <- tbl_in tbl_exp$a <- hms::as.hms(tbl_exp$a) test_table_roundtrip( con, tbl_in, tbl_exp, transform = function(tbl_out) { #' returned as objects that inherit from `difftime`) expect_is(tbl_out$a, "difftime") tbl_out$a <- hms::as.hms(tbl_out$a) tbl_out } ) }) }, #' - timestamp roundtrip_timestamp = function(ctx) { #' (if supported by the database; if (!isTRUE(ctx$tweaks$timestamp_typed)) { skip("tweak: !timestamp_typed") } with_connection({ #' returned as `POSIXct` #' with time zone support) tbl_in <- data.frame(id = 1:5) tbl_in$a <- round(Sys.time()) + c(1, 60, 3600, 86400, NA) tbl_in$b <- as.POSIXct(tbl_in$a, tz = "GMT") tbl_in$c <- as.POSIXct(tbl_in$a, tz = "PST8PDT") tbl_in$d <- as.POSIXct(tbl_in$a, tz = "UTC") test_table_roundtrip(con, tbl_in) }) }, #' #' Mixing column types in the same table is supported. roundtrip_mixed = function(ctx) { with_connection({ data <- list("a", 1L, 1.5) data <- lapply(data, c, NA) expanded <- expand.grid(a = data, b = data, c = data) tbl_in_list <- lapply( seq_len(nrow(expanded)), function(i) { as.data.frame(lapply(expanded[i, ], unlist, recursive = FALSE)) } ) lapply(tbl_in_list, test_table_roundtrip, con = con) }) }, #' #' The `field.types` argument must be a named character vector with at most #' one entry for each column. #' It indicates the SQL data type to be used for a new column. roundtrip_field_types = function(ctx) { with_connection({ tbl_in <- data.frame(a = numeric()) tbl_exp <- data.frame(a = integer()) test_table_roundtrip( con, tbl_in, tbl_exp, field.types = c(a = "INTEGER") ) }) }, #' #' The interpretation of [rownames] depends on the `row.names` argument, #' see [sqlRownamesToColumn()] for details: write_table_row_names_false = function(ctx) { #' - If `FALSE` or `NULL`, row names are ignored. for (row.names in list(FALSE, NULL)) { with_connection({ with_remove_test_table(name = "mtcars", { mtcars_in <- datasets::mtcars dbWriteTable(con, "mtcars", mtcars_in, row.names = row.names) mtcars_out <- check_df(dbReadTable(con, "mtcars", row.names = FALSE)) expect_false("row_names" %in% names(mtcars_out)) expect_equal_df(mtcars_out, unrowname(mtcars_in)) }) }) } }, write_table_row_names_true_exists = function(ctx) { #' - If `TRUE`, row names are converted to a column named "row_names", row.names <- TRUE with_connection({ with_remove_test_table(name = "mtcars", { mtcars_in <- datasets::mtcars dbWriteTable(con, "mtcars", mtcars_in, row.names = row.names) mtcars_out <- check_df(dbReadTable(con, "mtcars", row.names = FALSE)) expect_true("row_names" %in% names(mtcars_out)) expect_true(all(rownames(mtcars_in) %in% mtcars_out$row_names)) expect_true(all(mtcars_out$row_names %in% rownames(mtcars_in))) expect_equal_df(mtcars_out[names(mtcars_out) != "row_names"], unrowname(mtcars_in)) }) }) }, write_table_row_names_true_missing = function(ctx) { #' even if the input data frame only has natural row names from 1 to `nrow(...)`. row.names <- TRUE with_connection({ with_remove_test_table(name = "iris", { iris_in <- get_iris(ctx) dbWriteTable(con, "iris", iris_in, row.names = row.names) iris_out <- check_df(dbReadTable(con, "iris", row.names = FALSE)) expect_true("row_names" %in% names(iris_out)) expect_true(all(rownames(iris_in) %in% iris_out$row_names)) expect_true(all(iris_out$row_names %in% rownames(iris_in))) expect_equal_df(iris_out[names(iris_out) != "row_names"], iris_in) }) }) }, write_table_row_names_na_exists = function(ctx) { #' - If `NA`, a column named "row_names" is created if the data has custom row names, row.names <- NA with_connection({ with_remove_test_table(name = "mtcars", { mtcars_in <- datasets::mtcars dbWriteTable(con, "mtcars", mtcars_in, row.names = row.names) mtcars_out <- check_df(dbReadTable(con, "mtcars", row.names = FALSE)) expect_true("row_names" %in% names(mtcars_out)) expect_true(all(rownames(mtcars_in) %in% mtcars_out$row_names)) expect_true(all(mtcars_out$row_names %in% rownames(mtcars_in))) expect_equal_df(mtcars_out[names(mtcars_out) != "row_names"], unrowname(mtcars_in)) }) }) }, write_table_row_names_na_missing = function(ctx) { #' no extra column is created in the case of natural row names. row.names <- NA with_connection({ with_remove_test_table(name = "iris", { iris_in <- get_iris(ctx) dbWriteTable(con, "iris", iris_in, row.names = row.names) iris_out <- check_df(dbReadTable(con, "iris", row.names = FALSE)) expect_equal_df(iris_out, iris_in) }) }) }, write_table_row_names_string_exists = function(ctx) { row.names <- "make_model" #' - If a string, this specifies the name of the column in the remote table #' that contains the row names, with_connection({ with_remove_test_table(name = "mtcars", { mtcars_in <- datasets::mtcars dbWriteTable(con, "mtcars", mtcars_in, row.names = row.names) mtcars_out <- check_df(dbReadTable(con, "mtcars", row.names = FALSE)) expect_true("make_model" %in% names(mtcars_out)) expect_true(all(mtcars_out$make_model %in% rownames(mtcars_in))) expect_true(all(rownames(mtcars_in) %in% mtcars_out$make_model)) expect_equal_df(mtcars_out[names(mtcars_out) != "make_model"], unrowname(mtcars_in)) }) }) }, write_table_row_names_string_missing = function(ctx) { row.names <- "seq" #' even if the input data frame only has natural row names. with_connection({ with_remove_test_table(name = "iris", { iris_in <- get_iris(ctx) dbWriteTable(con, "iris", iris_in, row.names = row.names) iris_out <- check_df(dbReadTable(con, "iris", row.names = FALSE)) expect_true("seq" %in% names(iris_out)) expect_true(all(iris_out$seq %in% rownames(iris_in))) expect_true(all(rownames(iris_in) %in% iris_out$seq)) expect_equal_df(iris_out[names(iris_out) != "seq"], iris_in) }) }) }, NULL ) test_table_roundtrip <- function(...) { test_table_roundtrip_one(..., .add_na = "none") test_table_roundtrip_one(..., .add_na = "above") test_table_roundtrip_one(..., .add_na = "below") } test_table_roundtrip_one <- function(con, tbl_in, tbl_expected = tbl_in, transform = identity, name = "test", field.types = NULL, .add_na = "none") { force(tbl_expected) if (.add_na == "above") { tbl_in <- add_na_above(tbl_in) tbl_expected <- add_na_above(tbl_expected) } else if (.add_na == "below") { tbl_in <- add_na_below(tbl_in) tbl_expected <- add_na_below(tbl_expected) } with_remove_test_table(name = dbQuoteIdentifier(con, name), { dbWriteTable(con, name, tbl_in, field.types = field.types) tbl_out <- check_df(dbReadTable(con, name, check.names = FALSE)) tbl_out <- transform(tbl_out) expect_equal_df(tbl_out, tbl_expected) }) } add_na_above <- function(tbl) { tbl <- rbind(tbl, tbl[nrow(tbl) + 1L, , drop = FALSE]) unrowname(tbl) } add_na_below <- function(tbl) { tbl <- rbind(tbl[nrow(tbl) + 1L, , drop = FALSE], tbl) unrowname(tbl) } DBItest/R/spec-meta-bind.R0000644000176200001440000002452213232372770014677 0ustar liggesusers#' spec_meta_bind #' @usage NULL #' @format NULL #' @keywords NULL spec_meta_bind <- list( bind_formals = function(ctx) { # expect_equal(names(formals(dbBind)), c("res", "params", "...")) }, #' @return bind_return_value = function(ctx) { extra <- new_bind_tester_extra( check_return_value = function(bind_res, res) { #' `dbBind()` returns the result set, expect_identical(res, bind_res$value) #' invisibly, expect_false(bind_res$visible) } ) with_connection({ #' for queries issued by [dbSendQuery()] test_select_bind(con, ctx$tweaks$placeholder_pattern, 1L, extra = extra) }) with_connection({ #' and also for data manipulation statements issued by #' [dbSendStatement()]. test_select_bind(con, ctx$tweaks$placeholder_pattern, 1L, extra = extra, query = FALSE) }) }, bind_empty = function(ctx) { with_connection({ with_result( #' Calling `dbBind()` for a query without parameters dbSendQuery(con, "SELECT 1"), #' raises an error. expect_error(dbBind(res, list())) ) }) }, bind_too_many = function(ctx) { extra <- new_bind_tester_extra( patch_bind_values = function(bind_values) { #' Binding too many c(bind_values, bind_values[[1L]]) } ) with_connection({ expect_error( test_select_bind(con, ctx$tweaks$placeholder_pattern, 1L, extra = extra) ) }) }, bind_not_enough = function(ctx) { extra <- new_bind_tester_extra( patch_bind_values = function(bind_values) { #' or not enough values, bind_values[-1L] } ) with_connection({ expect_error( test_select_bind(con, ctx$tweaks$placeholder_pattern, 1L, extra = extra) ) }) }, bind_wrong_name = function(ctx) { extra <- new_bind_tester_extra( patch_bind_values = function(bind_values) { #' or parameters with wrong names stats::setNames(bind_values, paste0("bogus", names(bind_values))) }, requires_names = function() TRUE ) with_connection({ expect_error( test_select_bind(con, ctx$tweaks$placeholder_pattern, 1L, extra = extra) ) }) }, bind_multi_row_unequal_length = function(ctx) { extra <- new_bind_tester_extra( patch_bind_values = function(bind_values) { #' or unequal length, bind_values[[2]] <- bind_values[[2]][-1] bind_values } ) with_connection({ #' also raises an error. expect_error( test_select_bind( con, ctx$tweaks$placeholder_pattern, list(1:3, 2:4), extra = extra, query = FALSE ) ) }) }, #' If the placeholders in the query are named, bind_named_param_unnamed_placeholders = function(ctx) { extra <- new_bind_tester_extra( patch_bind_values = function(bind_values) { #' all parameter values must have names stats::setNames(bind_values, NULL) }, requires_names = function() TRUE ) with_connection({ expect_error( test_select_bind(con, ctx$tweaks$placeholder_pattern, 1L, extra = extra) ) }) }, bind_named_param_empty_placeholders = function(ctx) { extra <- new_bind_tester_extra( patch_bind_values = function(bind_values) { #' (which must not be empty names(bind_values)[[1]] <- "" }, requires_names = function() TRUE ) with_connection({ expect_error( test_select_bind(con, ctx$tweaks$placeholder_pattern, list(1L, 2L), extra = extra) ) }) }, bind_named_param_na_placeholders = function(ctx) { extra <- new_bind_tester_extra( patch_bind_values = function(bind_values) { #' or `NA`), names(bind_values)[[1]] <- NA }, requires_names = function() TRUE ) with_connection({ expect_error( test_select_bind(con, ctx$tweaks$placeholder_pattern, list(1L, 2L), extra = extra) ) }) }, #' and vice versa, bind_unnamed_param_named_placeholders = function(ctx) { extra <- new_bind_tester_extra( patch_bind_values = function(bind_values) { stats::setNames(bind_values, letters[seq_along(bind_values)]) }, requires_names = function() FALSE ) with_connection({ #' otherwise an error is raised. expect_error( test_select_bind(con, ctx$tweaks$placeholder_pattern, 1L, extra = extra) ) }) }, #' The behavior for mixing placeholders of different types #' (in particular mixing positional and named placeholders) #' is not specified. #' bind_premature_clear = function(ctx) { extra <- new_bind_tester_extra( #' Calling `dbBind()` on a result set already cleared by [dbClearResult()] is_premature_clear = function() TRUE ) with_connection({ #' also raises an error. expect_error( test_select_bind(con, ctx$tweaks$placeholder_pattern, 1L, extra = extra) ) }) }, #' @section Specification: #' The elements of the `params` argument do not need to be scalars, bind_multi_row = function(ctx) { with_connection({ #' vectors of arbitrary length test_select_bind(con, ctx$tweaks$placeholder_pattern, list(1:3)) }) }, bind_multi_row_zero_length = function(ctx) { with_connection({ #' (including length 0) test_select_bind(con, ctx$tweaks$placeholder_pattern, list(integer(), integer())) }) #' are supported. # This behavior is tested as part of run_bind_tester$fun #' For queries, calling `dbFetch()` binding such parameters returns #' concatenated results, equivalent to binding and fetching for each set #' of values and connecting via [rbind()]. }, bind_multi_row_statement = function(ctx) { with_connection({ # This behavior is tested as part of run_bind_tester$fun #' For data manipulation statements, `dbGetRowsAffected()` returns the #' total number of rows affected if binding non-scalar parameters. test_select_bind(con, ctx$tweaks$placeholder_pattern, list(1:3), query = FALSE) }) }, bind_repeated = function(ctx) { extra <- new_bind_tester_extra( #' `dbBind()` also accepts repeated calls on the same result set is_repeated = function() TRUE ) with_connection({ #' for both queries test_select_bind(con, ctx$tweaks$placeholder_pattern, 1L, extra = extra) }) with_connection({ #' and data manipulation statements, test_select_bind(con, ctx$tweaks$placeholder_pattern, 1L, extra = extra, query = FALSE) }) }, bind_repeated_untouched = function(ctx) { extra <- new_bind_tester_extra( #' even if no results are fetched between calls to `dbBind()`. is_repeated = function() TRUE, is_untouched = function() TRUE ) with_connection({ test_select_bind(con, ctx$tweaks$placeholder_pattern, 1L, extra = extra) }) with_connection({ test_select_bind(con, ctx$tweaks$placeholder_pattern, 1L, extra = extra, query = FALSE) }) }, #' #' At least the following data types are accepted: #' - [integer] bind_integer = function(ctx) { with_connection({ test_select_bind(con, ctx$tweaks$placeholder_pattern, 1L) }) }, #' - [numeric] bind_numeric = function(ctx) { with_connection({ test_select_bind(con, ctx$tweaks$placeholder_pattern, 1.5) }) }, #' - [logical] for Boolean values (some backends may return an integer) bind_logical = function(ctx) { with_connection({ test_select_bind( con, ctx$tweaks$placeholder_pattern, TRUE, type = NULL, transform_input = ctx$tweaks$logical_return, transform_output = ctx$tweaks$logical_return ) }) }, #' - [NA] bind_null = function(ctx) { with_connection({ test_select_bind( con, ctx$tweaks$placeholder_pattern, NA, transform_input = function(x) TRUE, transform_output = is.na) }) }, #' - [character] bind_character = function(ctx) { with_connection({ test_select_bind( con, ctx$tweaks$placeholder_pattern, texts ) }) }, #' - [factor] (bound as character, bind_factor = function(ctx) { with_connection({ #' with warning) expect_warning( test_select_bind( con, ctx$tweaks$placeholder_pattern, lapply(texts, factor) ) ) }) }, #' - [Date] bind_date = function(ctx) { if (!isTRUE(ctx$tweaks$date_typed)) { skip("tweak: !date_typed") } with_connection({ test_select_bind(con, ctx$tweaks$placeholder_pattern, Sys.Date()) }) }, #' - [POSIXct] timestamps bind_timestamp = function(ctx) { if (!isTRUE(ctx$tweaks$timestamp_typed)) { skip("tweak: !timestamp_typed") } with_connection({ data_in <- as.POSIXct(round(Sys.time())) test_select_bind( con, ctx$tweaks$placeholder_pattern, data_in, type = dbDataType(con, data_in), transform_input = identity, transform_output = identity, expect = expect_equal) }) }, #' - [POSIXlt] timestamps bind_timestamp_lt = function(ctx) { if (!isTRUE(ctx$tweaks$timestamp_typed)) { skip("tweak: !timestamp_typed") } with_connection({ data_in <- as.POSIXlt(round(Sys.time())) test_select_bind( con, ctx$tweaks$placeholder_pattern, data_in, type = dbDataType(con, data_in), transform_input = as.POSIXct, transform_output = as.POSIXct) }) }, #' - lists of [raw] for blobs (with `NULL` entries for SQL NULL values) bind_raw = function(ctx) { if (isTRUE(ctx$tweaks$omit_blob_tests)) { skip("tweak: omit_blob_tests") } with_connection({ test_select_bind( con, ctx$tweaks$placeholder_pattern, list(list(as.raw(1:10))), type = NULL, transform_input = blob::as.blob, transform_output = blob::as.blob) }) }, #' - objects of type [blob::blob] bind_blob = function(ctx) { if (isTRUE(ctx$tweaks$omit_blob_tests)) { skip("tweak: omit_blob_tests") } with_connection({ test_select_bind( con, ctx$tweaks$placeholder_pattern, list(blob::blob(as.raw(1:10))), type = NULL, transform_input = identity, transform_output = blob::as.blob) }) }, NULL ) DBItest/R/import-dbi.R0000644000176200001440000000133013213312665014141 0ustar liggesusers# The imports below were generated using the following call: # @import.gen::importFrom("DBI") #' @importFrom DBI dbBegin dbBind dbBreak dbCallProc dbClearResult dbColumnInfo #' @importFrom DBI dbCommit dbConnect dbDataType dbDisconnect #' @importFrom DBI dbExecute dbExistsTable dbFetch dbGetDBIVersion #' @importFrom DBI dbGetInfo dbGetQuery dbGetRowCount dbGetRowsAffected #' @importFrom DBI dbGetStatement dbHasCompleted dbIsValid #' @importFrom DBI dbListConnections dbListFields dbListTables #' @importFrom DBI dbQuoteIdentifier dbQuoteString dbReadTable dbRemoveTable #' @importFrom DBI dbRollback dbSendQuery dbSendStatement dbSetDataMappings #' @importFrom DBI dbWithTransaction dbWriteTable #' @importFrom DBI SQL NULL DBItest/R/spec-meta.R0000644000176200001440000000047013156254271013761 0ustar liggesusers#' @template dbispec #' @format NULL spec_meta <- c( spec_meta_bind, spec_meta_is_valid, spec_meta_has_completed, spec_meta_get_statement, spec_meta_column_info, spec_meta_get_row_count, spec_meta_get_rows_affected, spec_meta_get_info_result, # no 64-bit or time input data type yet NULL ) DBItest/R/spec-connection.R0000644000176200001440000000022313156254271015166 0ustar liggesusers#' @template dbispec #' @format NULL spec_connection <- c( spec_connection_disconnect, spec_connection_data_type, spec_connection_get_info ) DBItest/R/spec-result-send-statement.R0000644000176200001440000000642313232372770017306 0ustar liggesusers#' spec_result_send_statement #' @usage NULL #' @format NULL #' @keywords NULL spec_result_send_statement <- list( send_statement_formals = function(ctx) { # expect_equal(names(formals(dbSendStatement)), c("conn", "statement", "...")) }, #' @return #' `dbSendStatement()` returns send_statement_trivial = function(ctx) { with_connection({ with_remove_test_table({ res <- expect_visible(dbSendStatement(con, "CREATE TABLE test AS SELECT 1 AS a")) #' an S4 object that inherits from [DBIResult-class]. expect_s4_class(res, "DBIResult") #' The result set can be used with [dbGetRowsAffected()] to #' determine the number of rows affected by the query. expect_error(dbGetRowsAffected(res), NA) #' Once you have finished using a result, make sure to clear it #' with [dbClearResult()]. dbClearResult(res) }) }) }, #' An error is raised when issuing a statement over a closed send_statement_closed_connection = function(ctx) { with_closed_connection({ expect_error(dbSendStatement(con, "CREATE TABLE test AS SELECT 1 AS a")) }) }, #' or invalid connection, send_statement_invalid_connection = function(ctx) { with_invalid_connection({ expect_error(dbSendStatement(con, "CREATE TABLE test AS SELECT 1 AS a")) }) }, #' if the syntax of the statement is invalid, send_statement_syntax_error = function(ctx) { with_connection({ expect_error(dbSendStatement(con, "CREATE")) }) }, #' or if the statement is not a non-`NA` string. send_statement_non_string = function(ctx) { with_connection({ expect_error(dbSendStatement(con, character())) expect_error(dbSendStatement(con, letters)) expect_error(dbSendStatement(con, NA_character_)) }) }, #' @section Specification: send_statement_result_valid = function(ctx) { with_connection({ with_remove_test_table({ #' No warnings occur under normal conditions. expect_warning(res <- dbSendStatement(con, "CREATE TABLE test AS SELECT 1 AS a"), NA) #' When done, the DBIResult object must be cleared with a call to #' [dbClearResult()]. dbClearResult(res) }) }) }, send_statement_stale_warning = function(ctx) { #' Failure to clear the result set leads to a warning #' when the connection is closed. expect_warning( with_connection({ expect_warning(dbSendStatement(con, "SELECT 1"), NA) }) ) }, #' If the backend supports only one open result set per connection, send_statement_only_one_result_set = function(ctx) { with_connection({ with_remove_test_table({ res1 <- dbSendStatement(con, "CREATE TABLE test AS SELECT 1 AS a") with_remove_test_table(name = "test2", { #' issuing a second query invalidates an already open result set #' and raises a warning. expect_warning(res2 <- dbSendStatement(con, "CREATE TABLE test2 AS SELECT 1 AS a")) expect_false(dbIsValid(res1)) #' The newly opened result set is valid expect_true(dbIsValid(res2)) #' and must be cleared with `dbClearResult()`. dbClearResult(res2) }) }) }) }, NULL ) DBItest/R/spec-transaction.R0000644000176200001440000000022513156254271015356 0ustar liggesusers#' @template dbispec #' @format NULL spec_transaction <- c( spec_transaction_begin_commit_rollback, spec_transaction_with_transaction, NULL ) DBItest/R/spec-connection-disconnect.R0000644000176200001440000000202613213331065017307 0ustar liggesusers#' spec_connection_disconnect #' @usage NULL #' @format NULL #' @keywords NULL spec_connection_disconnect <- list( disconnect_formals = function(ctx) { # expect_equal(names(formals(dbDisconnect)), c("conn", "...")) }, #' @return can_disconnect = function(ctx) { con <- connect(ctx) #' `dbDisconnect()` returns `TRUE`, invisibly. expect_invisible_true(dbDisconnect(con)) }, #' @section Specification: #' A warning is issued on garbage collection when a connection has been #' released without calling `dbDisconnect()`, #' but this cannot be tested automatically. #' A warning is issued immediately when calling `dbDisconnect()` on an #' already disconnected disconnect_closed_connection = function(ctx) { with_closed_connection({ expect_warning(dbDisconnect(con)) }) }, #' or invalid connection. disconnect_invalid_connection = function(ctx) { with_invalid_connection({ expect_warning(dbDisconnect(con)) }) }, NULL ) DBItest/R/spec-meta-get-info-result.R0000644000176200001440000000120513232372770017000 0ustar liggesusers#' @template dbispec-sub-wip #' @format NULL #' @section Meta: #' \subsection{`dbGetInfo("DBIResult")` (deprecated)}{ spec_meta_get_info_result <- list( #' Return value of dbGetInfo has necessary elements get_info_result = function(ctx) { with_connection({ res <- dbSendQuery(con, "SELECT 1 as a") info <- dbGetInfo(res) expect_is(info, "list") info_names <- names(info) necessary_names <- c("statement", "row.count", "rows.affected", "has.completed") for (name in necessary_names) { eval(bquote( expect_true(.(name) %in% info_names))) } }) }, #' } NULL ) DBItest/R/test-driver.R0000644000176200001440000000063313156254271014354 0ustar liggesusers#' @name test_all #' @aliases NULL #' @section Tests: #' [test_driver()]: #' Test the "Driver" class NULL #' Test the "Driver" class #' #' @inheritParams test_all #' @include test-getting-started.R #' @family tests #' @importFrom withr with_temp_libpaths #' @export test_driver <- function(skip = NULL, ctx = get_default_context()) { test_suite <- "Driver" run_tests(ctx, spec_driver, skip, test_suite) } DBItest/R/spec-meta-get-statement.R0000644000176200001440000000274413232372770016546 0ustar liggesusers#' spec_meta_get_statement #' @usage NULL #' @format NULL #' @keywords NULL spec_meta_get_statement <- list( get_statement_formals = function(ctx) { # expect_equal(names(formals(dbGetStatement)), c("res", "...")) }, #' @return #' `dbGetStatement()` returns a string, the query used in get_statement_query = function(ctx) { with_connection({ query <- "SELECT 1 as a" with_result( #' either [dbSendQuery()] dbSendQuery(con, query), { s <- dbGetStatement(res) expect_is(s, "character") expect_identical(s, query) } ) }) }, get_statement_statement = function(ctx) { with_connection({ name <- random_table_name() with_connection({ with_remove_test_table(name = name, { query <- paste0("CREATE TABLE ", name, " (a integer)") with_result( #' or [dbSendStatement()]. dbSendQuery(con, query), { s <- dbGetStatement(res) expect_is(s, "character") expect_identical(s, query) } ) }) }) }) }, get_statement_error = function(ctx) { with_connection({ res <- dbSendQuery(con, "SELECT 1") dbClearResult(res) #' Attempting to query the statement for a result set cleared with #' [dbClearResult()] gives an error. expect_error(dbGetStatement(res)) }) }, NULL ) DBItest/R/spec-meta-get-row-count.R0000644000176200001440000000577513232372770016506 0ustar liggesusers#' spec_meta_get_row_count #' @usage NULL #' @format NULL #' @keywords NULL spec_meta_get_row_count <- list( get_row_count_formals = function(ctx) { # expect_equal(names(formals(dbGetRowCount)), c("res", "...")) }, #' @return #' `dbGetRowCount()` returns a scalar number (integer or numeric), #' the number of rows fetched so far. row_count_query = function(ctx) { with_connection({ query <- "SELECT 1 as a" with_result( #' After calling [dbSendQuery()], dbSendQuery(con, query), { rc <- dbGetRowCount(res) #' the row count is initially zero. expect_equal(rc, 0L) #' After a call to [dbFetch()] without limit, check_df(dbFetch(res)) rc <- dbGetRowCount(res) #' the row count matches the total number of rows returned. expect_equal(rc, 1L) } ) }) with_connection({ query <- union(.ctx = ctx, "SELECT 1 as a", "SELECT 2", "SELECT 3") with_result( dbSendQuery(con, query), { rc <- dbGetRowCount(res) expect_equal(rc, 0L) #' Fetching a limited number of rows check_df(dbFetch(res, 2L)) #' increases the number of rows by the number of rows returned, rc <- dbGetRowCount(res) expect_equal(rc, 2L) #' even if fetching past the end of the result set. check_df(dbFetch(res, 2L)) rc <- dbGetRowCount(res) expect_equal(rc, 3L) } ) }) with_connection({ #' For queries with an empty result set, query <- union( .ctx = ctx, "SELECT * FROM (SELECT 1 as a) a WHERE (0 = 1)" ) with_result( dbSendQuery(con, query), { rc <- dbGetRowCount(res) #' zero is returned expect_equal(rc, 0L) check_df(dbFetch(res)) rc <- dbGetRowCount(res) #' even after fetching. expect_equal(rc, 0L) } ) }) }, row_count_statement = function(ctx) { with_connection({ name <- random_table_name() with_remove_test_table(name = name, { query <- paste0("CREATE TABLE ", name, " (a integer)") with_result( #' For data manipulation statements issued with #' [dbSendStatement()], dbSendStatement(con, query), { rc <- dbGetRowCount(res) #' zero is returned before expect_equal(rc, 0L) expect_warning(check_df(dbFetch(res))) rc <- dbGetRowCount(res) #' and after calling `dbFetch()`. expect_equal(rc, 0L) } ) }) }) }, get_row_count_error = function(ctx) { with_connection({ res <- dbSendQuery(con, "SELECT 1") dbClearResult(res) #' Attempting to get the row count for a result set cleared with #' [dbClearResult()] gives an error. expect_error(dbGetRowCount(res)) }) }, NULL ) DBItest/R/spec-result-get-query.R0000644000176200001440000001327413232372770016277 0ustar liggesusers#' spec_result_get_query #' @usage NULL #' @format NULL #' @keywords NULL spec_result_get_query <- list( get_query_formals = function(ctx) { # expect_equal(names(formals(dbGetQuery)), c("conn", "statement", "...")) }, #' @return #' `dbGetQuery()` always returns a [data.frame] #' with as many rows as records were fetched and as many #' columns as fields in the result set, #' even if the result is a single value get_query_atomic = function(ctx) { with_connection({ query <- "SELECT 1 as a" rows <- check_df(dbGetQuery(con, query)) expect_identical(rows, data.frame(a=1L)) }) }, #' or has one get_query_one_row = function(ctx) { with_connection({ query <- "SELECT 1 as a, 2 as b, 3 as c" rows <- check_df(dbGetQuery(con, query)) expect_identical(rows, data.frame(a=1L, b=2L, c=3L)) }) }, #' or zero rows. get_query_zero_rows = function(ctx) { with_connection({ # Not all SQL dialects seem to support the query used here. query <- "SELECT * FROM (SELECT 1 as a, 2 as b, 3 as c) AS x WHERE (1 = 0)" rows <- check_df(dbGetQuery(con, query)) expect_identical(names(rows), letters[1:3]) expect_identical(dim(rows), c(0L, 3L)) }) }, #' An error is raised when issuing a query over a closed get_query_closed_connection = function(ctx) { with_closed_connection({ expect_error(dbGetQuery(con, "SELECT 1")) }) }, #' or invalid connection, get_query_invalid_connection = function(ctx) { with_invalid_connection({ expect_error(dbGetQuery(con, "SELECT 1")) }) }, #' if the syntax of the query is invalid, get_query_syntax_error = function(ctx) { with_connection({ expect_error(dbGetQuery(con, "SELECT")) }) }, #' or if the query is not a non-`NA` string. get_query_non_string = function(ctx) { with_connection({ expect_error(dbGetQuery(con, character())) expect_error(dbGetQuery(con, letters)) expect_error(dbGetQuery(con, NA_character_)) }) }, #' If the `n` argument is not an atomic whole number #' greater or equal to -1 or Inf, an error is raised, get_query_n_bad = function(ctx) { with_connection({ query <- "SELECT 1 as a" expect_error(dbGetQuery(con, query, -2)) expect_error(dbGetQuery(con, query, 1.5)) expect_error(dbGetQuery(con, query, integer())) expect_error(dbGetQuery(con, query, 1:3)) expect_error(dbGetQuery(con, query, NA_integer_)) }) }, #' but a subsequent call to `dbGetQuery()` with proper `n` argument succeeds. get_query_good_after_bad_n = function(ctx) { with_connection({ query <- "SELECT 1 as a" expect_error(dbGetQuery(con, query, NA_integer_)) rows <- check_df(dbGetQuery(con, query)) expect_identical(rows, data.frame(a = 1L)) }) }, #' @section Additional arguments: #' The following arguments are not part of the `dbGetQuery()` generic #' (to improve compatibility across backends) #' but are part of the DBI specification: #' - `n` (default: -1) #' - `params` (TBD) #' #' They must be provided as named arguments. #' See the "Specification" and "Value" sections for details on their usage. #' @section Specification: #' Fetching multi-row queries with one get_query_multi_row_single_column = function(ctx) { with_connection({ query <- union( .ctx = ctx, paste("SELECT", 1:3, "AS a"), .order_by = "a") rows <- check_df(dbGetQuery(con, query)) expect_identical(rows, data.frame(a = 1:3)) }) }, #' or more columns be default returns the entire result. get_query_multi_row_multi_column = function(ctx) { with_connection({ query <- union( .ctx = ctx, paste("SELECT", 1:5, "AS a,", 4:0, "AS b"), .order_by = "a") rows <- check_df(dbGetQuery(con, query)) expect_identical(rows, data.frame(a = 1:5, b = 4:0)) }) }, #' A value of [Inf] for the `n` argument is supported #' and also returns the full result. get_query_n_multi_row_inf = function(ctx) { with_connection({ query <- union( .ctx = ctx, paste("SELECT", 1:3, "AS a"), .order_by = "a") rows <- check_df(dbGetQuery(con, query, n = Inf)) expect_identical(rows, data.frame(a = 1:3)) }) }, #' If more rows than available are fetched, the result is returned in full #' without warning. get_query_n_more_rows = function(ctx) { with_connection({ query <- union( .ctx = ctx, paste("SELECT", 1:3, "AS a"), .order_by = "a") rows <- check_df(dbGetQuery(con, query, n = 5L)) expect_identical(rows, data.frame(a = 1:3)) }) }, #' If zero rows are fetched, the columns of the data frame are still fully #' typed. get_query_n_zero_rows = function(ctx) { with_connection({ query <- union( .ctx = ctx, paste("SELECT", 1:3, "AS a"), .order_by = "a") rows <- check_df(dbGetQuery(con, query, n = 0L)) expect_identical(rows, data.frame(a=integer())) }) }, #' Fetching fewer rows than available is permitted, #' no warning is issued. get_query_n_incomplete = function(ctx) { with_connection({ query <- union( .ctx = ctx, paste("SELECT", 1:3, "AS a"), .order_by = "a") rows <- check_df(dbGetQuery(con, query, n = 2L)) expect_identical(rows, data.frame(a = 1:2)) }) }, #' #' A column named `row_names` is treated like any other column. get_query_row_names = function(ctx) { with_connection({ query <- "SELECT 1 AS row_names" rows <- check_df(dbGetQuery(con, query)) expect_identical(rows, data.frame(row_names = 1L)) expect_identical(.row_names_info(rows), -1L) }) }, NULL ) DBItest/R/spec-meta-has-completed.R0000644000176200001440000000527613232372770016515 0ustar liggesusers#' spec_meta_has_completed #' @usage NULL #' @format NULL #' @keywords NULL spec_meta_has_completed <- list( has_completed_formals = function(ctx) { # expect_equal(names(formals(dbHasCompleted)), c("res", "...")) }, #' @return #' `dbHasCompleted()` returns a logical scalar. has_completed_query = function(ctx) { with_connection({ #' For a query initiated by [dbSendQuery()] with non-empty result set, with_result( dbSendQuery(con, "SELECT 1"), { #' `dbHasCompleted()` returns `FALSE` initially expect_false(expect_visible(dbHasCompleted(res))) #' and `TRUE` after calling [dbFetch()] without limit. check_df(dbFetch(res)) expect_true(expect_visible(dbHasCompleted(res))) } ) }) }, has_completed_statement = function(ctx) { with_connection({ name <- random_table_name() with_remove_test_table(name = name, { #' For a query initiated by [dbSendStatement()], with_result( dbSendQuery(con, paste0("CREATE TABLE ", name, " (a integer)")), { #' `dbHasCompleted()` always returns `TRUE`. expect_true(expect_visible(dbHasCompleted(res))) } ) }) }) }, has_completed_error = function(ctx) { with_connection({ res <- dbSendQuery(con, "SELECT 1") dbClearResult(res) #' Attempting to query completion status for a result set cleared with #' [dbClearResult()] gives an error. expect_error(dbHasCompleted(res)) }) }, #' @section Specification: has_completed_query_spec = function(ctx) { with_connection({ #' The completion status for a query is only guaranteed to be set to #' `FALSE` after attempting to fetch past the end of the entire result. #' Therefore, for a query with an empty result set, with_result( dbSendQuery(con, "SELECT * FROM (SELECT 1 as a) AS x WHERE (1 = 0)"), { #' the initial return value is unspecified, #' but the result value is `TRUE` after trying to fetch only one row. check_df(dbFetch(res, 1)) expect_true(expect_visible(dbHasCompleted(res))) } ) #' Similarly, for a query with a result set of length n, with_result( dbSendQuery(con, "SELECT 1"), { #' the return value is unspecified after fetching n rows, check_df(dbFetch(res, 1)) #' but the result value is `TRUE` after trying to fetch only one more #' row. check_df(dbFetch(res, 1)) expect_true(expect_visible(dbHasCompleted(res))) } ) }) }, NULL ) DBItest/R/spec-sql.R0000644000176200001440000000037413207026070013624 0ustar liggesusers#' @template dbispec #' @format NULL spec_sql <- c( spec_sql_quote_string, spec_sql_quote_identifier, spec_sql_read_table, spec_sql_write_table, spec_sql_list_tables, spec_sql_exists_table, spec_sql_remove_table, spec_sql_list_fields ) DBItest/R/spec-result-send-query.R0000644000176200001440000000532713232372770016451 0ustar liggesusers#' spec_result_send_query #' @usage NULL #' @format NULL #' @keywords NULL spec_result_send_query <- list( send_query_formals = function(ctx) { # expect_equal(names(formals(dbSendQuery)), c("conn", "statement", "...")) }, #' @return #' `dbSendQuery()` returns send_query_trivial = function(ctx) { with_connection({ res <- expect_visible(dbSendQuery(con, "SELECT 1")) #' an S4 object that inherits from [DBIResult-class]. expect_s4_class(res, "DBIResult") #' The result set can be used with [dbFetch()] to extract records. expect_equal(check_df(dbFetch(res))[[1]], 1) #' Once you have finished using a result, make sure to clear it #' with [dbClearResult()]. dbClearResult(res) }) }, #' An error is raised when issuing a query over a closed send_query_closed_connection = function(ctx) { with_closed_connection({ expect_error(dbSendQuery(con, "SELECT 1")) }) }, #' or invalid connection, send_query_invalid_connection = function(ctx) { with_invalid_connection({ expect_error(dbSendQuery(con, "SELECT 1")) }) }, #' if the syntax of the query is invalid, send_query_syntax_error = function(ctx) { with_connection({ expect_error(dbSendQuery(con, "SELECT")) }) }, #' or if the query is not a non-`NA` string. send_query_non_string = function(ctx) { with_connection({ expect_error(dbSendQuery(con, character())) expect_error(dbSendQuery(con, letters)) expect_error(dbSendQuery(con, NA_character_)) }) }, #' @section Specification: send_query_result_valid = function(ctx) { with_connection({ #' No warnings occur under normal conditions. expect_warning(res <- dbSendQuery(con, "SELECT 1"), NA) #' When done, the DBIResult object must be cleared with a call to #' [dbClearResult()]. dbClearResult(res) }) }, send_query_stale_warning = function(ctx) { #' Failure to clear the result set leads to a warning #' when the connection is closed. expect_warning( with_connection({ dbSendQuery(con, "SELECT 1") }) ) }, #' #' If the backend supports only one open result set per connection, send_query_only_one_result_set = function(ctx) { with_connection({ res1 <- dbSendQuery(con, "SELECT 1") #' issuing a second query invalidates an already open result set #' and raises a warning. expect_warning(res2 <- dbSendQuery(con, "SELECT 2")) expect_false(dbIsValid(res1)) #' The newly opened result set is valid expect_true(dbIsValid(res2)) #' and must be cleared with `dbClearResult()`. dbClearResult(res2) }) }, NULL ) DBItest/R/spec-meta-bind-runner.R0000644000176200001440000000712113232372770016202 0ustar liggesusersrun_bind_tester <- list() #' spec_meta_bind #' @name spec_meta_bind #' @usage NULL #' @format NULL #' @keywords NULL #' @section Specification: #' \pkg{DBI} clients execute parametrized statements as follows: #' run_bind_tester$fun <- function() { if ((extra_obj$requires_names() %in% TRUE) && is.null(names(placeholder))) { # test only valid for named placeholders return() } if ((extra_obj$requires_names() %in% FALSE) && !is.null(names(placeholder))) { # test only valid for unnamed placeholders return() } #' 1. Call [dbSendQuery()] or [dbSendStatement()] with a query or statement #' that contains placeholders, #' store the returned [DBIResult-class] object in a variable. #' Mixing placeholders (in particular, named and unnamed ones) is not #' recommended. if (is_query()) res <- send_query() else res <- send_statement() #' It is good practice to register a call to [dbClearResult()] via #' [on.exit()] right after calling `dbSendQuery()` or `dbSendStatement()` #' (see the last enumeration item). if (extra_obj$is_premature_clear()) dbClearResult(res) else on.exit(expect_error(dbClearResult(res), NA)) #' Until `dbBind()` has been called, the returned result set object has the #' following behavior: #' - [dbFetch()] raises an error (for `dbSendQuery()`) if (is_query()) expect_error(dbFetch(res)) #' - [dbGetRowCount()] returns zero (for `dbSendQuery()`) if (is_query()) expect_equal(dbGetRowCount(res), 0) #' - [dbGetRowsAffected()] returns an integer `NA` (for `dbSendStatement()`) if (!is_query()) expect_identical(dbGetRowsAffected(res), NA_integer_) #' - [dbIsValid()] returns `TRUE` expect_true(dbIsValid(res)) #' - [dbHasCompleted()] returns `FALSE` expect_false(dbHasCompleted(res)) #' 1. Construct a list with parameters #' that specify actual values for the placeholders. bind_values <- values #' The list must be named or unnamed, #' depending on the kind of placeholders used. #' Named values are matched to named parameters, unnamed values #' are matched by position in the list of parameters. if (!is.null(names(placeholder))) { names(bind_values) <- names(placeholder) } #' All elements in this list must have the same lengths and contain values #' supported by the backend; a [data.frame] is internally stored as such #' a list. #' The parameter list is passed to a call to `dbBind()` on the `DBIResult` #' object. bind(res, bind_values) # Safety net: returning early if dbBind() should have thrown an error but # didn't if (!identical(bind_values, extra_obj$patch_bind_values(bind_values))) return() if (extra_obj$is_premature_clear()) return() #' 1. Retrieve the data or the number of affected rows from the `DBIResult` object. retrieve <- function() { #' - For queries issued by `dbSendQuery()`, #' call [dbFetch()]. if (is_query()) { rows <- check_df(dbFetch(res)) compare(rows, values) } else { #' - For statements issued by `dbSendStatements()`, #' call [dbGetRowsAffected()]. #' (Execution begins immediately after the `dbBind()` call, #' the statement is processed entirely before the function returns.) rows_affected <- dbGetRowsAffected(res) compare_affected(rows_affected, values) } } if (!extra_obj$is_untouched()) retrieve() #' 1. Repeat 2. and 3. as necessary. if (extra_obj$is_repeated()) { bind(res, bind_values) retrieve() } #' 1. Close the result set via [dbClearResult()]. } DBItest/R/spec-meta-is-valid.R0000644000176200001440000000417013232372770015470 0ustar liggesusers#' spec_meta_is_valid #' @usage NULL #' @format NULL #' @keywords NULL spec_meta_is_valid <- list( is_valid_formals = function(ctx) { # expect_equal(names(formals(dbIsValid)), c("dbObj", "...")) }, #' @return #' `dbIsValid()` returns a logical scalar, #' `TRUE` if the object specified by `dbObj` is valid, #' `FALSE` otherwise. is_valid_connection = function(ctx) { con <- connect(ctx) #' A [DBIConnection-class] object is initially valid, expect_true(expect_visible(dbIsValid(con))) expect_error(dbDisconnect(con), NA) #' and becomes invalid after disconnecting with [dbDisconnect()]. expect_false(expect_visible(dbIsValid(con))) }, is_valid_result_query = function(ctx) { with_connection({ query <- "SELECT 1 as a" res <- dbSendQuery(con, query) #' A [DBIResult-class] object is valid after a call to [dbSendQuery()], expect_true(expect_visible(dbIsValid(res))) expect_error(dbFetch(res), NA) #' and stays valid even after all rows have been fetched; expect_true(expect_visible(dbIsValid(res))) dbClearResult(res) #' only clearing it with [dbClearResult()] invalidates it. expect_false(dbIsValid(res)) }) }, is_valid_result_statement = function(ctx) { with_connection({ with_remove_test_table({ query <- paste0("CREATE TABLE test (a ", dbDataType(con, 1L), ")") res <- dbSendStatement(con, query) #' A [DBIResult-class] object is also valid after a call to [dbSendStatement()], expect_true(expect_visible(dbIsValid(res))) #' and stays valid after querying the number of rows affected; expect_error(dbGetRowsAffected(res), NA) expect_true(expect_visible(dbIsValid(res))) dbClearResult(res) #' only clearing it with [dbClearResult()] invalidates it. expect_false(dbIsValid(res)) }) }) }, #' If the connection to the database system is dropped (e.g., due to #' connectivity problems, server failure, etc.), `dbIsValid()` should return #' `FALSE`. This is not tested automatically. NULL ) DBItest/R/spec-driver-get-info.R0000644000176200001440000000102113156254271016025 0ustar liggesusers#' @template dbispec-sub-wip #' @format NULL #' @section Driver: #' \subsection{`dbGetInfo("DBIDriver")` (deprecated)}{ spec_driver_get_info <- list( #' Return value of dbGetInfo has necessary elements. get_info_driver = function(ctx) { info <- dbGetInfo(ctx$drv) expect_is(info, "list") info_names <- names(info) necessary_names <- c("driver.version", "client.version") for (name in necessary_names) { eval(bquote( expect_true(.(name) %in% info_names))) } }, #' } NULL ) DBItest/R/spec-result-clear-result.R0000644000176200001440000000355113232372770016754 0ustar liggesusers#' spec_result_clear_result #' @usage NULL #' @format NULL #' @keywords NULL spec_result_clear_result <- list( clear_result_formals = function(ctx) { # expect_equal(names(formals(dbClearResult)), c("res", "...")) }, #' @return #' `dbClearResult()` returns `TRUE`, invisibly, for result sets obtained from #' both `dbSendQuery()` clear_result_return_query = function(ctx) { with_connection({ res <- dbSendQuery(con, "SELECT 1") expect_invisible_true(dbClearResult(res)) }) }, #' and `dbSendStatement()`. clear_result_return_statement = function(ctx) { with_connection({ table_name <- random_table_name() with_remove_test_table(name = table_name, { res <- dbSendStatement(con, paste0("CREATE TABLE ", table_name , " AS SELECT 1")) expect_invisible_true(dbClearResult(res)) }) }) }, #' An attempt to close an already closed result set issues a warning cannot_clear_result_twice_query = function(ctx) { with_connection({ res <- dbSendQuery(con, "SELECT 1") dbClearResult(res) expect_warning(expect_invisible_true(dbClearResult(res))) }) }, #' in both cases. cannot_clear_result_twice_statement = function(ctx) { table_name <- random_table_name() with_connection({ with_remove_test_table( name = table_name, { res <- dbSendStatement(con, paste0("CREATE TABLE ", table_name , " AS SELECT 1")) dbClearResult(res) expect_warning(expect_invisible_true(dbClearResult(res))) }) }) }, #' @section Specification: #' `dbClearResult()` frees all resources associated with retrieving #' the result of a query or update operation. #' The DBI backend can expect a call to `dbClearResult()` for each #' [dbSendQuery()] or [dbSendStatement()] call. NULL ) DBItest/R/spec-driver-data-type.R0000644000176200001440000000650513232372770016221 0ustar liggesusers#' spec_driver_data_type #' @usage NULL #' @format NULL #' @keywords NULL #' @inherit test_data_type spec_driver_data_type <- list( data_type_formals = function(ctx) { # expect_equal(names(formals(dbDataType)), c("dbObj", "obj", "...")) }, data_type_driver = function(ctx) { test_data_type(ctx, ctx$drv) }, NULL ) #' test_data_type #' @param ctx,dbObj Arguments to internal test function test_data_type <- function(ctx, dbObj) { #' @return #' `dbDataType()` returns the SQL type that corresponds to the `obj` argument check_data_type <- function(value) { eval(bquote({ #' as a non-empty expect_match(dbDataType(dbObj, .(value)), ".") #' character string. if (!is.data.frame(value)) { expect_equal(length(dbDataType(dbObj, .(value))), 1L) } else { #' For data frames, a character vector with one element per column #' is returned. expect_equal(length(dbDataType(dbObj, value)), .(ncol(value))) } expect_is(dbDataType(dbObj, .(value)), "character") expect_visible(dbDataType(dbObj, .(value))) })) } #' An error is raised for invalid values for the `obj` argument such as a #' `NULL` value. expect_error(dbDataType(dbObj, NULL)) #' @section Specification: #' The backend can override the [dbDataType()] generic #' for its driver class. #' #' This generic expects an arbitrary object as second argument. #' To query the values returned by the default implementation, #' run `example(dbDataType, package = "DBI")`. #' If the backend needs to override this generic, #' it must accept all basic R data types as its second argument, namely expect_has_data_type <- function(value) { eval(bquote( expect_error(check_data_type(.(value)), NA))) } expected_data_types <- list( #' [logical], logical(1), #' [integer], integer(1), #' [numeric], numeric(1), #' [character], character(1), #' dates (see [Dates]), Sys.Date(), #' date-time (see [DateTimeClasses]), Sys.time(), #' and [difftime]. Sys.time() - Sys.time(), #' If the database supports blobs, if (!isTRUE(ctx$tweaks$omit_blob_tests)) { #' this method also must accept lists of [raw] vectors, list(as.raw(1:10)) }, if (!isTRUE(ctx$tweaks$omit_blob_tests)) { #' and [blob::blob] objects. blob::blob(as.raw(1:10)) } ) lapply( compact(expected_data_types), expect_has_data_type ) expect_has_data_type(data.frame(a = 1, b = "2", stringsAsFactors = FALSE)) #' As-is objects (i.e., wrapped by [I()]) must be #' supported and return the same results as their unwrapped counterparts. lapply( compact(expected_data_types), function(value) { if (!is.null(value)) { eval(bquote( expect_error( expect_identical(dbDataType(dbObj, I(.(value))), dbDataType(dbObj, .(value))), NA))) } } ) #' The SQL data type for [factor] expect_identical(dbDataType(dbObj, letters), dbDataType(dbObj, factor(letters))) #' and [ordered] is the same as for character. expect_identical(dbDataType(dbObj, letters), dbDataType(dbObj, ordered(letters))) #' The behavior for other object types is not specified. } DBItest/R/spec-driver.R0000644000176200001440000000026013156254271014323 0ustar liggesusers#' @template dbispec #' @format NULL spec_driver <- c( spec_driver_class, spec_driver_constructor, spec_driver_data_type, spec_driver_get_info, spec_driver_connect ) DBItest/R/spec-result-fetch.R0000644000176200001440000001574613232372770015454 0ustar liggesusers#' spec_result_fetch #' @usage NULL #' @format NULL #' @keywords NULL spec_result_fetch <- list( fetch_formals = function(ctx) { # expect_equal(names(formals(dbFetch)), c("res", "n", "...")) }, #' @return #' `dbFetch()` always returns a [data.frame] #' with as many rows as records were fetched and as many #' columns as fields in the result set, #' even if the result is a single value fetch_atomic = function(ctx) { with_connection({ query <- "SELECT 1 as a" with_result( dbSendQuery(con, query), { rows <- check_df(dbFetch(res)) expect_identical(rows, data.frame(a = 1L)) } ) }) }, #' or has one fetch_one_row = function(ctx) { with_connection({ query <- "SELECT 1 as a, 2 as b, 3 as c" with_result( dbSendQuery(con, query), { rows <- check_df(dbFetch(res)) expect_identical(rows, data.frame(a = 1L, b = 2L, c = 3L)) } ) }) }, #' or zero rows. fetch_zero_rows = function(ctx) { with_connection({ query <- "SELECT * FROM (SELECT 1 as a, 2 as b, 3 as c) AS x WHERE (1 = 0)" with_result( dbSendQuery(con, query), { rows <- check_df(dbFetch(res)) expect_identical(class(rows), "data.frame") } ) }) }, #' An attempt to fetch from a closed result set raises an error. fetch_closed = function(ctx) { with_connection({ query <- "SELECT 1" res <- dbSendQuery(con, query) dbClearResult(res) expect_error(dbFetch(res)) }) }, #' If the `n` argument is not an atomic whole number #' greater or equal to -1 or Inf, an error is raised, fetch_n_bad = function(ctx) { with_connection({ query <- "SELECT 1 as a" with_result( dbSendQuery(con, query), { expect_error(dbFetch(res, -2)) expect_error(dbFetch(res, 1.5)) expect_error(dbFetch(res, integer())) expect_error(dbFetch(res, 1:3)) expect_error(dbFetch(res, NA_integer_)) } ) }) }, #' but a subsequent call to `dbFetch()` with proper `n` argument succeeds. fetch_n_good_after_bad = function(ctx) { with_connection({ query <- "SELECT 1 as a" with_result( dbSendQuery(con, query), { expect_error(dbFetch(res, NA_integer_)) rows <- check_df(dbFetch(res)) expect_identical(rows, data.frame(a = 1L)) } ) }) }, #' Calling `dbFetch()` on a result set from a data manipulation query #' created by [dbSendStatement()] #' can be fetched and return an empty data frame, with a warning. fetch_no_return_value = function(ctx) { with_connection({ query <- "CREATE TABLE test (a integer)" with_remove_test_table({ with_result( dbSendStatement(con, query), { expect_warning(rows <- check_df(dbFetch(res))) expect_identical(rows, data.frame()) } ) }) }) }, #' @section Specification: #' Fetching multi-row queries with one fetch_multi_row_single_column = function(ctx) { with_connection({ query <- union( .ctx = ctx, paste("SELECT", 1:3, "AS a"), .order_by = "a") with_result( dbSendQuery(con, query), { rows <- check_df(dbFetch(res)) expect_identical(rows, data.frame(a = 1:3)) } ) }) }, #' or more columns be default returns the entire result. fetch_multi_row_multi_column = function(ctx) { with_connection({ query <- union( .ctx = ctx, paste("SELECT", 1:5, "AS a,", 4:0, "AS b"), .order_by = "a") with_result( dbSendQuery(con, query), { rows <- check_df(dbFetch(res)) expect_identical(rows, data.frame(a = 1:5, b = 4:0)) } ) }) }, #' Multi-row queries can also be fetched progressively fetch_n_progressive = function(ctx) { with_connection({ query <- union( .ctx = ctx, paste("SELECT", 1:25, "AS a"), .order_by = "a") with_result( dbSendQuery(con, query), { #' by passing a whole number ([integer] rows <- check_df(dbFetch(res, 10L)) expect_identical(rows, data.frame(a = 1L:10L)) #' or [numeric]) rows <- check_df(dbFetch(res, 10)) expect_identical(rows, data.frame(a = 11L:20L)) #' as the `n` argument. rows <- check_df(dbFetch(res, n = 5)) expect_identical(rows, data.frame(a = 21L:25L)) } ) }) }, #' A value of [Inf] for the `n` argument is supported #' and also returns the full result. fetch_n_multi_row_inf = function(ctx) { with_connection({ query <- union( .ctx = ctx, paste("SELECT", 1:3, "AS a"), .order_by = "a") with_result( dbSendQuery(con, query), { rows <- check_df(dbFetch(res, n = Inf)) expect_identical(rows, data.frame(a = 1:3)) } ) }) }, #' If more rows than available are fetched, the result is returned in full #' without warning. fetch_n_more_rows = function(ctx) { with_connection({ query <- union( .ctx = ctx, paste("SELECT", 1:3, "AS a"), .order_by = "a") with_result( dbSendQuery(con, query), { rows <- check_df(dbFetch(res, 5L)) expect_identical(rows, data.frame(a = 1:3)) #' If fewer rows than requested are returned, further fetches will #' return a data frame with zero rows. rows <- check_df(dbFetch(res)) expect_identical(rows, data.frame(a = integer())) } ) }) }, #' If zero rows are fetched, the columns of the data frame are still fully #' typed. fetch_n_zero_rows = function(ctx) { with_connection({ query <- union( .ctx = ctx, paste("SELECT", 1:3, "AS a"), .order_by = "a") with_result( dbSendQuery(con, query), { rows <- check_df(dbFetch(res, 0L)) expect_identical(rows, data.frame(a = integer())) } ) }) }, #' Fetching fewer rows than available is permitted, #' no warning is issued when clearing the result set. fetch_n_premature_close = function(ctx) { with_connection({ query <- union( .ctx = ctx, paste("SELECT", 1:3, "AS a"), .order_by = "a") with_result( dbSendQuery(con, query), { rows <- check_df(dbFetch(res, 2L)) expect_identical(rows, data.frame(a = 1:2)) } ) }) }, #' #' A column named `row_names` is treated like any other column. fetch_row_names = function(ctx) { with_connection({ query <- "SELECT 1 AS row_names" with_result( dbSendQuery(con, query), { rows <- check_df(dbFetch(res)) expect_identical(rows, data.frame(row_names = 1L)) expect_identical(.row_names_info(rows), -1L) } ) }) }, NULL ) DBItest/R/spec-driver-connect.R0000644000176200001440000000231413156254271015754 0ustar liggesusers#' spec_driver_connect #' @usage NULL #' @format NULL #' @keywords NULL spec_driver_connect <- list( connect_formals = function(ctx) { # expect_equal(names(formals(dbConnect)), c("drv", "...")) }, #' @return can_connect = function(ctx) { con <- expect_visible(connect(ctx)) #' `dbConnect()` returns an S4 object that inherits from [DBIConnection-class]. expect_s4_class(con, "DBIConnection") dbDisconnect(con) #' This object is used to communicate with the database engine. }, #' @section Specification: #' DBI recommends using the following argument names for authentication #' parameters, with `NULL` default: #' - `user` for the user name (default: current user) #' - `password` for the password #' - `host` for the host name (default: local connection) #' - `port` for the port number (default: local connection) #' - `dbname` for the name of the database on the host, or the database file #' name #' #' The defaults should provide reasonable behavior, in particular a #' local connection for `host = NULL`. For some DBMS (e.g., PostgreSQL), #' this is different to a TCP/IP connection to `localhost`. NULL ) DBItest/R/spec-sql-list-fields.R0000644000176200001440000000147213156254271016052 0ustar liggesusers#' @template dbispec-sub-wip #' @format NULL #' @section SQL: #' \subsection{`dbListFields("DBIConnection")`}{ spec_sql_list_fields <- list( #' Can list the fields for a table in the database. list_fields = function(ctx) { with_connection({ with_remove_test_table(name = "iris", { iris <- get_iris(ctx) dbWriteTable(con, "iris", iris) fields <- dbListFields(con, "iris") expect_identical(fields, names(iris)) }) }) }, #' #' A column named `row_names` is treated like any other column. list_fields_row_names = function(ctx) { with_connection({ with_remove_test_table({ dbWriteTable(con, "test", data.frame(a = 1L, row_names = 2L)) expect_identical(dbListFields(con, "test"), c("a", "row_names")) }) }) }, #' } NULL ) DBItest/R/run.R0000644000176200001440000000375413161005164012706 0ustar liggesusersrun_tests <- function(ctx, tests, skip, test_suite) { if (is.null(ctx)) { stop("Need to call make_context() to use the test_...() functions.", call. = FALSE) } if (!inherits(ctx, "DBItest_context")) { stop("ctx must be a DBItest_context object created by make_context().", call. = FALSE) } test_context <- paste0( "DBItest", if(!is.null(ctx$name)) paste0("[", ctx$name, "]"), ": ", test_suite) context(test_context) tests <- tests[!vapply(tests, is.null, logical(1L))] skipped <- get_skip_names(skip) skip_flag <- names(tests) %in% skipped ok <- vapply(seq_along(tests), function(test_idx) { test_name <- names(tests)[[test_idx]] if (skip_flag[[test_idx]]) FALSE else { test_fun <- patch_test_fun(tests[[test_name]], paste0(test_context, ": ", test_name)) test_fun(ctx) } }, logical(1L)) if (any(skip_flag)) { test_that(paste0(test_context, ": skipped tests"), { skip(paste0("by request: ", paste(names(tests)[skip_flag], collapse = ", "))) }) } ok } get_skip_names <- function(skip) { if (length(skip) == 0L) return(character()) names_all <- names(spec_all) names_all <- names_all[names_all != ""] skip_flags_all <- lapply(paste0("(?:^", skip, "$)"), grepl, names_all, perl = TRUE) skip_used <- vapply(skip_flags_all, any, logical(1L)) if (!all(skip_used)) { warning("Unused skip expressions: ", paste(skip[!skip_used], collapse = ", "), call. = FALSE) } skip_flag_all <- Reduce(`|`, skip_flags_all) skip_tests <- names_all[skip_flag_all] skip_tests } patch_test_fun <- function(test_fun, desc) { body_of_test_fun <- wrap_all_statements_with_expect_no_warning(body(test_fun)) eval(bquote( function(ctx) { test_that(.(desc), .(body_of_test_fun)) } )) } wrap_all_statements_with_expect_no_warning <- function(block) { stopifnot(identical(block[[1]], quote(`{`))) block[-1] <- lapply(block[-1], function(x) eval(bquote(quote(expect_warning(.(x), NA))))) block } DBItest/R/spec-getting-started.R0000644000176200001440000000154113232405526016134 0ustar liggesusers#' @template dbispec #' @format NULL #' @importFrom desc desc_get_deps #' @section Definition: spec_getting_started <- list( package_dependencies = function(ctx) { #' A DBI backend is an R package pkg_path <- get_pkg_path(ctx) pkg_deps_df <- desc_get_deps(pkg_path) pkg_imports <- pkg_deps_df[pkg_deps_df[["type"]] == "Imports", ][["package"]] #' which imports the \pkg{DBI} expect_true("DBI" %in% pkg_imports) #' and \pkg{methods} expect_true("methods" %in% pkg_imports) #' packages. }, package_name = function(ctx) { pkg_name <- package_name(ctx) #' For better or worse, the names of many existing backends start with #' \sQuote{R}, e.g., \pkg{RSQLite}, \pkg{RMySQL}, \pkg{RSQLServer}; it is up #' to the backend author to adopt this convention or not. expect_match(pkg_name, "^R") }, NULL ) DBItest/R/spec-compliance-methods.R0000644000176200001440000000527513232372770016616 0ustar liggesusers#' @template dbispec-sub #' @format NULL #' @section DBI classes and methods: spec_compliance_methods <- list( #' A backend defines three classes, compliance = function(ctx) { #' which are subclasses of expect_identical( names(key_methods), c( #' [DBIDriver-class], "Driver", #' [DBIConnection-class], "Connection", #' and [DBIResult-class]. "Result" ) ) pkg <- package_name(ctx) where <- asNamespace(pkg) sapply(names(key_methods), function(name) { dbi_class <- paste0("DBI", name) classes <- Filter(function(class) { extends(class, dbi_class) && getClass(class)@virtual == FALSE }, getClasses(where)) expect_equal(length(classes), 1) class <- classes[[1]] #' The backend provides implementation for all methods #' of these base classes #' that are defined but not implemented by DBI. mapply(function(method, args) { expect_has_class_method(method, class, args, where) }, names(key_methods[[name]]), key_methods[[name]]) }) }, #' All methods have an ellipsis `...` in their formals. ellipsis = function(ctx) { pkg <- package_name(ctx) where <- asNamespace(pkg) methods <- s4_methods(where, function(x) x == "DBI") Map(expect_ellipsis_in_formals, methods, names(methods)) }, NULL ) # Helpers ----------------------------------------------------------------- #' @importFrom methods hasMethod expect_has_class_method <- function(name, class, args, driver_package) { full_args <- c(class, args) eval(bquote( expect_true(hasMethod(.(name), .(full_args), driver_package)) )) } expect_ellipsis_in_formals <- function(method, name) { sym <- as.name(name) eval(bquote({ .(sym) <- method expect_true("..." %in% s4_real_argument_names(.(sym))) })) } key_methods <- list( Driver = list( "dbGetInfo" = NULL, "dbConnect" = NULL, "dbDataType" = NULL ), Connection = list( "dbDisconnect" = NULL, "dbGetInfo" = NULL, "dbSendQuery" = "character", "dbListFields" = "character", "dbListTables" = NULL, "dbReadTable" = "character", "dbWriteTable" = c("character", "data.frame"), "dbExistsTable" = "character", "dbRemoveTable" = "character", "dbBegin" = NULL, "dbCommit" = NULL, "dbRollback" = NULL, "dbIsValid" = NULL, "dbQuoteString" = "character", "dbQuoteIdentifier" = "character" ), Result = list( "dbIsValid" = NULL, "dbFetch" = NULL, "dbClearResult" = NULL, "dbColumnInfo" = NULL, "dbGetRowsAffected" = NULL, "dbGetRowCount" = NULL, "dbHasCompleted" = NULL, "dbGetStatement" = NULL, "dbBind" = NULL ) ) DBItest/R/spec-sql-read-table.R0000644000176200001440000002367113213312261015624 0ustar liggesusers#' spec_sql_read_table #' @usage NULL #' @format NULL #' @keywords NULL spec_sql_read_table <- list( read_table_formals = function(ctx) { # expect_equal(names(formals(dbReadTable)), c("conn", "name", "...")) }, #' @return #' `dbReadTable()` returns a data frame that contains the complete data #' from the remote table, effectively the result of calling [dbGetQuery()] #' with `SELECT * FROM `. read_table = function(ctx) { with_connection({ with_remove_test_table(name = "iris", { iris_in <- get_iris(ctx) dbWriteTable(con, "iris", iris_in) iris_out <- check_df(dbReadTable(con, "iris")) expect_equal_df(iris_out, iris_in) }) }) }, #' An error is raised if the table does not exist. read_table_missing = function(ctx) { with_connection({ with_remove_test_table({ expect_error(dbReadTable(con, "test")) }) }) }, #' An empty table is returned as a data frame with zero rows. read_table_empty = function(ctx) { with_connection({ with_remove_test_table(name = "iris", { iris_in <- get_iris(ctx)[integer(), ] dbWriteTable(con, "iris", iris_in) iris_out <- check_df(dbReadTable(con, "iris")) expect_equal(nrow(iris_out), 0L) expect_equal_df(iris_out, iris_in) }) }) }, #' #' The presence of [rownames] depends on the `row.names` argument, #' see [sqlColumnToRownames()] for details: read_table_row_names_false = function(ctx) { #' - If `FALSE` or `NULL`, the returned data frame doesn't have row names. for (row.names in list(FALSE, NULL)) { with_connection({ with_remove_test_table(name = "mtcars", { mtcars_in <- datasets::mtcars dbWriteTable(con, "mtcars", mtcars_in, row.names = TRUE) mtcars_out <- check_df(dbReadTable(con, "mtcars", row.names = row.names)) expect_true("row_names" %in% names(mtcars_out)) expect_true(all(mtcars_out$row_names %in% rownames(mtcars_in))) expect_true(all(rownames(mtcars_in) %in% mtcars_out$row_names)) expect_equal_df(mtcars_out[names(mtcars_out) != "row_names"], unrowname(mtcars_in)) }) }) } }, read_table_row_names_true_exists = function(ctx) { #' - If `TRUE`, a column named "row_names" is converted to row names, row.names <- TRUE with_connection({ with_remove_test_table(name = "mtcars", { mtcars_in <- datasets::mtcars dbWriteTable(con, "mtcars", mtcars_in, row.names = NA) mtcars_out <- check_df(dbReadTable(con, "mtcars", row.names = row.names)) expect_equal_df(mtcars_out, mtcars_in) }) }) }, read_table_row_names_true_missing = function(ctx) { #' an error is raised if no such column exists. row.names <- TRUE with_connection({ with_remove_test_table(name = "iris", { iris_in <- get_iris(ctx) dbWriteTable(con, "iris", iris_in, row.names = NA) expect_error(dbReadTable(con, "iris", row.names = row.names)) }) }) }, read_table_row_names_na_exists = function(ctx) { #' - If `NA`, a column named "row_names" is converted to row names if it exists, row.names <- NA with_connection({ with_remove_test_table(name = "mtcars", { mtcars_in <- datasets::mtcars dbWriteTable(con, "mtcars", mtcars_in, row.names = TRUE) mtcars_out <- check_df(dbReadTable(con, "mtcars", row.names = row.names)) expect_equal_df(mtcars_out, mtcars_in) }) }) }, read_table_row_names_na_missing = function(ctx) { #' otherwise no translation occurs. row.names <- NA with_connection({ with_remove_test_table(name = "iris", { iris_in <- get_iris(ctx) dbWriteTable(con, "iris", iris_in, row.names = FALSE) iris_out <- check_df(dbReadTable(con, "iris", row.names = row.names)) expect_equal_df(iris_out, iris_in) }) }) }, read_table_row_names_string_exists = function(ctx) { #' - If a string, this specifies the name of the column in the remote table #' that contains the row names, row.names <- "make_model" with_connection({ with_remove_test_table(name = "mtcars", { mtcars_in <- datasets::mtcars mtcars_in$make_model <- rownames(mtcars_in) mtcars_in <- unrowname(mtcars_in) dbWriteTable(con, "mtcars", mtcars_in, row.names = FALSE) mtcars_out <- check_df(dbReadTable(con, "mtcars", row.names = row.names)) expect_false("make_model" %in% names(mtcars_out)) expect_true(all(mtcars_in$make_model %in% rownames(mtcars_out))) expect_true(all(rownames(mtcars_out) %in% mtcars_in$make_model)) expect_equal_df(unrowname(mtcars_out), mtcars_in[names(mtcars_in) != "make_model"]) }) }) }, read_table_row_names_string_missing = function(ctx) { #' an error is raised if no such column exists. row.names <- "missing" with_connection({ with_remove_test_table(name = "iris", { iris_in <- get_iris(ctx) dbWriteTable(con, "iris", iris_in, row.names = FALSE) expect_error(dbReadTable(con, "iris", row.names = row.names)) }) }) }, #' read_table_row_names_default = function(ctx) { #' #' The default is `row.names = FALSE`. #' with_connection({ with_remove_test_table(name = "mtcars", { mtcars_in <- datasets::mtcars dbWriteTable(con, "mtcars", mtcars_in, row.names = TRUE) mtcars_out <- check_df(dbReadTable(con, "mtcars")) expect_true("row_names" %in% names(mtcars_out)) expect_true(all(mtcars_out$row_names %in% rownames(mtcars_in))) expect_true(all(rownames(mtcars_in) %in% mtcars_out$row_names)) expect_equal_df(mtcars_out[names(mtcars_out) != "row_names"], unrowname(mtcars_in)) }) }) }, read_table_check_names = function(ctx) { with_connection({ #' If the database supports identifiers with special characters, if (isTRUE(ctx$tweaks$strict_identifier)) { skip("tweak: strict_identifier") } #' the columns in the returned data frame are converted to valid R #' identifiers with_remove_test_table({ test_in <- data.frame(a = 1:3, b = 4:6) names(test_in) <- c("with spaces", "with,comma") dbWriteTable(con, "test", test_in) #' if the `check.names` argument is `TRUE`, test_out <- check_df(dbReadTable(con, "test", check.names = TRUE)) expect_identical(names(test_out), make.names(names(test_out), unique = TRUE)) expect_equal_df(test_out, setNames(test_in, names(test_out))) }) #' otherwise non-syntactic column names can be returned unquoted. with_remove_test_table({ test_in <- data.frame(a = 1:3, b = 4:6) names(test_in) <- c("with spaces", "with,comma") dbWriteTable(con, "test", test_in) test_out <- check_df(dbReadTable(con, "test", check.names = FALSE)) expect_equal_df(test_out, test_in) }) }) }, #' #' An error is raised when calling this method for a closed read_table_closed_connection = function(ctx) { with_connection({ with_remove_test_table({ dbWriteTable(con, "test", data.frame(a = 1)) with_closed_connection(con = "con2", { expect_error(dbReadTable(con2, "test")) }) }) }) }, #' or invalid connection. read_table_invalid_connection = function(ctx) { with_connection({ with_remove_test_table({ dbWriteTable(con, "test", data.frame(a = 1)) with_invalid_connection(con = "con2", { expect_error(dbReadTable(con2, "test")) }) }) }) }, #' An error is raised read_table_error = function(ctx) { with_connection({ with_remove_test_table({ dbWriteTable(con, "test", data.frame(a = 1L)) #' if `name` cannot be processed with [dbQuoteIdentifier()] expect_error(dbReadTable(con, NA)) #' or if this results in a non-scalar. expect_error(dbReadTable(con, c("test", "test"))) #' Unsupported values for `row.names` and `check.names` #' (non-scalars, expect_error(dbReadTable(con, "test", row.names = letters)) #' unsupported data types, expect_error(dbReadTable(con, "test", row.names = list(1L))) expect_error(dbReadTable(con, "test", check.names = 1L)) #' `NA` for `check.names`) expect_error(dbReadTable(con, "test", check.names = NA)) #' also raise an error. }) }) }, #' @section Additional arguments: #' The following arguments are not part of the `dbReadTable()` generic #' (to improve compatibility across backends) #' but are part of the DBI specification: #' - `row.names` #' - `check.names` #' #' They must be provided as named arguments. #' See the "Value" section for details on their usage. #' @section Specification: #' The `name` argument is processed as follows, read_table_name = function(ctx) { with_connection({ #' to support databases that allow non-syntactic names for their objects: if (isTRUE(ctx$tweaks$strict_identifier)) { table_names <- "a" } else { table_names <- c("a", "with spaces", "with,comma") } for (table_name in table_names) { with_remove_test_table(name = dbQuoteIdentifier(con, table_name), { test_in <- data.frame(a = 1L) dbWriteTable(con, table_name, test_in) #' - If an unquoted table name as string: `dbReadTable()` will do the #' quoting, test_out <- check_df(dbReadTable(con, table_name)) expect_equal_df(test_out, test_in) #' perhaps by calling `dbQuoteIdentifier(conn, x = name)` #' - If the result of a call to [dbQuoteIdentifier()]: no more quoting is done test_out <- check_df(dbReadTable(con, dbQuoteIdentifier(con, table_name))) expect_equal_df(test_out, test_in) }) } }) }, NULL ) DBItest/R/expectations.R0000644000176200001440000000402313232372770014607 0ustar liggesusersarglist_is_empty <- function() { function(x) { expect_true( is.null(formals(x)), "has empty argument list") } } all_args_have_default_values <- function() { function(x) { args <- formals(x) args <- args[names(args) != "..."] expect_true( all(vapply(args, as.character, character(1L)) != ""), "has arguments without default values") } } has_method <- function(method_name) { function(x) { my_class <- class(x) expect_true( length(findMethod(method_name, my_class)) > 0L, paste("object of class", my_class, "has no", method_name, "method")) } } expect_visible <- function(code) { ret <- withVisible(code) expect_true(ret$visible) ret$value } expect_invisible_true <- function(code) { ret <- withVisible(code) expect_true(ret$value) test_that("Visibility", { expect_false(ret$visible) }) invisible(ret$value) } expect_equal_df <- function(actual, expected) { factor_cols <- vapply(expected, is.factor, logical(1L)) expected[factor_cols] <- lapply(expected[factor_cols], as.character) asis_cols <- vapply(expected, inherits, "AsIs", FUN.VALUE = logical(1L)) expected[asis_cols] <- lapply(expected[asis_cols], unclass) list_cols <- vapply(expected, is.list, logical(1L)) if (!any(list_cols)) { order_actual <- order(actual) order_expected <- order(expected) } else { expect_false(all(list_cols)) expect_equal(anyDuplicated(actual[!list_cols]), 0) expect_equal(anyDuplicated(expected[!list_cols]), 0) order_actual <- order(actual[!list_cols]) order_expected <- order(expected[!list_cols]) } has_rownames_actual <- is.character(attr(actual, "row.names")) has_rownames_expected <- is.character(attr(expected, "row.names")) expect_equal(has_rownames_actual, has_rownames_expected) if (has_rownames_actual) { expect_equal(sort(row.names(actual)), sort(row.names(expected))) } actual <- unrowname(actual[order_actual, ]) expected <- unrowname(expected[order_expected, ]) expect_identical(actual, expected) } DBItest/R/spec-result-roundtrip.R0000644000176200001440000002754013232422026016372 0ustar liggesusers#' spec_result_roundtrip #' @usage NULL #' @format NULL #' @keywords NULL spec_result_roundtrip <- list( #' @section Specification: #' The column types of the returned data frame depend on the data returned: #' - [integer] for integer values between -2^31 and 2^31 - 1 data_integer = function(ctx) { with_connection({ test_select_with_null(.ctx = ctx, con, 1L, -100L) }) }, #' - [numeric] for numbers with a fractional component data_numeric = function(ctx) { with_connection({ test_select_with_null(.ctx = ctx, con, 1.5, -100.5) }) }, #' - [logical] for Boolean values (some backends may return an integer) data_logical = function(ctx) { with_connection({ int_values <- 1:0 values <- ctx$tweaks$logical_return(as.logical(int_values)) sql_names <- paste0("CAST(", int_values, " AS ", dbDataType(con, logical()), ")") test_select_with_null(.ctx = ctx, con, .dots = setNames(values, sql_names)) }) }, #' - [character] for text data_character = function(ctx) { with_connection({ values <- texts test_funs <- rep(list(has_utf8_or_ascii_encoding), length(values)) sql_names <- as.character(dbQuoteString(con, texts)) test_select_with_null(.ctx = ctx, con, .dots = setNames(values, sql_names)) test_select_with_null(.ctx = ctx, con, .dots = setNames(test_funs, sql_names)) }) }, #' - lists of [raw] for blobs (with `NULL` entries for SQL NULL values) data_raw = function(ctx) { if (isTRUE(ctx$tweaks$omit_blob_tests)) { skip("tweak: omit_blob_tests") } with_connection({ values <- list(is_raw_list) sql_names <- paste0("cast(1 as ", dbDataType(con, list(raw())), ")") test_select_with_null(.ctx = ctx, con, .dots = setNames(values, sql_names)) }) }, #' - coercible using [as.Date()] for dates data_date = function(ctx) { with_connection({ char_values <- paste0("2015-01-", sprintf("%.2d", 1:12)) values <- as_date_equals_to(as.Date(char_values)) sql_names <- ctx$tweaks$date_cast(char_values) test_select_with_null(.ctx = ctx, con, .dots = setNames(values, sql_names)) }) }, #' (also applies to the return value of the SQL function `current_date`) data_date_current = function(ctx) { with_connection({ test_select_with_null( .ctx = ctx, con, "current_date" ~ is_roughly_current_date) }) }, #' - coercible using [hms::as.hms()] for times data_time = function(ctx) { with_connection({ char_values <- c("00:00:00", "12:34:56") time_values <- as_hms_equals_to(hms::as.hms(char_values)) sql_names <- ctx$tweaks$time_cast(char_values) test_select_with_null(.ctx = ctx, con, .dots = setNames(time_values, sql_names)) }) }, #' (also applies to the return value of the SQL function `current_time`) data_time_current = function(ctx) { with_connection({ test_select_with_null( .ctx = ctx, con, "current_time" ~ coercible_to_time) }) }, #' - coercible using [as.POSIXct()] for timestamps data_timestamp = function(ctx) { with_connection({ char_values <- c("2015-10-11 00:00:00", "2015-10-11 12:34:56") time_values <- rep(list(coercible_to_timestamp), 2L) sql_names <- ctx$tweaks$time_cast(char_values) test_select_with_null(.ctx = ctx, con, .dots = setNames(time_values, sql_names)) }) }, #' (also applies to the return value of the SQL function `current_timestamp`) data_timestamp_current = function(ctx) { with_connection({ test_select_with_null( .ctx = ctx, con, "current_timestamp" ~ is_roughly_current_timestamp) }) }, #' - [NA] for SQL `NULL` values data_null = function(ctx) { with_connection({ check_result <- function(rows) { expect_true(is.na(rows$a)) } test_select(.ctx = ctx, con, "NULL" = is.na) }) }, #' #' If dates and timestamps are supported by the backend, the following R types are #' used: #' - [Date] for dates data_date_typed = function(ctx) { if (!isTRUE(ctx$tweaks$date_typed)) { skip("tweak: !date_typed") } with_connection({ char_values <- paste0("2015-01-", sprintf("%.2d", 1:12)) values <- lapply(char_values, as_numeric_date) sql_names <- ctx$tweaks$date_cast(char_values) test_select_with_null(.ctx = ctx, con, .dots = setNames(values, sql_names)) }) }, #' (also applies to the return value of the SQL function `current_date`) data_date_current_typed = function(ctx) { if (!isTRUE(ctx$tweaks$date_typed)) { skip("tweak: !date_typed") } with_connection({ test_select_with_null( .ctx = ctx, con, "current_date" ~ is_roughly_current_date_typed) }) }, #' - [POSIXct] for timestamps data_timestamp_typed = function(ctx) { if (!isTRUE(ctx$tweaks$timestamp_typed)) { skip("tweak: !timestamp_typed") } with_connection({ char_values <- c("2015-10-11 00:00:00", "2015-10-11 12:34:56") timestamp_values <- rep(list(is_timestamp), 2L) sql_names <- ctx$tweaks$timestamp_cast(char_values) test_select_with_null(.ctx = ctx, con, .dots = setNames(timestamp_values, sql_names)) }) }, #' (also applies to the return value of the SQL function `current_timestamp`) data_timestamp_current_typed = function(ctx) { if (!isTRUE(ctx$tweaks$timestamp_typed)) { skip("tweak: !timestamp_typed") } with_connection({ test_select_with_null( .ctx = ctx, con, "current_timestamp" ~ is_roughly_current_timestamp_typed) }) }, #' #' R has no built-in type with lossless support for the full range of 64-bit #' or larger integers. If 64-bit integers are returned from a query, #' the following rules apply: #' - Values are returned in a container with support for the full range of #' valid 64-bit values (such as the `integer64` class of the \pkg{bit64} #' package) #' - Coercion to numeric always returns a number that is as close as possible #' to the true value data_64_bit_numeric = function(ctx) { with_connection({ char_values <- c("10000000000", "-10000000000") test_values <- as_numeric_identical_to(as.numeric(char_values)) test_select_with_null(.ctx = ctx, con, .dots = setNames(test_values, char_values)) }) }, #' - Loss of precision when converting to numeric gives a warning data_64_bit_numeric_warning = function(ctx) { with_connection({ char_values <- c(" 1234567890123456789", "-1234567890123456789") num_values <- as.numeric(char_values) test_values <- as_numeric_equals_to(num_values) expect_warning( test_select_with_null(.ctx = ctx, con, .dots = setNames(test_values, char_values)) ) }) }, #' - Conversion to character always returns a lossless decimal representation #' of the data data_64_bit_lossless = function(ctx) { with_connection({ char_values <- c("1234567890123456789", "-1234567890123456789") test_values <- as_character_equals_to(char_values) test_select_with_null(.ctx = ctx, con, .dots = setNames(test_values, char_values)) }) }, NULL ) test_select_with_null <- function(...) { test_select(..., .add_null = "none") test_select(..., .add_null = "above") test_select(..., .add_null = "below") } # NB: .table = TRUE will not work in bigrquery test_select <- function(con, ..., .dots = NULL, .add_null = "none", .table = FALSE, .ctx, .envir = parent.frame()) { values <- c(list(...), .dots) value_is_formula <- vapply(values, is.call, logical(1L)) names(values)[value_is_formula] <- lapply(values[value_is_formula], "[[", 2L) values[value_is_formula] <- lapply( values[value_is_formula], function(x) { eval(x[[3]], envir = .envir) } ) if (is.null(names(values))) { sql_values <- lapply(values, as.character) } else { sql_values <- names(values) } if (isTRUE(.ctx$tweaks$current_needs_parens)) { sql_values <- gsub("^(current_(?:date|time|timestamp))$", "\\1()", sql_values) } sql_names <- letters[seq_along(sql_values)] query <- paste("SELECT", paste(sql_values, "as", sql_names, collapse = ", ")) if (.add_null != "none") { query_null <- paste("SELECT", paste("NULL as", sql_names, collapse = ", ")) query <- c(query, query_null) if (.add_null == "above") { query <- rev(query) } query <- paste0(query, ", ", 1:2, " as id") query <- union(.ctx = .ctx, query) } if (.table) { with_remove_test_table({ query <- paste("CREATE TABLE test AS", query) dbExecute(con, query) rows <- check_df(dbReadTable(con, "test")) }) } else { rows <- check_df(dbGetQuery(con, query)) } if (.add_null != "none") { rows <- rows[order(rows$id), -(length(sql_names) + 1L), drop = FALSE] if (.add_null == "above") { rows <- rows[2:1, , drop = FALSE] } } expect_identical(names(rows), sql_names) for (i in seq_along(values)) { value_or_testfun <- values[[i]] if (is.function(value_or_testfun)) { eval(bquote(expect_true(value_or_testfun(rows[1L, .(i)])))) } else { eval(bquote(expect_identical(rows[1L, .(i)], .(value_or_testfun)))) } } if (.add_null != "none") { expect_equal(nrow(rows), 2L) if (is.list(rows[[1L]])) { expect_true(is.null(rows[2L, 1L][[1L]])) } else { expect_true(is.na(rows[2L, 1L])) } } else { expect_equal(nrow(rows), 1L) } } all_have_utf8_or_ascii_encoding <- function(x) { all(vapply(x, has_utf8_or_ascii_encoding, logical(1L))) } has_utf8_or_ascii_encoding <- function(x) { if (Encoding(x) == "UTF-8") TRUE else if (Encoding(x) == "unknown") { # Characters encoded as "unknown" must be ASCII only, and remain "unknown" # after attempting to assign an encoding. From ?Encoding : # > ASCII strings will never be marked with a declared encoding, since their # > representation is the same in all supported encodings. Encoding(x) <- "UTF-8" Encoding(x) == "unknown" } else FALSE } is_raw_list <- function(x) { is.list(x) && is.raw(x[[1L]]) } coercible_to_date <- function(x) { x_date <- try_silent(as.Date(x)) !is.null(x_date) && all(is.na(x) == is.na(x_date)) } as_date_equals_to <- function(x) { lapply(x, function(xx) { function(value) as.Date(value) == xx }) } is_roughly_current_date <- function(x) { coercible_to_date(x) && (abs(Sys.Date() - as.Date(x)) <= 1) } coercible_to_time <- function(x) { x_hms <- try_silent(hms::as.hms(x)) !is.null(x_hms) && all(is.na(x) == is.na(x_hms)) } as_hms_equals_to <- function(x) { lapply(x, function(xx) { function(value) hms::as.hms(value) == xx }) } coercible_to_timestamp <- function(x) { x_timestamp <- try_silent(as.POSIXct(x)) !is.null(x_timestamp) && all(is.na(x) == is.na(x_timestamp)) } as_timestamp_equals_to <- function(x) { lapply(x, function(xx) { function(value) as.POSIXct(value) == xx }) } as_numeric_identical_to <- function(x) { lapply(x, function(xx) { function(value) as.numeric(value) == xx }) } as_numeric_equals_to <- function(x) { lapply(x, function(xx) { function(value) isTRUE(all.equal(as.numeric(value), xx)) }) } as_character_equals_to <- function(x) { lapply(x, function(xx) { function(value) as.character(value) == xx }) } is_roughly_current_timestamp <- function(x) { coercible_to_timestamp(x) && (Sys.time() - as.POSIXct(x, tz = "UTC") <= hms::hms(2)) } is_date <- function(x) { inherits(x, "Date") } is_roughly_current_date_typed <- function(x) { is_date(x) && (abs(Sys.Date() - x) <= 1) } is_timestamp <- function(x) { inherits(x, "POSIXct") } is_roughly_current_timestamp_typed <- function(x) { is_timestamp(x) && (Sys.time() - x <= hms::hms(2)) } as_numeric_date <- function(d) { d <- as.Date(d) structure(as.numeric(unclass(d)), class = class(d)) } DBItest/R/test-compliance.R0000644000176200001440000000061413156254271015172 0ustar liggesusers#' @name test_all #' @aliases NULL #' @section Tests: #' [test_compliance()]: #' Test full compliance to DBI NULL #' Test full compliance to DBI #' #' @inheritParams test_all #' @include test-transaction.R #' @family tests #' @export test_compliance <- function(skip = NULL, ctx = get_default_context()) { test_suite <- "Full compliance" run_tests(ctx, spec_compliance, skip, test_suite) } DBItest/R/spec-.R0000644000176200001440000000434013156254271013112 0ustar liggesusers# reverse order # Script to create new spec files from subspec names read from clipboard: # xclip -out -se c | sed 's/,//' | for i in $(cat); do f=$(echo $i | sed 's/_/-/g;s/$/.R/'); echo "$i <- list(" > R/$f; echo ")" >> R/$f; echo "#' @include $f"; done | tac # # Example input: # test_xxx_1, # test_xxx_2, # # Output: Files R/test-xxx-1.R and R/test-xxx-2.R, and @include directives to stdout ##### All #' @include spec-all.R ##### Stress #' @include spec-stress.R #' @include spec-stress-connection.R ##### Aggregators #' @include spec-compliance.R #' @include spec-transaction.R #' @include spec-meta.R #' @include spec-sql.R #' @include spec-result.R #' @include spec-connection.R #' @include spec-driver.R ##### Later #' @include spec-meta-get-info-result.R #' @include spec-meta-column-info.R #' @include spec-sql-list-fields.R #' @include spec-connection-get-info.R #' @include spec-driver-get-info.R ##### Method specs #' @include spec-transaction-with-transaction.R #' @include spec-transaction-begin-commit-rollback.R #' @include spec-meta-get-rows-affected.R #' @include spec-meta-get-row-count.R #' @include spec-meta-get-statement.R #' @include spec-meta-has-completed.R #' @include spec-meta-is-valid.R #' @include spec-meta-bind-.R #' @include spec-meta-bind.R #' @include spec-meta-bind-tester-extra.R #' @include spec-meta-bind-runner.R #' @include spec-sql-remove-table.R #' @include spec-sql-exists-table.R #' @include spec-sql-list-tables.R #' @include spec-sql-write-table.R #' @include spec-sql-read-table.R #' @include spec-sql-quote-identifier.R #' @include spec-sql-quote-string.R #' @include spec-result-execute.R #' @include spec-result-send-statement.R #' @include spec-result-get-query.R #' @include spec-result-clear-result.R #' @include spec-result-roundtrip.R #' @include spec-result-fetch.R #' @include spec-result-send-query.R #' @include spec-connection-disconnect.R #' @include spec-driver-connect.R #' @include spec-result-create-table-with-data-type.R #' @include spec-connection-data-type.R #' @include spec-driver-data-type.R ##### Class specs #' @include spec-driver-class.R ##### Soft specs #' @include spec-driver-constructor.R #' @include spec-compliance-methods.R #' @include spec-getting-started.R #' @include spec.R NULL DBItest/R/spec-meta-column-info.R0000644000176200001440000000114513232372770016205 0ustar liggesusers#' @template dbispec-sub-wip #' @format NULL #' @section Meta: #' \subsection{`dbColumnInfo("DBIResult")`}{ spec_meta_column_info <- list( #' Column information is correct. column_info = function(ctx) { with_connection({ query <- "SELECT 1 as a, 1.5 as b, NULL" with_result( dbSendQuery(con, query), { ci <- dbColumnInfo(res) expect_is(ci, "data.frame") expect_identical(colnames(ci), c("name", "type")) expect_identical(ci$name[1:2], c("a", "b")) expect_is(ci$type, "character") } ) }) }, #' } NULL ) DBItest/R/spec-all.R0000644000176200001440000000025013156254271013577 0ustar liggesusersspec_all <- c( spec_getting_started, spec_driver, spec_connection, spec_result, spec_sql, spec_meta, spec_transaction, spec_compliance, spec_stress ) DBItest/R/test-stress.R0000644000176200001440000000056513156254271014410 0ustar liggesusers#' @name test_all #' @aliases NULL #' @section Tests: #' [test_stress()]: #' Stress tests (not tested with `test_all`) NULL #' Stress tests #' #' @inheritParams test_all #' @include test-compliance.R #' @family tests #' @export test_stress <- function(skip = NULL, ctx = get_default_context()) { test_suite <- "Stress" run_tests(ctx, spec_stress, skip, test_suite) } DBItest/R/import-testthat.R0000644000176200001440000000017313156254271015253 0ustar liggesusers#' @import testthat NULL #' @importFrom methods findMethod getClasses getClass extends #' @importFrom stats setNames NULL DBItest/R/spec-sql-list-tables.R0000644000176200001440000000506013232372770016053 0ustar liggesusers#' spec_sql_list_tables #' @usage NULL #' @format NULL #' @keywords NULL spec_sql_list_tables <- list( list_tables_formals = function(ctx) { # expect_equal(names(formals(dbListTables)), c("conn", "...")) }, #' @return #' `dbListTables()` list_tables = function(ctx) { with_connection({ with_remove_test_table(name = "iris", { tables <- dbListTables(con) #' returns a character vector expect_is(tables, "character") #' that enumerates all tables expect_false("iris" %in% tables) #' and views # TODO #' in the database. #' Tables added with [dbWriteTable()] iris <- get_iris(ctx) dbWriteTable(con, "iris", iris) #' are part of the list, tables <- dbListTables(con) expect_true("iris" %in% tables) }) with_remove_test_table({ #' including temporary tables if supported by the database. if (isTRUE(ctx$tweaks$temporary_tables)) { dbWriteTable(con, "test", data.frame(a = 1L), temporary = TRUE) tables <- dbListTables(con) expect_true("test" %in% tables) } }) #' As soon a table is removed from the database, #' it is also removed from the list of database tables. tables <- dbListTables(con) expect_false("iris" %in% tables) #' #' The returned names are suitable for quoting with `dbQuoteIdentifier()`. if (isTRUE(ctx$tweaks$strict_identifier)) { table_names <- "a" } else { table_names <- c("a", "with spaces", "with,comma") } for (table_name in table_names) { with_remove_test_table(name = dbQuoteIdentifier(con, table_name), { dbWriteTable(con, dbQuoteIdentifier(con, table_name), data.frame(a = 2L)) tables <- dbListTables(con) expect_true(table_name %in% tables) expect_true(dbQuoteIdentifier(con, table_name) %in% dbQuoteIdentifier(con, tables)) }) } }) }, #' An error is raised when calling this method for a closed list_tables_closed_connection = function(ctx) { with_closed_connection({ expect_error(dbListTables(con)) }) }, #' or invalid connection. list_tables_invalid_connection = function(ctx) { with_invalid_connection({ expect_error(dbListTables(con)) }) }, #' @section Additional arguments: #' TBD: `temporary = NA` #' #' This must be provided as named argument. #' See the "Specification" section for details on their usage. NULL ) DBItest/R/spec-sql-remove-table.R0000644000176200001440000001144013232372770016211 0ustar liggesusers#' spec_sql_remove_table #' @usage NULL #' @format NULL #' @keywords NULL spec_sql_remove_table <- list( remove_table_formals = function(ctx) { # expect_equal(names(formals(dbRemoveTable)), c("conn", "name", "...")) }, #' @return #' `dbRemoveTable()` returns `TRUE`, invisibly. remove_table_return = function(ctx) { with_connection({ with_remove_test_table(name = "iris", { iris <- get_iris(ctx) dbWriteTable(con, "iris", iris) expect_invisible_true(dbRemoveTable(con, "iris")) }) }) }, #' If the table does not exist, an error is raised. remove_table_missing = function(ctx) { with_connection({ with_remove_test_table({ expect_error(dbRemoveTable("test")) }) }) }, #' An attempt to remove a view with this function may result in an error. #' #' #' An error is raised when calling this method for a closed remove_table_closed_connection = function(ctx) { with_connection({ with_remove_test_table({ dbWriteTable(con, "test", data.frame(a = 1)) with_closed_connection(con = "con2", { expect_error(dbRemoveTable(con2, "test")) }) }) }) }, #' or invalid connection. remove_table_invalid_connection = function(ctx) { with_connection({ with_remove_test_table({ dbWriteTable(con, "test", data.frame(a = 1)) with_invalid_connection(con = "con2", { expect_error(dbRemoveTable(con2, "test")) }) }) }) }, #' An error is also raised remove_table_error = function(ctx) { with_connection({ with_remove_test_table({ dbWriteTable(con, "test", data.frame(a = 1L)) #' if `name` cannot be processed with [dbQuoteIdentifier()] expect_error(dbRemoveTable(con, NA)) #' or if this results in a non-scalar. expect_error(dbRemoveTable(con, c("test", "test"))) }) }) }, #' @section Specification: #' A table removed by `dbRemoveTable()` doesn't appear in the list of tables #' returned by [dbListTables()], #' and [dbExistsTable()] returns `FALSE`. remove_table_list = function(ctx) { with_connection({ with_remove_test_table({ dbWriteTable(con, "test", data.frame(a = 1L)) expect_true("test" %in% dbListTables(con)) expect_true(dbExistsTable(con, "test")) dbRemoveTable(con, "test") expect_false("test" %in% dbListTables(con)) expect_false(dbExistsTable(con, "test")) }) }) }, #' The removal propagates immediately to other connections to the same database. remove_table_other_con = function(ctx) { with_connection({ with_connection(con = "con2", { with_remove_test_table({ dbWriteTable(con, "test", data.frame(a = 1L)) expect_true("test" %in% dbListTables(con2)) expect_true(dbExistsTable(con2, "test")) dbRemoveTable(con, "test") expect_false("test" %in% dbListTables(con2)) expect_false(dbExistsTable(con2, "test")) }) }) }) }, #' This function can also be used to remove a temporary table. remove_table_temporary = function(ctx) { if (!isTRUE(ctx$tweaks$temporary_tables)) { skip("tweak: temporary_tables") } with_connection({ with_remove_test_table({ dbWriteTable(con, "test", data.frame(a = 1L), temporary = TRUE) expect_true("test" %in% dbListTables(con)) expect_true(dbExistsTable(con, "test")) dbRemoveTable(con, "test") expect_false("test" %in% dbListTables(con)) expect_false(dbExistsTable(con, "test")) }) }) }, #' #' The `name` argument is processed as follows, remove_table_name = function(ctx) { with_connection({ #' to support databases that allow non-syntactic names for their objects: if (isTRUE(ctx$tweaks$strict_identifier)) { table_names <- "a" } else { table_names <- c("a", "with spaces", "with,comma") } test_in <- data.frame(a = 1L) for (table_name in table_names) { with_remove_test_table(name = dbQuoteIdentifier(con, table_name), { #' - If an unquoted table name as string: `dbRemoveTable()` will do the #' quoting, dbWriteTable(con, table_name, test_in) expect_true(dbRemoveTable(con, table_name)) #' perhaps by calling `dbQuoteIdentifier(conn, x = name)` }) } for (table_name in table_names) { with_remove_test_table(name = dbQuoteIdentifier(con, table_name), { #' - If the result of a call to [dbQuoteIdentifier()]: no more quoting is done dbWriteTable(con, table_name, test_in) expect_true(dbRemoveTable(con, dbQuoteIdentifier(con, table_name))) }) } }) }, NULL ) DBItest/R/spec-transaction-begin-commit-rollback.R0000644000176200001440000001251313232372770021520 0ustar liggesusers#' spec_transaction_begin_commit_rollback #' @usage NULL #' @format NULL #' @keywords NULL spec_transaction_begin_commit_rollback <- list( begin_formals = function(ctx) { # expect_equal(names(formals(dbBegin)), c("conn", "...")) }, commit_formals = function(ctx) { # expect_equal(names(formals(dbCommit)), c("conn", "...")) }, rollback_formals = function(ctx) { # expect_equal(names(formals(dbRollback)), c("conn", "...")) }, #' @return #' `dbBegin()`, `dbCommit()` and `dbRollback()` return `TRUE`, invisibly. begin_commit_return_value = function(ctx) { with_connection({ expect_invisible_true(dbBegin(con)) with_rollback_on_error({ expect_invisible_true(dbCommit(con)) }) }) }, begin_rollback_return_value = function(ctx) { with_connection({ expect_invisible_true(dbBegin(con)) expect_invisible_true(dbRollback(con)) }) }, #' The implementations are expected to raise an error in case of failure, #' but this is not tested. begin_commit_closed = function(ctx) { with_closed_connection({ #' In any way, all generics throw an error with a closed expect_error(dbBegin(con)) expect_error(dbCommit(con)) expect_error(dbRollback(con)) }) }, begin_commit_invalid = function(ctx) { with_invalid_connection({ #' or invalid connection. expect_error(dbBegin(con)) expect_error(dbCommit(con)) expect_error(dbRollback(con)) }) }, commit_without_begin = function(ctx) { #' In addition, a call to `dbCommit()` with_connection({ expect_error(dbCommit(con)) }) }, rollback_without_begin = function(ctx) { #' or `dbRollback()` with_connection({ #' without a prior call to `dbBegin()` raises an error. expect_error(dbRollback(con)) }) }, begin_begin = function(ctx) { #' Nested transactions are not supported by DBI, with_connection({ #' an attempt to call `dbBegin()` twice dbBegin(con) with_rollback_on_error({ #' yields an error. expect_error(dbBegin(con)) dbCommit(con) }) }) }, #' @section Specification: #' Actual support for transactions may vary between backends. begin_commit = function(ctx) { with_connection({ #' A transaction is initiated by a call to `dbBegin()` dbBegin(con) #' and committed by a call to `dbCommit()`. success <- FALSE expect_error({dbCommit(con); success <- TRUE}, NA) if (!success) dbRollback(con) }) }, #' Data written in a transaction must persist after the transaction is committed. begin_write_commit = function(ctx) { with_connection({ #' For example, a table that is missing when the transaction is started expect_false(dbExistsTable(con, "test")) dbBegin(con) with_rollback_on_error({ #' but is created dbExecute(con, paste0("CREATE TABLE test (a ", dbDataType(con, 0L), ")")) #' and populated during the transaction dbExecute(con, paste0("INSERT INTO test (a) VALUES (1)")) #' must exist and contain the data added there expect_equal(check_df(dbReadTable(con, "test")), data.frame(a = 1)) #' both during dbCommit(con) }) #' and after the transaction, expect_equal(check_df(dbReadTable(con, "test")), data.frame(a = 1)) }) with_connection({ with_remove_test_table({ #' and also in a new connection. expect_true(dbExistsTable(con, "test")) expect_equal(check_df(dbReadTable(con, "test")), data.frame(a = 1)) }) }) }, begin_rollback = function(ctx) { with_connection({ #' #' A transaction dbBegin(con) #' can also be aborted with `dbRollback()`. expect_error(dbRollback(con), NA) }) }, #' All data written in such a transaction must be removed after the #' transaction is rolled back. begin_write_rollback = function(ctx) { with_connection({ #' For example, a table that is missing when the transaction is started with_remove_test_table({ dbBegin(con) #' but is created during the transaction expect_error( dbExecute(con, paste0("CREATE TABLE test (a ", dbDataType(con, 0L), ")")), NA ) #' must not exist anymore after the rollback. dbRollback(con) expect_false(dbExistsTable(con, "test")) }) }) }, begin_write_disconnect = function(ctx) { #' #' Disconnection from a connection with an open transaction with_connection({ dbBegin(con) dbExecute(con, paste0("CREATE TABLE test (a ", dbDataType(con, 0L), ")")) }) with_connection({ #' effectively rolls back the transaction. #' All data written in such a transaction must be removed after the #' transaction is rolled back. with_remove_test_table({ expect_false(dbExistsTable(con, "test")) }) }) }, #' #' The behavior is not specified if other arguments are passed to these #' functions. In particular, \pkg{RSQLite} issues named transactions #' with support for nesting #' if the `name` argument is set. #' #' The transaction isolation level is not specified by DBI. NULL ) DBItest/R/tweaks.R0000644000176200001440000001131413232372770013400 0ustar liggesusers#' Tweaks for DBI tests #' #' TBD. #' @name tweaks #' @aliases NULL { # nolint tweak_names <- alist( #' @param ... `[any]`\cr #' Unknown tweaks are accepted, with a warning. The ellipsis #' also asserts that all arguments are named. "..." = , #' @param constructor_name `[character(1)]`\cr #' Name of the function that constructs the `Driver` object. "constructor_name" = NULL, #' @param constructor_relax_args `[logical(1)]`\cr #' If `TRUE`, allow a driver constructor with default values for all #' arguments; otherwise, require a constructor with empty argument list #' (default). "constructor_relax_args" = FALSE, #' @param strict_identifier `[logical(1)]`\cr #' Set to `TRUE` if the DBMS does not support arbitrarily-named #' identifiers even when quoting is used. "strict_identifier" = FALSE, #' @param omit_blob_tests `[logical(1)]`\cr #' Set to `TRUE` if the DBMS does not support a `BLOB` data #' type. "omit_blob_tests" = FALSE, #' @param current_needs_parens `[logical(1)]`\cr #' Set to `TRUE` if the SQL functions `current_date`, #' `current_time`, and `current_timestamp` require parentheses. "current_needs_parens" = FALSE, #' @param union `[function(character)]`\cr #' Function that combines several subqueries into one so that the #' resulting query returns the concatenated results of the subqueries "union" = function(x) paste(x, collapse = " UNION "), #' @param placeholder_pattern `[character]`\cr #' A pattern for placeholders used in [dbBind()], e.g., #' `"?"`, `"$1"`, or `":name"`. See #' [make_placeholder_fun()] for details. "placeholder_pattern" = NULL, #' @param logical_return `[function(logical)]`\cr #' A vectorized function that converts logical values to the data type #' returned by the DBI backend. "logical_return" = identity, #' @param date_cast `[function(character)]`\cr #' A vectorized function that creates an SQL expression for coercing a #' string to a date value. "date_cast" = function(x) paste0("date('", x, "')"), #' @param time_cast `[function(character)]`\cr #' A vectorized function that creates an SQL expression for coercing a #' string to a time value. "time_cast" = function(x) paste0("time('", x, "')"), #' @param timestamp_cast `[function(character)]`\cr #' A vectorized function that creates an SQL expression for coercing a #' string to a timestamp value. "timestamp_cast" = function(x) paste0("timestamp('", x, "')"), #' @param date_typed `[logical(1L)]`\cr #' Set to `FALSE` if the DBMS doesn't support a dedicated type for dates. "date_typed" = TRUE, #' @param time_typed `[logical(1L)]`\cr #' Set to `FALSE` if the DBMS doesn't support a dedicated type for times. "time_typed" = TRUE, #' @param timestamp_typed `[logical(1L)]`\cr #' Set to `FALSE` if the DBMS doesn't support a dedicated type for #' timestamps. "timestamp_typed" = TRUE, #' @param temporary_tables `[logical(1L)]`\cr #' Set to `FALSE` if the DBMS doesn't support temporary tables. "temporary_tables" = TRUE, # Dummy argument NULL ) } # A helper function that constructs the tweaks() function in a DRY fashion. make_tweaks <- function(envir = parent.frame()) { fmls <- tweak_names[-length(tweak_names)] tweak_quoted <- lapply(setNames(nm = names(fmls)), as.name) tweak_quoted <- c(tweak_quoted) list_call <- as.call(c(quote(list), tweak_quoted)) fun <- eval(bquote(function() { unknown <- list(...) if (length(unknown) > 0) { if (is.null(names(unknown)) || any(names(unknown) == "")) { warning("All tweaks must be named", call. = FALSE) } else { warning("Unknown tweaks: ", paste(names(unknown), collapse = ", "), call. = FALSE) } } ret <- .(list_call) ret <- ret[!vapply(ret, is.null, logical(1L))] structure(ret, class = "DBItest_tweaks") } , as.environment(list(list_call = list_call)))) formals(fun) <- fmls environment(fun) <- envir fun } #' @export #' @rdname tweaks tweaks <- make_tweaks() #' @export format.DBItest_tweaks <- function(x, ...) { if (length(x) == 0L) { return("DBItest tweaks: Empty") } c( "DBItest tweaks:", unlist(mapply( function(name, value) { paste0(" ", name, ": ", format(value)[[1]]) }, names(x), unclass(x))) ) } #' @export print.DBItest_tweaks <- function(x, ...) { cat(format(x), sep = "\n") } #' @export `$.DBItest_tweaks` <- function(x, tweak) { if (!(tweak %in% names(tweak_names))) { stop("Tweak not found: ", tweak, call. = FALSE) } NextMethod() } DBItest/R/spec.R0000644000176200001440000000034213156254271013033 0ustar liggesusers#' DBI specification #' #' Placeholder page. #' #' @format NULL #' @usage NULL #' @name DBIspec NULL #' DBI specification (work in progress) #' #' Placeholder page. #' #' @format NULL #' @usage NULL #' @name DBIspec-wip NULL DBItest/R/DBItest.R0000644000176200001440000000057013156254271013402 0ustar liggesusers#' @details #' The two most important functions are [make_context()] and #' [test_all()]. The former tells the package how to connect to your #' DBI backend, the latter executes all tests of the test suite. More #' fine-grained test functions (all with prefix `test_`) are available. #' #' See the package's vignette for more details. #' #' @author Kirill Müller "_PACKAGE" DBItest/R/test-result.R0000644000176200001440000000055613156254271014403 0ustar liggesusers#' @name test_all #' @aliases NULL #' @section Tests: #' [test_result()]: #' Test the "Result" class NULL #' Test the "Result" class #' #' @inheritParams test_all #' @include test-connection.R #' @family tests #' @export test_result <- function(skip = NULL, ctx = get_default_context()) { test_suite <- "Result" run_tests(ctx, spec_result, skip, test_suite) } DBItest/R/spec-stress-connection.R0000644000176200001440000000143413156254271016514 0ustar liggesusers#' @template dbispec-sub-wip #' @format NULL #' @importFrom withr with_output_sink #' @section Connection: #' \subsection{Stress tests}{ spec_stress_connection <- list( #' Open 50 simultaneous connections simultaneous_connections = function(ctx) { cons <- list() on.exit(try_silent(lapply(cons, dbDisconnect)), add = TRUE) for (i in seq_len(50L)) { cons <- c(cons, connect(ctx)) } inherit_from_connection <- vapply(cons, is, class2 = "DBIConnection", logical(1)) expect_true(all(inherit_from_connection)) }, #' Open and close 50 connections stress_connections = function(ctx) { for (i in seq_len(50L)) { con <- connect(ctx) expect_s4_class(con, "DBIConnection") expect_error(dbDisconnect(con), NA) } }, #' } NULL ) DBItest/R/spec-meta-bind-tester-extra.R0000644000176200001440000000074513232372770017325 0ustar liggesusersBindTesterExtra <- R6::R6Class( "BindTesterExtra", portable = TRUE, public = list( check_return_value = function(bind_res, res) invisible(NULL), patch_bind_values = identity, requires_names = function() NA, is_repeated = function() FALSE, is_premature_clear = function() FALSE, is_untouched = function() FALSE ) ) new_bind_tester_extra <- function(...) { R6::R6Class( inherit = BindTesterExtra, portable = TRUE, public = list(...) ) } DBItest/R/spec-sql-quote-string.R0000644000176200001440000001134713232372770016276 0ustar liggesusers#' spec_sql_quote_string #' @usage NULL #' @format NULL #' @keywords NULL spec_sql_quote_string <- list( quote_string_formals = function(ctx) { # expect_equal(names(formals(dbQuoteString)), c("conn", "x", "...")) }, #' @return quote_string_return = function(ctx) { with_connection({ #' `dbQuoteString()` returns an object that can be coerced to [character], simple <- "simple" simple_out <- dbQuoteString(con, simple) expect_error(as.character(simple_out), NA) expect_is(as.character(simple_out), "character") expect_equal(length(simple_out), 1L) }) }, quote_string_vectorized = function(ctx) { with_connection({ #' of the same length as the input. letters_out <- dbQuoteString(con, letters) expect_equal(length(letters_out), length(letters)) #' For an empty character vector this function returns a length-0 object. empty_out <- dbQuoteString(con, character()) expect_equal(length(empty_out), 0L) }) }, quote_string_double = function(ctx) { with_connection({ simple <- "simple" simple_out <- dbQuoteString(con, simple) letters_out <- dbQuoteString(con, letters) empty <- character() empty_out <- dbQuoteString(con, character()) #' #' When passing the returned object again to `dbQuoteString()` #' as `x` #' argument, it is returned unchanged. expect_identical(dbQuoteString(con, simple_out), simple_out) expect_identical(dbQuoteString(con, letters_out), letters_out) expect_identical(dbQuoteString(con, empty_out), empty_out) #' Passing objects of class [SQL] should also return them unchanged. expect_identical(dbQuoteString(con, SQL(simple)), SQL(simple)) expect_identical(dbQuoteString(con, SQL(letters)), SQL(letters)) expect_identical(dbQuoteString(con, SQL(empty)), SQL(empty)) #' (For backends it may be most convenient to return [SQL] objects #' to achieve this behavior, but this is not required.) }) }, #' @section Specification: quote_string_roundtrip = function(ctx) { with_connection({ do_test_string <- function(x) { #' The returned expression can be used in a `SELECT ...` query, query <- paste0("SELECT ", paste(dbQuoteString(con, x), collapse = ", ")) #' and for any scalar character `x` the value of #' \code{dbGetQuery(paste0("SELECT ", dbQuoteString(x)))[[1]]} #' must be identical to `x`, x_out <- check_df(dbGetQuery(con, query)) expect_equal(nrow(x_out), 1L) expect_identical(unlist(unname(x_out)), x) } test_chars <- c( #' even if `x` contains "", #' spaces, " ", #' tabs, "\t", #' quotes (single "'", #' or double), '"', #' backticks, "`", #' or newlines "\n" ) #' (in any combination) # length(test_chars) ** 3 test_strings_0 <- expand_char(test_chars, "a", test_chars, "b", test_chars) #' or is itself the result of a `dbQuoteString()` call coerced back to #' character (even repeatedly). test_strings_1 <- as.character(dbQuoteString(con, test_strings_0)) test_strings_2 <- as.character(dbQuoteString(con, test_strings_1)) test_strings <- c(test_strings_0, test_strings_1, test_strings_2) do_test_string(test_strings) }) }, quote_string_na = function(ctx) { with_connection({ null <- dbQuoteString(con, NA_character_) quoted_null <- dbQuoteString(con, as.character(null)) na <- dbQuoteString(con, "NA") quoted_na <- dbQuoteString(con, as.character(na)) query <- paste0("SELECT ", null, " as null_return,", na, "as na_return,", quoted_null, "as quoted_null,", quoted_na, "as quoted_na") #' If `x` is `NA`, the result must merely satisfy [is.na()]. rows <- check_df(dbGetQuery(con, query)) expect_true(is.na(rows$null_return)) #' The strings `"NA"` or `"NULL"` are not treated specially. expect_identical(rows$na_return, "NA") expect_identical(rows$quoted_null, as.character(null)) expect_identical(rows$quoted_na, as.character(na)) }) }, quote_string_na_is_null = function(ctx) { with_connection({ #' #' `NA` should be translated to an unquoted SQL `NULL`, null <- dbQuoteString(con, NA_character_) #' so that the query `SELECT * FROM (SELECT 1) a WHERE ... IS NULL` rows <- check_df(dbGetQuery(con, paste0("SELECT * FROM (SELECT 1) a WHERE ", null, " IS NULL"))) #' returns one row. expect_equal(nrow(rows), 1L) }) }, NULL ) DBItest/R/test-sql.R0000644000176200001440000000052013156254271013653 0ustar liggesusers#' @name test_all #' @aliases NULL #' @section Tests: #' [test_sql()]: #' Test SQL methods NULL #' Test SQL methods #' #' @inheritParams test_all #' @include test-result.R #' @family tests #' @export test_sql <- function(skip = NULL, ctx = get_default_context()) { test_suite <- "SQL" run_tests(ctx, spec_sql, skip, test_suite) } DBItest/R/test-meta.R0000644000176200001440000000054313156254271014007 0ustar liggesusers#' @name test_all #' @aliases NULL #' @section Tests: #' [test_meta()]: #' Test metadata functions NULL #' Test metadata functions #' #' @inheritParams test_all #' @include test-sql.R #' @family tests #' @export test_meta <- function(skip = NULL, ctx = get_default_context()) { test_suite <- "Metadata" run_tests(ctx, spec_meta, skip, test_suite) } DBItest/R/spec-driver-constructor.R0000644000176200001440000000251213156254271016710 0ustar liggesusers#' @template dbispec-sub #' @format NULL #' @section Construction of the DBIDriver object: spec_driver_constructor <- list( constructor = function(ctx) { pkg_name <- package_name(ctx) #' The backend must support creation of an instance of its [DBIDriver-class] #' subclass #' with a \dfn{constructor function}. #' By default, its name is the package name without the leading \sQuote{R} #' (if it exists), e.g., `SQLite` for the \pkg{RSQLite} package. default_constructor_name <- gsub("^R", "", pkg_name) #' However, backend authors may choose a different name. constructor_name <- ctx$tweaks$constructor_name %||% default_constructor_name #' The constructor must be exported, and pkg_env <- getNamespace(pkg_name) eval(bquote( expect_true(.(constructor_name) %in% getNamespaceExports(pkg_env)))) #' it must be a function eval(bquote( expect_true(exists(.(constructor_name), mode = "function", pkg_env)))) constructor <- get(constructor_name, mode = "function", pkg_env) #' that is callable without arguments. expect_that(constructor, all_args_have_default_values()) #' DBI recommends to define a constructor with an empty argument list. if (!isTRUE(ctx$tweaks$constructor_relax_args)) { expect_that(constructor, arglist_is_empty()) } }, NULL ) DBItest/R/test-all.R0000644000176200001440000000241713232372770013633 0ustar liggesusers#' Run all tests #' #' `test_all()` calls all tests defined in this package (see the section #' "Tests" below). #' #' @section Tests: #' This function runs the following tests, except the stress tests: #' #' @param skip `[character()]`\cr A vector of regular expressions to match #' against test names; skip test if matching any. #' @param ctx `[DBItest_context]`\cr A test context as created by #' [make_context()]. #' #' @export test_all <- function(skip = NULL, ctx = get_default_context()) { test_getting_started(skip = skip, ctx = ctx) test_driver(skip = skip, ctx = ctx) test_connection(skip = skip, ctx = ctx) test_result(skip = skip, ctx = ctx) test_sql(skip = skip, ctx = ctx) test_meta(skip = skip, ctx = ctx) test_transaction(skip = skip, ctx = ctx) test_compliance(skip = skip, ctx = ctx) # stress tests are not tested by default (#92) } #' @rdname test_all #' @description `test_some()` allows testing one or more tests, it works by #' constructing the `skip` argument using negative lookaheads. #' @param test `[character]`\cr A character vector of regular expressions #' describing the tests to run. #' @export test_some <- function(test, ctx = get_default_context()) { test_all(skip = paste0("(?!", paste(test, collapse = "|"), ").*$"), ctx = ctx) } DBItest/R/spec-meta-bind-.R0000644000176200001440000001175413232372770014757 0ustar liggesusers# Helpers ----------------------------------------------------------------- test_select_bind <- function(con, placeholder_fun, ...) { if (is.character(placeholder_fun)) placeholder_fun <- lapply(placeholder_fun, make_placeholder_fun) else if (is.function(placeholder_fun)) placeholder_fun <- list(placeholder_fun) if (length(placeholder_fun) == 0) { skip("Use the placeholder_pattern tweak, or skip all 'bind_.*' tests") } lapply(placeholder_fun, test_select_bind_one, con = con, ...) } test_select_bind_one <- function(con, placeholder_fun, values, type = "character(10)", query = TRUE, transform_input = as.character, transform_output = function(x) trimws(x, "right"), expect = expect_identical, extra = "none") { bind_tester <- BindTester$new(con) bind_tester$placeholder <- placeholder_fun(length(values)) bind_tester$values <- values bind_tester$type <- type bind_tester$query <- query bind_tester$transform$input <- transform_input bind_tester$transform$output <- transform_output bind_tester$expect$fun <- expect bind_tester$extra_obj <- new_extra_imp(extra) bind_tester$run() } new_extra_imp <- function(extra) { if (is.environment(extra)) extra$new() else if (length(extra) == 0) new_extra_imp_one("none") else if (length(extra) == 1) new_extra_imp_one(extra) else { stop("need BindTesterExtraMulti") # BindTesterExtraMulti$new(lapply(extra, new_extra_imp_one)) } } new_extra_imp_one <- function(extra) { extra_imp <- switch( extra, none = BindTesterExtra, stop("Unknown extra: ", extra, call. = FALSE) ) extra_imp$new() } # BindTester -------------------------------------------------------------- BindTester <- R6::R6Class( "BindTester", portable = FALSE, public = list( initialize = function(con) { self$con <- con }, run = run_bind_tester$fun, con = NULL, placeholder = NULL, values = NULL, type = "character(10)", query = TRUE, transform = list(input = as.character, output = function(x) trimws(x, "right")), expect = list(fun = expect_identical), extra_obj = NULL ), private = list( is_query = function() { query }, send_query = function() { value_names <- letters[seq_along(values)] if (is.null(type)) { typed_placeholder <- placeholder } else { typed_placeholder <- paste0("cast(", placeholder, " as ", type, ")") } query <- paste0("SELECT ", paste0( typed_placeholder, " as ", value_names, collapse = ", ")) dbSendQuery(con, query) }, send_statement = function() { data <- data.frame(a = rep(1:5, 1:5)) data$b <- seq_along(data$a) table_name <- random_table_name() dbWriteTable(con, table_name, data, temporary = TRUE) value_names <- letters[seq_along(values)] statement <- paste0( "UPDATE ", dbQuoteIdentifier(con, table_name), "SET b = b + 1 WHERE ", paste(value_names, " = ", placeholder, collapse = " AND ")) dbSendStatement(con, statement) }, bind = function(res, bind_values) { bind_values <- extra_obj$patch_bind_values(bind_values) bind_res <- withVisible(dbBind(res, bind_values)) extra_obj$check_return_value(bind_res, res) invisible() }, compare = function(rows, values) { expect$fun(lapply(unname(rows), transform$output), lapply(unname(values), transform$input)) }, compare_affected = function(rows_affected, values) { expect_equal(rows_affected, sum(values[[1]])) } ) ) # make_placeholder_fun ---------------------------------------------------- #' Create a function that creates n placeholders #' #' For internal use by the `placeholder_format` tweak. #' #' @param pattern `[character(1)]`\cr Any character, optionally followed by `1` or `name`. Examples: `"?"`, `"$1"`, `":name"` #' #' @return `[function(n)]`\cr A function with one argument `n` that #' returns a vector of length `n` with placeholders of the specified format. #' Examples: `?, ?, ?, ...`, `$1, $2, $3, ...`, `:a, :b, :c` #' #' @keywords internal make_placeholder_fun <- function(pattern) { format_rx <- "^(.)(.*)$" character <- gsub(format_rx, "\\1", pattern) kind <- gsub(format_rx, "\\2", pattern) if (character == "") { stop("placeholder pattern must have at least one character", call. = FALSE) } if (kind == "") { eval(bquote( function(n) .(character) )) } else if (kind == "1") { eval(bquote( function(n) paste0(.(character), seq_len(n)) )) } else if (kind == "name") { eval(bquote( function(n) { l <- letters[seq_len(n)] stats::setNames(paste0(.(character), l), l) } )) } else { stop("Pattern must be any character, optionally followed by 1 or name. Examples: $1, :name", call. = FALSE) } } DBItest/R/s4.R0000644000176200001440000000231413156254271012430 0ustar liggesusers# http://stackoverflow.com/a/39880324/946850 s4_methods <- function(env, pkg_fun = NULL) { generics <- methods::getGenerics(env) if (is.null(pkg_fun)) { ok <- TRUE } else { ok <- pkg_fun(generics@package) } res <- Map( generics@.Data[ok], generics@package[ok], USE.NAMES = TRUE, f = function(name, package) { what <- methods::methodsPackageMetaName("T", paste(name, package, sep = ":")) table <- get(what, envir = env) mget(ls(table, all.names = TRUE), envir = table) }) unlist(res, recursive = FALSE) } s4_real_argument_names <- function(s4_method) { expect_is(s4_method, c("function", "MethodDefinition", "derivedDefaultMethod")) unwrapped <- s4_unwrap(s4_method) names(formals(unwrapped)) } s4_unwrap <- function(s4_method) { # Only unwrap if body is of the following form: # { # .local <- function(x, y, z, ...) { # ... # } # ... # } method_body <- body(s4_method) if (inherits(method_body, "{")) { local_def <- method_body[[2]] if (inherits(local_def, "<-") && local_def[[2]] == quote(.local)) { local_fun <- local_def[[3]] if (inherits(local_fun, "function")) return(local_fun) } } s4_method } DBItest/R/spec-compliance.R0000644000176200001440000000014013156254271015137 0ustar liggesusers#' @template dbispec #' @format NULL spec_compliance <- c( spec_compliance_methods, NULL ) DBItest/R/spec-driver-class.R0000644000176200001440000000017513156254271015433 0ustar liggesusersspec_driver_class <- list( inherits_from_driver = function(ctx) { expect_s4_class(ctx$drv, "DBIDriver") }, NULL ) DBItest/R/context.R0000644000176200001440000000355713156254271013600 0ustar liggesusers#' Test contexts #' #' Create a test context, set and query the default context. #' #' @param drv `[DBIDriver]`\cr An expression that constructs a DBI driver, #' like `SQLite()`. #' @param connect_args `[named list]`\cr Connection arguments (names and values). #' @param set_as_default `[logical(1)]`\cr Should the created context be #' set as default context? #' @param tweaks `[DBItest_tweaks]`\cr Tweaks as constructed by the #' [tweaks()] function. #' @param ctx `[DBItest_context]`\cr A test context. #' @param name `[character]`\cr An optional name of the context which will #' be used in test messages. #' @return `[DBItest_context]`\cr A test context, for #' `set_default_context` the previous default context (invisibly) or #' `NULL`. #' #' @rdname context #' @export make_context <- function(drv, connect_args, set_as_default = TRUE, tweaks = NULL, name = NULL) { drv_call <- substitute(drv) if (is.null(drv)) { stop("drv cannot be NULL.") } if (is.null(tweaks)) { tweaks <- tweaks() } ctx <- structure( list( drv = drv, drv_call = drv_call, connect_args = connect_args, tweaks = tweaks, name = name ), class = "DBItest_context" ) if (set_as_default) { set_default_context(ctx) } ctx } #' @rdname context #' @export set_default_context <- function(ctx) { old_ctx <- .ctx_env$default_context .ctx_env$default_context <- ctx invisible(old_ctx) } #' @rdname context #' @export get_default_context <- function() { .ctx_env$default_context } package_name <- function(ctx) { attr(class(ctx$drv), "package") } connect <- function(ctx) { connect_call <- as.call(c(list(quote(dbConnect), ctx$drv), ctx$connect_args)) connect_fun <- function() {} body(connect_fun) <- connect_call connect_fun() } .ctx_env <- new.env(parent = emptyenv()) set_default_context(NULL) DBItest/R/spec-result-execute.R0000644000176200001440000000356213232372770016016 0ustar liggesusers#' spec_result_execute #' @usage NULL #' @format NULL #' @keywords NULL spec_result_execute <- list( execute_formals = function(ctx) { # expect_equal(names(formals(dbExecute)), c("conn", "statement", "...")) }, #' @return #' `dbExecute()` always returns a execute_atomic = function(ctx) { with_connection({ with_remove_test_table({ query <- "CREATE TABLE test AS SELECT 1 AS a" ret <- dbExecute(con, query) #' scalar expect_equal(length(ret), 1) #' numeric expect_true(is.numeric(ret)) #' that specifies the number of rows affected #' by the statement. }) }) }, #' An error is raised when issuing a statement over a closed execute_closed_connection = function(ctx) { with_closed_connection({ expect_error(dbExecute(con, "CREATE TABLE test AS SELECT 1 AS a")) }) }, #' or invalid connection, execute_invalid_connection = function(ctx) { with_invalid_connection({ expect_error(dbExecute(con, "CREATE TABLE test AS SELECT 1 AS a")) }) }, #' if the syntax of the statement is invalid, execute_syntax_error = function(ctx) { with_connection({ expect_error(dbExecute(con, "CREATE")) }) }, #' or if the statement is not a non-`NA` string. execute_non_string = function(ctx) { with_connection({ expect_error(dbExecute(con, character())) expect_error(dbExecute(con, letters)) expect_error(dbExecute(con, NA_character_)) }) }, #' @section Additional arguments: #' The following argument is not part of the `dbExecute()` generic #' (to improve compatibility across backends) #' but is part of the DBI specification: #' - `params` (TBD) #' #' They must be provided as named arguments. #' See the "Specification" section for details on its usage. NULL ) DBItest/R/spec-sql-exists-table.R0000644000176200001440000000705613213312261016227 0ustar liggesusers#' spec_sql_exists_table #' @usage NULL #' @format NULL #' @keywords NULL spec_sql_exists_table <- list( exists_table_formals = function(ctx) { # expect_equal(names(formals(dbExistsTable)), c("conn", "name", "...")) }, #' @return #' `dbExistsTable()` returns a logical scalar, `TRUE` if the table or view #' specified by the `name` argument exists, `FALSE` otherwise. exists_table = function(ctx) { with_connection({ with_remove_test_table(name = "iris", { expect_false(expect_visible(dbExistsTable(con, "iris"))) iris <- get_iris(ctx) dbWriteTable(con, "iris", iris) expect_true(expect_visible(dbExistsTable(con, "iris"))) expect_false(expect_visible(dbExistsTable(con, "test"))) #' This includes temporary tables if supported by the database. if (isTRUE(ctx$tweaks$temporary_tables)) { dbWriteTable(con, "test", data.frame(a = 1L), temporary = TRUE) expect_true(expect_visible(dbExistsTable(con, "test"))) } }) expect_false(expect_visible(dbExistsTable(con, "iris"))) }) }, #' #' An error is raised when calling this method for a closed exists_table_closed_connection = function(ctx) { with_closed_connection({ expect_error(dbExistsTable(con, "test")) }) }, #' or invalid connection. exists_table_invalid_connection = function(ctx) { with_invalid_connection({ expect_error(dbExistsTable(con, "test")) }) }, #' An error is also raised exists_table_error = function(ctx) { with_connection({ with_remove_test_table({ dbWriteTable(con, "test", data.frame(a = 1L)) #' if `name` cannot be processed with [dbQuoteIdentifier()] expect_error(dbExistsTable(con, NA)) #' or if this results in a non-scalar. expect_error(dbExistsTable(con, c("test", "test"))) }) }) }, #' @section Additional arguments: #' TBD: `temporary = NA` #' #' This must be provided as named argument. #' See the "Specification" section for details on their usage. #' @section Specification: #' The `name` argument is processed as follows, exists_table_name = function(ctx) { with_connection({ #' to support databases that allow non-syntactic names for their objects: if (isTRUE(ctx$tweaks$strict_identifier)) { table_names <- "a" } else { table_names <- c("a", "with spaces", "with,comma") } for (table_name in table_names) { with_remove_test_table(name = table_name, { expect_false(dbExistsTable(con, table_name)) test_in <- data.frame(a = 1L) dbWriteTable(con, table_name, test_in) #' - If an unquoted table name as string: `dbExistsTable()` will do the #' quoting, expect_true(dbExistsTable(con, table_name)) #' perhaps by calling `dbQuoteIdentifier(conn, x = name)` #' - If the result of a call to [dbQuoteIdentifier()]: no more quoting is done expect_true(dbExistsTable(con, dbQuoteIdentifier(con, table_name))) }) } }) }, #' #' For all tables listed by [dbListTables()], `dbExistsTable()` returns `TRUE`. exists_table_list = function(ctx) { with_connection({ name <- random_table_name() with_remove_test_table( name = name, { dbWriteTable(con, name, data.frame(a = 1)) for (table_name in dbListTables(con)) { expect_true(dbExistsTable(con, table_name)) } } ) }) }, NULL ) DBItest/R/spec-connection-data-type.R0000644000176200001440000000022713156254271017060 0ustar liggesusersspec_connection_data_type <- list( data_type_connection = function(ctx) { with_connection({ test_data_type(ctx, con) }) }, NULL ) DBItest/vignettes/0000755000176200001440000000000013232432047013560 5ustar liggesusersDBItest/vignettes/test.Rmd0000644000176200001440000001026313071272237015211 0ustar liggesusers--- title: "Testing DBI backends" author: "Kirill Müller" date: "`r Sys.Date()`" output: rmarkdown::html_vignette vignette: > %\VignetteIndexEntry{Testing DBI backends} %\VignetteEngine{knitr::rmarkdown} \usepackage[utf8]{inputenc} --- This document shows how to use the `DBItest` package when implementing a new `DBI` backend or when applying it to an existing backend. The `DBItest` package provides a large collection of automated tests. ## Testing a new backend The test cases in the `DBItest` package are structured very similarly to the sections in the "backend" vignette: ```r vignette("backend", package = "DBI") ``` Like the "backend" vignette, this vignette assumes that you are implementing the `RKazam` package that has a `Kazam()` function that creates a new `DBIDriver` instance for connecting to a "Kazam" database. You can add the tests in the `DBItest` package incrementally, as you proceed with implementing the various parts of the DBI. The `DBItest` package builds upon the `testthat` package. To enable it, run the following in your package directory (after installing or updating `devtools`): ```r devtools::use_testthat() devtools::use_test("DBItest") ``` This creates, among others, a file `test-DBItest.R` in the `tests/testthat` directory. Replace its entire contents by the following: ```r DBItest::make_context(Kazam(), NULL) DBItest::test_getting_started() ``` Now test your package with `devtools::test()`. If you followed at least the "Getting started" section of the `DBI` "backend" vignette, all tests should succeed. By adding the corresponding test function to your `tests/test-DBItest.R` file *before* implementing a section, you get immediate feedback which functionality of this section still needs to be implemented by running `devtools::test()` again. Therefore, proceed by appending the following to `tests/test-DBItest.R`, to include a test case for the forthcoming section: ```r DBItest::test_driver() ``` Again, all tests should succeed when you are done with the "Driver" section. Add the call to the next tester function, implement the following section until all tests succeed, and so forth. In this scenario, you are usually interested only in the first error the test suite finds. The `StopReporter` of `testthat` is most helpful here, activate it by passing `reporter = "stop"` to `devtools::test()`. Alternatively, call the relevant `DBItest::test_()` function directly. The tests are documented with the corresponding functions: For instance, `?test_driver` shows a coarse description of all tests for the "Driver" test case. Test failures will include the name of the test that is failing; in this case, investigating the documentation or the source code of the `DBItest` package will usually lead to the cause of the error. Not all tests can be satisfied: For example, there is one test that tests that `logical` variables survive a write-read roundtrip to the database, whereas another test tests that `logical` variables are converted to `integer` in such a case. Tests can be skipped by adding regular expressions for the tests to skip as character vector to the call, as in the following[^termnull]: ```r DBItest::test_driver(skip = c( "data_type" # Reason 1... "constructor.*", # Reason 2... NULL )) [^termnull]: The terminating `NULL` allows appending new lines to the end by copy-pasting an existing line, without having to take care of the terminating comma. ``` Some other reasons to skip tests are: - your database does not support a feature - you want to postpone or avoid the implementation of a feature - the test takes too long to run ## Testing an existing backend For an existing backends, simply enabling all tests may be the quickest way to get started. Run the following in your package directory (after installing or updating `devtools`): ```r devtools::use_testthat() devtools::use_test("DBItest") ``` This creates, among others, a file `test-DBItest.R` in the `tests/testthat` directory. Replace its entire contents by the following: ```r DBItest::make_context(Kazam(), NULL) DBItest::test_all() ``` The notes about "Kazam" and skipping tests from the previous section apply here as well. The `test_all()` function simply calls all test cases. DBItest/README.md0000644000176200001440000000301313232372770013032 0ustar liggesusers# DBItest [![Travis-CI Build Status](https://travis-ci.org/rstats-db/DBItest.svg?branch=master)](https://travis-ci.org/rstats-db/DBItest) [![AppVeyor Build Status](https://ci.appveyor.com/api/projects/status/github/rstats-db/DBItest?branch=master&svg=true)](https://ci.appveyor.com/project/rstats-db/DBItest) [![CRAN_Status_Badge](http://www.r-pkg.org/badges/version/DBItest)](https://cran.r-project.org/package=DBItest) This package provides a considerable set of test cases which you can easily incorporate in your DBI driver package. ## Usage Install from CRAN via ```r install.packages("DBItest") ``` or the development version using ```r devtools::install_github("rstats-db/DBItest") ``` In your driver backage, add `DBItest` to the `Suggests:`. Then, enable the tests by running ```r devtools::use_testthat() devtools::use_test("DBItest") ``` from your package's directory. This enables testing using `testthat` (if necessary) and creates, among others, a file `test-DBItest.R` in the `tests/testthat` directory. Replace its entire contents by the following: ```r DBItest::make_context(Kazam(), NULL) DBItest::test_all() ``` (This assumes that `Kazam()` returns an instance of your `DBIDriver` class. Additional arguments to `dbConnect()` are specified as named list instead of the `NULL` argument to `make_context()`.) The `skip` argument to `test_all()` allows specifying skipped tests. See the package's documentation and the [feature list](https://github.com/rstats-db/DBItest/wiki/Proposal) for a description of the tests. DBItest/MD50000644000176200001440000001616613232433022012064 0ustar liggesusersbd53e42892f7ebc7a1a511bd8369210e *DESCRIPTION 3d72fbf8895684538555f27ecd187ede *NAMESPACE 498b3fbfc4d2c0f93aab918bcfc590e3 *NEWS.md f956e8e1290d2316d720831804645f34 *R/DBItest.R 34b240efcc81cd529578dd0ae94d9d4a *R/context.R d5685a8b06a7bc18eb306206ade486e9 *R/expectations.R 2456d9af5a745bbc589664a9807cb618 *R/import-dbi.R d1e36fe1f7d910ebe6ded24c9a2052b3 *R/import-testthat.R 620d45fb2dbdd989f71962ac20548e54 *R/run.R 392269cd7ac8df3e6fbc56b0bea5538b *R/s4.R 1834d84891a386f20fed471804ba9af7 *R/spec-.R 0f31c304333be84f7316c3e4b97f3c0d *R/spec-all.R 06780ae0e6199d4a6f2bcf930bbb62a6 *R/spec-compliance-methods.R 972cfc91acad25db5b17773e92da485d *R/spec-compliance.R 0edfb3ab996e0e1fd47f208b02e81fad *R/spec-connection-data-type.R f3c673a7e9bf7373c394359a2bb467a2 *R/spec-connection-disconnect.R 983bcf8185cfa4639f6154d049bcff7a *R/spec-connection-get-info.R a7d5ca6edad9ad39cfce44daabdc4a54 *R/spec-connection.R bba2f83db6655029c093f677317a9a5f *R/spec-driver-class.R a5de28228f154ad01aa1fda9b1b2a45b *R/spec-driver-connect.R 514dc60d2d73d95c8ef04aa04b16737c *R/spec-driver-constructor.R 67e621f97e3edbe3deb6069874284363 *R/spec-driver-data-type.R 978caf381fa28ccf9cc86f6e6dd1829e *R/spec-driver-get-info.R c9021aed2d2038ce94985e8a32dec17c *R/spec-driver.R 012d53b27aa0904ed817349447756592 *R/spec-getting-started.R 36943fabb95767725ed0e8612932d8ac *R/spec-meta-bind-.R 10f95280bda787682afa81de099f75d0 *R/spec-meta-bind-runner.R edad73f8c65e82e0b0e525775e2b36fb *R/spec-meta-bind-tester-extra.R 1e8844d8153d746e28190c849e23604e *R/spec-meta-bind.R 6cef2e7c2dfebbbb157dcc4f8b6db2f9 *R/spec-meta-column-info.R 8baa8e35cca961dbf729f32e64772a8c *R/spec-meta-get-info-result.R 74c358255ccfdd7996f1e9590c97f0fe *R/spec-meta-get-row-count.R aa5770fc1932c29a4fa3dd7282f9914b *R/spec-meta-get-rows-affected.R 2de48f7148473480fcc00569e5dda94c *R/spec-meta-get-statement.R 940f2a46d267d5c1e5c92363c2c785ce *R/spec-meta-has-completed.R fe580adc31ab99c23239c7cf27cfade3 *R/spec-meta-is-valid.R 646b4c99f1282473f003703fc8945202 *R/spec-meta.R 22c4dbe38db7b7b1d986884c2f64456b *R/spec-result-clear-result.R 0d74571fddd43736be206871afc6d4cb *R/spec-result-create-table-with-data-type.R edbdfa32d432d88b2a6708b5775a4e09 *R/spec-result-execute.R 6d0d6ea9abfc359d5f68c85cf09b7a9b *R/spec-result-fetch.R 849ca1b0185d36d5d2d9ebb97067efbb *R/spec-result-get-query.R 78e91a69133766b000af15290f36ab2e *R/spec-result-roundtrip.R f6b3a5fc1b1140ccbda87f149135be1d *R/spec-result-send-query.R 032c1ebecb0d5fd83ca9949fde8ab0f7 *R/spec-result-send-statement.R 731f38402ebc536101493e18e9d6d4c5 *R/spec-result.R 16b21fb31401dfa924a562c8e47de532 *R/spec-sql-exists-table.R 5bb59f7831543044a99275394abdf7d2 *R/spec-sql-list-fields.R 8fd8c3ca87c3fbb5a7275dcde2d54591 *R/spec-sql-list-tables.R 26aa547943a60ca1234fb4d66d758134 *R/spec-sql-quote-identifier.R 1fca815ed3b90f7a02f75dc12995a7b2 *R/spec-sql-quote-string.R 6832c24d5ac16e1b495a75cf695021cb *R/spec-sql-read-table.R 3ad10e6c7693e3a1113be5eadb22b071 *R/spec-sql-remove-table.R d713fef9d76f1cef6842ddc6a5cdc7d2 *R/spec-sql-write-table.R 72c51ae33337d0c3de4cb0cf5b17289c *R/spec-sql.R 520f2727c9e6d8344d2014efa2be31b5 *R/spec-stress-connection.R a0df2a1b835bf1a6382da15019229c14 *R/spec-stress.R 7f7dc1a68ab9cacd3ef14f771c723356 *R/spec-transaction-begin-commit-rollback.R 198aa308e2344a1e767585c591610fab *R/spec-transaction-with-transaction.R 1a22ea9de2430d087a469690f421a33b *R/spec-transaction.R c5a5d9501278eb4d6cc2da3073c55bcd *R/spec.R 5f5348b2d1f011bcb0aaf2363acdaa57 *R/test-all.R 0f3c8fb591881f71dc343a2bbd3cf898 *R/test-compliance.R 5306ca1f57b4cec62b057c38b9acce72 *R/test-connection.R 0395879bafa2c627d4c3f954baf1c3e5 *R/test-driver.R c206fc81b191714dd7e736e3348666e5 *R/test-getting-started.R a11b4257b6ee210f7840be0cc32a1fc8 *R/test-meta.R 91549a3a28e40ee333ab44724b5eb46c *R/test-result.R c23e4b8e51ad324d42d65f45e10eaacc *R/test-sql.R 4f098cdffca1949c4cbe9f96fc270c88 *R/test-stress.R 16036f6126b6d6184b02c9f2978eb75a *R/test-transaction.R 9e35d01ed22e883b5166095d232773c8 *R/tweaks.R 5bb7302537533dc370d8aa96bb6d1dfd *R/utf8.R 1468998e72e9e9f2aa9407b458efa685 *R/utils.R 2ac48354a76e3430802052eff9620bac *README.md 71cdad4e660e9abab84174d4d20b703f *build/vignette.rds 1994b1ff1f8a4d1ded48cd7a04a8d770 *inst/doc/test.Rmd c208798ec9a9f5945d6db842c5e9a646 *inst/doc/test.html f4991d67bed9b353e9d345df60f0644a *man/DBIspec-wip.Rd 02423d348a28bc746dbdcbad995a6133 *man/DBIspec.Rd 742938944bd94771b0908461884e4335 *man/DBItest-package.Rd db14a08cc7e54abf4f37c7925f6dcd22 *man/context.Rd d1ce288efebb76c3a37173ed08b424aa *man/make_placeholder_fun.Rd 780fd49920719103acd3ac2fd03a38de *man/spec_connection_disconnect.Rd ac236b13318f49da5ddb0a934848f4d1 *man/spec_driver_connect.Rd 90198413c2c473ea8c8588b916cf2117 *man/spec_driver_data_type.Rd 7e60c90549131ef6e5a191454c00e607 *man/spec_meta_bind.Rd ad7ccec1026426affbc1711e0ba4e530 *man/spec_meta_get_row_count.Rd 23cb6913cfe73a45170e2389684ec2e0 *man/spec_meta_get_rows_affected.Rd eac7a0e0375fa2253ed7e8730c35c496 *man/spec_meta_get_statement.Rd 31d4934206413dc389b1b2047c345532 *man/spec_meta_has_completed.Rd d889652b2cc25de411f0bd510add2139 *man/spec_meta_is_valid.Rd eed88eee1d1057b5abd0771690408060 *man/spec_result_clear_result.Rd 2ba0de7b65ae593739878145833394f2 *man/spec_result_create_table_with_data_type.Rd 874fcfa70384f03fb4c6252fc3e5e6af *man/spec_result_execute.Rd 04e289bdc54621cfb998dcc8c0a12704 *man/spec_result_fetch.Rd 60c4c439b0067834cb16c0b8485ff42b *man/spec_result_get_query.Rd 663bf753c23037884a427f488b6e6ed2 *man/spec_result_roundtrip.Rd d2f0bab545567ea1fc8b4afe0265d587 *man/spec_result_send_query.Rd d4528b1b3fb332b7b39fcbf135996c9b *man/spec_result_send_statement.Rd 1fd29617a4e9cd794aa027dbb38d0110 *man/spec_sql_exists_table.Rd 3080f65296f19a0cd19fd14caf96531d *man/spec_sql_list_tables.Rd d5881e833c7dcd04dfbc0effd2884988 *man/spec_sql_quote_identifier.Rd 485367a0a71df587658b8f5d6d82f8b5 *man/spec_sql_quote_string.Rd 27a0bba8fe0115024cd42554df8c468c *man/spec_sql_read_table.Rd 7f26eca7aaa65ffa83d28cf3448d7a57 *man/spec_sql_remove_table.Rd b6ecc86e4fbcfa49da41c28432ca99eb *man/spec_sql_write_table.Rd 5bb241217ca33f7b17c781a033357ca4 *man/spec_transaction_begin_commit_rollback.Rd 2d13edd8dda77b2a1a16800d56ba2f30 *man/spec_transaction_with_transaction.Rd ed4bf19f8656de19b97a630af4e162d5 *man/test_all.Rd 4bc0f9d09ca8eab9f2ff83b304001a05 *man/test_compliance.Rd bbe6d24725d121c50cdbc5e9c785ed7f *man/test_connection.Rd 6712e8eb176e63f271e253332d708f78 *man/test_data_type.Rd 63ce8e8a0cce917b24f47941e628d4d5 *man/test_driver.Rd 52a1adcb43614ab4575ff339f55269b9 *man/test_getting_started.Rd 568916d526f9bd636aa357378a11f44c *man/test_meta.Rd dfa395bce7c4be8e3912a3198a09446e *man/test_result.Rd 9f4e497e73fafd1041d505563306d217 *man/test_sql.Rd 51d757c4b4410d6305e909c7c0c2f50c *man/test_stress.Rd fb4fefc8dead49d3218bf51e8c81c3ea *man/test_transaction.Rd 54002550af0573ae2456320ffbba470a *man/tweaks.Rd e66cc0201e7914ca0a08d2401d1ac8a8 *tests/testthat.R e6e5e686b137cce397617a031718c1ff *tests/testthat/test-consistency.R 3675efbbcc4ee2129cfe7b52a10fd282 *tests/testthat/test-context.R 4c438214a5f4b238d0832ce8b8c9a0ba *tests/testthat/test-lint.R 8d740e8ffa890ea201a4be8119408ef1 *tests/testthat/test-tweaks.R 1994b1ff1f8a4d1ded48cd7a04a8d770 *vignettes/test.Rmd DBItest/build/0000755000176200001440000000000013232432047012647 5ustar liggesusersDBItest/build/vignette.rds0000644000176200001440000000031213232432047015202 0ustar liggesusersb```b`faf`b2 1# '(I-. MA g+8y*$%&g(A+ Cā I5/174v%gx"5lP5,n90{C2K7(1 棸(\^o @hrNb1GRKҊAC]EDBItest/DESCRIPTION0000644000176200001440000000524713232433022013260 0ustar liggesusersPackage: DBItest Title: Testing 'DBI' Back Ends Version: 1.5-2 Date: 2018-01-26 Authors@R: c( person(given = "Kirill", family = "Müller", role = c("aut", "cre"), email = "krlmlr+r@mailbox.org"), person("RStudio", role = "cph"), person("R Consortium", role = "fnd") ) Description: A helper that tests 'DBI' back ends for conformity to the interface. Depends: R (>= 3.0.0) Imports: blob, DBI (>= 0.4-9), desc, hms, methods, R6, testthat (>= 1.0.2), withr Suggests: knitr, lintr, rmarkdown License: LGPL (>= 2) LazyData: true Encoding: UTF-8 BugReports: https://github.com/rstats-db/DBItest/issues RoxygenNote: 6.0.1 VignetteBuilder: knitr Collate: 'DBItest.R' 'context.R' 'expectations.R' 'import-dbi.R' 'import-testthat.R' 'run.R' 's4.R' 'spec.R' 'spec-getting-started.R' 'spec-compliance-methods.R' 'spec-driver-constructor.R' 'spec-driver-class.R' 'spec-driver-data-type.R' 'spec-connection-data-type.R' 'spec-result-create-table-with-data-type.R' 'spec-driver-connect.R' 'spec-connection-disconnect.R' 'spec-result-send-query.R' 'spec-result-fetch.R' 'spec-result-roundtrip.R' 'spec-result-clear-result.R' 'spec-result-get-query.R' 'spec-result-send-statement.R' 'spec-result-execute.R' 'spec-sql-quote-string.R' 'spec-sql-quote-identifier.R' 'spec-sql-read-table.R' 'spec-sql-write-table.R' 'spec-sql-list-tables.R' 'spec-sql-exists-table.R' 'spec-sql-remove-table.R' 'spec-meta-bind-runner.R' 'spec-meta-bind-tester-extra.R' 'spec-meta-bind.R' 'spec-meta-bind-.R' 'spec-meta-is-valid.R' 'spec-meta-has-completed.R' 'spec-meta-get-statement.R' 'spec-meta-get-row-count.R' 'spec-meta-get-rows-affected.R' 'spec-transaction-begin-commit-rollback.R' 'spec-transaction-with-transaction.R' 'spec-driver-get-info.R' 'spec-connection-get-info.R' 'spec-sql-list-fields.R' 'spec-meta-column-info.R' 'spec-meta-get-info-result.R' 'spec-driver.R' 'spec-connection.R' 'spec-result.R' 'spec-sql.R' 'spec-meta.R' 'spec-transaction.R' 'spec-compliance.R' 'spec-stress-connection.R' 'spec-stress.R' 'spec-all.R' 'spec-.R' 'test-all.R' 'test-getting-started.R' 'test-driver.R' 'test-connection.R' 'test-result.R' 'test-sql.R' 'test-meta.R' 'test-transaction.R' 'test-compliance.R' 'test-stress.R' 'tweaks.R' 'utf8.R' 'utils.R' NeedsCompilation: no Packaged: 2018-01-25 19:46:48 UTC; muelleki Author: Kirill Müller [aut, cre], RStudio [cph], R Consortium [fnd] Maintainer: Kirill Müller Repository: CRAN Date/Publication: 2018-01-25 19:54:58 UTC DBItest/man/0000755000176200001440000000000013232372770012331 5ustar liggesusersDBItest/man/spec_connection_disconnect.Rd0000644000176200001440000000117513213331065020175 0ustar liggesusers% Generated by roxygen2: do not edit by hand % Please edit documentation in R/spec-connection-disconnect.R \docType{data} \name{spec_connection_disconnect} \alias{spec_connection_disconnect} \title{spec_connection_disconnect} \value{ \code{dbDisconnect()} returns \code{TRUE}, invisibly. } \description{ spec_connection_disconnect } \section{Specification}{ A warning is issued on garbage collection when a connection has been released without calling \code{dbDisconnect()}, but this cannot be tested automatically. A warning is issued immediately when calling \code{dbDisconnect()} on an already disconnected or invalid connection. } DBItest/man/test_meta.Rd0000644000176200001440000000141413071442207014577 0ustar liggesusers% Generated by roxygen2: do not edit by hand % Please edit documentation in R/test-meta.R \name{test_meta} \alias{test_meta} \title{Test metadata functions} \usage{ test_meta(skip = NULL, ctx = get_default_context()) } \arguments{ \item{skip}{\code{[character()]}\cr A vector of regular expressions to match against test names; skip test if matching any.} \item{ctx}{\code{[DBItest_context]}\cr A test context as created by \code{\link[=make_context]{make_context()}}.} } \description{ Test metadata functions } \seealso{ Other tests: \code{\link{test_compliance}}, \code{\link{test_connection}}, \code{\link{test_driver}}, \code{\link{test_getting_started}}, \code{\link{test_result}}, \code{\link{test_sql}}, \code{\link{test_stress}}, \code{\link{test_transaction}} } DBItest/man/test_driver.Rd0000644000176200001440000000142413071442207015145 0ustar liggesusers% Generated by roxygen2: do not edit by hand % Please edit documentation in R/test-driver.R \name{test_driver} \alias{test_driver} \title{Test the "Driver" class} \usage{ test_driver(skip = NULL, ctx = get_default_context()) } \arguments{ \item{skip}{\code{[character()]}\cr A vector of regular expressions to match against test names; skip test if matching any.} \item{ctx}{\code{[DBItest_context]}\cr A test context as created by \code{\link[=make_context]{make_context()}}.} } \description{ Test the "Driver" class } \seealso{ Other tests: \code{\link{test_compliance}}, \code{\link{test_connection}}, \code{\link{test_getting_started}}, \code{\link{test_meta}}, \code{\link{test_result}}, \code{\link{test_sql}}, \code{\link{test_stress}}, \code{\link{test_transaction}} } DBItest/man/spec_meta_get_rows_affected.Rd0000644000176200001440000000153013071442207020303 0ustar liggesusers% Generated by roxygen2: do not edit by hand % Please edit documentation in R/spec-meta-get-rows-affected.R \docType{data} \name{spec_meta_get_rows_affected} \alias{spec_meta_get_rows_affected} \title{spec_meta_get_rows_affected} \value{ \code{dbGetRowsAffected()} returns a scalar number (integer or numeric), the number of rows affected by a data manipulation statement issued with \code{\link[=dbSendStatement]{dbSendStatement()}}. The value is available directly after the call and does not change after calling \code{\link[=dbFetch]{dbFetch()}}. For queries issued with \code{\link[=dbSendQuery]{dbSendQuery()}}, zero is returned before and after the call to \code{dbFetch()}. Attempting to get the rows affected for a result set cleared with \code{\link[=dbClearResult]{dbClearResult()}} gives an error. } \description{ spec_meta_get_rows_affected } DBItest/man/test_stress.Rd0000644000176200001440000000137413071442207015201 0ustar liggesusers% Generated by roxygen2: do not edit by hand % Please edit documentation in R/test-stress.R \name{test_stress} \alias{test_stress} \title{Stress tests} \usage{ test_stress(skip = NULL, ctx = get_default_context()) } \arguments{ \item{skip}{\code{[character()]}\cr A vector of regular expressions to match against test names; skip test if matching any.} \item{ctx}{\code{[DBItest_context]}\cr A test context as created by \code{\link[=make_context]{make_context()}}.} } \description{ Stress tests } \seealso{ Other tests: \code{\link{test_compliance}}, \code{\link{test_connection}}, \code{\link{test_driver}}, \code{\link{test_getting_started}}, \code{\link{test_meta}}, \code{\link{test_result}}, \code{\link{test_sql}}, \code{\link{test_transaction}} } DBItest/man/test_getting_started.Rd0000644000176200001440000000162113071442207017040 0ustar liggesusers% Generated by roxygen2: do not edit by hand % Please edit documentation in R/test-getting-started.R \name{test_getting_started} \alias{test_getting_started} \title{Getting started with testing} \usage{ test_getting_started(skip = NULL, ctx = get_default_context()) } \arguments{ \item{skip}{\code{[character()]}\cr A vector of regular expressions to match against test names; skip test if matching any.} \item{ctx}{\code{[DBItest_context]}\cr A test context as created by \code{\link[=make_context]{make_context()}}.} } \description{ Tests very basic features of a DBI driver package, to support testing and test-first development right from the start. } \seealso{ Other tests: \code{\link{test_compliance}}, \code{\link{test_connection}}, \code{\link{test_driver}}, \code{\link{test_meta}}, \code{\link{test_result}}, \code{\link{test_sql}}, \code{\link{test_stress}}, \code{\link{test_transaction}} } DBItest/man/test_result.Rd0000644000176200001440000000142213071442207015166 0ustar liggesusers% Generated by roxygen2: do not edit by hand % Please edit documentation in R/test-result.R \name{test_result} \alias{test_result} \title{Test the "Result" class} \usage{ test_result(skip = NULL, ctx = get_default_context()) } \arguments{ \item{skip}{\code{[character()]}\cr A vector of regular expressions to match against test names; skip test if matching any.} \item{ctx}{\code{[DBItest_context]}\cr A test context as created by \code{\link[=make_context]{make_context()}}.} } \description{ Test the "Result" class } \seealso{ Other tests: \code{\link{test_compliance}}, \code{\link{test_connection}}, \code{\link{test_driver}}, \code{\link{test_getting_started}}, \code{\link{test_meta}}, \code{\link{test_sql}}, \code{\link{test_stress}}, \code{\link{test_transaction}} } DBItest/man/spec_result_send_statement.Rd0000644000176200001440000000242113071442207020236 0ustar liggesusers% Generated by roxygen2: do not edit by hand % Please edit documentation in R/spec-result-send-statement.R \docType{data} \name{spec_result_send_statement} \alias{spec_result_send_statement} \title{spec_result_send_statement} \value{ \code{dbSendStatement()} returns an S4 object that inherits from \linkS4class{DBIResult}. The result set can be used with \code{\link[=dbGetRowsAffected]{dbGetRowsAffected()}} to determine the number of rows affected by the query. Once you have finished using a result, make sure to clear it with \code{\link[=dbClearResult]{dbClearResult()}}. An error is raised when issuing a statement over a closed or invalid connection, if the syntax of the statement is invalid, or if the statement is not a non-\code{NA} string. } \description{ spec_result_send_statement } \section{Specification}{ No warnings occur under normal conditions. When done, the DBIResult object must be cleared with a call to \code{\link[=dbClearResult]{dbClearResult()}}. Failure to clear the result set leads to a warning when the connection is closed. If the backend supports only one open result set per connection, issuing a second query invalidates an already open result set and raises a warning. The newly opened result set is valid and must be cleared with \code{dbClearResult()}. } DBItest/man/make_placeholder_fun.Rd0000644000176200001440000000134013071442207016737 0ustar liggesusers% Generated by roxygen2: do not edit by hand % Please edit documentation in R/spec-meta-bind-.R \name{make_placeholder_fun} \alias{make_placeholder_fun} \title{Create a function that creates n placeholders} \usage{ make_placeholder_fun(pattern) } \arguments{ \item{pattern}{\code{[character(1)]}\cr Any character, optionally followed by \code{1} or \code{name}. Examples: \code{"?"}, \code{"$1"}, \code{":name"}} } \value{ \code{[function(n)]}\cr A function with one argument \code{n} that returns a vector of length \code{n} with placeholders of the specified format. Examples: \code{?, ?, ?, ...}, \code{$1, $2, $3, ...}, \code{:a, :b, :c} } \description{ For internal use by the \code{placeholder_format} tweak. } \keyword{internal} DBItest/man/spec_sql_remove_table.Rd0000644000176200001440000000263013076746114017161 0ustar liggesusers% Generated by roxygen2: do not edit by hand % Please edit documentation in R/spec-sql-remove-table.R \docType{data} \name{spec_sql_remove_table} \alias{spec_sql_remove_table} \title{spec_sql_remove_table} \value{ \code{dbRemoveTable()} returns \code{TRUE}, invisibly. If the table does not exist, an error is raised. An attempt to remove a view with this function may result in an error. An error is raised when calling this method for a closed or invalid connection. An error is also raised if \code{name} cannot be processed with \code{\link[=dbQuoteIdentifier]{dbQuoteIdentifier()}} or if this results in a non-scalar. } \description{ spec_sql_remove_table } \section{Specification}{ A table removed by \code{dbRemoveTable()} doesn't appear in the list of tables returned by \code{\link[=dbListTables]{dbListTables()}}, and \code{\link[=dbExistsTable]{dbExistsTable()}} returns \code{FALSE}. The removal propagates immediately to other connections to the same database. This function can also be used to remove a temporary table. The \code{name} argument is processed as follows, to support databases that allow non-syntactic names for their objects: \itemize{ \item If an unquoted table name as string: \code{dbRemoveTable()} will do the quoting, perhaps by calling \code{dbQuoteIdentifier(conn, x = name)} \item If the result of a call to \code{\link[=dbQuoteIdentifier]{dbQuoteIdentifier()}}: no more quoting is done } } DBItest/man/spec_result_send_query.Rd0000644000176200001440000000226713071442207017407 0ustar liggesusers% Generated by roxygen2: do not edit by hand % Please edit documentation in R/spec-result-send-query.R \docType{data} \name{spec_result_send_query} \alias{spec_result_send_query} \title{spec_result_send_query} \value{ \code{dbSendQuery()} returns an S4 object that inherits from \linkS4class{DBIResult}. The result set can be used with \code{\link[=dbFetch]{dbFetch()}} to extract records. Once you have finished using a result, make sure to clear it with \code{\link[=dbClearResult]{dbClearResult()}}. An error is raised when issuing a query over a closed or invalid connection, if the syntax of the query is invalid, or if the query is not a non-\code{NA} string. } \description{ spec_result_send_query } \section{Specification}{ No warnings occur under normal conditions. When done, the DBIResult object must be cleared with a call to \code{\link[=dbClearResult]{dbClearResult()}}. Failure to clear the result set leads to a warning when the connection is closed. If the backend supports only one open result set per connection, issuing a second query invalidates an already open result set and raises a warning. The newly opened result set is valid and must be cleared with \code{dbClearResult()}. } DBItest/man/test_all.Rd0000644000176200001440000000343313076746114014435 0ustar liggesusers% Generated by roxygen2: do not edit by hand % Please edit documentation in R/test-all.R, R/test-getting-started.R, % R/test-driver.R, R/test-connection.R, R/test-result.R, R/test-sql.R, % R/test-meta.R, R/test-transaction.R, R/test-compliance.R, R/test-stress.R \name{test_all} \alias{test_all} \alias{test_some} \title{Run all tests} \usage{ test_all(skip = NULL, ctx = get_default_context()) test_some(test, ctx = get_default_context()) } \arguments{ \item{skip}{\code{[character()]}\cr A vector of regular expressions to match against test names; skip test if matching any.} \item{ctx}{\code{[DBItest_context]}\cr A test context as created by \code{\link[=make_context]{make_context()}}.} \item{test}{\code{[character]}\cr A character vector of regular expressions describing the tests to run.} } \description{ \code{test_all()} calls all tests defined in this package (see the section "Tests" below). \code{test_some()} allows testing one or more tests, it works by constructing the \code{skip} argument using negative lookaheads. } \section{Tests}{ This function runs the following tests, except the stress tests: \code{\link[=test_getting_started]{test_getting_started()}}: Getting started with testing \code{\link[=test_driver]{test_driver()}}: Test the "Driver" class \code{\link[=test_connection]{test_connection()}}: Test the "Connection" class \code{\link[=test_result]{test_result()}}: Test the "Result" class \code{\link[=test_sql]{test_sql()}}: Test SQL methods \code{\link[=test_meta]{test_meta()}}: Test metadata functions \code{\link[=test_transaction]{test_transaction()}}: Test transaction functions \code{\link[=test_compliance]{test_compliance()}}: Test full compliance to DBI \code{\link[=test_stress]{test_stress()}}: Stress tests (not tested with \code{test_all}) } DBItest/man/test_compliance.Rd0000644000176200001440000000145013071442207015763 0ustar liggesusers% Generated by roxygen2: do not edit by hand % Please edit documentation in R/test-compliance.R \name{test_compliance} \alias{test_compliance} \title{Test full compliance to DBI} \usage{ test_compliance(skip = NULL, ctx = get_default_context()) } \arguments{ \item{skip}{\code{[character()]}\cr A vector of regular expressions to match against test names; skip test if matching any.} \item{ctx}{\code{[DBItest_context]}\cr A test context as created by \code{\link[=make_context]{make_context()}}.} } \description{ Test full compliance to DBI } \seealso{ Other tests: \code{\link{test_connection}}, \code{\link{test_driver}}, \code{\link{test_getting_started}}, \code{\link{test_meta}}, \code{\link{test_result}}, \code{\link{test_sql}}, \code{\link{test_stress}}, \code{\link{test_transaction}} } DBItest/man/test_sql.Rd0000644000176200001440000000137313071442207014454 0ustar liggesusers% Generated by roxygen2: do not edit by hand % Please edit documentation in R/test-sql.R \name{test_sql} \alias{test_sql} \title{Test SQL methods} \usage{ test_sql(skip = NULL, ctx = get_default_context()) } \arguments{ \item{skip}{\code{[character()]}\cr A vector of regular expressions to match against test names; skip test if matching any.} \item{ctx}{\code{[DBItest_context]}\cr A test context as created by \code{\link[=make_context]{make_context()}}.} } \description{ Test SQL methods } \seealso{ Other tests: \code{\link{test_compliance}}, \code{\link{test_connection}}, \code{\link{test_driver}}, \code{\link{test_getting_started}}, \code{\link{test_meta}}, \code{\link{test_result}}, \code{\link{test_stress}}, \code{\link{test_transaction}} } DBItest/man/spec_meta_bind.Rd0000644000176200001440000001064513232372770015562 0ustar liggesusers% Generated by roxygen2: do not edit by hand % Please edit documentation in R/spec-meta-bind-runner.R, R/spec-meta-bind.R \docType{data} \name{spec_meta_bind} \alias{spec_meta_bind} \alias{spec_meta_bind} \title{spec_meta_bind} \value{ \code{dbBind()} returns the result set, invisibly, for queries issued by \code{\link[=dbSendQuery]{dbSendQuery()}} and also for data manipulation statements issued by \code{\link[=dbSendStatement]{dbSendStatement()}}. Calling \code{dbBind()} for a query without parameters raises an error. Binding too many or not enough values, or parameters with wrong names or unequal length, also raises an error. If the placeholders in the query are named, all parameter values must have names (which must not be empty or \code{NA}), and vice versa, otherwise an error is raised. The behavior for mixing placeholders of different types (in particular mixing positional and named placeholders) is not specified. Calling \code{dbBind()} on a result set already cleared by \code{\link[=dbClearResult]{dbClearResult()}} also raises an error. } \description{ spec_meta_bind spec_meta_bind } \section{Specification}{ \pkg{DBI} clients execute parametrized statements as follows: \enumerate{ \item Call \code{\link[=dbSendQuery]{dbSendQuery()}} or \code{\link[=dbSendStatement]{dbSendStatement()}} with a query or statement that contains placeholders, store the returned \linkS4class{DBIResult} object in a variable. Mixing placeholders (in particular, named and unnamed ones) is not recommended. It is good practice to register a call to \code{\link[=dbClearResult]{dbClearResult()}} via \code{\link[=on.exit]{on.exit()}} right after calling \code{dbSendQuery()} or \code{dbSendStatement()} (see the last enumeration item). Until \code{dbBind()} has been called, the returned result set object has the following behavior: \itemize{ \item \code{\link[=dbFetch]{dbFetch()}} raises an error (for \code{dbSendQuery()}) \item \code{\link[=dbGetRowCount]{dbGetRowCount()}} returns zero (for \code{dbSendQuery()}) \item \code{\link[=dbGetRowsAffected]{dbGetRowsAffected()}} returns an integer \code{NA} (for \code{dbSendStatement()}) \item \code{\link[=dbIsValid]{dbIsValid()}} returns \code{TRUE} \item \code{\link[=dbHasCompleted]{dbHasCompleted()}} returns \code{FALSE} } \item Construct a list with parameters that specify actual values for the placeholders. The list must be named or unnamed, depending on the kind of placeholders used. Named values are matched to named parameters, unnamed values are matched by position in the list of parameters. All elements in this list must have the same lengths and contain values supported by the backend; a \link{data.frame} is internally stored as such a list. The parameter list is passed to a call to \code{dbBind()} on the \code{DBIResult} object. \item Retrieve the data or the number of affected rows from the \code{DBIResult} object. \itemize{ \item For queries issued by \code{dbSendQuery()}, call \code{\link[=dbFetch]{dbFetch()}}. \item For statements issued by \code{dbSendStatements()}, call \code{\link[=dbGetRowsAffected]{dbGetRowsAffected()}}. (Execution begins immediately after the \code{dbBind()} call, the statement is processed entirely before the function returns.) } \item Repeat 2. and 3. as necessary. \item Close the result set via \code{\link[=dbClearResult]{dbClearResult()}}. } The elements of the \code{params} argument do not need to be scalars, vectors of arbitrary length (including length 0) are supported. For queries, calling \code{dbFetch()} binding such parameters returns concatenated results, equivalent to binding and fetching for each set of values and connecting via \code{\link[=rbind]{rbind()}}. For data manipulation statements, \code{dbGetRowsAffected()} returns the total number of rows affected if binding non-scalar parameters. \code{dbBind()} also accepts repeated calls on the same result set for both queries and data manipulation statements, even if no results are fetched between calls to \code{dbBind()}. At least the following data types are accepted: \itemize{ \item \link{integer} \item \link{numeric} \item \link{logical} for Boolean values (some backends may return an integer) \item \link{NA} \item \link{character} \item \link{factor} (bound as character, with warning) \item \link{Date} \item \link{POSIXct} timestamps \item \link{POSIXlt} timestamps \item lists of \link{raw} for blobs (with \code{NULL} entries for SQL NULL values) \item objects of type \link[blob:blob]{blob::blob} } } DBItest/man/spec_sql_quote_string.Rd0000644000176200001440000000274013071442207017231 0ustar liggesusers% Generated by roxygen2: do not edit by hand % Please edit documentation in R/spec-sql-quote-string.R \docType{data} \name{spec_sql_quote_string} \alias{spec_sql_quote_string} \title{spec_sql_quote_string} \value{ \code{dbQuoteString()} returns an object that can be coerced to \link{character}, of the same length as the input. For an empty character vector this function returns a length-0 object. When passing the returned object again to \code{dbQuoteString()} as \code{x} argument, it is returned unchanged. Passing objects of class \link{SQL} should also return them unchanged. (For backends it may be most convenient to return \link{SQL} objects to achieve this behavior, but this is not required.) } \description{ spec_sql_quote_string } \section{Specification}{ The returned expression can be used in a \code{SELECT ...} query, and for any scalar character \code{x} the value of \code{dbGetQuery(paste0("SELECT ", dbQuoteString(x)))[[1]]} must be identical to \code{x}, even if \code{x} contains spaces, tabs, quotes (single or double), backticks, or newlines (in any combination) or is itself the result of a \code{dbQuoteString()} call coerced back to character (even repeatedly). If \code{x} is \code{NA}, the result must merely satisfy \code{\link[=is.na]{is.na()}}. The strings \code{"NA"} or \code{"NULL"} are not treated specially. \code{NA} should be translated to an unquoted SQL \code{NULL}, so that the query \code{SELECT * FROM (SELECT 1) a WHERE ... IS NULL} returns one row. } DBItest/man/spec_meta_has_completed.Rd0000644000176200001440000000244513071442207017446 0ustar liggesusers% Generated by roxygen2: do not edit by hand % Please edit documentation in R/spec-meta-has-completed.R \docType{data} \name{spec_meta_has_completed} \alias{spec_meta_has_completed} \title{spec_meta_has_completed} \value{ \code{dbHasCompleted()} returns a logical scalar. For a query initiated by \code{\link[=dbSendQuery]{dbSendQuery()}} with non-empty result set, \code{dbHasCompleted()} returns \code{FALSE} initially and \code{TRUE} after calling \code{\link[=dbFetch]{dbFetch()}} without limit. For a query initiated by \code{\link[=dbSendStatement]{dbSendStatement()}}, \code{dbHasCompleted()} always returns \code{TRUE}. Attempting to query completion status for a result set cleared with \code{\link[=dbClearResult]{dbClearResult()}} gives an error. } \description{ spec_meta_has_completed } \section{Specification}{ The completion status for a query is only guaranteed to be set to \code{FALSE} after attempting to fetch past the end of the entire result. Therefore, for a query with an empty result set, the initial return value is unspecified, but the result value is \code{TRUE} after trying to fetch only one row. Similarly, for a query with a result set of length n, the return value is unspecified after fetching n rows, but the result value is \code{TRUE} after trying to fetch only one more row. } DBItest/man/spec_meta_get_row_count.Rd0000644000176200001440000000207013071442207017507 0ustar liggesusers% Generated by roxygen2: do not edit by hand % Please edit documentation in R/spec-meta-get-row-count.R \docType{data} \name{spec_meta_get_row_count} \alias{spec_meta_get_row_count} \title{spec_meta_get_row_count} \value{ \code{dbGetRowCount()} returns a scalar number (integer or numeric), the number of rows fetched so far. After calling \code{\link[=dbSendQuery]{dbSendQuery()}}, the row count is initially zero. After a call to \code{\link[=dbFetch]{dbFetch()}} without limit, the row count matches the total number of rows returned. Fetching a limited number of rows increases the number of rows by the number of rows returned, even if fetching past the end of the result set. For queries with an empty result set, zero is returned even after fetching. For data manipulation statements issued with \code{\link[=dbSendStatement]{dbSendStatement()}}, zero is returned before and after calling \code{dbFetch()}. Attempting to get the row count for a result set cleared with \code{\link[=dbClearResult]{dbClearResult()}} gives an error. } \description{ spec_meta_get_row_count } DBItest/man/spec_result_fetch.Rd0000644000176200001440000000321113232372770016316 0ustar liggesusers% Generated by roxygen2: do not edit by hand % Please edit documentation in R/spec-result-fetch.R \docType{data} \name{spec_result_fetch} \alias{spec_result_fetch} \title{spec_result_fetch} \value{ \code{dbFetch()} always returns a \link{data.frame} with as many rows as records were fetched and as many columns as fields in the result set, even if the result is a single value or has one or zero rows. An attempt to fetch from a closed result set raises an error. If the \code{n} argument is not an atomic whole number greater or equal to -1 or Inf, an error is raised, but a subsequent call to \code{dbFetch()} with proper \code{n} argument succeeds. Calling \code{dbFetch()} on a result set from a data manipulation query created by \code{\link[=dbSendStatement]{dbSendStatement()}} can be fetched and return an empty data frame, with a warning. } \description{ spec_result_fetch } \section{Specification}{ Fetching multi-row queries with one or more columns be default returns the entire result. Multi-row queries can also be fetched progressively by passing a whole number (\link{integer} or \link{numeric}) as the \code{n} argument. A value of \link{Inf} for the \code{n} argument is supported and also returns the full result. If more rows than available are fetched, the result is returned in full without warning. If fewer rows than requested are returned, further fetches will return a data frame with zero rows. If zero rows are fetched, the columns of the data frame are still fully typed. Fetching fewer rows than available is permitted, no warning is issued when clearing the result set. A column named \code{row_names} is treated like any other column. } DBItest/man/spec_sql_quote_identifier.Rd0000644000176200001440000000334513103067422020045 0ustar liggesusers% Generated by roxygen2: do not edit by hand % Please edit documentation in R/spec-sql-quote-identifier.R \docType{data} \name{spec_sql_quote_identifier} \alias{spec_sql_quote_identifier} \title{spec_sql_quote_identifier} \value{ \code{dbQuoteIdentifier()} returns an object that can be coerced to \link{character}, of the same length as the input. For an empty character vector this function returns a length-0 object. An error is raised if the input contains \code{NA}, but not for an empty string. When passing the returned object again to \code{dbQuoteIdentifier()} as \code{x} argument, it is returned unchanged. Passing objects of class \link{SQL} should also return them unchanged. (For backends it may be most convenient to return \link{SQL} objects to achieve this behavior, but this is not required.) } \description{ spec_sql_quote_identifier } \section{Specification}{ Calling \code{\link[=dbGetQuery]{dbGetQuery()}} for a query of the format \code{SELECT 1 AS ...} returns a data frame with the identifier, unquoted, as column name. Quoted identifiers can be used as table and column names in SQL queries, in particular in queries like \code{SELECT 1 AS ...} and \code{SELECT * FROM (SELECT 1) ...}. The method must use a quoting mechanism that is unambiguously different from the quoting mechanism used for strings, so that a query like \code{SELECT ... FROM (SELECT 1 AS ...)} throws an error if the column names do not match. The method can quote column names that contain special characters such as a space, a dot, a comma, or quotes used to mark strings or identifiers, if the database supports this. In any case, checking the validity of the identifier should be performed only when executing a query, and not by \code{dbQuoteIdentifier()}. } DBItest/man/spec_sql_exists_table.Rd0000644000176200001440000000257313076746114017211 0ustar liggesusers% Generated by roxygen2: do not edit by hand % Please edit documentation in R/spec-sql-exists-table.R \docType{data} \name{spec_sql_exists_table} \alias{spec_sql_exists_table} \title{spec_sql_exists_table} \value{ \code{dbExistsTable()} returns a logical scalar, \code{TRUE} if the table or view specified by the \code{name} argument exists, \code{FALSE} otherwise. This includes temporary tables if supported by the database. An error is raised when calling this method for a closed or invalid connection. An error is also raised if \code{name} cannot be processed with \code{\link[=dbQuoteIdentifier]{dbQuoteIdentifier()}} or if this results in a non-scalar. } \description{ spec_sql_exists_table } \section{Additional arguments}{ TBD: \code{temporary = NA} This must be provided as named argument. See the "Specification" section for details on their usage. } \section{Specification}{ The \code{name} argument is processed as follows, to support databases that allow non-syntactic names for their objects: \itemize{ \item If an unquoted table name as string: \code{dbExistsTable()} will do the quoting, perhaps by calling \code{dbQuoteIdentifier(conn, x = name)} \item If the result of a call to \code{\link[=dbQuoteIdentifier]{dbQuoteIdentifier()}}: no more quoting is done } For all tables listed by \code{\link[=dbListTables]{dbListTables()}}, \code{dbExistsTable()} returns \code{TRUE}. } DBItest/man/tweaks.Rd0000644000176200001440000000606213232372770014122 0ustar liggesusers% Generated by roxygen2: do not edit by hand % Please edit documentation in R/tweaks.R \name{tweaks} \alias{tweaks} \title{Tweaks for DBI tests} \usage{ tweaks(..., constructor_name = NULL, constructor_relax_args = FALSE, strict_identifier = FALSE, omit_blob_tests = FALSE, current_needs_parens = FALSE, union = function(x) paste(x, collapse = " UNION "), placeholder_pattern = NULL, logical_return = identity, date_cast = function(x) paste0("date('", x, "')"), time_cast = function(x) paste0("time('", x, "')"), timestamp_cast = function(x) paste0("timestamp('", x, "')"), date_typed = TRUE, time_typed = TRUE, timestamp_typed = TRUE, temporary_tables = TRUE) } \arguments{ \item{...}{\code{[any]}\cr Unknown tweaks are accepted, with a warning. The ellipsis also asserts that all arguments are named.} \item{constructor_name}{\code{[character(1)]}\cr Name of the function that constructs the \code{Driver} object.} \item{constructor_relax_args}{\code{[logical(1)]}\cr If \code{TRUE}, allow a driver constructor with default values for all arguments; otherwise, require a constructor with empty argument list (default).} \item{strict_identifier}{\code{[logical(1)]}\cr Set to \code{TRUE} if the DBMS does not support arbitrarily-named identifiers even when quoting is used.} \item{omit_blob_tests}{\code{[logical(1)]}\cr Set to \code{TRUE} if the DBMS does not support a \code{BLOB} data type.} \item{current_needs_parens}{\code{[logical(1)]}\cr Set to \code{TRUE} if the SQL functions \code{current_date}, \code{current_time}, and \code{current_timestamp} require parentheses.} \item{union}{\code{[function(character)]}\cr Function that combines several subqueries into one so that the resulting query returns the concatenated results of the subqueries} \item{placeholder_pattern}{\code{[character]}\cr A pattern for placeholders used in \code{\link[=dbBind]{dbBind()}}, e.g., \code{"?"}, \code{"$1"}, or \code{":name"}. See \code{\link[=make_placeholder_fun]{make_placeholder_fun()}} for details.} \item{logical_return}{\code{[function(logical)]}\cr A vectorized function that converts logical values to the data type returned by the DBI backend.} \item{date_cast}{\code{[function(character)]}\cr A vectorized function that creates an SQL expression for coercing a string to a date value.} \item{time_cast}{\code{[function(character)]}\cr A vectorized function that creates an SQL expression for coercing a string to a time value.} \item{timestamp_cast}{\code{[function(character)]}\cr A vectorized function that creates an SQL expression for coercing a string to a timestamp value.} \item{date_typed}{\code{[logical(1L)]}\cr Set to \code{FALSE} if the DBMS doesn't support a dedicated type for dates.} \item{time_typed}{\code{[logical(1L)]}\cr Set to \code{FALSE} if the DBMS doesn't support a dedicated type for times.} \item{timestamp_typed}{\code{[logical(1L)]}\cr Set to \code{FALSE} if the DBMS doesn't support a dedicated type for timestamps.} \item{temporary_tables}{\code{[logical(1L)]}\cr Set to \code{FALSE} if the DBMS doesn't support temporary tables.} } \description{ TBD. } DBItest/man/test_connection.Rd0000644000176200001440000000145013071442207016010 0ustar liggesusers% Generated by roxygen2: do not edit by hand % Please edit documentation in R/test-connection.R \name{test_connection} \alias{test_connection} \title{Test the "Connection" class} \usage{ test_connection(skip = NULL, ctx = get_default_context()) } \arguments{ \item{skip}{\code{[character()]}\cr A vector of regular expressions to match against test names; skip test if matching any.} \item{ctx}{\code{[DBItest_context]}\cr A test context as created by \code{\link[=make_context]{make_context()}}.} } \description{ Test the "Connection" class } \seealso{ Other tests: \code{\link{test_compliance}}, \code{\link{test_driver}}, \code{\link{test_getting_started}}, \code{\link{test_meta}}, \code{\link{test_result}}, \code{\link{test_sql}}, \code{\link{test_stress}}, \code{\link{test_transaction}} } DBItest/man/spec_meta_get_statement.Rd0000644000176200001440000000105113071442207017472 0ustar liggesusers% Generated by roxygen2: do not edit by hand % Please edit documentation in R/spec-meta-get-statement.R \docType{data} \name{spec_meta_get_statement} \alias{spec_meta_get_statement} \title{spec_meta_get_statement} \value{ \code{dbGetStatement()} returns a string, the query used in either \code{\link[=dbSendQuery]{dbSendQuery()}} or \code{\link[=dbSendStatement]{dbSendStatement()}}. Attempting to query the statement for a result set cleared with \code{\link[=dbClearResult]{dbClearResult()}} gives an error. } \description{ spec_meta_get_statement } DBItest/man/DBIspec-wip.Rd0000644000176200001440000000225513104147640014664 0ustar liggesusers% Generated by roxygen2: do not edit by hand % Please edit documentation in R/spec.R, R/spec-driver-get-info.R, % R/spec-connection-get-info.R, R/spec-sql-list-fields.R, % R/spec-meta-column-info.R, R/spec-meta-get-info-result.R, % R/spec-stress-connection.R \docType{data} \name{DBIspec-wip} \alias{DBIspec-wip} \title{DBI specification (work in progress)} \description{ Placeholder page. } \section{Driver}{ \subsection{\code{dbGetInfo("DBIDriver")} (deprecated)}{ Return value of dbGetInfo has necessary elements. } } \section{Connection}{ \subsection{\code{dbGetInfo("DBIConnection")} (deprecated)}{ Return value of dbGetInfo has necessary elements } \subsection{Stress tests}{ Open 50 simultaneous connections Open and close 50 connections } } \section{SQL}{ \subsection{\code{dbListFields("DBIConnection")}}{ Can list the fields for a table in the database. A column named \code{row_names} is treated like any other column. } } \section{Meta}{ \subsection{\code{dbColumnInfo("DBIResult")}}{ Column information is correct. } \subsection{\code{dbGetInfo("DBIResult")} (deprecated)}{ Return value of dbGetInfo has necessary elements } } \keyword{datasets} \keyword{internal} DBItest/man/spec_result_roundtrip.Rd0000644000176200001440000000404013232372770017254 0ustar liggesusers% Generated by roxygen2: do not edit by hand % Please edit documentation in R/spec-result-roundtrip.R \docType{data} \name{spec_result_roundtrip} \alias{spec_result_roundtrip} \title{spec_result_roundtrip} \description{ spec_result_roundtrip } \section{Specification}{ The column types of the returned data frame depend on the data returned: \itemize{ \item \link{integer} for integer values between -2^31 and 2^31 - 1 \item \link{numeric} for numbers with a fractional component \item \link{logical} for Boolean values (some backends may return an integer) \item \link{character} for text \item lists of \link{raw} for blobs (with \code{NULL} entries for SQL NULL values) \item coercible using \code{\link[=as.Date]{as.Date()}} for dates (also applies to the return value of the SQL function \code{current_date}) \item coercible using \code{\link[hms:as.hms]{hms::as.hms()}} for times (also applies to the return value of the SQL function \code{current_time}) \item coercible using \code{\link[=as.POSIXct]{as.POSIXct()}} for timestamps (also applies to the return value of the SQL function \code{current_timestamp}) \item \link{NA} for SQL \code{NULL} values } If dates and timestamps are supported by the backend, the following R types are used: \itemize{ \item \link{Date} for dates (also applies to the return value of the SQL function \code{current_date}) \item \link{POSIXct} for timestamps (also applies to the return value of the SQL function \code{current_timestamp}) } R has no built-in type with lossless support for the full range of 64-bit or larger integers. If 64-bit integers are returned from a query, the following rules apply: \itemize{ \item Values are returned in a container with support for the full range of valid 64-bit values (such as the \code{integer64} class of the \pkg{bit64} package) \item Coercion to numeric always returns a number that is as close as possible to the true value \item Loss of precision when converting to numeric gives a warning \item Conversion to character always returns a lossless decimal representation of the data } } DBItest/man/spec_result_clear_result.Rd0000644000176200001440000000144313071442207017710 0ustar liggesusers% Generated by roxygen2: do not edit by hand % Please edit documentation in R/spec-result-clear-result.R \docType{data} \name{spec_result_clear_result} \alias{spec_result_clear_result} \title{spec_result_clear_result} \value{ \code{dbClearResult()} returns \code{TRUE}, invisibly, for result sets obtained from both \code{dbSendQuery()} and \code{dbSendStatement()}. An attempt to close an already closed result set issues a warning in both cases. } \description{ spec_result_clear_result } \section{Specification}{ \code{dbClearResult()} frees all resources associated with retrieving the result of a query or update operation. The DBI backend can expect a call to \code{dbClearResult()} for each \code{\link[=dbSendQuery]{dbSendQuery()}} or \code{\link[=dbSendStatement]{dbSendStatement()}} call. } DBItest/man/spec_transaction_begin_commit_rollback.Rd0000644000176200001440000000404313232372770022545 0ustar liggesusers% Generated by roxygen2: do not edit by hand % Please edit documentation in R/spec-transaction-begin-commit-rollback.R \docType{data} \name{spec_transaction_begin_commit_rollback} \alias{spec_transaction_begin_commit_rollback} \title{spec_transaction_begin_commit_rollback} \value{ \code{dbBegin()}, \code{dbCommit()} and \code{dbRollback()} return \code{TRUE}, invisibly. The implementations are expected to raise an error in case of failure, but this is not tested. In any way, all generics throw an error with a closed or invalid connection. In addition, a call to \code{dbCommit()} or \code{dbRollback()} without a prior call to \code{dbBegin()} raises an error. Nested transactions are not supported by DBI, an attempt to call \code{dbBegin()} twice yields an error. } \description{ spec_transaction_begin_commit_rollback } \section{Specification}{ Actual support for transactions may vary between backends. A transaction is initiated by a call to \code{dbBegin()} and committed by a call to \code{dbCommit()}. Data written in a transaction must persist after the transaction is committed. For example, a table that is missing when the transaction is started but is created and populated during the transaction must exist and contain the data added there both during and after the transaction, and also in a new connection. A transaction can also be aborted with \code{dbRollback()}. All data written in such a transaction must be removed after the transaction is rolled back. For example, a table that is missing when the transaction is started but is created during the transaction must not exist anymore after the rollback. Disconnection from a connection with an open transaction effectively rolls back the transaction. All data written in such a transaction must be removed after the transaction is rolled back. The behavior is not specified if other arguments are passed to these functions. In particular, \pkg{RSQLite} issues named transactions with support for nesting if the \code{name} argument is set. The transaction isolation level is not specified by DBI. } DBItest/man/spec_result_create_table_with_data_type.Rd0000644000176200001440000000075113071442207022724 0ustar liggesusers% Generated by roxygen2: do not edit by hand % Please edit documentation in R/spec-result-create-table-with-data-type.R \docType{data} \name{spec_result_create_table_with_data_type} \alias{spec_result_create_table_with_data_type} \title{spec_result_create_table_with_data_type} \description{ spec_result_create_table_with_data_type } \section{Specification}{ All data types returned by \code{dbDataType()} are usable in an SQL statement of the form \code{"CREATE TABLE test (a ...)"}. } DBItest/man/spec_driver_connect.Rd0000644000176200001440000000204613071442207016632 0ustar liggesusers% Generated by roxygen2: do not edit by hand % Please edit documentation in R/spec-driver-connect.R \docType{data} \name{spec_driver_connect} \alias{spec_driver_connect} \title{spec_driver_connect} \value{ \code{dbConnect()} returns an S4 object that inherits from \linkS4class{DBIConnection}. This object is used to communicate with the database engine. } \description{ spec_driver_connect } \section{Specification}{ DBI recommends using the following argument names for authentication parameters, with \code{NULL} default: \itemize{ \item \code{user} for the user name (default: current user) \item \code{password} for the password \item \code{host} for the host name (default: local connection) \item \code{port} for the port number (default: local connection) \item \code{dbname} for the name of the database on the host, or the database file name } The defaults should provide reasonable behavior, in particular a local connection for \code{host = NULL}. For some DBMS (e.g., PostgreSQL), this is different to a TCP/IP connection to \code{localhost}. } DBItest/man/spec_sql_write_table.Rd0000644000176200001440000001115413232372770017014 0ustar liggesusers% Generated by roxygen2: do not edit by hand % Please edit documentation in R/spec-sql-write-table.R \docType{data} \name{spec_sql_write_table} \alias{spec_sql_write_table} \title{spec_sql_write_table} \value{ \code{dbWriteTable()} returns \code{TRUE}, invisibly. If the table exists, and both \code{append} and \code{overwrite} arguments are unset, or \code{append = TRUE} and the data frame with the new data has different column names, an error is raised; the remote table remains unchanged. An error is raised when calling this method for a closed or invalid connection. An error is also raised if \code{name} cannot be processed with \code{\link[=dbQuoteIdentifier]{dbQuoteIdentifier()}} or if this results in a non-scalar. Invalid values for the additional arguments \code{row.names}, \code{overwrite}, \code{append}, \code{field.types}, and \code{temporary} (non-scalars, unsupported data types, \code{NA}, incompatible values, duplicate or missing names, incompatible columns) also raise an error. } \description{ spec_sql_write_table } \section{Additional arguments}{ The following arguments are not part of the \code{dbWriteTable()} generic (to improve compatibility across backends) but are part of the DBI specification: \itemize{ \item \code{row.names} (default: \code{NA}) \item \code{overwrite} (default: \code{FALSE}) \item \code{append} (default: \code{FALSE}) \item \code{field.types} (default: \code{NULL}) \item \code{temporary} (default: \code{FALSE}) } They must be provided as named arguments. See the "Specification" and "Value" sections for details on their usage. } \section{Specification}{ The \code{name} argument is processed as follows, to support databases that allow non-syntactic names for their objects: \itemize{ \item If an unquoted table name as string: \code{dbWriteTable()} will do the quoting, perhaps by calling \code{dbQuoteIdentifier(conn, x = name)} \item If the result of a call to \code{\link[=dbQuoteIdentifier]{dbQuoteIdentifier()}}: no more quoting is done } If the \code{overwrite} argument is \code{TRUE}, an existing table of the same name will be overwritten. This argument doesn't change behavior if the table does not exist yet. If the \code{append} argument is \code{TRUE}, the rows in an existing table are preserved, and the new data are appended. If the table doesn't exist yet, it is created. If the \code{temporary} argument is \code{TRUE}, the table is not available in a second connection and is gone after reconnecting. Not all backends support this argument. A regular, non-temporary table is visible in a second connection and after reconnecting to the database. SQL keywords can be used freely in table names, column names, and data. Quotes, commas, and spaces can also be used in the data, and, if the database supports non-syntactic identifiers, also for table names and column names. The following data types must be supported at least, and be read identically with \code{\link[=dbReadTable]{dbReadTable()}}: \itemize{ \item integer \item numeric (also with \code{Inf} and \code{NaN} values, the latter are translated to \code{NA}) \item logical \item \code{NA} as NULL \item 64-bit values (using \code{"bigint"} as field type); the result can be converted to a numeric, which may lose precision, \item character (in both UTF-8 and native encodings), supporting empty strings \item factor (returned as character) \item list of raw (if supported by the database) \item objects of type \link[blob:blob]{blob::blob} (if supported by the database) \item date (if supported by the database; returned as \code{Date}) \item time (if supported by the database; returned as objects that inherit from \code{difftime}) \item timestamp (if supported by the database; returned as \code{POSIXct} with time zone support) } Mixing column types in the same table is supported. The \code{field.types} argument must be a named character vector with at most one entry for each column. It indicates the SQL data type to be used for a new column. The interpretation of \link{rownames} depends on the \code{row.names} argument, see \code{\link[=sqlRownamesToColumn]{sqlRownamesToColumn()}} for details: \itemize{ \item If \code{FALSE} or \code{NULL}, row names are ignored. \item If \code{TRUE}, row names are converted to a column named "row_names", even if the input data frame only has natural row names from 1 to \code{nrow(...)}. \item If \code{NA}, a column named "row_names" is created if the data has custom row names, no extra column is created in the case of natural row names. \item If a string, this specifies the name of the column in the remote table that contains the row names, even if the input data frame only has natural row names. } } DBItest/man/test_data_type.Rd0000644000176200001440000000305413103067422015623 0ustar liggesusers% Generated by roxygen2: do not edit by hand % Please edit documentation in R/spec-driver-data-type.R \name{test_data_type} \alias{test_data_type} \title{test_data_type} \usage{ test_data_type(ctx, dbObj) } \arguments{ \item{ctx, dbObj}{Arguments to internal test function} } \value{ \code{dbDataType()} returns the SQL type that corresponds to the \code{obj} argument as a non-empty character string. For data frames, a character vector with one element per column is returned. An error is raised for invalid values for the \code{obj} argument such as a \code{NULL} value. } \description{ test_data_type } \section{Specification}{ The backend can override the \code{\link[=dbDataType]{dbDataType()}} generic for its driver class. This generic expects an arbitrary object as second argument. To query the values returned by the default implementation, run \code{example(dbDataType, package = "DBI")}. If the backend needs to override this generic, it must accept all basic R data types as its second argument, namely \link{logical}, \link{integer}, \link{numeric}, \link{character}, dates (see \link{Dates}), date-time (see \link{DateTimeClasses}), and \link{difftime}. If the database supports blobs, this method also must accept lists of \link{raw} vectors, and \link[blob:blob]{blob::blob} objects. As-is objects (i.e., wrapped by \code{\link[=I]{I()}}) must be supported and return the same results as their unwrapped counterparts. The SQL data type for \link{factor} and \link{ordered} is the same as for character. The behavior for other object types is not specified. } DBItest/man/spec_result_get_query.Rd0000644000176200001440000000325513104147640017233 0ustar liggesusers% Generated by roxygen2: do not edit by hand % Please edit documentation in R/spec-result-get-query.R \docType{data} \name{spec_result_get_query} \alias{spec_result_get_query} \title{spec_result_get_query} \value{ \code{dbGetQuery()} always returns a \link{data.frame} with as many rows as records were fetched and as many columns as fields in the result set, even if the result is a single value or has one or zero rows. An error is raised when issuing a query over a closed or invalid connection, if the syntax of the query is invalid, or if the query is not a non-\code{NA} string. If the \code{n} argument is not an atomic whole number greater or equal to -1 or Inf, an error is raised, but a subsequent call to \code{dbGetQuery()} with proper \code{n} argument succeeds. } \description{ spec_result_get_query } \section{Additional arguments}{ The following arguments are not part of the \code{dbGetQuery()} generic (to improve compatibility across backends) but are part of the DBI specification: \itemize{ \item \code{n} (default: -1) \item \code{params} (TBD) } They must be provided as named arguments. See the "Specification" and "Value" sections for details on their usage. } \section{Specification}{ Fetching multi-row queries with one or more columns be default returns the entire result. A value of \link{Inf} for the \code{n} argument is supported and also returns the full result. If more rows than available are fetched, the result is returned in full without warning. If zero rows are fetched, the columns of the data frame are still fully typed. Fetching fewer rows than available is permitted, no warning is issued. A column named \code{row_names} is treated like any other column. } DBItest/man/context.Rd0000644000176200001440000000217013071442207014276 0ustar liggesusers% Generated by roxygen2: do not edit by hand % Please edit documentation in R/context.R \name{make_context} \alias{make_context} \alias{set_default_context} \alias{get_default_context} \title{Test contexts} \usage{ make_context(drv, connect_args, set_as_default = TRUE, tweaks = NULL, name = NULL) set_default_context(ctx) get_default_context() } \arguments{ \item{drv}{\code{[DBIDriver]}\cr An expression that constructs a DBI driver, like \code{SQLite()}.} \item{connect_args}{\code{[named list]}\cr Connection arguments (names and values).} \item{set_as_default}{\code{[logical(1)]}\cr Should the created context be set as default context?} \item{tweaks}{\code{[DBItest_tweaks]}\cr Tweaks as constructed by the \code{\link[=tweaks]{tweaks()}} function.} \item{name}{\code{[character]}\cr An optional name of the context which will be used in test messages.} \item{ctx}{\code{[DBItest_context]}\cr A test context.} } \value{ \code{[DBItest_context]}\cr A test context, for \code{set_default_context} the previous default context (invisibly) or \code{NULL}. } \description{ Create a test context, set and query the default context. } DBItest/man/DBIspec.Rd0000644000176200001440000000316013232405532014062 0ustar liggesusers% Generated by roxygen2: do not edit by hand % Please edit documentation in R/spec.R, R/spec-getting-started.R, % R/spec-compliance-methods.R, R/spec-driver-constructor.R, R/spec-driver.R, % R/spec-connection.R, R/spec-result.R, R/spec-sql.R, R/spec-meta.R, % R/spec-transaction.R, R/spec-compliance.R, R/spec-stress.R \docType{data} \name{DBIspec} \alias{DBIspec} \title{DBI specification} \description{ Placeholder page. } \section{Definition}{ A DBI backend is an R package which imports the \pkg{DBI} and \pkg{methods} packages. For better or worse, the names of many existing backends start with \sQuote{R}, e.g., \pkg{RSQLite}, \pkg{RMySQL}, \pkg{RSQLServer}; it is up to the backend author to adopt this convention or not. } \section{DBI classes and methods}{ A backend defines three classes, which are subclasses of \linkS4class{DBIDriver}, \linkS4class{DBIConnection}, and \linkS4class{DBIResult}. The backend provides implementation for all methods of these base classes that are defined but not implemented by DBI. All methods have an ellipsis \code{...} in their formals. } \section{Construction of the DBIDriver object}{ The backend must support creation of an instance of its \linkS4class{DBIDriver} subclass with a \dfn{constructor function}. By default, its name is the package name without the leading \sQuote{R} (if it exists), e.g., \code{SQLite} for the \pkg{RSQLite} package. However, backend authors may choose a different name. The constructor must be exported, and it must be a function that is callable without arguments. DBI recommends to define a constructor with an empty argument list. } \keyword{datasets} DBItest/man/spec_meta_is_valid.Rd0000644000176200001440000000227013232372770016433 0ustar liggesusers% Generated by roxygen2: do not edit by hand % Please edit documentation in R/spec-meta-is-valid.R \docType{data} \name{spec_meta_is_valid} \alias{spec_meta_is_valid} \title{spec_meta_is_valid} \value{ \code{dbIsValid()} returns a logical scalar, \code{TRUE} if the object specified by \code{dbObj} is valid, \code{FALSE} otherwise. A \linkS4class{DBIConnection} object is initially valid, and becomes invalid after disconnecting with \code{\link[=dbDisconnect]{dbDisconnect()}}. A \linkS4class{DBIResult} object is valid after a call to \code{\link[=dbSendQuery]{dbSendQuery()}}, and stays valid even after all rows have been fetched; only clearing it with \code{\link[=dbClearResult]{dbClearResult()}} invalidates it. A \linkS4class{DBIResult} object is also valid after a call to \code{\link[=dbSendStatement]{dbSendStatement()}}, and stays valid after querying the number of rows affected; only clearing it with \code{\link[=dbClearResult]{dbClearResult()}} invalidates it. If the connection to the database system is dropped (e.g., due to connectivity problems, server failure, etc.), \code{dbIsValid()} should return \code{FALSE}. This is not tested automatically. } \description{ spec_meta_is_valid } DBItest/man/spec_transaction_with_transaction.Rd0000644000176200001440000000215013076746114021620 0ustar liggesusers% Generated by roxygen2: do not edit by hand % Please edit documentation in R/spec-transaction-with-transaction.R \docType{data} \name{spec_transaction_with_transaction} \alias{spec_transaction_with_transaction} \title{spec_transaction_with_transaction} \value{ \code{dbWithTransaction()} returns the value of the executed code. Failure to initiate the transaction (e.g., if the connection is closed or invalid of if \code{\link[=dbBegin]{dbBegin()}} has been called already) gives an error. } \description{ spec_transaction_with_transaction } \section{Specification}{ \code{dbWithTransaction()} initiates a transaction with \code{dbBegin()}, executes the code given in the \code{code} argument, and commits the transaction with \code{\link[=dbCommit]{dbCommit()}}. If the code raises an error, the transaction is instead aborted with \code{\link[=dbRollback]{dbRollback()}}, and the error is propagated. If the code calls \code{dbBreak()}, execution of the code stops and the transaction is silently aborted. All side effects caused by the code (such as the creation of new variables) propagate to the calling environment. } DBItest/man/spec_sql_list_tables.Rd0000644000176200001440000000162313076746114017023 0ustar liggesusers% Generated by roxygen2: do not edit by hand % Please edit documentation in R/spec-sql-list-tables.R \docType{data} \name{spec_sql_list_tables} \alias{spec_sql_list_tables} \title{spec_sql_list_tables} \value{ \code{dbListTables()} returns a character vector that enumerates all tables and views in the database. Tables added with \code{\link[=dbWriteTable]{dbWriteTable()}} are part of the list, including temporary tables if supported by the database. As soon a table is removed from the database, it is also removed from the list of database tables. The returned names are suitable for quoting with \code{dbQuoteIdentifier()}. An error is raised when calling this method for a closed or invalid connection. } \description{ spec_sql_list_tables } \section{Additional arguments}{ TBD: \code{temporary = NA} This must be provided as named argument. See the "Specification" section for details on their usage. } DBItest/man/test_transaction.Rd0000644000176200001440000000144713071442207016204 0ustar liggesusers% Generated by roxygen2: do not edit by hand % Please edit documentation in R/test-transaction.R \name{test_transaction} \alias{test_transaction} \title{Test transaction functions} \usage{ test_transaction(skip = NULL, ctx = get_default_context()) } \arguments{ \item{skip}{\code{[character()]}\cr A vector of regular expressions to match against test names; skip test if matching any.} \item{ctx}{\code{[DBItest_context]}\cr A test context as created by \code{\link[=make_context]{make_context()}}.} } \description{ Test transaction functions } \seealso{ Other tests: \code{\link{test_compliance}}, \code{\link{test_connection}}, \code{\link{test_driver}}, \code{\link{test_getting_started}}, \code{\link{test_meta}}, \code{\link{test_result}}, \code{\link{test_sql}}, \code{\link{test_stress}} } DBItest/man/spec_driver_data_type.Rd0000644000176200001440000000275513103067422017160 0ustar liggesusers% Generated by roxygen2: do not edit by hand % Please edit documentation in R/spec-driver-data-type.R \docType{data} \name{spec_driver_data_type} \alias{spec_driver_data_type} \title{spec_driver_data_type} \value{ \code{dbDataType()} returns the SQL type that corresponds to the \code{obj} argument as a non-empty character string. For data frames, a character vector with one element per column is returned. An error is raised for invalid values for the \code{obj} argument such as a \code{NULL} value. } \description{ spec_driver_data_type } \section{Specification}{ The backend can override the \code{\link[=dbDataType]{dbDataType()}} generic for its driver class. This generic expects an arbitrary object as second argument. To query the values returned by the default implementation, run \code{example(dbDataType, package = "DBI")}. If the backend needs to override this generic, it must accept all basic R data types as its second argument, namely \link{logical}, \link{integer}, \link{numeric}, \link{character}, dates (see \link{Dates}), date-time (see \link{DateTimeClasses}), and \link{difftime}. If the database supports blobs, this method also must accept lists of \link{raw} vectors, and \link[blob:blob]{blob::blob} objects. As-is objects (i.e., wrapped by \code{\link[=I]{I()}}) must be supported and return the same results as their unwrapped counterparts. The SQL data type for \link{factor} and \link{ordered} is the same as for character. The behavior for other object types is not specified. } DBItest/man/spec_sql_read_table.Rd0000644000176200001440000000512213121446671016572 0ustar liggesusers% Generated by roxygen2: do not edit by hand % Please edit documentation in R/spec-sql-read-table.R \docType{data} \name{spec_sql_read_table} \alias{spec_sql_read_table} \title{spec_sql_read_table} \value{ \code{dbReadTable()} returns a data frame that contains the complete data from the remote table, effectively the result of calling \code{\link[=dbGetQuery]{dbGetQuery()}} with \code{SELECT * FROM }. An error is raised if the table does not exist. An empty table is returned as a data frame with zero rows. The presence of \link{rownames} depends on the \code{row.names} argument, see \code{\link[=sqlColumnToRownames]{sqlColumnToRownames()}} for details: \itemize{ \item If \code{FALSE} or \code{NULL}, the returned data frame doesn't have row names. \item If \code{TRUE}, a column named "row_names" is converted to row names, an error is raised if no such column exists. \item If \code{NA}, a column named "row_names" is converted to row names if it exists, otherwise no translation occurs. \item If a string, this specifies the name of the column in the remote table that contains the row names, an error is raised if no such column exists. } The default is \code{row.names = FALSE}. If the database supports identifiers with special characters, the columns in the returned data frame are converted to valid R identifiers if the \code{check.names} argument is \code{TRUE}, otherwise non-syntactic column names can be returned unquoted. An error is raised when calling this method for a closed or invalid connection. An error is raised if \code{name} cannot be processed with \code{\link[=dbQuoteIdentifier]{dbQuoteIdentifier()}} or if this results in a non-scalar. Unsupported values for \code{row.names} and \code{check.names} (non-scalars, unsupported data types, \code{NA} for \code{check.names}) also raise an error. } \description{ spec_sql_read_table } \section{Additional arguments}{ The following arguments are not part of the \code{dbReadTable()} generic (to improve compatibility across backends) but are part of the DBI specification: \itemize{ \item \code{row.names} \item \code{check.names} } They must be provided as named arguments. See the "Value" section for details on their usage. } \section{Specification}{ The \code{name} argument is processed as follows, to support databases that allow non-syntactic names for their objects: \itemize{ \item If an unquoted table name as string: \code{dbReadTable()} will do the quoting, perhaps by calling \code{dbQuoteIdentifier(conn, x = name)} \item If the result of a call to \code{\link[=dbQuoteIdentifier]{dbQuoteIdentifier()}}: no more quoting is done } } DBItest/man/DBItest-package.Rd0000644000176200001440000000144713232372770015515 0ustar liggesusers% Generated by roxygen2: do not edit by hand % Please edit documentation in R/DBItest.R \docType{package} \name{DBItest-package} \alias{DBItest} \alias{DBItest-package} \title{DBItest: Testing 'DBI' Back Ends} \description{ A helper that tests 'DBI' back ends for conformity to the interface. } \details{ The two most important functions are \code{\link[=make_context]{make_context()}} and \code{\link[=test_all]{test_all()}}. The former tells the package how to connect to your DBI backend, the latter executes all tests of the test suite. More fine-grained test functions (all with prefix \code{test_}) are available. See the package's vignette for more details. } \seealso{ Useful links: \itemize{ \item Report bugs at \url{https://github.com/rstats-db/DBItest/issues} } } \author{ Kirill Müller } DBItest/man/spec_result_execute.Rd0000644000176200001440000000153013071442207016663 0ustar liggesusers% Generated by roxygen2: do not edit by hand % Please edit documentation in R/spec-result-execute.R \docType{data} \name{spec_result_execute} \alias{spec_result_execute} \title{spec_result_execute} \value{ \code{dbExecute()} always returns a scalar numeric that specifies the number of rows affected by the statement. An error is raised when issuing a statement over a closed or invalid connection, if the syntax of the statement is invalid, or if the statement is not a non-\code{NA} string. } \description{ spec_result_execute } \section{Additional arguments}{ The following argument is not part of the \code{dbExecute()} generic (to improve compatibility across backends) but is part of the DBI specification: \itemize{ \item \code{params} (TBD) } They must be provided as named arguments. See the "Specification" section for details on its usage. }