polycor/0000755000176200001440000000000014167354404011751 5ustar liggesuserspolycor/NAMESPACE0000644000176200001440000000100714167115470013164 0ustar liggesusers# last modified 2022-01-10 by J. Fox importFrom("stats", "complete.cases", "cor", "dnorm", "na.omit", "optim", "optimise", "pchisq", "pnorm", "qnorm", "sd", "coef") importFrom("parallel", "detectCores") importFrom("admisc", "tryCatchWEM") export(hetcor, polychor, polyserial) export(detectCores) S3method(hetcor, data.frame) S3method(hetcor, default) S3method(print, polycor) S3method(print, hetcor) S3method(as.matrix, hetcor) S3method(coef, polycor) S3method(vcov, polycor) S3method(summary, polycor) polycor/man/0000755000176200001440000000000013514744331012521 5ustar liggesuserspolycor/man/print.polycor.Rd0000644000176200001440000000356314153744747015654 0ustar liggesusers\name{print.polycor} \alias{print.polycor} \alias{summary.polycor} \alias{coef.polycor} \alias{vcov.polycor} \title{Standard Methods for polycor Objects} \description{ Some standard methods for objects of class \code{polycor}, produced by \code{\link{polychor}} and \code{\link{polyserial}}, including \code{\link{print}}, \code{\link{summary}}, \code{\link{coef}}, and \code{\link{vcov}}. The \code{summary} method simply invokes the \code{print} method. } \usage{ \method{print}{polycor}(x, digits = max(3, getOption("digits") - 3), ...) \method{summary}{polycor}(object, ...) \method{coef}{polycor}(object, correlation=TRUE, thresholds=TRUE, ...) \method{vcov}{polycor}(object, correlation=TRUE, thresholds=TRUE, ...) } \arguments{ \item{x, object}{an object of class \code{polycor}, as returned by \code{polychor} or \code{polyserial}.} \item{digits}{number of significant digits to be printed.} \item{correlation}{return the estimated correlation or sampling variance of the correlation.} \item{thresholds}{return the estimated thresholds or sampling variances/covariances of the thresholds.} \item{\dots}{pass arguments from \code{summary} to \code{print}; otherwise not used.} } \author{John Fox \email{jfox@mcmaster.ca}} \seealso{\code{\link{polychor}}, \code{\link{polyserial}}} \examples{ if(require(mvtnorm)){ set.seed(12345) data <- rmvnorm(1000, c(0, 0), matrix(c(1, .5, .5, 1), 2, 2)) x <- data[,1] y <- data[,2] cor(x, y) # sample correlation } if(require(mvtnorm)){ x <- cut(x, c(-Inf, .75, Inf)) y <- cut(y, c(-Inf, -1, .5, 1.5, Inf)) print(polychor(x, y, ML=TRUE, std.err=TRUE), digits=3) # polychoric correlation, ML estimate } if(require(mvtnorm)){ coef(polychor(x, y, ML=TRUE, std.err=TRUE)) } if(require(mvtnorm)){ vcov(polychor(x, y, ML=TRUE, std.err=TRUE)) } } \keyword{print} \keyword{methods} polycor/man/polyserial.Rd0000644000176200001440000001124614153722457015204 0ustar liggesusers\name{polyserial} \alias{polyserial} \title{Polyserial Correlation} \description{ Computes the polyserial correlation (and its standard error) between a quantitative variable and an ordinal variable, based on the assumption that the joint distribution of the quantitative variable and a latent continuous variable underlying the ordinal variable is bivariate normal. Either the maximum-likelihood estimator or a quicker ``two-step'' approximation is available. For the ML estimator the estimates of the thresholds and the covariance matrix of the estimates are also available. } \usage{ polyserial(x, y, ML = FALSE, control = list(), std.err = FALSE, maxcor=.9999, bins=4, start, thresholds=FALSE) } \arguments{ \item{x}{a numerical variable.} \item{y}{an ordered categorical variable; can be numeric, logical, a factor, an ordered factor, or a character variables, but if a factor, its levels should be in proper order, and the values of a character variable are ordered alphabetically.} \item{ML}{if \code{TRUE}, compute the maximum-likelihood estimate; if \code{FALSE}, the default, compute a quicker ``two-step'' approximation.} \item{control}{optional arguments to be passed to the \code{optim} function.} \item{std.err}{if \code{TRUE}, return the estimated variance of the correlation (for the two-step estimator) or the estimated covariance matrix of the correlation and thresholds (for the ML estimator); the default is \code{FALSE}.} \item{maxcor}{maximum absolute correlation (to insure numerical stability).} \item{bins}{the number of bins into which to dissect \code{x} for a test of bivariate normality; the default is 4.} \item{start}{optional start value(s): if a single number, start value for the correlation; if a list with the elements \code{rho} and \code{thresholds}, start values for these parameters; start values are supplied automatically if omitted, and are only relevant when the ML estimator or standard errors are selected.} \item{thresholds}{if \code{TRUE} (the default is \code{FALSE}) return estimated thresholds along with the estimated correlation even if standard errors aren't computed.} } \value{ If \code{std.err} or \code{thresholds} is \code{TRUE}, returns an object of class \code{"polycor"} with the following components: \item{type}{set to \code{"polyserial"}.} \item{rho}{the polyserial correlation.} \item{cuts}{estimated thresholds for the ordinal variable (\code{y}), for the ML estimator.} \item{var}{the estimated variance of the correlation, or, for the ML estimator, the estimated covariance matrix of the correlation and thresholds.} \item{n}{the number of observations on which the correlation is based.} \item{chisq}{chi-square test for bivariate normality.} \item{df}{degrees of freedom for the test of bivariate normality.} \item{ML}{\code{TRUE} for the ML estimate, \code{FALSE} for the two-step estimate.} Othewise, returns the polyserial correlation. } \details{ The ML estimator is computed by maximizing the bivariate-normal likelihood with respect to the thresholds for \eqn{y} (\eqn{\tau^{y}_j, i = 1,\ldots, c - 1}{\tau^y[j], j = 1,\ldots, c - 1}) and the population correlation (\eqn{\rho}). The likelihood is maximized numerically using the \code{\link{optim}} function, and the covariance matrix of the estimated parameters is based on the numerical Hessian computed by \code{optim}. The two-step estimator is computed by first estimating the thresholds (\eqn{\tau^{y}_j, i = 1,\ldots, c - 1}{\tau^y[j], j = 1,\ldots, c - 1}) from the marginal distribution of \eqn{y}. Then if the standard error of \eqn{\hat{\rho}}{\rho hat} is requested, the one-dimensional likelihood for \eqn{\rho} is maximized numerically, using \code{\link{optim}} if standard errors are requested; the standard error computed treats the thresholds as fixed. If the standard error isn't request, \eqn{\hat{\rho}}{\rho hat} is computed directly. } \references{ Drasgow, F. (1986) Polychoric and polyserial correlations. Pp. 68--74 in S. Kotz and N. Johnson, eds., \emph{The Encyclopedia of Statistics, Volume 7.} Wiley. } \author{John Fox \email{jfox@mcmaster.ca}} \seealso{\code{\link{hetcor}}, \code{\link{polychor}}, \code{\link{print.polycor}}, \code{\link[stats]{optim}}} \examples{ if(require(mvtnorm)){ set.seed(12345) data <- rmvnorm(1000, c(0, 0), matrix(c(1, .5, .5, 1), 2, 2)) x <- data[,1] y <- data[,2] cor(x, y) # sample correlation } if(require(mvtnorm)){ y <- cut(y, c(-Inf, -1, .5, 1.5, Inf)) polyserial(x, y) # 2-step estimate } if(require(mvtnorm)){ polyserial(x, y, ML=TRUE, std.err=TRUE) # ML estimate } } \keyword{models} polycor/man/hetcor.Rd0000644000176200001440000001521714155244532014302 0ustar liggesusers\name{hetcor} \alias{hetcor} \alias{hetcor.default} \alias{hetcor.data.frame} \alias{print.hetcor} \alias{as.matrix.hetcor} \alias{detectCores} \title{Heterogeneous Correlation Matrix} \description{ \code{hetcor} computes a heterogenous correlation matrix, consisting of Pearson product-moment correlations between numeric variables, polyserial correlations between numeric and ordinal variables, and polychoric correlations between ordinal variables. The \code{\link[parallel]{detectCores}} function is imported from the \pkg{parallel} package and re-exported. } \usage{ hetcor(data, ..., ML = FALSE, std.err = TRUE, use=c("complete.obs", "pairwise.complete.obs"), bins=4, pd=TRUE, parallel=FALSE, ncores=detectCores(logical=FALSE), thresholds=FALSE) \method{hetcor}{data.frame}(data, ML = FALSE, std.err = TRUE, use = c("complete.obs", "pairwise.complete.obs"), bins=4, pd=TRUE, parallel=FALSE, ncores=detectCores(logical=FALSE), thresholds=FALSE, ...) \method{hetcor}{default}(data, ..., ML = FALSE, std.err = TRUE, use=c("complete.obs", "pairwise.complete.obs"), bins=4, pd=TRUE, parallel=FALSE, ncores=detectCores(logical=FALSE), thresholds=FALSE) \method{print}{hetcor}(x, digits = max(3, getOption("digits") - 3), ...) \method{as.matrix}{hetcor}(x, ...) detectCores(all.tests=FALSE, logical=TRUE) } \arguments{ \item{data}{a data frame consisting of factors, ordered factors, logical variables, character variables, and/or numeric variables, or the first of several variables.} \item{\dots}{variables and/or arguments to be passed down.} \item{ML}{if \code{TRUE}, compute maximum-likelihood estimates; if \code{FALSE}, compute quick two-step estimates.} \item{std.err}{if \code{TRUE}, compute standard errors.} \item{bins}{number of bins to use for continuous variables in testing bivariate normality; the default is 4.} \item{pd}{if \code{TRUE} and if the correlation matrix is not positive-definite, an attempt will be made to adjust it to a positive-definite matrix, using the \code{\link[Matrix]{nearPD}} function in the \code{Matrix} package. Note that default arguments to \code{nearPD} are used (except \code{corr=TRUE}); for more control call \code{nearPD} directly.} \item{parallel}{if \code{TRUE} (the default is \code{FALSE}), perform parallel computations on a computer with multiple CPUs/cores.} \item{ncores}{the number of cores to use for parallel computations; the default is the number of physical cores detected.} \item{use}{if \code{"complete.obs"}, remove observations with any missing data; if \code{"pairwise.complete.obs"}, compute each correlation using all observations with valid data for that pair of variables.} \item{thresholds}{if \code{TRUE} (the default is \code{FALSE}), include the estimated thresholds for polyserial and polychoric correlation in the returned object.} \item{x}{an object of class \code{"hetcor"} to be printed, or from which to extract the correlation matrix.} \item{digits}{number of significant digits.} \item{all.tests}{logical, apply all known tests; default is \code{FALSE}.} \item{logical}{if \code{TRUE}, detect logical CPUs/cores; if \code{FALSE}, detect physical CPUs/cores.} } \value{ \code{hetcor} returns an object of class \code{"hetcor"} with the following components: \item{correlations}{the correlation matrix.} \item{type}{the type of each correlation: \code{"Pearson"}, \code{"Polychoric"}, or \code{"Polyserial"}.} \item{std.errors}{the standard errors of the correlations, if requested.} \item{n}{the number (or numbers) of observations on which the correlations are based.} \item{tests}{p-values for tests of bivariate normality for each pair of variables.} \item{NA.method}{the method by which any missing data were handled: \code{"complete.obs"} or \code{"pairwise.complete.obs"}.} \item{ML}{\code{TRUE} for ML estimates, \code{FALSE} for two-step estimates.} \item{thresholds}{optionally, according to the \code{thresholds} argument, a matrix of mode list with a list of thresholds for each polyserial and polychoric correlation in the elements below the diagonal and the type of each correlation (Pearson, polyserial, or polychoric) above the diagonal.} } \author{John Fox \email{jfox@mcmaster.ca}} \note{Although the function reports standard errors for product-moment correlations, transformations (the most well known is Fisher's \emph{z}-transformation) are available that make the approach to asymptotic normality much more rapid.} \section{Warning}{ Be careful with character variables (as opposed to factors), the values of which are ordered alphabetically. Thus, e.g., the values \code{"disagree"}, \code{"neutral"}, \code{"agree"} are ordered \code{"agree"}, \code{"disagree"}, \code{"neutral"}. } \references{ Drasgow, F. (1986) Polychoric and polyserial correlations. Pp. 68-74 in S. Kotz and N. Johnson, eds., \emph{The Encyclopedia of Statistics, Volume 7.} Wiley. Olsson, U. (1979) Maximum likelihood estimation of the polychoric correlation coefficient. \emph{Psychometrika} \bold{44}, 443-460. Rodriguez, R.N. (1982) Correlation. Pp. 193-204 in S. Kotz and N. Johnson, eds., \emph{The Encyclopedia of Statistics, Volume 2.} Wiley. Ghosh, B.K. (1966) Asymptotic expansion for the moments of the distribution of correlation coefficient. \emph{Biometrika} \bold{53}, 258-262. Olkin, I., and Pratt, J.W. (1958) Unbiased estimation of certain correlation coefficients. \emph{Annals of Mathematical Statistics} \bold{29}, 201-211. } \seealso{\code{\link{polychor}}, \code{\link{polyserial}}, \code{\link[Matrix]{nearPD}}, \code{\link[parallel]{detectCores}}} \examples{ if(require(mvtnorm)){ set.seed(12345) R <- matrix(0, 4, 4) R[upper.tri(R)] <- runif(6) diag(R) <- 1 R <- cov2cor(t(R) \%*\% R) round(R, 4) # population correlations data <- rmvnorm(1000, rep(0, 4), R) round(cor(data), 4) # sample correlations } if(require(mvtnorm)){ x1 <- data[,1] x2 <- data[,2] y1 <- cut(data[,3], c(-Inf, .75, Inf)) y2 <- cut(data[,4], c(-Inf, -1, .5, 1.5, Inf)) data <- data.frame(x1, x2, y1, y2) hetcor(data) # Pearson, polychoric, and polyserial correlations, 2-step est. } if(require(mvtnorm)){ hetcor(x1, x2, y1, y2, ML=TRUE) # Pearson, polychoric, polyserial correlations, ML est. } \dontrun{ hc <- hetcor(data, ML=TRUE) # parallel computation: hc.m <- hetcor(data, ML=TRUE, parallel=TRUE, ncores=min(2, detectCores())) hc.m all.equal(hc, hc.m) # error handling: data$y1[data$y2 == "(0.5,1.5]"] <- NA hetcor(data) } } \keyword{models} polycor/man/polychor.Rd0000644000176200001440000001204214167153322014645 0ustar liggesusers\name{polychor} \alias{polychor} \title{Polychoric Correlation} \description{ Computes the polychoric correlation (and its standard error) between two ordinal variables or from their contingency table, under the assumption that the ordinal variables dissect continuous latent variables that are bivariate normal. Either the maximum-likelihood estimator or a (possibly much) quicker ``two-step'' approximation is available. For the ML estimator, the estimates of the thresholds and the covariance matrix of the estimates are also available. } \usage{ polychor(x, y, ML = FALSE, control = list(), std.err = FALSE, maxcor=.9999, start, thresholds=FALSE) } \arguments{ \item{x}{a contingency table of counts or an ordered categorical variable; the latter can be numeric, logical, a factor, an ordered factor, or a character variable, but if a factor, its levels should be in proper order, and the values of a character variable are ordered alphabetically.} \item{y}{if \code{x} is a variable, a second ordered categorical variable.} \item{ML}{if \code{TRUE}, compute the maximum-likelihood estimate; if \code{FALSE}, the default, compute a quicker ``two-step'' approximation.} \item{control}{optional arguments to be passed to the \code{optim} function.} \item{std.err}{if \code{TRUE}, return the estimated variance of the correlation (for the two-step estimator) or the estimated covariance matrix (for the ML estimator) of the correlation and thresholds; the default is \code{FALSE}.} \item{maxcor}{maximum absolute correlation (to insure numerical stability).} \item{start}{optional start value(s): if a single number, start value for the correlation; if a list with the elements \code{rho}, \code{row.thresholds}, and \code{column.thresholds}, start values for these parameters; start values are supplied automatically if omitted, and are only relevant when the ML estimator or standard errors are selected.} \item{thresholds}{if \code{TRUE} (the default is \code{FALSE}) return estimated thresholds along with the estimated correlation even if standard errors aren't computed.} } \value{ If \code{std.err} or \code{thresholds} is \code{TRUE}, returns an object of class \code{"polycor"} with the following components: \item{type}{set to \code{"polychoric"}.} \item{rho}{the polychoric correlation.} \item{row.cuts}{estimated thresholds for the row variable (\code{x}), for the ML estimate.} \item{col.cuts}{estimated thresholds for the column variable (\code{y}), for the ML estimate.} \item{var}{the estimated variance of the correlation, or, for the ML estimate, the estimated covariance matrix of the correlation and thresholds.} \item{n}{the number of observations on which the correlation is based.} \item{chisq}{chi-square test for bivariate normality.} \item{df}{degrees of freedom for the test of bivariate normality.} \item{ML}{\code{TRUE} for the ML estimate, \code{FALSE} for the two-step estimate.} Othewise, returns the polychoric correlation. } \details{ The ML estimator is computed by maximizing the bivariate-normal likelihood with respect to the thresholds for the two variables (\eqn{\tau^{x}_i, i = 1,\ldots, r - 1}{\tau^x[i], i = 1,\ldots, r - 1}; \eqn{\tau^{y}_j, j = 1,\ldots, c - 1}{\tau^y[j], j = 1,\ldots, c - 1}) and the population correlation (\eqn{\rho}). Here, \eqn{r} and \eqn{c} are respectively the number of levels of \eqn{x} and \eqn{y}. The likelihood is maximized numerically using the \code{\link{optim}} function, and the covariance matrix of the estimated parameters is based on the numerical Hessian computed by \code{optim}. The two-step estimator is computed by first estimating the thresholds (\eqn{\tau^{x}_i, i = 1,\ldots, r - 1}{\tau^x[i], i = 1,\ldots, r - 1} and \eqn{\tau^{y}_j, i = j,\ldots, c - 1}{\tau^y[j], i = j,\ldots, c - 1}) separately from the marginal distribution of each variable. Then the one-dimensional likelihood for \eqn{\rho} is maximized numerically, using \code{\link{optim}} if standard errors are requested, or \code{\link{optimise}} if they are not. The standard error computed treats the thresholds as fixed. } \references{ Drasgow, F. (1986) Polychoric and polyserial correlations. Pp. 68--74 in S. Kotz and N. Johnson, eds., \emph{The Encyclopedia of Statistics, Volume 7.} Wiley. Olsson, U. (1979) Maximum likelihood estimation of the polychoric correlation coefficient. \emph{Psychometrika} \bold{44}, 443-460. } \author{John Fox \email{jfox@mcmaster.ca}} \seealso{\code{\link{hetcor}}, \code{\link{polyserial}}, \code{\link{print.polycor}}, \code{\link[stats]{optim}}} \examples{ if(require(mvtnorm)){ set.seed(12345) data <- rmvnorm(1000, c(0, 0), matrix(c(1, .5, .5, 1), 2, 2)) x <- data[,1] y <- data[,2] cor(x, y) # sample correlation } if(require(mvtnorm)){ x <- cut(x, c(-Inf, .75, Inf)) y <- cut(y, c(-Inf, -1, .5, 1.5, Inf)) polychor(x, y) # 2-step estimate } if(require(mvtnorm)){ polychor(x, y, ML=TRUE, std.err=TRUE) # ML estimate } } \keyword{models} polycor/DESCRIPTION0000644000176200001440000000155314167354404013463 0ustar liggesusersPackage: polycor Version: 0.8-1 Date: 2022-01-10 Title: Polychoric and Polyserial Correlations Authors@R: c(person("John", "Fox", role = c("aut", "cre"), email = "jfox@mcmaster.ca"), person("Adrian", "Dusa", role = "ctb")) Depends: R (>= 3.3.0) Imports: stats, mvtnorm, Matrix, parallel, admisc (>= 0.22) ByteCompile: yes LazyLoad: yes Description: Computes polychoric and polyserial correlations by quick "two-step" methods or ML, optionally with standard errors; tetrachoric and biserial correlations are special cases. License: GPL (>= 2) URL: https://r-forge.r-project.org/projects/polycor/, https://CRAN.R-project.org/package=polycor NeedsCompilation: no Packaged: 2022-01-11 18:44:12 UTC; johnfox Author: John Fox [aut, cre], Adrian Dusa [ctb] Maintainer: John Fox Repository: CRAN Date/Publication: 2022-01-11 19:22:44 UTC polycor/NEWS0000644000176200001440000000317714167115502012452 0ustar liggesusersversion 0.8-1 o Further enhancements to warnings in hetcor() and error reporting in polyserial(); my thanks to Adrian Dusa for changes to his admisc::tryCatchWEM() facilitating these enhancements. o polychor() and polyserial() can return estimated thresholds even when std.err=FALSE, via new thresholds argument; hetcor() has also acquired a thresholds argument (suggestions of Florian Schuberth). o New methods for standard generics, including summary(), coef(), and vcov(). version 0.8-0 o hetcor() can optionally use parallel computations (suggestion of Marc Segond). o polychor() and polyserial() now allow user-specified start values (suggestion of someone whose name is lost to antiquity, with my apologies). o Cases where correlations can't be computed (e.g., because a factor has only one level in the data) are handled more gracefully in hetcor(). version 0.7-10 o Let hetcor() work with objects that inherit from "data.frame" (suggestion of Emil O. W. Kirkegaard). o Allow empty pairs when use="pairwise.complete.obs" for hetcor() (suggestion of Emil O. W. Kirkegaard). o Allow character variables in hetcor() but print message. version 0.7-9 o Further protection/warnings against inadmissible correlations. o Protect against out-of-order thresholds during optimization (suggestion of Evgeny Mirkes). o hetcor() doesn't die when an error is thrown computing a correlation, which is now set to NA. o hetcor() now uses Matrix::nearPD() to force a postitive-definite result rather than the deprecated sfsmisc::nearcor() (suggestion of Timothy Bates). o Improved clarity of documentation. o See the CHANGES file for prior versions. polycor/R/0000755000176200001440000000000014166376455012163 5ustar liggesuserspolycor/R/utilities.R0000644000176200001440000000232713514744332014312 0ustar liggesusers# last modified 29 Mar 07 by J. Fox binBvn <- function(rho, row.cuts, col.cuts, bins=4){ row.cuts <- if (missing(row.cuts)) c(-Inf, 1:(bins - 1)/bins, Inf) else c(-Inf, row.cuts, Inf) col.cuts <- if (missing(col.cuts)) c(-Inf, 1:(bins - 1)/bins, Inf) else c(-Inf, col.cuts, Inf) r <- length(row.cuts) - 1 c <- length(col.cuts) - 1 P <- matrix(0, r, c) R <- matrix(c(1, rho, rho, 1), 2, 2) for (i in 1:r){ for (j in 1:c){ P[i,j] <- mvtnorm::pmvnorm(lower=c(row.cuts[i], col.cuts[j]), upper=c(row.cuts[i+1], col.cuts[j+1]), corr=R) } } P } chisq <- function(x, y, rho, row.cuts, col.cuts, zerotol=1e-6, bins=4){ if (missing(row.cuts)) row.cuts <- qnorm(1:(bins - 1)/bins) if (missing(col.cuts)) col.cuts <- qnorm(1:(bins - 1)/bins) P <- binBvn(rho, row.cuts, col.cuts, bins=bins) if (!is.factor(x)) x <- cut(scale(x), c(-Inf, row.cuts, Inf)) if (!is.factor(y)) y <- cut(scale(y), c(-Inf, col.cuts, Inf)) tab <- table(x, y) n <- sum(tab) 2*sum(tab*log((tab + zerotol)/(P*n))) } as.matrix.hetcor <- function(x, ...) x$correlations polycor/R/polychor.R0000644000176200001440000001402014167115405014125 0ustar liggesusers# last modified 2022-01-10 by J. Fox polychor <- function (x, y, ML=FALSE, control=list(), std.err=FALSE, maxcor=.9999, start, thresholds=FALSE){ f <- function(pars) { if (length(pars) == 1){ rho <- pars if (abs(rho) > maxcor) rho <- sign(rho)*maxcor row.cuts <- rc col.cuts <- cc } else { rho <- pars[1] if (abs(rho) > maxcor) rho <- sign(rho)*maxcor row.cuts <- pars[2:r] col.cuts <- pars[(r+1):(r+c-1)] if (any(diff(row.cuts) < 0) || any(diff(col.cuts) < 0)) return(Inf) } P <- binBvn(rho, row.cuts, col.cuts) - sum(tab * log(P)) } tab <- if (missing(y)) x else table(x, y) zerorows <- apply(tab, 1, function(x) all(x == 0)) zerocols <- apply(tab, 2, function(x) all(x == 0)) zr <- sum(zerorows) if (0 < zr) warning(paste(zr, " row", suffix <- if(zr == 1) "" else "s", " with zero marginal", suffix," removed", sep="")) zc <- sum(zerocols) if (0 < zc) warning(paste(zc, " column", suffix <- if(zc == 1) "" else "s", " with zero marginal", suffix, " removed", sep="")) tab <- tab[!zerorows, ,drop=FALSE] tab <- tab[, !zerocols, drop=FALSE] r <- nrow(tab) c <- ncol(tab) if (r < 2) { warning("the table has fewer than 2 rows") return(NA) } if (c < 2) { warning("the table has fewer than 2 columns") return(NA) } n <- sum(tab) rc <- qnorm(cumsum(rowSums(tab))/n)[-r] cc <- qnorm(cumsum(colSums(tab))/n)[-c] if (!missing(start) && (ML || std.err)) { if (is.list(start)){ rho <- start$rho rc <- start$row.thresholds cc <- start$col.thresholds } else { rho <- start } if (!is.numeric(rho) || length(rho) != 1) stop("start value for rho must be a number") if (!is.numeric(rc) || length(rc) != r - 1) stop("start values for row thresholds must be ", r - 1, "numbers") if (!is.numeric(cc) || length(cc) != c - 1) stop("start values for column thresholds must be ", c - 1, "numbers") } if (ML) { result <- optim( c(if (missing(start)) optimise(f, interval=c(-1, 1))$minimum else rho, rc, cc), f, control=control, hessian=std.err ) if (result$par[1] > 1){ result$par[1] <- maxcor warning(paste("inadmissible correlation set to", maxcor)) } else if (result$par[1] < -1){ result$par[1] <- -maxcor warning(paste("inadmissible correlation set to -", maxcor, sep="")) } if (std.err) { chisq <- 2*(result$value + sum(tab * log((tab + 1e-6)/n))) df <- length(tab) - r - c result <- list(type="polychoric", rho=result$par[1], row.cuts=result$par[2:r], col.cuts=result$par[(r+1):(r+c-1)], var=solve(result$hessian), n=n, chisq=chisq, df=df, ML=TRUE) class(result) <- "polycor" return(result) } else if (thresholds){ result <- list(type="polychoric", rho=result$par[1], row.cuts=result$par[2:r], col.cuts=result$par[(r+1):(r+c-1)], var=NA, n=n, chisq=NA, df=NA, ML=TRUE) class(result) <- "polycor" return(result) } else return(as.vector(result$par[1])) } else if (std.err){ result <- optim(if (missing(start)) 0 else rho, f, control=control, hessian=TRUE, method="BFGS") if (result$par > 1){ result$par <- maxcor warning(paste("inadmissible correlation set to", maxcor)) } else if (result$par < -1){ result$par <- -maxcor warning(paste("inadmissible correlation set to -", maxcor, sep="")) } chisq <- 2*(result$value + sum(tab *log((tab + 1e-6)/n))) df <- length(tab) - r - c result <- list(type="polychoric", rho=result$par, row.cuts=rc, col.cuts=cc, var=1/result$hessian, n=n, chisq=chisq, df=df, ML=FALSE) class(result) <- "polycor" return(result) } else { rho <- optimise(f, interval=c(-maxcor, maxcor))$minimum if (thresholds){ result <- list(type="polychoric", rho=rho, row.cuts=rc, col.cuts=cc, var=NA, n=n, chisq=NA, df=NA, ML=FALSE) class(result) <- "polycor" return(result) } else { return(rho) } } } polycor/R/print.polycor.R0000644000176200001440000000553414153721403015116 0ustar liggesusers# last modified 2021-12-07 by J. Fox "print.polycor" <- function(x, digits = max(3, getOption("digits") - 3), ...){ if (x$type == "polychoric"){ if (!all(is.na(x$var))){ se <- sqrt(diag(x$var)) se.rho <- se[1] } else { se <- NA se.rho <- NA } est <- if (x$ML) "ML est." else "2-step est." if (!is.na(se.rho)){ cat("\nPolychoric Correlation, ", est, " = ", signif(x$rho, digits), " (", signif(se.rho, digits), ")", sep="") if (x$df > 0) cat("\nTest of bivariate normality: Chisquare = ", signif(x$chisq, digits), ", df = ", x$df, ", p = ", signif(pchisq(x$chisq, x$df, lower.tail=FALSE), digits), "\n", sep="") else cat("\n") } else { cat("\nPolychoric Correlation, ", est, " = ", signif(x$rho, digits), "\n\n") } r <- length(x$row.cuts) c <- length(x$col.cuts) if (r == 0) return(invisible(x)) if (!all(is.na(se))){ row.cuts.se <- se[2:(r+1)] col.cuts.se <- se[(r+2):(r+c+1)] } else { row.cuts.se <- rep(NA, r) col.cuts.se <- rep(NA, c) } rowThresh <- signif(cbind(x$row.cuts, row.cuts.se), digits) if (r > 1) cat("\n Row Thresholds") else cat("\n Row Threshold") rownames(rowThresh) <- if (r > 1) 1:r else " " if (all(is.na(rowThresh[, 2]))) print(rowThresh[, 1, drop=FALSE]) else { colnames(rowThresh) <- c("Threshold", "Std.Err.") cat("\n") print(rowThresh) } colThresh <- signif(cbind(x$col.cuts, col.cuts.se), digits) if (c > 1) cat("\n\n Column Thresholds") else cat("\n\n Column Threshold") rownames(colThresh) <- if (c > 1) 1:c else " " if (all(is.na(colThresh[, 2]))) print(colThresh[, 1, drop=FALSE]) else { colnames(colThresh) <- c("Threshold", "Std.Err.") cat("\n") print(colThresh) } } else if (x$type == "polyserial"){ if (!all(is.na(x$var))) { se <- sqrt(diag(x$var)) se.rho <- se[1] } else { se <- NA se.rho <- NA } est <- if (x$ML) "ML est." else "2-step est." if (!all(is.na(se))){ cat("\nPolyserial Correlation, ", est, " = ", signif(x$rho, digits), " (", signif(se.rho, digits), ")", sep="") cat("\nTest of bivariate normality: Chisquare = ", signif(x$chisq, digits), ", df = ", x$df, ", p = ", signif(pchisq(x$chisq, x$df, lower.tail=FALSE), digits), "\n\n", sep="") } else { cat("\nPolyserial Correlation, ", est, " = ", signif(x$rho, digits), "\n\n") } if (length(se) > 1) cuts.se <- se[-1] else cuts.se <- rep(NA, length(x$cuts)) thresh <- signif(rbind(x$cuts, cuts.se), digits) colnames(thresh) <- 1:length(x$cuts) rownames(thresh) <- c("Threshold", "Std.Err.") if (all(is.na(thresh[2, ]))) thresh <- thresh[-2, , drop=FALSE] print(thresh) } else print(unclass(x)) invisible(x) } polycor/R/hetcor.R0000644000176200001440000000043114155243250013550 0ustar liggesusers# last modified 2021-12-11 by J. Fox hetcor <- function(data, ..., ML=FALSE, std.err=TRUE, use=c("complete.obs", "pairwise.complete.obs"), bins=4, pd=TRUE, parallel=FALSE, ncores=detectCores(logical=FALSE), thresholds=FALSE){ UseMethod("hetcor") } polycor/R/polyserial.R0000644000176200001440000000722114153736364014466 0ustar liggesusers# last modified 2021-12-07 by J. Fox polyserial <- function(x, y, ML=FALSE, control=list(), std.err=FALSE, maxcor=.9999, bins=4, start, thresholds=FALSE){ f <- function(pars){ rho <- pars[1] if (abs(rho) > maxcor) rho <- sign(rho)*maxcor cts <- if (length(pars) == 1) c(-Inf, cuts, Inf) else c(-Inf, pars[-1], Inf) if (any(diff(cts) < 0)) return(Inf) tau <- (matrix(cts, n, s+1, byrow=TRUE) - matrix(rho*z, n, s+1))/ sqrt(1 - rho^2) - sum(log(dnorm(z)*(pnorm(tau[cbind(indices, y+1)]) - pnorm(tau[cbind(indices, y)])))) } if (!is.numeric(x)) stop("x must be numeric") valid <- complete.cases(x, y) x <- x[valid] y <- y[valid] z <- scale(x) tab <- table(y) n <- sum(tab) s <- length(tab) if (s < 2) { warning("y has fewer than 2 levels") return(NA) } if (sum(0 != tab) < 2){ warning("y has fewer than 2 levels with data") return(NA) } nzeros <- sum(zeros <- 0 == tab) if (nzeros > 0){ warning("the following ", if (nzeros == 1) " level" else " levels", " of y", if (nzeros == 1) " has" else " have", " no cases: ", names(tab)[zeros]) } indices <- 1:n cuts <- qnorm(cumsum(tab)/n)[-s] y <- as.numeric(as.factor(y)) rho <- sqrt((n - 1)/n)*sd(y)*cor(x, y)/sum(dnorm(cuts)) if (!missing(start) && (ML || std.err)) { if (is.list(start)){ rho <- start$rho cuts <- start$thresholds } else { rho <- start } if (!is.numeric(rho) || length(rho) != 1) stop("start value for rho must be a number") if (!is.numeric(cuts) || length(cuts) != s - 1) stop("start values for thresholds must be ", s - 1, "numbers") } if (abs(rho) > maxcor) { warning("initial correlation inadmissible, ", rho, ", set to ", sign(rho)*maxcor) rho <- sign(rho)*maxcor } if (ML) { result <- optim(c(rho, cuts), f, control=control, hessian=std.err) if (result$par[1] > 1){ result$par[1] <- maxcor warning(paste("inadmissible correlation set to", maxcor)) } else if (result$par[1] < -1){ result$par[1] <- -maxcor warning(paste("inadmissible correlation set to -", maxcor, sep="")) } if (std.err){ chisq <- chisq(y, z, result$par[1], result$par[-1], bins=bins) df <- s*bins - s - 1 result <- list(type="polyserial", rho=result$par[1], cuts=result$par[-1], var=solve(result$hessian), n=n, chisq=chisq, df=df, ML=TRUE) class(result) <- "polycor" return(result) } else if (thresholds){ result <- list(type="polyserial", rho=result$par[1], cuts=result$par[-1], var=NA, n=n, chisq=NA, df=NA, ML=TRUE) class(result) <- "polycor" return(result) } else return(as.vector(result$par[1])) } else if (std.err){ result <- optim(rho, f, control=control, hessian=TRUE, method="BFGS") if (result$par > 1){ result$par <- maxcor warning(paste("inadmissible correlation set to", maxcor)) } else if (result$par < -1){ result$par <- -maxcor warning(paste("inadmissible correlation set to -", maxcor, sep="")) } chisq <- chisq(y, z, rho, cuts, bins=bins) df <- s*bins - s - 1 result <- list(type="polyserial", rho=result$par, cuts=cuts, var=1/result$hessian, n=n, chisq=chisq, df=df, ML=FALSE) class(result) <- "polycor" return(result) } else if (thresholds){ result <- list(type="polyserial", rho=rho, cuts=cuts, var=NA, n=n, chisq=NA, df=NA, ML=FALSE) class(result) <- "polycor" return(result) } else return(rho) } polycor/R/hetcor.default.R0000644000176200001440000000101714155243160015174 0ustar liggesusers# last modified 2021-12-11 by J. Fox hetcor.default <- function (data, ..., ML = FALSE, std.err = TRUE, use=c("complete.obs", "pairwise.complete.obs"), bins = 4, pd = TRUE, parallel=FALSE, ncores=detectCores(logical=FALSE), thresholds=FALSE) { use <- match.arg(use) dframe <- data.frame(data, ...) if (!missing(...)) names(dframe)[1] <- deparse(substitute(data)) hetcor(dframe, ML = ML, std.err = std.err, use=use, bins = bins, pd = pd, ncores=ncores, thresholds=thresholds) } polycor/R/standard-methods.R0000644000176200001440000000253614153742515015543 0ustar liggesuserscoef.polycor <- function(object, correlation=TRUE, thresholds=TRUE, ...){ result <- if (correlation) c(rho=object$rho) else numeric(0) if (thresholds){ if (object$type == "polychoric"){ row.cuts <- object$row.cuts if (!all(is.na(row.cuts))){ names(row.cuts) <- paste0("row.threshold.", seq_along(row.cuts)) result <- c(result, row.cuts) } col.cuts <- object$col.cuts if (!all(is.na(col.cuts))){ names(col.cuts) <- paste0("col.threshold.", seq_along(col.cuts)) result <- c(result, col.cuts) } } else { cuts <- object$cuts if (!all(is.na(cuts))){ names(cuts) <- paste0("threshold.", seq_along(cuts)) result <- c(result, cuts) } } } result } vcov.polycor <- function(object, correlation=TRUE, thresholds=TRUE, ...){ if (!correlation && !thresholds) return(NULL) vc <- object$var if (is.null(vc) || all(is.na(vc))) return(NA) if (length(vc) > 1){ rownames(vc) <- colnames(vc) <- names(coef(object)) } if (correlation && (!thresholds)) { if (length(vc) == 1) { return(vc) } else { return(vc[1, 1]) } } if ((!correlation) && thresholds){ if (length(vc) == 1) { return(NA) } else { return(vc[-1, -1]) } } return(vc) } summary.polycor <- function(object, ...){ print(object, ...) } polycor/R/print.hetcor.R0000644000176200001440000000227213514744332014715 0ustar liggesusers# last modified 12 Dec 04 by J. Fox "print.hetcor" <- function(x, digits = max(3, getOption("digits") - 3), ...){ R <- signif(x$correlations, digits=digits) R[upper.tri(R)] <- x$type[upper.tri(R)] R <- as.data.frame(R) if (x$ML) cat("\nMaximum-Likelihood Estimates\n") else cat("\nTwo-Step Estimates\n") cat("\nCorrelations/Type of Correlation:\n") print(R) if (!is.null(x$std.errors)){ SE <- signif(x$std.errors, digits) diag(SE) <- "" if (x$NA.method == "complete.obs"){ SE[upper.tri(SE)] <- "" cat("\nStandard Errors:\n") SE <- as.data.frame(SE) print(SE[,-ncol(SE)]) cat(paste("\nn =", x$n, "\n")) } else { SE[upper.tri(SE)] <- x$n[upper.tri(SE)] diag(SE) <- diag(x$n) SE <- as.data.frame(SE) cat("\nStandard Errors/Numbers of Observations:\n") print(SE) } if (!all(is.na(x$tests[lower.tri(x$tests)]))){ Test <- signif(x$tests, digits) Test[upper.tri(Test)] <- "" diag(Test) <- "" Test <- as.data.frame(Test) cat("\nP-values for Tests of Bivariate Normality:\n") print(Test[,-ncol(Test)]) } } invisible(x) } polycor/R/hetcor.data.frame.R0000644000176200001440000002167714167115433015574 0ustar liggesusers# last modified 2022-01-10 by J. Fox # the following function to be imported from admisc and then deleted here: # `tryCatchWEM` <- function(expr, capture = FALSE) { # toreturn <- list() # # output <- withVisible(withCallingHandlers( # tryCatch(expr, error = function(e) { # toreturn$error <<- e$message # NULL # }), # warning = function(w) { # toreturn$warning <<- c(toreturn$warning, w$message) # invokeRestart("muffleWarning") # }, # message = function(m) { # toreturn$message <<- paste(toreturn$message, m$message, sep = "") # invokeRestart("muffleMessage") # } # )) # # if (capture && output$visible && !is.null(output$value)) { # toreturn$output <- capture.output(output$value) # toreturn$value <- output$value # } # # if (length(toreturn) > 0) { # return(toreturn) # } # } hetcor.data.frame <- function(data, ML=FALSE, std.err=TRUE, use=c("complete.obs", "pairwise.complete.obs"), bins=4, pd=TRUE, parallel=FALSE, ncores=detectCores(logical=FALSE), thresholds=FALSE, ...){ se.r <- function(r, n){ rho <- r*(1 + (1 - r^2)/(2*(n - 3))) # approx. unbiased estimator v <- (((1 - rho^2)^2)/(n + 6))*(1 + (14 + 11*rho^2)/(2*(n + 6))) sqrt(v) } computeCor <- function(pair){ type <- "" se <- NA test <- NA i <- rows[pair] j <- cols[pair] x <- data[, i] y <- data[, j] n <- sum(complete.cases(x, y)) if (n == 0) { test <- r <- se <- NA warning("no cases for pair ", j, ", ", i) } if (inherits(x, c("numeric", "integer")) && inherits(y, c("numeric", "integer"))) { r <- cor(x, y, use="complete.obs") type <- "Pearson" if (std.err) { se <- se.r(r, n) test <- pchisq(chisq(x, y, r, bins=bins), bins^2 - 2, lower.tail=FALSE) } Thresholds <- list(NULL) } else if (inherits(x, c("factor", "logical", "character")) && inherits(y, c("factor", "logical", "character"))) { type <- "Polychoric" result <- tryCatchWEM(polychor(x, y, ML=ML, std.err=std.err, thresholds=thresholds), capture=TRUE) error <- !is.null(result$error) if (!is.null(result$warning)){ warning("polychoric correlation between variables ", vnames[j], " and ", vnames[i], if (length(result$warning) == 1) " produced a warning:\n" else " produced warnings:\n", paste(paste(" ", result$warning), collapse="\n")) } if (error){ msg <- result$error warning("could not compute polychoric correlation between variables ", vnames[j], " and ", vnames[i], "\n", " Error message: ", msg) result <- NA } if (std.err && !error){ result <- result$value if (!(length(result) == 1 && is.na(result))){ r <- result$rho se <- sqrt(result$var[1,1]) test <- if (result$df > 0) pchisq(result$chisq, result$df, lower.tail=FALSE) else NA } else { r <- if (is.list(result)) result$value else result test <- se <- NA } } else { r <- if (is.list(result)) result$value else result test <- se <- NA } Thresholds <- if (thresholds) { list(row.cuts=as.vector(result$row.cuts), col.cuts=as.vector(result$col.cuts)) } else { NULL } } else { if (inherits(x, c("factor", "logical", "character")) && inherits(y, c("numeric", "integer"))) result <- tryCatchWEM(polyserial(y, x, ML=ML, std.err=std.err, bins=bins, thresholds=thresholds), capture=TRUE) else if (inherits(x, c("numeric", "integer")) && inherits(y, c("factor", "logical", "character"))) result <- tryCatchWEM(polyserial(x, y, ML=ML, std.err=std.err, bins=bins), capture=TRUE) else { stop("columns must be numeric, factors, logical, or character.") } type <- "Polyserial" error <- !(is.null(result$error)) if (!is.null(result$warning)){ warning("polyserial correlation between variables ", vnames[j], " and ", vnames[i], if (length(result$warning) == 1) " produced a warning:\n" else " produced warnings:\n", paste(paste( " ", result$warning), collapse="\n")) } if (error){ msg <- result$error warning("could not compute polyserial correlation between variables ", vnames[j], " and ", vnames[i], "\n", " Error message: ", msg) result <- NA } if (std.err && !error){ result <- result$value if (!(length(result) == 1 && is.na(result))){ r <- result$rho se <- sqrt(result$var[1,1]) test <- pchisq(result$chisq, result$df, lower.tail=FALSE) } else { r <- if (is.list(result)) result$value else result test <- se <- NA } } else { r <- if (is.list(result)) result$value else result se <- test <- NA } Thresholds <- if (thresholds) { list(cuts=as.vector(result$cuts)) } else { NULL } } list(n=n, r=r, Type=type, SE=se, Test=test, Thresholds=Thresholds) } vnames <- names(data) if (any(sapply(data, function(x) inherits(x, "character")))){ message("data contain one or more character variables", "\nthe values of which are ordered alphabetically") } use <- match.arg(use) if (use == "complete.obs") { data <- na.omit(data) n <- nrow(data) } p <- length(data) if (p < 2) stop("fewer than 2 variables.") R <- matrix(1, p, p) Type <- matrix("", p, p) SE <- matrix(0, p, p) N <- matrix(0, p, p) Test <- matrix(0, p, p) if (thresholds){ Thresholds <- vector(p^2, mode="list") Thresholds <- matrix(Thresholds, p, p) } diag(N) <- if (use == "complete.obs") nrow(data) else sapply(data, function(x) sum(!is.na(x))) if (all(diag(N) == 0)) stop("no non-missing cases") npairs <- p*(p -1)/2 rows <- matrix(1:p, p, p) cols <- t(rows) rows <- rows[lower.tri(rows)] cols <- cols[lower.tri(cols)] result <- if (parallel && ncores > 1){ message("Note: using a cluster of ", ncores, " cores") cl <- parallel::makeCluster(ncores) on.exit(parallel::stopCluster(cl)) parallel::clusterApply(cl, 1:npairs, computeCor) } else { lapply(1:npairs, computeCor) } for (pair in 1:npairs){ i <- rows[pair] j <- cols[pair] res <- result[[pair]] N[i, j] <- N[j, i] <- res$n R[i, j] <- R[j, i] <- res$r Type[i, j] <- Type[j, i] <- res$Type SE[i, j] <- SE[j, i] <- res$SE Test[i, j] <- Test[j, i] <- res$Test if (thresholds) { Thresholds[[i, j]] <- res$Thresholds Thresholds[[j, i]] <- res$Type } } if (pd && !any(is.na(R)) && min(eigen(R, only.values=TRUE)$values) < 0){ cor <- Matrix::nearPD(R, corr=TRUE) if (!cor$converged) warning("attempt to make correlation matrix positive-definite failed") else warning("the correlation matrix has been adjusted to make it positive-definite") R <- as.matrix(cor$mat) } rownames(R) <- colnames(R) <- names(data) result <- list(correlations=R, type=Type, NA.method=use, ML=ML) if (thresholds) result$thresholds <- Thresholds if (std.err) { rownames(SE) <- colnames(SE) <- names(data) rownames(N) <- colnames(N) <- names(N) rownames(Test) <- colnames(Test) <- names(data) result$std.errors <- SE result$n <- if (use == "complete.obs") n else N result$tests <- Test } if (0 < (nNA <- sum(is.na(R[lower.tri(R)])))){ warning(nNA, if (nNA == 1) " correlation" else " correlations", " couldn't be computed and", if (nNA == 1) " is" else " are", " NA") } class(result) <- "hetcor" result } polycor/MD50000644000176200001440000000150514167354404012262 0ustar liggesusers7259355989a2387b08f502cad5633328 *DESCRIPTION 10c6f438cc07edc4009a3c623dbd7bd8 *NAMESPACE 0111162ffe000474e94ac648d1fd0fd3 *NEWS 04022503a39601aa5644e2fd1043897d *R/hetcor.R 34c0cefe4c8683f50bbcb40be0f6e0ed *R/hetcor.data.frame.R 705e792b0b449c353247f31d12baf898 *R/hetcor.default.R 478be5ff67389aa9c16521fbc2288778 *R/polychor.R 2ec09027097cdf060f10d3584d3310dd *R/polyserial.R 3dd639b4ff9c10632f0c3ef8ffa0d043 *R/print.hetcor.R dea1f800a80109fe4fbb56db55d7dad0 *R/print.polycor.R 0562466704e2a4716720c76cb7e65b62 *R/standard-methods.R 9a57a3f34449a1270944719c42ca7079 *R/utilities.R 4bd45f5ccc3154f479a7db406681015d *inst/CHANGES 76011702b2d9b9c12da4262390b1aced *man/hetcor.Rd 53dfb8a35c6c9762305bb731ec29c74d *man/polychor.Rd 3d202c968065a8ab30b6e70047b5d76b *man/polyserial.Rd 80b29e260d3b07acf456b7f63a77ae2c *man/print.polycor.Rd polycor/inst/0000755000176200001440000000000013514744331012723 5ustar liggesuserspolycor/inst/CHANGES0000644000176200001440000000402713514744331013721 0ustar liggesusersversion 0.5: original version on CRAN version 0.6-0 o added approximate standard errors for quick 2-step estimates. o corrected bug constraining quick estimates in polychor() to 0,1 rather than -1,1. o constraints on correlations (default to -.9999,.9999) for numeric stability. o several small changes. version 0.7-0 o added tests of bivariate normality. o added as.matrix.hetcor(). o fixed bug that caused use="pairwise.complete.obs" in hetcor() to fail. o cleaned up code. version 0.7-1 o suppressed test of bivariate normality in 2 x 2 (tetrachoric) case (where there are 0 df for the test). o examples, which use random data, now set the random-number-generator seed. o small changes to printed output. version 0.7-2 o polychor() and polyserial() now report an error when a categorical variable has just one level (problem pointed out by David Barron). version 0.7-3 o as.matrix.hetcor() now has a second argument, ..., to conform to the as.matrix() generic in R 2.5.0. version 0.7-4 o hetcor() uses the nearcor() function in sfsmisc to insure that the correlation matrix returned is positive-definite. o small changes. version 0.7-5 o polychor() and polyserial() now return NA and report a warning when a factor is invariant (after problem report by Jose Quesada); in polychor(), empty factor levels are automatically removed. version 0.7-6 o polyserial() and polychor() now make sure that final correlations are in the range [-1, 1], reporting a warning when inadmissible values are coerced to -1 or 1 (correcting a problem reported by Marie Loh). version 0.7-7 o hetcor() avoids an incorrect name for the first variable when it's given a matrix as input, rather than a data frame or individual variables, as the documentation indicates (preventing a consequent renaming error reported by Iuri Gavronski). o Small fix to hetcor.Rd. version 0.7-8 o hetcor() now accepts logical variables, treated as two-level factors (suggested by Soren Vogel).