psyphy/0000755000175000017500000000000014164557072011753 5ustar nileshnileshpsyphy/MD50000644000175000017500000000313714164557072012267 0ustar nileshnilesh118fb282ba8249808acc0c4317341d07 *DESCRIPTION 487093aa6e86b410715e1ef5b0faf42f *NAMESPACE 602f1f05bdb63f79c2d82db9315445f3 *R/SS.RGBcalib.R 97e8c8eeae85eb16df17dca67beb31eb *R/SS.calib.R 59e6a8e6f72d0b0fdf996e23c2096cc9 *R/dprime.ABX.R 96db1996d4ce465c061b18dd825468fb *R/dprime.SD.R 2120d2e8568f8f76054768cdef07ffa6 *R/dprime.mAFC.R 92f825506cceb052b7ec8927d162ce37 *R/dprime.oddity.R dc83555725ff85df732494743a8ef8d4 *R/gl.links.R b12ab66f54495661e84779be4c60a765 *R/glm.WH.R 77a2c3a777c898ccf61a38e1ab974b4b *R/glm.lambda.R b82de69bd88971fc152067a7992c7c52 *R/mafc.R 4f3b773a8e17bc42ce5f871c24d90377 *R/probit.lambda.R 7b58640877ae1c8a7d0326c2be6fe8c4 *R/psyfun.2asym.R 38b9474d4ae86b3179c8596c1aeea7ba *data/RGB.rda 9485b30f7e79c4e34d622ac3ae856be4 *data/ecc2.rda 7913c66fbf5d79ac495082c12fdfde1c *inst/NEWS 23c488bcea5b93390dd47acc1503c74b *inst/TODO dcbe0efda17467122f18ddb1fe65d3df *man/RGB.Rd 418c9c08578e4124bf6af977ea13b531 *man/SS.RGBcalib.Rd ff0a3be8febb8c623b7d6ad722c47caf *man/dprime.ABX.Rd 981e8aa06fc99d13c971cfd2b2def014 *man/dprime.SD.Rd 2ddd8aa9557d56a0bcdf675d027e3288 *man/dprime.mAFC.Rd 62f9eac1e55c1cf7da1461897e4c33af *man/dprime.oddity.Rd d0883c35ad168e2540418bb6e55887d8 *man/ecc2.Rd eb0bce3a6d3a321c63ed06b535d4d68f *man/glm.WH.Rd 5ba09f5c80af16d743bf5b2a08dc39e9 *man/glm.lambda.Rd ab58682cbc0c798d8702728b271a7dfb *man/logit.2asym.Rd 8935290b0dca60084f5ee57541b35315 *man/mafc.Rd 5e9697c6e0e43ba5cda075b1c6e12714 *man/probit.lambda.Rd 5ebb475d2d9da1411068b348c0d2ec13 *man/psyfun.2asym.Rd 45a8cb6033907a481faaac6c51728c71 *man/psyphy-package.Rd 30c58d8fe6b0609585aeae8f3ef96fdf *man/summary.lambda.Rd psyphy/DESCRIPTION0000755000175000017500000000145414164557072013470 0ustar nileshnileshPackage: psyphy Type: Package Title: Functions for Analyzing Psychophysical Data in R Version: 0.2-3 Date: 2021-12-22 Author: Kenneth Knoblauch Maintainer: Ken Knoblauch Depends: R (>= 3.0), stats, graphics Suggests: MASS, nlme, lattice LazyData: yes Description: An assortment of functions that could be useful in analyzing data from psychophysical experiments. It includes functions for calculating d' from several different experimental designs, links for m-alternative forced-choice (mafc) data to be used with the binomial family in glm (and possibly other contexts) and self-Start functions for estimating gamma values for CRT screen calibrations. License: GPL NeedsCompilation: no Packaged: 2021-12-22 09:55:29 UTC; ken Repository: CRAN Date/Publication: 2022-01-03 11:30:02 UTC psyphy/man/0000755000175000017500000000000014160572276012525 5ustar nileshnileshpsyphy/man/dprime.oddity.Rd0000755000175000017500000000212011243523616015556 0ustar nileshnilesh\name{dprime.oddity} \alias{dprime.oddity} \title{d' for 3 Stimulus Oddity Paradigm } \description{ Calculate \eqn{d'} for a 3 stimulus (triangular) paradigm. Two of the stimuli are the same and the observer must designate the stimulus that is different. } \usage{ dprime.oddity(Pc.tri) } \arguments{ \item{Pc.tri}{numeric in (1/3, 1). The proportion of correct responses for an unbiased observer. } } \value{ Returns the value of \eqn{d'} } \references{ Frijters, G. S., Kooistra, A. and Verijken, P. F. G. (1980) Tables of \eqn{d'} for the triangular method and the 3-AFC signal detection procedure. \emph{Perception & Psychophysics}, \bold{27}, 176--178. MacMillan, N. A. and Creeman, C. D. (1991) \emph{Detection Theory: A User's Guide} Cambridge University Press Green, D. M. and Swets, J. A. (1966) \emph{Signal Detection Theory and Psychophysics} Robert E. Krieger Publishing Company } \author{Kenneth Knoblauch} \seealso{\code{\link{dprime.mAFC}}, \code{\link{dprime.SD}}, \code{\link{dprime.ABX}} } \examples{ dprime.oddity(0.8) } \keyword{univar} psyphy/man/logit.2asym.Rd0000644000175000017500000000634211035504402015151 0ustar nileshnilesh\name{logit.2asym} \alias{logit.2asym} \alias{probit.2asym} \alias{cauchit.2asym} \alias{cloglog.2asym} \alias{weib.2asym} \title{ Links for Binomial Family with Variable Upper/Lower Asymptotes} \description{ These functions provide links for the binamial family so that psychometric functions can be fit with \emph{both} the upper and lower asymptotes different from 1 and 0, respectively. } \usage{ logit.2asym(g, lam) probit.2asym(g, lam) cauchit.2asym(g, lam) cloglog.2asym(g, lam) weib.2asym( ... ) } \arguments{ \item{g}{ numeric in the range (0, 1), normally <= 0.5, however, which specifies the lower asymptote of the psychometric function. } \item{lam}{ numeric in the range (0, 1), specifying 1 - the upper asymptote of the psychometric function. } \item{...}{ used just to pass along the formals of \code{cloglog.2asym} as arguments to \code{weib.2asym}. } } \details{ These links are used to specify psychometric functions with the form \deqn{ P(x) = \gamma + (1 - \gamma - \lambda) p(x) } where \eqn{\gamma} is the lower asymptote and \eqn{lambda} is \eqn{1 - } the upper asymptote, and \eqn{p(x)} is the base psychometric function, varying between 0 and 1. } \value{ Each link returns a list containing functions required for relating the response to the linear predictor in generalized linear models and the name of the link. \item{linkfun }{The link function} \item{linkinv}{The inverse link function} \item{mu.eta}{The derivative of the inverse link } \item{valideta}{The domain over which the linear predictor is valid} \item{link}{A name to be used for the link }} \references{ Klein S. A. (2001) Measuring, estimating, and understanding the psychometric function: a commentary. \emph{Percept Psychophys.}, \bold{63(8)}, 1421--1455. Wichmann, F. A. and Hill, N. J. (2001) The psychometric function: I.Fitting, sampling, and goodness of fit. \emph{Percept Psychophys.}, \bold{63(8)}, 1293--1313. } \author{Kenneth Knoblauch} \seealso{\code{\link{glm}}, \code{\link{glm}} \code{\link{make.link}}, \code{\link{psyfun.2asym}}} \examples{ #A toy example, b <- 3 g <- 0.05 # simulated false alarm rate d <- 0.03 a <- 0.04 p <- c(a, b, g, d) num.tr <- 160 cnt <- 10^seq(-2, -1, length = 6) # contrast levels #simulated Weibull-Quick observer responses truep <- g + (1 - g - d) * pweibull(cnt, b, a) ny <- rbinom(length(cnt), num.tr, truep) nn <- num.tr - ny phat <- ny/(ny + nn) resp.mat <- matrix(c(ny, nn), ncol = 2) ddprob.glm <- psyfun.2asym(resp.mat ~ cnt, link = probit.2asym) ddlog.glm <- psyfun.2asym(resp.mat ~ cnt, link = logit.2asym) # Can fit a Weibull function, but use log contrast as variable ddweib.glm <- psyfun.2asym(resp.mat ~ log(cnt), link = weib.2asym) ddcau.glm <- psyfun.2asym(resp.mat ~ cnt, link = cauchit.2asym) plot(cnt, phat, log = "x", cex = 1.5, ylim = c(0, 1)) pcnt <- seq(0.01, 0.1, len = 100) lines(pcnt, predict(ddprob.glm, data.frame(cnt = pcnt), type = "response"), lwd = 5) lines(pcnt, predict(ddlog.glm, data.frame(cnt = pcnt), type = "response"), lwd = 2, lty = 2, col = "blue") lines(pcnt, predict(ddweib.glm, data.frame(cnt = pcnt), type = "response"), lwd = 3, col = "grey") lines(pcnt, predict(ddcau.glm, data.frame(cnt = pcnt), type = "response"), lwd = 3, col = "grey", lty = 2) } \keyword{ models } psyphy/man/probit.lambda.Rd0000755000175000017500000000567311035506012015527 0ustar nileshnilesh\name{probit.lambda} \alias{probit.lambda} \title{ mafc Probit Link for Binomial Family with Upper Asymptote < 1 } \description{ This function provides a link for the binomial family for fitting m-alternative forced-choice, with a probit link and with the upper asymptote permitted to be less than 1.} \usage{ probit.lambda(m = 2, lambda = 0) } \arguments{ \item{m}{ is the integer number (>1) of choices (Default to 2AFC). } \item{lambda}{ number in [0, 1] indicating 1 minus the upper asymptotic value of the psychometric function. } } \details{ This function provides a link for fitting psychometric functions arising from an m-alternative forced-choice experiment using a probit link and allowing that the upper aymptote is less than 1. The psychometric function fit to the data is described by \deqn{P(x) = 1/m + (1 - 1/m - \lambda) \Phi(x)} where \eqn{m} is the number of alternatives and the lower asymptote, \eqn{1 - \lambda} is the upper asymptote and \eqn{\Phi} is the cumulative normal function. } \value{ The link returns a list containing functions required for relating the response to the linear predictor in generalized linear models and the name of the link. \item{linkfun }{The link function} \item{linkinv }{DTHe inverse link function} \item{mu.eta }{The derivative of the inverse link function} \item{valideta }{The domain over which the linear predictor is valid} \item{link }{A name to be used for the link} } \references{Wichmann, F. A. and Hill, N. J. (2001) The psychometric function: I.Fitting, sampling, and goodness of fit. \emph{Percept Psychophys.}, 63(8), 1293--1313.} \author{Ken Knoblauch} \note{Due to the difficulty of the task, subject error or incorrectly recorded data, psychophysical data may reveal less than perfect performance when stimulus differences are readily visible. When this occurs, letting the upper asymptote be less than 1 often results in a better fit to the data and a less-biased estimate of the steepness of the curve (see example below). } \seealso{\code{\link{mafc}}, \code{\link{glm}}, \code{\link{glm.lambda}}, \code{\link{family}}, \code{\link{make.link}} } \examples{ b <- 3.5 g <- 1/3 d <- 0.025 a <- 0.04 p <- c(a, b, g, d) num.tr <- 160 cnt <- 10^seq(-2, -1, length = 6) # contrast levels #simulated Weibull-Quick observer responses truep <- g + (1 - g - d) * pweibull(cnt, b, a) ny <- rbinom(length(cnt), num.tr, truep) nn <- num.tr - ny phat <- ny/(ny + nn) resp.mat <- matrix(c(ny, nn), ncol = 2) ddprob.glm <- glm(resp.mat ~ cnt, family = binomial(mafc.probit(3))) ddprob.lam <- glm(resp.mat ~ cnt, family = binomial(probit.lambda(3, 0.025))) AIC(ddprob.glm, ddprob.lam) plot(cnt, phat, log = "x", cex = 1.5, ylim = c(0, 1)) pcnt <- seq(0.01, 0.1, len = 100) lines(pcnt, predict(ddprob.glm, data.frame(cnt = pcnt), type = "response"), lwd = 2) lines(pcnt, predict(ddprob.lam, data.frame(cnt = pcnt), type = "response"), lwd = 2, lty = 2) } \keyword{models} psyphy/man/psyphy-package.Rd0000755000175000017500000000217211243524063015734 0ustar nileshnilesh\name{psyphy-package} \alias{psyphy-package} \alias{psyphy} \docType{package} \title{ Functions for analyzing psychophysical functions } \description{ An assortment of functions that could be useful in analyzing data from pyschophysical experiments. It includes functions for calculating d' from several different experimental designs, links for mafc to be used with the binomial family in glm (and possibly other contexts) and a self-Start function for estimating gamma values for CRT screen calibrations. } \details{ \tabular{ll}{ Package: \tab psyphy\cr Type: \tab Package\cr Version: \tab 0.0-2\cr Date: \tab 2007-01-27\cr License: \tab GPL\cr } For the moment, the package contains several functions for calculating \eqn{d'} for a variety of psychophysical paradigms, some link functions for the binomial family in glm (and perhaps other functions) for fitting psychometric functions from mAFC experiments and a self-Start function for estimating the value of the exponent, gamma, based on the luminance calibration of the three channels of a CRT-like display. } \author{ Kenneth Knoblauch } \keyword{ package } psyphy/man/glm.WH.Rd0000755000175000017500000000641111035506451014103 0ustar nileshnilesh\name{glm.WH} \alias{glm.WH} \title{ mafc Probit Fit to Psychometric Function with Upper Asymptote Less than One} \description{ A probit fit of a psychometric function with upper asymptote less than 1 is obtained by cycling between a fit with \code{glm} using the \code{probit.lambda} link and \code{optimize} to estimate \code{lambda}, 1 - the upper asymptotic value, until the log Likelihood changes by less than a pre-set tolerance. } \usage{ glm.WH(formula, data, NumAlt = 2, lambda.init = 0.01, interval = c(0, 0.05), trace = FALSE, tol = 1e-06, ...) } \arguments{ \item{formula}{ a symbolic description of the model to be fit. } \item{data}{ an optional data frame, list or enviroment (or object coercible by \code{\link{as.data.frame}} containing the variables in the model. If not found in data, the variables are taken from the \code{environment(formula)}, typically the environment from \code{glm.WH} was called.} \item{NumAlt}{ integer indicating the number of alternatives (> 1) in the mafc-task. (Default: 2). } \item{lambda.init}{ numeric, initial estimate of 1 - upper asymptote. } \item{interval}{ numeric vector giving interval endpoints within which to search for \code{lambda}. } \item{trace}{ logical, indicating whether or not to print out a trace of the iterative process. } \item{tol}{ numeric, tolerance for ending iterations.} \item{\dots}{ futher arguments passed to \code{glm}. } } \details{ The psychometric function fit to the data is described by \deqn{P(x) = 1/m + (1 - 1/m - \lambda) \Phi(x)} where \eqn{m} is the number of alternatives and the lower asymptote, \eqn{1 - \lambda} is the upper asymptote and \eqn{\Phi} is the cumulative normal function.} \value{ returns an object of class \sQuote{lambda} which inherits from classes \sQuote{glm} and \sQuote{lm}. It only differs from an object of class \sQuote{glm} in including an additional components, \code{lambda}, giving the estimated minimum of \code{lambda}. The degrees of freedom are reduced by 1 to take into account the estimation of \code{lambda}.} \references{Wichmann, F. A. and Hill, N. J. (2001) The psychometric function: I.Fitting, sampling, and goodness of fit. Percept Psychophys., 63(8), 1293--1313. Yssaad-Fesselier, R. and Knoblauch, K. (2006) Modeling psychometric functions in R. \emph{ Behav Res Methods.}, \bold{38(1)}, 28--41. (for examples with \code{gnlr}). } \author{ Ken Knoblauch } \seealso{ \code{\link{mafc}}, \code{\link{glm}},\code{\link{glm.lambda}}, \code{\link{probit.lambda}}, \code{\link{family}} } \examples{ b <- 3.5 g <- 1/4 d <- 0.04 a <- 0.04 p <- c(a, b, g, d) num.tr <- 160 cnt <- 10^seq(-2, -1, length = 6) # contrast levels #simulated Weibull-Quick observer responses truep <- g + (1 - g - d) * pweibull(cnt, b, a) ny <- rbinom(length(cnt), num.tr, truep) nn <- num.tr - ny phat <- ny/(ny + nn) resp.mat <- matrix(c(ny, nn), ncol = 2) tst.glm <- glm(resp.mat ~ cnt, binomial(mafc.probit(1/g))) pcnt <- seq(0.005, 1, len = 1000) plot(cnt, phat, log = "x", ylim = c(0, 1), xlim = c(0.005, 1), cex = 1.75) lines(pcnt, predict(tst.glm, data.frame(cnt = pcnt), type = "response"), lwd = 2) tst.lam <- glm.WH(resp.mat ~ cnt, NumAlt = 1/g, trace = TRUE) lines(pcnt, predict(tst.lam, data.frame(cnt = pcnt), type = "response"), lty = 2, lwd = 2) } \keyword{ models } psyphy/man/dprime.ABX.Rd0000755000175000017500000000352510626253276014714 0ustar nileshnilesh\name{dprime.ABX} \alias{dprime.ABX} \title{ d' for ABX Paradigm } \description{ Calulate \eqn{d'} for ABX paradigm either assuming a differencing strategy or independent observations } \usage{ dprime.ABX(Hits, FA, zdiff, Pc.unb, method = "diff") } \arguments{ \item{Hits}{numeric in [0, 1] corresponding to Hit rate} \item{FA}{numeric in [0, 1] corresponding to False alarm rate} \item{zdiff}{numeric. Difference of z-scores for Hit and False Alarm rates } \item{Pc.unb}{numeric in [0, 1]. Proportion correct for an unbiased observer, \code{pnorm(zdiff)}} \item{method}{character. Specifies the model to describe the observer's criterion for dividing up the decision space, must be either "diff" for a differencing strategy (the default) or "IO" for independent observations.} } \details{ Two different strategies have been described for how the observer partitions the decision space in the ABX paradigm, either based on Independent Observations of each stimulus or on a differencing strategy. The differecing strategy is the default. \eqn{d'} can be calculated either from the \code{H} and \code{FA} rates, from the difference of z-scores or from the probability correct of an unbiased observer. } \value{ Returns the value of \eqn{d'} } \references{ MacMillan, N. A. and Creeman, C. D. (1991) \emph{Detection Theory: A User's Guide} Cambridge University Press Green, D. M. and Swets, J. A. (1966) \emph{Signal Detection Theory and Psychophysics} Robert E. Krieger Publishing Company } \author{Kenneth Knoblauch} \seealso{\code{\link{dprime.mAFC}}, \code{\link{dprime.SD}}, \code{\link{dprime.oddity}}} \examples{ dprime.ABX(H = 0.75, F = 0.3) dprime.ABX(H = 0.75, F = 0.3, method = "IO") dprime.ABX(zdiff = qnorm(0.75) - qnorm(0.3)) dprime.ABX(Pc.unb = pnorm( (qnorm(0.75) - qnorm(0.3))/2 )) } \keyword{univar } psyphy/man/ecc2.Rd0000755000175000017500000000337611243523722013633 0ustar nileshnilesh\name{ecc2} \alias{ecc2} \docType{data} \title{4-afc Detection and Identification of Letters} \description{ Letter detection and identification at 2 degrees eccentricity in the visual field. On each trial, one of four letters (b, d, p, q) were presented in one of four positions (superior, inferior, left, right) in the visual field. In a given session, the letter height was fixed. Six contrast levels were tested in each session. The data indicate the proportion of correctly identified positions, referred to here as detection, and the proportion of correctly identified letters, conditional on correct identification. } \usage{data(ecc2)} \format{ A data frame with 48 observations on the following 5 variables. \describe{ \item{\code{Contr}}{numeric. The contrast of the stimulus, defined as Weberian contrast.} \item{\code{task}}{a factor with levels \code{DET} \code{ID} indicating the two tasks, detection and identification.} \item{\code{Size}}{a numeric vector indicating the letter height} \item{\code{Correct}}{an integer vector indicating the number of correct responses (\code{DET} or \code{ID}).} \item{\code{Incorrect}}{an integer vector, indicating the number of incorrect responses.} } } \references{ Yssaad-Fesselier, R. and Knoblauch, K. (2006) Modeling psychometric functions in R. \emph{ Behav Res Methods.}, \bold{38(1)}, 28--41. } \examples{ data(ecc2) library(lattice) xyplot(Correct/(Correct + Incorrect) ~ Contr | Size * task, ecc2, type = "b", scale = list(x = list(log = TRUE), y = list(limits = c(0, 1.05))), xlab = "Contrast", ylab = "Proportion Correct Response", panel = function(x, y, ...) { panel.xyplot(x, y, ...) panel.abline(h = 0.25, lty = 2) }) } \keyword{datasets} psyphy/man/SS.RGBcalib.Rd0000755000175000017500000000753612267704402014755 0ustar nileshnilesh\name{SS.RGBcalib} \alias{SS.RGBcalib} \alias{SS.calib} \title{ Self-Start Functions for Fitting Luminance vs Grey Level Relation on CRT displays} \description{ This \code{selfStart} model evaluates the parameters for describing the luminance vs grey level relation of the R, G and B guns of a CRT-like display, fitting a single exponent, gamma, for each of the 3 guns. It has an initial attribute that will evaluate initial estimates of the parameters, \code{Blev}, \code{Br}, \code{Bg}, \code{Bb} and \code{gamm}. In the case of fitting data from a single gun or for a combination of guns, as in the sum of the three for calibrating the \emph{white}, the parameter \code{k} is used for the coefficient. Both functions include gradient and hessian attributes. } \usage{ SS.calib(Blev, k, gamm, GL) SS.RGBcalib(Blev, Br, Bg, Bb, gamm, Rgun, Ggun, Bgun) } %- maybe also 'usage' for other objects documented here. \arguments{ \item{Blev}{numeric. The black level is the luminance at the 0 grey level } \item{k}{numeric, coefficient of one gun for fitting single gun} \item{Br}{numeric, coefficient of the R gun } \item{Bg}{numeric, coefficient of the G gun } \item{Bb}{numeric, coefficient of the B gun } \item{gamm}{numeric, the exponent, gamma, applied to the grey level } \item{GL}{numeric, is the grey level for the gun tested, covariate in model matrix in one gun case} \item{Rgun}{numeric, is a covariate in the model matrix that indicates the grey level for the R gun. See the example below.} \item{Ggun}{numeric, is a covariate in the model matrix that indicates the grey level for the G gun } \item{Bgun}{numeric, is a covariate in the model matrix that indicates the grey level for the B gun } } \details{ The model \deqn{ Lum(GL) = Blev + \beta_i * GL^\gamma } where i is in \{R, G, B\}, usually provides a reasonable description of the change in luminance of a display gun with grey level, \code{GL}. This \code{selfStart} function estimates \eqn{\gamma} and the other parameters using the \code{nls} function. It is assumed that grey level is normalized to the interval [0, 1]. This results in lower correlation between the linear coefficients of the guns, \eqn{\beta_i} , than if the actual bit-level is used, e.g., [0, 255], for an 8-bit graphics card (see the example). Also, with this normalization of the data, the coefficients, \eqn{\beta_i}, provide estimates of the maximum luminance for each gun. The need for the arguments \code{Rgun}, \code{Ggun} and \code{Bgun} is really a kludge in order to add gradient and hessian information to the model. } \value{ returns a numeric vector giving the estimated luminance given the parameters passed as arguments and a gradient matrix and a hessian array as attributes.} \references{ ~put references to the literature/web site here ~ } \author{Kenneth Knoblauch } \seealso{\code{\link{nls}}} \examples{ data(RGB) #Fitting a single gun W.nls <- nls(Lum ~ SS.calib(Blev, k, gamm, GL), data = RGB, subset = (Gun == "W")) summary(W.nls) #curvature (parameter effect) is greater when GL is 0:255 Wc.nls <- nls(Lum ~ SS.calib(Blev, k, gamm, GL*255), data = RGB, subset = (Gun == "W")) MASS::rms.curv(W.nls) MASS::rms.curv(Wc.nls) pairs(profile(Wc.nls), absVal = FALSE) pairs(profile(W.nls), absVal = FALSE) #Fitting 3 guns with independent gamma's RGB0.nls <- nlme::nlsList(Lum ~ SS.calib(Blev, k, gamm, GL) | Gun, data = subset(RGB, Gun != "W")) summary(RGB0.nls) plot(nlme::intervals(RGB0.nls)) # Add covariates to data.frame for R, G and B grey levels gg <- model.matrix(~-1 + Gun/GL, RGB)[ , c(5:7)] RGB$Rgun <- gg[, 1] RGB$Ggun <- gg[, 2] RGB$Bgun <- gg[, 3] RGB.nls <- nls(Lum ~ SS.RGBcalib(Blev, Br, Bg, Bb, gamm, Rgun, Ggun, Bgun), data = RGB, subset = (Gun != "W") ) summary(RGB.nls) confint(RGB.nls) } \keyword{ models } \keyword{ nonlinear } psyphy/man/RGB.Rd0000755000175000017500000000174611243523551013430 0ustar nileshnilesh\name{RGB} \alias{RGB} \docType{data} \title{ Luminance Calibration Data from Video Projector} \description{ The data were obtained from the measurements of the luminance of the \code{R}, \code{G} and \code{B} channels individually, as well as the three together, W, for each of 21 grey levels, \code{GL} from a screen on which a video projector was displaying an image of a uniform field. Grey level has been normalized to the interval [0, 1], though originally it is specified as integers in [0, 255]. The measurements were obtained with a Photo Research 650 spectro-radiometer. } \usage{data(RGB)} \format{ A data frame with 84 observations on the following 3 variables. \describe{ \item{\code{Lum}}{numeric vector of the measured luminance in candelas/meter\eqn{^2} } \item{\code{GL}}{The grey level normalized to the interval [0, 1]} \item{\code{Gun}}{factor with levels \code{R} \code{G} \code{B} \code{W}} } } \examples{ data(RGB) } \keyword{datasets} psyphy/man/mafc.Rd0000755000175000017500000001156213752207517013731 0ustar nileshnilesh\name{mafc} \alias{mafc} \alias{mafc.logit} \alias{mafc.probit} \alias{mafc.cloglog} \alias{mafc.weib} \alias{mafc.cauchit} \title{Links for Binomial Family for m-alternative Forced-choice } \description{ These provide links for the binomial family for fitting m-alternative forced-choice psychophysical functions. } \usage{ mafc.logit( .m = 2 ) mafc.probit( .m = 2 ) mafc.cloglog( .m = 2 ) mafc.weib( ... ) mafc.cauchit( .m = 2 ) } \arguments{ \item{.m}{is the integer number (>1) of choices (Default to 2AFC). For m = 1 (Yes/No paradigm), use one of the built-in links for the binomial family.} \item{...}{ just to pass along the formals of \code{mafc.cloglog}.}} \details{ These functions provide links for fitting psychometric functions arising from an m-alternative forced-choice experiment. The estimated coefficients of the linear predictor influence both the location and the slope of the psychometric function(s), but provide no means of estimating the upper aymptote which is constrained to approach 1. If the upper asympotote must be estimated, it would be better to maximize directly the likelihood, either with a function like \code{optim} or \code{gnlr} from package \pkg{gnlm} (available at \url{https://www.commanster.eu/rcode.html}). Alternatively, the function \code{\link{probit.lambda}} can be used with a known upper asymptote, or \code{\link{glm.lambda}} or \code{\link{glm.WH}} to estimate one, with a probit link. \code{mafc.weib} is just an alias for \code{mafc.cloglog}.} \value{ Each link returns a list containing functions required for relating the response to the linear predictor in generalized linear models and the name of the link. \item{linkfun }{The link function} \item{linkinv}{The inverse link function} \item{mu.eta}{The derivative of the inverse link } \item{valideta}{The domain over which the linear predictor is valid} \item{link}{A name to be used for the link } } \references{ Williams J, Ramaswamy D and Oulhaj A (2006) 10 Hz flicker improves recognition memory in older people \emph{BMC Neurosci.} 2006 5;7:21 \url{https://www.ncbi.nlm.nih.gov/pmc/articles/PMC1434755/} (for an example developed prior to this one, but for m = 2). Klein S. A. (2001) Measuring, estimating, and understanding the psychometric function: a commentary. \emph{Percept Psychophys.}, \bold{63(8)}, 1421--1455. Wichmann, F. A. and Hill, N. J. (2001) The psychometric function: I.Fitting, sampling, and goodness of fit. \emph{Percept Psychophys.}, \bold{63(8)}, 1293--1313. Yssaad-Fesselier, R. and Knoblauch, K. (2006) Modeling psychometric functions in R. \emph{ Behav Res Methods.}, \bold{38(1)}, 28--41. (for examples with \code{gnlr}). } \author{Kenneth Knoblauch} \seealso{ \code{\link{family}}, \code{\link{make.link}}, \code{\link{glm}}, \code{\link{optim}}, \code{\link{probit.lambda}}, \code{\link{glm.lambda}}, \code{\link{glm.WH}}} \examples{ #A toy example, b <- 3.5 g <- 1/3 d <- 0.0 a <- 0.04 p <- c(a, b, g, d) num.tr <- 160 cnt <- 10^seq(-2, -1, length = 6) # contrast levels #simulated observer responses truep <- g + (1 - g - d) * pweibull(cnt, b, a) ny <- rbinom(length(cnt), num.tr, truep) nn <- num.tr - ny phat <- ny/(ny + nn) resp.mat <- matrix(c(ny, nn), ncol = 2) ddprob.glm <- glm(resp.mat ~ cnt, family = binomial(mafc.probit(3))) ddlog.glm <- glm(resp.mat ~ cnt, family = binomial(mafc.logit(3))) # Can fit a Weibull function, but use log contrast as variable ddweib.glm <- glm(resp.mat ~ log(cnt), family = binomial(mafc.cloglog(3))) ddcau.glm <- glm(resp.mat ~ log(cnt), family = binomial(mafc.cauchit(3))) plot(cnt, phat, log = "x", cex = 1.5, ylim = c(0, 1)) pcnt <- seq(0.01, 0.1, len = 100) lines(pcnt, predict(ddprob.glm, data.frame(cnt = pcnt), type = "response"), lwd = 2) lines(pcnt, predict(ddlog.glm, data.frame(cnt = pcnt), type = "response"), lwd = 2, lty = 2) lines(pcnt, predict(ddweib.glm, data.frame(cnt = pcnt), type = "response"), lwd = 3, col = "grey") lines(pcnt, predict(ddcau.glm, data.frame(cnt = pcnt), type = "response"), lwd = 3, col = "grey", lty = 2) # Weibull parameters \alpha and \beta cc <- coef(ddweib.glm) alph <- exp(-cc[1]/cc[2]) bet <- cc[2] #More interesting example with data from Yssaad-Fesselier and Knoblauch data(ecc2) ecc2.glm <- glm(cbind(Correct, Incorrect) ~ Contr * Size * task, family = binomial(mafc.probit(4)), data = ecc2) summary(ecc2.glm) ecc2$fit <- fitted(ecc2.glm) library(lattice) xyplot(Correct/(Correct + Incorrect) ~ Contr | Size * task, data = ecc2, subscripts = TRUE, ID = with(ecc2, Size + as.numeric(task)), scale = list(x = list(log = TRUE), y = list(limits = c(0, 1.05))), xlab = "Contrast", ylab = "Proportion Correct Response", aspect = "xy", panel = function(x, y, subscripts, ID, ...) { which = unique(ID[subscripts]) llines(x, ecc2$fit[which ==ID], col = "black", ...) panel.xyplot(x, y, pch = 16, ...) panel.abline(h = 0.25, lty = 2, ...) } ) } \keyword{ models } psyphy/man/dprime.mAFC.Rd0000755000175000017500000000312610626253356015044 0ustar nileshnilesh\name{dprime.mAFC} \alias{dprime.mAFC} \title{ d' for m-alternative Forced-choice } \description{ Calculate the value of \eqn{d'} for an m-alternative forced choice paradigm } \usage{ dprime.mAFC(Pc, m) } \arguments{ \item{Pc}{ The proportion of correct responses based on either the Hit rate or based on an unbiased observer} \item{m}{ The number of alternative choices, an integer > 1. } } \details{ The probability of a correct response in m-alternative forced-choice, assuming independence, is based on the product of the likelihoods of the signal alternative generating the strongest response and the m - 1 noise alternatives generating responses less than this value (Green and Dai, 1991). For a Yes-No paradigm, the sensitivity is calculated more simply as \deqn{d' = \code{qnorm}(H) - \code{qnorm}(F)} where \eqn{H} and \eqn{F} are the Hit and False Alarm rates, respectively. } \value{ Returns the value of \eqn{d'} } \references{ Green, D. M. and Dai, H. (1991) Probability of being correct with 1 of M orthogonal signals. \emph{Perception & Psychophysics}, \bold{49}, 100--101. Green, D. M. and Swets, J. A. (1966) \emph{Signal Detection Theory and Psychophysics} Robert E. Krieger Publishing Company } \author{ Kenneth Knoblauch } \note{ Currently is only valid for \eqn{d'} in the interval [-10, 10] which should be well outside the range of sensory differences that this paradigm is used to investigate. } \seealso{ See Also \code{\link{dprime.ABX}}, \code{\link{dprime.SD}}, \code{\link{dprime.oddity}}} \examples{ dprime.mAFC(0.8, 4) } \keyword{univar} psyphy/man/summary.lambda.Rd0000755000175000017500000000313411243524110015713 0ustar nileshnilesh\name{summary.lambda} \alias{summary.lambda} \alias{print.summary.lambda} \title{ Summary Method for Objects of Class `lambda'} \description{ Identical to \code{summary.glm} but with one line of additional output: the estimate of lambda from \code{glm.lambda}, obtained by profiling the deviance and estimating its minimum.} \usage{ \method{summary}{lambda}(object, ...) \method{print}{summary.lambda}(x, digits = max(3, getOption("digits") - 3), symbolic.cor = x$symbolic.cor, signif.stars = getOption("show.signif.stars"), ...) } \arguments{ \item{object}{ Fitted model object of class \dQuote{lambda} inheriting from \code{glm} and \code{lm}. Typically the output of \code{glm.lambda}.} \item{x}{ an object of class \dQuote{summary.lambda}, usually a result of a call to \code{summary.lambda}. } \item{digits}{ the number of significant digits to use when printing.} \item{symbolic.cor}{logical. If \code{TRUE}, print the correlations in a symbolic form (see \code{\link{symnum}}) rather than as numbers.} \item{signif.stars}{logical. If \code{TRUE}, \dQuote{significance stars} are printed for each coefficient.} \item{\dots}{further arguments passed to or from other methods. } } \details{ Provides a summary of the class \code{lambda} object generated by \code{glm.lambda}. } \value{ Returns the same structure as \code{\link{summary.glm}} with an added component, \code{lambda}. \eqn{1 - \lambda} is the estimated upper asymptote of the psychometric function. } \author{ Ken Knoblauch } \seealso{ \code{\link{probit.lambda}}, \code{\link{glm.lambda}} } \keyword{ methods } \keyword{ print } psyphy/man/psyfun.2asym.Rd0000644000175000017500000001045611100160626015357 0ustar nileshnilesh\name{psyfun.2asym} \alias{psyfun.2asym} \title{Fit Psychometric Functions and Upper and Lower Asymptotes} \description{ Fits psychometric functions allowing for variation of both upper and lower asymptotes. Uses a procedure that alternates between fitting linear predictor with \code{glm} and estimating the asymptotes with \code{optim} until a minimum in -log likelihood is obtained within a tolerance. } \usage{ psyfun.2asym(formula, data, link = logit.2asym, init.g = 0.01, init.lam = 0.01, trace = FALSE, tol = 1e-06, mxNumAlt = 50, ...) } \arguments{ \item{formula}{a two sided formula specifying the response and the linear predictor } \item{data}{ a data frame within which the formula terms are interpreted } \item{link}{ a link function for the binomial family that allows specifying both upper and lower asymptotes } \item{init.g}{ numeric specifying the initial estimate for the lower asymptote } \item{init.lam}{ numeric specifying initial estimate for 1 - upper asymptote } \item{trace}{ logical indicating whether to show the trace of the minimization of -log likelihood } \item{tol}{ numeric indicating change in -log likelihood as a criterion for stopping iteration. } \item{mxNumAlt}{ integer indicating maximum number of alternations between \code{glm} and \code{optim} steps to perform if minimum not reached.} \item{\dots}{ additional arguments passed to \code{glm} } } \details{ The function is a wrapper for \code{glm} for fitting psychometric functions with the equation \deqn{ P(x) = \gamma + (1 - \gamma - \lambda) p(x) } where \eqn{\gamma} is the lower asymptote and \eqn{lambda} is \eqn{1 - } the upper asymptote, and \eqn{p(x)} is the base psychometric function, varying between 0 and 1. } \note{The \code{cloglog.2asym} and its alias, \code{weib.2asym}, don't converge on occasion. This can be observed by using the \code{trace} argument. One strategy is to modify the initial estimates. } \value{ list of class \sQuote{lambda} inheriting from classes \sQuote{glm} and \sQuote{lm} and containing additional components \item{lambda }{numeric indicating 1 - upper asymptote} \item{gam }{numeric indicating lower asymptote} \item{SElambda }{numeric indicating standard error estimate for lambda based on the Hessian of the last interation of \code{optim}. The optimization is done on the value transformed by the function \code{plogis} and the value is stored in on this scale} \item{SEgam }{numeric indicating standard error estimate for gam estimated in the same fashion as \code{SElambda}} If a diagonal element of the Hessian is sufficiently close to 0, \code{NA} is returned. } \references{ Klein S. A. (2001) Measuring, estimating, and understanding the psychometric function: a commentary. \emph{Percept Psychophys.}, \bold{63(8)}, 1421--1455. Wichmann, F. A. and Hill, N. J. (2001) The psychometric function: I.Fitting, sampling, and goodness of fit. \emph{Percept Psychophys.}, \bold{63(8)}, 1293--1313. } \author{ Kenneth Knoblauch } \seealso{ \code{\link{glm}}, \code{\link{optim}}, \code{\link{glm.lambda}}, \code{\link{mafc}} } \examples{ #A toy example, set.seed(12161952) b <- 3 g <- 0.05 # simulated false alarm rate d <- 0.03 a <- 0.04 p <- c(a, b, g, d) num.tr <- 160 cnt <- 10^seq(-2, -1, length = 6) # contrast levels #simulated Weibull-Quick observer responses truep <- g + (1 - g - d) * pweibull(cnt, b, a) ny <- rbinom(length(cnt), num.tr, truep) nn <- num.tr - ny phat <- ny/(ny + nn) resp.mat <- matrix(c(ny, nn), ncol = 2) ddprob.glm <- psyfun.2asym(resp.mat ~ cnt, link = probit.2asym) ddlog.glm <- psyfun.2asym(resp.mat ~ cnt, link = logit.2asym) # Can fit a Weibull function, but use log contrast as variable ddweib.glm <- psyfun.2asym(resp.mat ~ log(cnt), link = weib.2asym) ddcau.glm <- psyfun.2asym(resp.mat ~ cnt, link = cauchit.2asym) plot(cnt, phat, log = "x", cex = 1.5, ylim = c(0, 1)) pcnt <- seq(0.01, 0.1, len = 100) lines(pcnt, predict(ddprob.glm, data.frame(cnt = pcnt), type = "response"), lwd = 5) lines(pcnt, predict(ddlog.glm, data.frame(cnt = pcnt), type = "response"), lwd = 2, lty = 2, col = "blue") lines(pcnt, predict(ddweib.glm, data.frame(cnt = pcnt), type = "response"), lwd = 3, col = "grey") lines(pcnt, predict(ddcau.glm, data.frame(cnt = pcnt), type = "response"), lwd = 3, col = "grey", lty = 2) summary(ddprob.glm) } \keyword{ models } psyphy/man/dprime.SD.Rd0000755000175000017500000000405010626253556014603 0ustar nileshnilesh\name{dprime.SD} \alias{dprime.SD} \title{d' for Same-different Paradigm} \description{ Calulate \eqn{d'} for same-different paradigm either assuming a differencing strategy or independent observations } \usage{ dprime.SD(H, FA, zdiff, Pcmax, method = "diff") } %- maybe also 'usage' for other objects documented here. \arguments{ \item{H}{numeric in [0, 1] corresponding to Hit rate} \item{FA}{numeric in [0, 1] corresponding to False alarm rate} \item{zdiff}{numeric. Difference of z-scores for Hit and False Alarm rates ( only valid for method "IO")} \item{Pcmax}{numeric in [0, 1]. Proportion correct for an unbiased observer, \code{pnorm(zdiff)} (only valid for method "IO"). } \item{method}{character. Specifies the model to describe the observer's criterion for dividing up the decision space, must be either "diff" for a differencing strategy (the default) or "IO" for independent observations. } } \details{Two different strategies have been described for how the observer partitions the decision space in the same-different paradigm. With Independent Observations, \eqn{d'} can be calculated either from the \code{H} and \code{FA} rates, from the difference of z-scores or from the probability correct of an unbiased observer. Only one of these three choices should be specified in the arguments. For the differencing strategy, only the first of these choices is valid. } \value{ Returns the value of \eqn{d'} } \references{MacMillan, N. A. and Creeman, C. D. (1991) \emph{Detection Theory: A User's Guide} Cambridge University Press Green, D. M. and Swets, J. A. (1966) \emph{Signal Detection Theory and Psychophysics} Robert E. Krieger Publishing Company } \author{Kenneth Knoblauch } \seealso{\code{\link{dprime.mAFC}}, \code{\link{dprime.ABX}}, \code{\link{dprime.oddity}} } \examples{ dprime.SD(H = 0.642, F = 0.3) dprime.SD(H = 0.75, F = 0.3, method = "IO") dprime.SD(zdiff = qnorm(0.75) - qnorm(0.3), method = "IO") dprime.SD(Pcmax = pnorm( (qnorm(0.75) - qnorm(0.3))/2 ), method = "IO") } \keyword{univar} psyphy/man/glm.lambda.Rd0000755000175000017500000000770411035506011015003 0ustar nileshnilesh\name{glm.lambda} \alias{glm.lambda} \title{ mafc Probit Fit to Psychometric Function Profiled on Upper Asymptote } \description{ A wrapper for \code{glm} in which the deviance for the model with binomial family and link \code{probit.lambda} is profiled as a function of \code{lambda}, the upper asymptote of the psychometric function. } \usage{ glm.lambda(formula, data, NumAlt = 2, lambda = seq(0, 0.1, len = 40), plot.it = FALSE, ...) } \arguments{ \item{formula}{a symbolic description of the model to be fit } \item{data}{ an optional data frame, list or environment (or object coercible by \code{\link{as.data.frame}} to a data frame) containing the variables in the model. If not found in data, the variables are taken from \code{environment(formula)}, typically the environment from which \code{glm} is called. } \item{NumAlt}{ the number of alternatives, \code{m} in the mafc experiment from which the data arise } \item{lambda}{ a sequence of values to profile for the upper asymptote of the psychometric function } \item{plot.it}{ logical indicating whether to plot the profile of the deviances as a function of \code{lambda}} \item{\dots}{ further arguments passed to \code{glm}} } \details{ The psychometric function fit to the data is described by \deqn{P(x) = 1/m + (1 - 1/m - \lambda) \Phi(x)} where \eqn{m} is the number of alternatives and the lower asymptote, \eqn{1 - \lambda} is the upper asymptote and \eqn{\Phi} is the cumulative normal function. } \value{ returns an object of class \sQuote{lambda} which inherits from classes \sQuote{glm} and \sQuote{lm}. It only differs from an object of class \sQuote{glm} in including two additional components, \code{lambda}, giving the estimated minimum of the profile by fitting a quadratic to the profile and a data frame containing the profiled deviance values for each value of \code{lambda} tested. The degrees of freedom are reduced by 1 to take into account the estimation of \code{lambda}. } \references{ Wichmann, F. A. and Hill, N. J. (2001) The psychometric function: I.Fitting, sampling, and goodness of fit. Percept Psychophys., 63(8), 1293--1313. Yssaad-Fesselier, R. and Knoblauch, K. (2006) Modeling psychometric functions in R. \emph{ Behav Res Methods.}, \bold{38(1)}, 28--41. (for examples with \code{gnlr}). } \author{ Ken Knoblauch} \note{ If the minimum occurs outside the interval examined, an error might occur. In any case, re-running the function with a new range of \code{lambda} that includes the minimum should work. if the plotted profile indicates that the fitted quadratic does not describe well the profile at the minimum, refitting with a more restricted range of \code{lambda} is recommended. } \seealso{ \code{\link{mafc}}, \code{\link{glm}}, \code{\link{probit.lambda}}, \code{\link{family}}} \examples{ b <- 3.5 g <- 1/3 d <- 0.025 a <- 0.04 p <- c(a, b, g, d) num.tr <- 160 cnt <- 10^seq(-2, -1, length = 6) # contrast levels #simulated Weibull-Quick observer responses set.seed(12161952) truep <- g + (1 - g - d) * pweibull(cnt, b, a) ny <- rbinom(length(cnt), num.tr, truep) nn <- num.tr - ny phat <- ny/(ny + nn) resp.mat <- matrix(c(ny, nn), ncol = 2) ## First with upper asymptote at 1 dd.glm <- glm(resp.mat ~ cnt, family = binomial(mafc.probit(3))) summary(dd.glm) dd.lam <- glm.lambda(resp.mat ~ cnt, NumAlt = 3, lambda = seq(0, 0.03, len = 100), plot.it = TRUE) summary(dd.lam) ## can refine interval, but doesn't change result much dd.lam2 <- glm.lambda(resp.mat ~ cnt, NumAlt = 3, lambda = seq(dd.lam$lambda/sqrt(2), dd.lam$lambda*sqrt(2), len = 100), plot.it = TRUE) summary(dd.lam2) ## Compare fits w/ and w/out lambda anova(dd.glm, dd.lam2, test = "Chisq") plot(cnt, phat, log = "x", cex = 1.5, ylim = c(0, 1)) pcnt <- seq(0.01, 0.1, len = 100) lines(pcnt, predict(dd.glm, data.frame(cnt = pcnt), type = "response"), lwd = 2) lines(pcnt, predict(dd.lam, data.frame(cnt = pcnt), type = "response"), lwd = 2, lty = 2) } \keyword{models} psyphy/R/0000755000175000017500000000000014160572314012144 5ustar nileshnileshpsyphy/R/dprime.ABX.R0000755000175000017500000000226710551735704014176 0ustar nileshnilesh`dprime.ABX` <-function(Hits, FA, zdiff, Pc.unb, method = "diff") { Call <- match.call() if (pmatch("Hits", names(Call), 0) > 0) { if (pmatch("FA", names(Call), 0) > 0) { zdiff <- qnorm(Hits) - qnorm(FA) Pc.unb <- pnorm(zdiff/2) } else { zdiff <- qnorm(Hits) - qnorm(1-Hits) Pc.unb <- pnorm(zdiff/2) } } else { if (pmatch("zdiff", names(Call), 0) > 0) { Pc.unb <- pnorm(zdiff/2) } } if (Pc.unb < 0.5) stop("Only valid for Pc.unb > 0.5") root2 <- sqrt(2) if (method == "diff") { root6 <- sqrt(6) est.dp <- function(dp) { Pc.unb - pnorm(dp/root2)*pnorm(dp/root6) - pnorm(-dp/root2)*pnorm(-dp/root6) } dp.res <- uniroot(est.dp, interval=c(0,10)) dprime <- dp.res$root } else { if (method == "IO") { est.dp <- function(dp) { Pc.unb - pnorm(dp/root2)*pnorm(dp/2) - pnorm(-dp/root2)*pnorm(-dp/2) } dp.res <- uniroot(est.dp, interval=c(0,10)) dprime <- dp.res$root } else {stop("method not defined; must be either diff or IO") } } dprime } psyphy/R/dprime.mAFC.R0000755000175000017500000000101410551422760014312 0ustar nileshnilesh`dprime.mAFC` <- function(Pc, m) { # m an integer > 2, number of choices # Pc - probability correct choice (unbiased observer) m <- as.integer(m) if (m < 2) stop("m must be an integer greater than 1") if (!is.integer(m)) stop("m must be an integer") if (Pc <= 0 || Pc >= 1) stop ("Pc must be in (0,1)") est.dp <- function(dp){ pr <- function(x) dnorm(x-dp) * pnorm(x)^(m-1)#0-1) Pc - integrate(pr, lower = -Inf, upper = Inf)$value } dp.res <- uniroot(est.dp, interval = c(-10,10)) dp.res$root } psyphy/R/SS.calib.R0000755000175000017500000000176013751214132013670 0ustar nileshnilesh`SS.calib` <- structure(function (Blev, k, gamm, GL) { .expr1 <- GL^gamm .expr4 <- ifelse(GL == 0, 0, log(GL)) .expr5 <- .expr1 * .expr4 .value <- Blev + k * .expr1 .grad <- array(0, c(length(.value), 3), list(NULL, c("Blev", "k", "gamm"))) .hessian <- array(0, c(length(.value), 3, 3), list(NULL, c("Blev", "k", "gamm"), c("Blev", "k", "gamm"))) .grad[, "Blev"] <- 1 .grad[, "k"] <- .expr1 .hessian[, "k", "k"] <- 0 .hessian[, "k", "gamm"] <- .hessian[, "gamm", "k"] <- .expr5 .grad[, "gamm"] <- k * .expr5 .hessian[, "gamm", "gamm"] <- k * (.expr5 * .expr4) attr(.value, "gradient") <- .grad attr(.value, "hessian") <- .hessian .value } , initial = function(mCall, data, LHS, ...) { xy <- sortedXyData(mCall[["GL"]], LHS, data) Blev <- min(xy[["y"]]) k <- max(xy[["y"]]) gamm <- 2.5 value <- c(Blev, k, gamm) names(value) <- mCall[c("Blev", "k", "gamm")] value } , pnames = c("Blev", "k", "gamm"), class = "selfStart") psyphy/R/glm.WH.R0000755000175000017500000000242210622674002013362 0ustar nileshnileshglm.WH <- function(formula, data, NumAlt = 2, lambda.init = 0.01, interval = c(0, 0.05), trace = FALSE, tol = 1e-6, ...) { p.l <- function(lambda = lambda, m = NumAlt) { probit.lambda(m, lambda) } if (missing(data)) data <- environment(formula) est.glm <- glm(formula, family = binomial(probit.lambda(NumAlt, lambda.init)), data = data, ...) ll <- function(lam, NumAlt, x) { rr <- x$model[[1]] -sum(rr[, 1] * log(p.l(lam, NumAlt)$linkinv(x$linear.predictors)) + rr[, 2] * log(1 - p.l(lam, NumAlt)$linkinv(x$linear.predictors))) } dlogcur <- dd <- -as.vector(logLik(est.glm)) new.glm <- est.glm n <- 0 while (dd > tol) { n <- n + 1 lam.c <- optimize(ll, interval = interval, NumAlt = NumAlt, x = new.glm)$minimum new.glm <- glm(formula, family = binomial(probit.lambda(NumAlt, lam.c)), data = data, ...) dd <- abs(-as.vector(logLik(new.glm)) - dlogcur) dlogcur <- -as.vector(logLik(new.glm)) if (trace) print(data.frame(n = n, logLik = dd, lambda = lam.c)) } new.glm$lambda <- lam.c cat("lambda = \t", lam.c, "\n") new.glm$df.residual <- new.glm$df.residual - 1 class(new.glm) <- c("lambda", "glm", "lm") new.glm } psyphy/R/SS.RGBcalib.R0000755000175000017500000000407613751214151014227 0ustar nileshnilesh`SS.RGBcalib` <-structure(function (Blev, Br, Bg, Bb, gamm, Rgun, Ggun, Bgun) { .expr1 <- Rgun^gamm .expr4 <- Ggun^gamm .expr7 <- Bgun^gamm .expr10 <- ifelse(Rgun == 0, 0, log(Rgun)) .expr11 <- .expr1 * .expr10 .expr12 <- ifelse(Ggun == 0, 0, log(Ggun)) .expr13 <- .expr4 * .expr12 .expr14 <- ifelse(Bgun == 0, 0, log(Bgun)) .expr15 <- .expr7 * .expr14 .value <- Blev + Br * .expr1 + Bg * .expr4 + Bb * .expr7 .grad <- array(0, c(length(.value), 5), list(NULL, c("Blev", "Br", "Bg", "Bb", "gamm"))) .hessian <- array(0, c(length(.value), 5, 5), list(NULL, c("Blev", "Br", "Bg", "Bb", "gamm"), c("Blev", "Br", "Bg", "Bb", "gamm"))) .grad[, "Blev"] <- 1 .grad[, "Br"] <- .expr1 .hessian[, "Br", "Br"] <- 0 .hessian[, "Br", "Bg"] <- .hessian[, "Bg", "Br"] <- 0 .hessian[, "Br", "Bb"] <- .hessian[, "Bb", "Br"] <- 0 .hessian[, "Br", "gamm"] <- .hessian[, "gamm", "Br"] <- .expr11 .grad[, "Bg"] <- .expr4 .hessian[, "Bg", "Bg"] <- 0 .hessian[, "Bg", "Bb"] <- .hessian[, "Bb", "Bg"] <- 0 .hessian[, "Bg", "gamm"] <- .hessian[, "gamm", "Bg"] <- .expr13 .grad[, "Bb"] <- .expr7 .hessian[, "Bb", "Bb"] <- 0 .hessian[, "Bb", "gamm"] <- .hessian[, "gamm", "Bb"] <- .expr15 .grad[, "gamm"] <- Br * .expr11 + Bg * .expr13 + Bb * .expr15 .hessian[, "gamm", "gamm"] <- Br * (.expr11 * .expr10) + Bg * (.expr13 * .expr12) + Bb * (.expr15 * .expr14) attr(.value, "gradient") <- .grad attr(.value, "hessian") <- .hessian .value } , initial = function(mCall, data, LHS, ...) { Lum <- eval(asOneSidedFormula("Lum")[[2]], data) Rgun <- eval(asOneSidedFormula("Rgun")[[2]], data) Ggun <- eval(asOneSidedFormula("Ggun")[[2]], data) Bgun <- eval(asOneSidedFormula("Bgun")[[2]], data) Blev <- min(Lum) Br <- max(Lum[Rgun > 0]) Bg <- max(Lum[Ggun > 0]) Bb <- max(Lum[Bgun > 0]) gamm <- 2.5 value <- c(Blev, Br, Bg, Bb, gamm) names(value) <- mCall[c("Blev", "Br", "Bg", "Bb", "gamm")] value } , pnames = c("Blev", "Br", "Bg", "Bb", "gamm"), class = "selfStart") psyphy/R/dprime.oddity.R0000755000175000017500000000062210551422760015044 0ustar nileshnilesh`dprime.oddity` <- function(Pc.tri) { if (Pc.tri < 1/3) stop("Only valid for Pc.tri > 1/3") root3 <- sqrt(3) root2.3 <- sqrt(2)/root3 est.dp <- function(dp) { pr <- function(x) { 2 *(pnorm(-x * root3 + dp * root2.3) + pnorm(-x * root3 - dp * root2.3)) * dnorm(x) } Pc.tri - integrate(pr, lower=0, upper=Inf)$value } dp.res <- uniroot(est.dp, interval = c(0,10)) dp.res$root } psyphy/R/probit.lambda.R0000755000175000017500000000126110621530254015003 0ustar nileshnileshprobit.lambda <- function (m = 2, lambda = 0) { m <- as.integer(m) if (m < 2) stop("m must be an integer > 1") linkfun <- function(mu) { mu <- pmax(mu, 1/m + .Machine$double.eps) mu <- pmin(mu, 1 - lambda) qnorm((m * mu - 1)/(m * (1 - lambda) - 1)) } linkinv <- function(eta) { 1/m + ((m - 1)/m -lambda) * pnorm(eta) } mu.eta <- function(eta) ((m - 1)/m - lambda) * dnorm(eta) valideta <- function(eta) TRUE link <- paste("probit.lambda(", m, ",", lambda, ")", sep = "") structure(list(linkfun = linkfun, linkinv = linkinv, mu.eta = mu.eta, valideta = valideta, name = link), class = "link-glm") } psyphy/R/psyfun.2asym.R0000644000175000017500000000363011100160136014631 0ustar nileshnileshpsyfun.2asym <- function(formula, data, link = logit.2asym, init.g = 0.01, init.lam = 0.01, trace = FALSE, tol = 1e-6, mxNumAlt = 50, ...) { p.l <- function(p) { link(p[1], p[2]) } if (missing(data)) data <- environment(formula) est.glm <- glm(formula, family = binomial(link(g = init.g, lam = init.lam)), data = data, ...) ll <- function(p, x) { p <- plogis(p) rr <- x$model[[1]] -sum(rr[, 1] * log(p.l(p)$linkinv(x$linear.predictors)) + rr[, 2] * log(1 - p.l(p)$linkinv(x$linear.predictors))) } dlogcur <- dd <- -as.vector(logLik(est.glm)) new.glm <- est.glm n <- 0 p <- c(init.g, init.lam) while (dd > tol) { n <- n + 1 p <- qlogis(p) p.opt <- optim(p, ll, x = new.glm, hessian = TRUE) p <- plogis(p.opt$par) new.glm <- glm(formula, family = binomial(link(g = p[1], lam = p[2])), data = data, ...) dd <- abs(-as.vector(logLik(new.glm)) - dlogcur)/dlogcur dlogcur <- -as.vector(logLik(new.glm)) if (trace) print(data.frame(n = n, logLik = dd, lambda = p[2], gamma = p[1])) if (n > mxNumAlt) { print("Number of iterations exceeded without finding best fit. \n") break} } p.svd <- svd(p.opt$hessian) SEp <- sqrt(diag(with(p.svd, v %*% diag(1/ifelse(zapsmall(d), d, Inf)) %*% t(u)))) # X <<- SEp SEp[zapsmall(SEp) == 0] <- NA new.glm$lambda <- p[2] new.glm$SElambda <- SEp[2] new.glm$gam <- p[1] new.glm$SEgam <- SEp[1] cat("lambda = \t", p[2], "\t", "gamma = ", p[1], "\n") cat("+/-SE(lambda) = \t(", plogis(qlogis(p[2]) + c(-SEp[2], SEp[2])), ")\n") cat("+/-SE(gamma) = \t(", plogis(qlogis(p[1]) + c(-SEp[1], SEp[1])), ")\n") new.glm$df.residual <- new.glm$df.residual - 2 new.glm$call[[3]][[2]][[1]] <- as.name(substitute(link)) class(new.glm) <- c("lambda", "glm", "lm") new.glm }psyphy/R/gl.links.R0000644000175000017500000000573312005244710014010 0ustar nileshnileshlogit.2asym <- function(g, lam) { if ((g < 0 ) || (g > 1)) stop("g must in (0, 1)") if ((lam < 0) || (lam > 1)) stop("lam outside (0, 1)") linkfun <- function(mu) { mu <- pmin(mu, 1 - (lam + .Machine$double.eps)) mu <- pmax(mu, g + .Machine$double.eps) qlogis((mu - g)/(1 - g - lam)) } linkinv <- function(eta) { g + (1 - g - lam) * binomial()$linkinv(eta) # .Call("logit_linkinv", eta, PACKAGE = "stats") } mu.eta <- function(eta) { (1 - g - lam) * binomial()$mu.eta(eta) # .Call("logit_mu_eta", eta, PACKAGE = "stats") } valideta <- function(eta) TRUE link <- paste("logit.2asym(", g, ", ", lam, ")", sep = "") structure(list(linkfun = linkfun, linkinv = linkinv, mu.eta = mu.eta, valideta = valideta, name = link), class = "link-glm") } probit.2asym <- function(g, lam) { if ((g < 0 ) || (g > 1)) stop("g must in (0, 1)") if ((lam < 0) || (lam > 1)) stop("lam outside (0, 1)") linkfun <- function(mu) { mu <- pmin(mu, 1 - (lam + .Machine$double.eps)) mu <- pmax(mu, g + .Machine$double.eps) qnorm((mu - g)/(1 - g - lam)) } linkinv <- function(eta) { g + (1 - g - lam) * pnorm(eta) } mu.eta <- function(eta) { (1 - g - lam) * dnorm(eta) } valideta <- function(eta) TRUE link <- paste("probit.2asym(", g, ", ", lam, ")", sep = "") structure(list(linkfun = linkfun, linkinv = linkinv, mu.eta = mu.eta, valideta = valideta, name = link), class = "link-glm") } cauchit.2asym <- function(g, lam) { if ((g < 0 ) || (g > 1)) stop("g must in (0, 1)") if ((lam < 0) || (lam > 1)) stop("lam outside (0, 1)") linkfun <- function(mu) { mu <- pmin(mu, 1 - (lam + .Machine$double.eps)) mu <- pmax(mu, g + .Machine$double.eps) qcauchy((mu - g)/(1 - g - lam)) } linkinv <- function(eta) { g + (1 - g - lam) * pcauchy(eta) } mu.eta <- function(eta) { (1 - g - lam) * dcauchy(eta) } valideta <- function(eta) TRUE link <- paste("cauchit.2asym(", g, ", ", lam, ")", sep = "") structure(list(linkfun = linkfun, linkinv = linkinv, mu.eta = mu.eta, valideta = valideta, name = link), class = "link-glm") } cloglog.2asym <- function(g, lam) { if ((g < 0 ) || (g > 1)) stop("g must in (0, 1)") if ((lam < 0) || (lam > 1)) stop("lam outside (0, 1)") linkfun <- function(mu) { mu <- pmax(pmin(mu, 1 - (lam + .Machine$double.eps)), g + .Machine$double.eps) log(-log((mu - g)/(1 - g - lam))) } linkinv <- function(eta) { tmp <- g + (1 - g - lam) * (-expm1(-exp(eta))) pmax(pmin(tmp, 1 - (lam + .Machine$double.eps)), g + .Machine$double.eps) } mu.eta <- function(eta) { eta <- pmin(eta, 700) pmax((1 - g - lam) * exp(eta) * exp(-exp(eta)), .Machine$double.eps) } valideta <- function(eta) TRUE link <- paste("cloglog.2asym(", g, ", ", lam, ")", sep = "") structure(list(linkfun = linkfun, linkinv = linkinv, mu.eta = mu.eta, valideta = valideta, name = link), class = "link-glm") } weib.2asym <- function(...) cloglog.2asym(...) psyphy/R/dprime.SD.R0000755000175000017500000000175411602603625014064 0ustar nileshnilesh`dprime.SD` <- function (H, FA, zdiff, Pcmax, method = "diff") { if (method == "diff") { if (isTRUE(all.equal(H, FA))) return(0) root2 <- sqrt(2) k <- root2 * qnorm(FA/2) est.dp <- function(dp) { H - pnorm((k + dp)/root2) - pnorm((k - dp)/root2) } dp.res <- uniroot(est.dp, interval = c(0, 10)) dprime <- dp.res$root } else if (method == "IO") { Call <- match.call() if (pmatch("H", names(Call), 0) > 0) { if (pmatch("FA", names(Call), 0) > 0) { zdiff <- qnorm(H) - qnorm(FA) Pcmax <- pnorm(zdiff/2) } else { zdiff <- qnorm(H) - qnorm(1-H) Pcmax <- pnorm(zdiff/2) } } else { if (pmatch("zdiff", names(Call), 0) > 0) { Pcmax <- pnorm(zdiff/2) } } dprime <- sign(Pcmax - 0.5) * if ( Pcmax < 0.5 ) 2 * qnorm(0.5 * (1 + sqrt(2 * (1 - Pcmax) - 1))) else 2 * qnorm(0.5 * (1 + sqrt(2 * Pcmax - 1))) #2 * qnorm(0.5 * (1 + sqrt(2*Pcmax - 1))) } else {stop("method must be one of diff or IO") } if(dprime < 0) warning("FA > H giving d' < 0!") dprime } psyphy/R/glm.lambda.R0000755000175000017500000001062611600600676014274 0ustar nileshnileshglm.lambda <- function(formula, data, NumAlt = 2, lambda = seq(0, 0.1, len = 40), plot.it = FALSE, ...) { if (missing(data)) data <- environment(formula) lam.lst <- vector("list", length(lambda)) for (ix in seq(length(lambda))) { lam.lst[[ix]] <- glm(formula, family = binomial(probit.lambda(NumAlt, lambda[ix])), data = data, ...) } dev <- unlist(sapply(lam.lst, deviance)) n <- 5 mdev <- which.min(dev) if (mdev > n) { ss <- seq(-n, n) + mdev } else { ss <- seq(1, 2*n+1) } lam.quad <- lm(dev[ss] ~ lambda[ss] + I(lambda[ss]^2)) lmin <- -0.5 * coef(lam.quad)[2]/coef(lam.quad)[3] res <- glm(formula, family = binomial(probit.lambda(NumAlt, lmin)), data = data, ...) class(res) <- c("lambda", "glm", "lm") res$lambda <- as.vector(lmin) res$df.residual <- res$df.residual - 1 res$profile <- data.frame(lambda = lambda, deviance = dev) if (plot.it) { plot(lambda, dev, xlab = expression(lambda), cex = 0.2, ylab = "Deviance", cex.lab = 1.35, type = "p") if (length(lambda[ss]) == length(predict(lam.quad))) { lines(lambda[ss], predict(lam.quad), lwd = 2) # lines(lambda, predict(lam.quad), lwd = 2) abline(v = lmin, lty = 3) } } cat("\n", "lambda = ", lmin, "\n") res } summary.lambda <- function(object, ...) { ans <- summary.glm(object, ...) ans$lambda <- object$lambda if (!is.null(object$gam)) ans$gamma <- object$gam if(!is.null(object$SEgam)) ans$SEgam <- object$SEgam if(!is.null(object$SElambda)) ans$SElambda <- object$SElambda class(ans) <- c("summary.lambda") return(ans) } print.summary.lambda <- function(x, digits = max(3, getOption("digits") - 3), symbolic.cor = x$symbolic.cor, signif.stars = getOption("show.signif.stars"), ...) { cat("\nCall:\n") cat(paste(deparse(x$call), sep = "\n", collapse = "\n"), "\n\n", sep = "") cat("Deviance Residuals: \n") if (x$df.residual > 5) { x$deviance.resid <- quantile(x$deviance.resid, na.rm = TRUE) names(x$deviance.resid) <- c("Min", "1Q", "Median", "3Q", "Max") } print.default(x$deviance.resid, digits = digits, na.print = "", print.gap = 2) if (length(x$aliased) == 0) { cat("\nNo Coefficients\n") } else { if (!is.null(df <- x$df) && (nsingular <- df[3] - df[1])) cat("\nCoefficients: (", nsingular, " not defined because of singularities)\n", sep = "") else cat("\nCoefficients:\n") coefs <- x$coefficients if (!is.null(aliased <- x$aliased) && any(aliased)) { cn <- names(aliased) coefs <- matrix(NA, length(aliased), 4, dimnames = list(cn, colnames(coefs))) coefs[!aliased, ] <- x$coefficients } printCoefmat(coefs, digits = digits, signif.stars = signif.stars, na.print = "NA", ...) } cat("\n(Dispersion parameter for ", x$family$family, " family taken to be ", format(x$dispersion), ")\n\n", apply(cbind(paste(format(c("Null", "Residual"), justify = "right"), "deviance:"), format(unlist(x[c("null.deviance", "deviance")]), digits = max(5, digits + 1)), " on", format(unlist(x[c("df.null", "df.residual")])), " degrees of freedom\n"), 1, paste, collapse = " "), sep = "") if (nchar(mess <- naprint(x$na.action))) cat(" (", mess, ")\n", sep = "") cat("AIC: ", format(x$aic, digits = max(4, digits + 1)), "\n\n", "Number of Fisher Scoring iterations: ", x$iter, "\n", sep = "") correl <- x$correlation if (!is.null(correl)) { p <- NCOL(correl) if (p > 1) { cat("\nCorrelation of Coefficients:\n") if (is.logical(symbolic.cor) && symbolic.cor) { print(symnum(correl, abbr.colnames = NULL)) } else { correl <- format(round(correl, 2), nsmall = 2, digits = digits) correl[!lower.tri(correl)] <- "" print(correl[-1, -p, drop = FALSE], quote = FALSE) } } } cat("\n") cat("lambda\t", round(x$lambda, digits), "\t") if (!is.null(x$gam)) cat("gamma\t", round(x$gam, digits), "\n") else cat("\n") if (!is.null(x$SEgam)) { cat("+/-SE(lambda) = \t(", plogis(qlogis(x$lambda) + c(-x$SElambda, x$SElambda)), ")\n") cat("+/-SE(gamma) = \t(", plogis(qlogis(x$gam) + c(-x$SEgam, x$SEgam)), ")\n") } invisible(x) } psyphy/R/mafc.R0000755000175000017500000000561012005244554013200 0ustar nileshnilesh`mafc.logit` <- function( .m = 2 ) { .m <- as.integer(.m) if (.m < 2) stop(".m must be an integer > 1") linkfun <- function(mu) { mu <- pmax(mu, 1/.m + .Machine$double.eps) qlogis((.m * mu - 1)/(.m - 1) ) } linkinv <- function(eta) { 1/.m + (.m - 1)/.m * binomial()$linkinv(eta) # .Call("logit_linkinv", eta, PACKAGE = "stats") } mu.eta <- function(eta) ((.m -1) / .m) * binomial()$mu.eta(eta) # .Call("logit_mu_eta", eta, PACKAGE = "stats") valideta <- function(eta) TRUE link <- paste("mafc.logit(", .m, ")", sep = "") structure(list(linkfun = linkfun, linkinv = linkinv, mu.eta = mu.eta, valideta = valideta, name = link), class = "link-glm") } `mafc.probit` <- function( .m = 2 ) { .m <- as.integer(.m) if (.m < 2) stop("m must be an integer > 1") linkfun <- function(mu) { mu <- pmax(mu, 1/.m + .Machine$double.eps) qnorm((.m * mu - 1)/(.m - 1) ) } linkinv <- function(eta) { 1/.m + (.m - 1)/.m * pnorm(eta) } mu.eta <- function(eta) ((.m -1) / .m) * dnorm(eta) valideta <- function(eta) TRUE link <- paste("mafc.probit(", .m, ")", sep = "") structure(list(linkfun = linkfun, linkinv = linkinv, mu.eta = mu.eta, valideta = valideta, name = link), class = "link-glm") } `mafc.cloglog` <- function( .m = 2 ) { .m <- as.integer(.m) if (.m < 2) stop(".m must be an integer > 1") linkfun <- function(mu) { mu <- pmax(pmin(mu, 1 - .Machine$double.eps), 1/.m + .Machine$double.eps) log(-log((.m - mu * .m)/(.m - 1))) } linkinv <- function(eta) { tmp <- 1/.m + ((.m - 1)/.m) * (-expm1(-exp(eta))) pmax(pmin(tmp, 1 - .Machine$double.eps), 1/.m + .Machine$double.eps) } mu.eta <- function(eta) { eta <- pmin(eta, 700) pmax(((.m - 1)/.m) * exp(eta) * exp(-exp(eta)), .Machine$double.eps) } valideta <- function(eta) TRUE link <- paste("mafc.cloglog(", .m, ")", sep = "") structure(list(linkfun = linkfun, linkinv = linkinv, mu.eta = mu.eta, valideta = valideta, name = link), class = "link-glm") } `mafc.weib` <- function(...) mafc.cloglog(...) `mafc.cauchit` <- function( .m = 2) { .m <- as.integer(.m) if (.m < 2) stop(".m must be an integer > 1") linkfun <- function(mu) { mu <- pmax(mu, 1/.m + .Machine$double.eps) qcauchy((.m * mu - 1)/(.m - 1)) } linkinv <- function(eta) { thresh <- -qcauchy(.Machine$double.eps) eta <- pmin(pmax(eta, -thresh), thresh) 1/.m + (.m - 1)/.m * pcauchy(eta) } mu.eta <- function(eta) { pmax(((.m - 1)/.m)* dcauchy(eta), .Machine$double.eps) } valideta <- function(eta) TRUE link <- paste("mafc.cauchy(", .m, ")", sep = "") structure(list(linkfun = linkfun, linkinv = linkinv, mu.eta = mu.eta, valideta = valideta, name = link), class = "link-glm") } psyphy/inst/0000755000175000017500000000000014160571001012710 5ustar nileshnileshpsyphy/inst/NEWS0000755000175000017500000000425214160571001013415 0ustar nileshnileshv0.0-6 - added an mafc link for the cloglog link, which permits fitting Weibull psychometric functions on a log stimulus axis. See example(mafc). v0.0-7 - added mafc.weib link that is just an alias for mafc.cloglog - added mafc.cauchit link v0.0-8 - added additional links that include parameters for both upper and lower asymptotes and a function, psyfun.2asym, that is a wrapper for glm and uses the new links to fit a psychometric function with variable lower and upper asymptotes. v0.0-9 - added standard error estimates based on the Hessian from last iteration of optim to the output of psyfun.2asym (suggested by Ralph Pirow). v0.1-0 - second graphic of mafc example set to aspect = "xy" for better rendition - modified column names of data set ecc2 from nyes, nno to Correct and Incorrect, better reflecting the meaning of these variables. v0.1-1 - fixed an unmatched parenthesis in summary.lambda.Rd - fixed a double dot '..' in the name of an Rd file that did not belong v0.1-2 - fixed some warnings for Rd files v0.1-3 - dprime.SD was failing for small values of H and FA when equal (thanks to Paul Willis) v0.1-4 - the argument name of the mafc links (mafc.logit, mafc.probit, etc.) has been changed from m to .m. This should permit these links to work with the glmer function in the lme4a, the development version of lme4. - dealt with some warnings concerning potential partial argument matching v0.1-5 - added possibility for dprime.SD to give negative values (with a warning) for the IO method (suggested by Sverre Stausland) - increased the range of possible d' values obtainable from dprime.SD to 10 (was 5) for method "diff". v0.1-6 - removed LazyLoad from Description v0.1-7 - substituted binomial()$linkinv and binomial()$mu.eta for uses of .Call in mafc and mafc.logit and logit.2asym link functions. v0.1-7 - eliminated some triple colons from examples v0.2-0 - fixed title case per CRAN standards v0.2-1 - added ... to initial functions of SS.calib and SS.RGBcalib to accomadate passing trace and control arguments for nls as required for R > 4.0.3. v0.2-3 - changed URL's to DOI's for JSS citations psyphy/inst/TODO0000755000175000017500000000016211035230175013404 0ustar nileshnilesh- fit Rosa's example with predicted instead of fitted - see if fits using cloglog.2asym can be made more stable. psyphy/data/0000755000175000017500000000000012547510242012653 5ustar nileshnileshpsyphy/data/RGB.rda0000755000175000017500000000144510551452552013766 0ustar nileshnileshOHTAg7 EfToɢ-2p7%.ZuY5тx\]y :(JiQޛ|Iyy7o~;i%:13@ƍc,YX3i C̝lŻ3<WCqh3ц4VM^Qga_A啕؂ۂs_XYJOa(PIwc9-2ZhqnSm݅N|`)Pk=:ܤpui՛MK &hUZ-l--\-<-u&š @P `p08 C!`0 Caaaaaaaaaa }l相7/E0[  psyphy/data/ecc2.rda0000755000175000017500000000132011133363366014161 0ustar nileshnileshՕOHTQƯoSR ,u{lrf?4M:ЛGYТL !6bᢢM`YDEh"ZDPֹ̹~]77MVۚ+0! L>2(Q_¡µTU[[oOy࣬*] *}y~׻2pNgHf%g;jx|u&}4y[?],G/cڇ> {-+>&p:759C`B>M;v)VjgFHZicCVHK߱ t# EHCR:˽%\uIf?uk8sKu=뜩ny3+uq&<5_MڝE _{!T[ND\WToN*CZsu. RWL*.9sǂ1r c'ԾЉ0y q+f?mA'XnSޜrsTgxژXx֦FZm*83,8/\5\ \-2$2$2$2$2$2$2$2$2$2$2Ld0a"D &2LdȰa!B 2,dXȰTz/On}psyphy/NAMESPACE0000755000175000017500000000127113751217010013160 0ustar nileshnileshexport(SS.RGBcalib, SS.calib, dprime.ABX, dprime.SD, dprime.mAFC, dprime.oddity, glm.WH, glm.lambda, mafc.probit, mafc.logit, mafc.cloglog, mafc.weib, mafc.cauchit, probit.lambda, logit.2asym, probit.2asym, cauchit.2asym, cloglog.2asym, weib.2asym, psyfun.2asym) S3method(print, summary.lambda) S3method(summary, lambda) importFrom("graphics", "abline", "lines") importFrom("stats", "binomial", "coef", "dcauchy", "deviance", "dnorm", "glm", "integrate", "lm", "logLik", "naprint", "optim", "optimize", "pcauchy", "plogis", "pnorm", "predict", "printCoefmat", "qcauchy", "qlogis", "qnorm", "quantile", "summary.glm", "symnum", "uniroot")