* using log directory 'd:/RCompile/CRANincoming/R-devel/VGAMextra.Rcheck' * using R Under development (unstable) (2025-02-07 r87707 ucrt) * using platform: x86_64-w64-mingw32 * R was compiled by gcc.exe (GCC) 13.3.0 GNU Fortran (GCC) 13.3.0 * running under: Windows Server 2022 x64 (build 20348) * using session charset: UTF-8 * checking for file 'VGAMextra/DESCRIPTION' ... OK * this is package 'VGAMextra' version '0.0-7' * checking CRAN incoming feasibility ... [17s] OK * checking package namespace information ... OK * checking package dependencies ... OK * checking if this is a source package ... OK * checking if there is a namespace ... OK * checking for hidden files and directories ... OK * checking for portable file names ... OK * checking whether package 'VGAMextra' can be installed ... OK * checking installed package size ... OK * checking package directory ... OK * checking for future file timestamps ... OK * checking DESCRIPTION meta-information ... OK * checking top-level files ... OK * checking for left-over files ... OK * checking index information ... OK * checking package subdirectories ... OK * checking code files for non-ASCII characters ... OK * checking R files for syntax errors ... OK * checking whether the package can be loaded ... OK * checking whether the package can be loaded with stated dependencies ... OK * checking whether the package can be unloaded cleanly ... OK * checking whether the namespace can be loaded with stated dependencies ... OK * checking whether the namespace can be unloaded cleanly ... OK * checking loading without being on the library search path ... OK * checking whether startup messages can be suppressed ... OK * checking use of S3 registration ... OK * checking dependencies in R code ... OK * checking S3 generic/method consistency ... OK * checking replacement functions ... OK * checking foreign function calls ... OK * checking R code for possible problems ... [28s] OK * checking Rd files ... OK * checking Rd metadata ... OK * checking Rd line widths ... OK * checking Rd cross-references ... NOTE Found the following Rd file(s) with Rd \link{} targets missing package anchors: gen.betaIImr.Rd: Links, CommonVGAMffArguments, vglmff-class, vglm, vgam, betaff, betaII, dagum, sinmad, fisk, lomax, inv.lomax, paralogistic, inv.paralogistic invgamma2mr.Rd: CommonVGAMffArguments, vglmff-class, vglm, vgam invweibull2mr.Rd: CommonVGAMffArguments trunclognormalff.Rd: vglmff-class, vglm, vgam truncnormalff.Rd: vglmff-class, vglm, vgam Please provide package anchors for all Rd \link{} targets not in the package itself and the base packages. * checking for missing documentation entries ... OK * checking for code/documentation mismatches ... OK * checking Rd \usage sections ... OK * checking Rd contents ... OK * checking for unstated dependencies in examples ... OK * checking contents of 'data' directory ... OK * checking data for non-ASCII characters ... OK * checking data for ASCII and uncompressed saves ... OK * checking installed files from 'inst/doc' ... OK * checking examples ... [11s] ERROR Running examples in 'VGAMextra-Ex.R' failed The error most likely occurred in: > base::assign(".ptime", proc.time(), pos = "CheckExEnv") > ### Name: VGAMextra-package > ### Title: Additions and extensions of the VGAM package. > ### Aliases: VGAMextra-package VGAMextra > ### Keywords: package time series mean modelling quantile modelling > > ### ** Examples > > ##### EXAMPLE 1. An AR(1) model with ARCH(1) errors. > # Chan et.al. (2013) proposed a long and technical methodology to > # estimate the tail index of an AR(1) with ARCH(1) errors involving > # its estimation by QMLE. I fit this model straightforwardly by MLE > # using the family function ARXff() for time series, and constraining > # the effect of Y^2_{t - 1} to the conditional variance using > # constraint matrices. > > > # Generate some data > set.seed(1) > nn <- ceiling(runif(1, 150, 160)) > my.rho <- rhobitlink(-1.0, inverse = TRUE) # -0.46212 > my.mu <- 0.0 > my.omega <- 1 > my.b <- 0.5 > tsdata <- data.frame(x2 = sort(runif(n = nn))) > tsdata <- transform(tsdata, index = 1:nn, TS1 = runif(nn)) > > for (ii in 2:nn) + tsdata$TS1[ii] <- my.mu + my.rho * tsdata$TS1[ii-1] + + sqrt(my.omega + my.b * (tsdata$TS1[ii-1])^2) * rnorm(1) > > # Remove the burn-in data: > nnr <- ceiling(nn/5) > tsdata <- tsdata[-(1:nnr), ] > tsdata["index"] <- 1:(nn - nnr) > > # The constraint matrices, to inhibit the effect of Y^2_{t - 1} > # over sigma^2 only. > const.mat <- list('(Intercept)' = diag(3), 'TS1l1sq' = cbind(c(0, 1, 0))) > > # Set up the data using function WN.lags() from VGAMextra to generate > # our 'explanatory' > tsdata <- transform(tsdata, TS1l1sq = WN.lags(y = cbind(tsdata[, "TS1"])^2, lags = 1)) > > # Fitting the model > fit.Chan.etal <- vglm(TS1 ~ TS1l1sq, ARXff(order = 1, # AR order + zero = NULL, noChecks = FALSE, + var.arg = TRUE, lvar = "identitylink"), + crit = "loglikelihood", trace = TRUE, + constraints = const.mat, data = tsdata) ## Constraints... Iteration 1: loglikelihood = -217.71018 Iteration 2: loglikelihood = -216.35111 Iteration 3: loglikelihood = -216.10699 Iteration 4: loglikelihood = -216.07562 Iteration 5: loglikelihood = -216.07187 Iteration 6: loglikelihood = -216.07142 Iteration 7: loglikelihood = -216.07137 Iteration 8: loglikelihood = -216.07137 Iteration 9: loglikelihood = -216.07137 Checks on stationarity / invertibility successfully performed. No roots lying inside the unit circle. Further details within the 'summary' output. > summary(fit.Chan.etal, lrt0 = TRUE, score0 = TRUE, wald0 = TRUE) Checks on stationarity / invertibility successfully performed. No roots lying inside the unit circle. Further details within the 'summary' output. Checks on stationarity / invertibility successfully performed. No roots lying inside the unit circle. Further details within the 'summary' output. Checks on stationarity / invertibility successfully performed. No roots lying inside the unit circle. Further details within the 'summary' output. Call: vglm(formula = TS1 ~ TS1l1sq, family = ARXff(order = 1, zero = NULL, noChecks = FALSE, var.arg = TRUE, lvar = "identitylink"), data = tsdata, constraints = const.mat, crit = "loglikelihood", trace = TRUE) Likelihood ratio test coefficients: Estimate z value Pr(>|z|) TS1l1sq 0.3721 3.149 0.00164 ** Rao score test coefficients: Estimate Std. Error z value Pr(>|z|) TS1l1sq 0.3721 0.1329 4.515 6.34e-06 *** Wald (modified by IRLS iterations) coefficients: Estimate Std. Error z value Pr(>|z|) TS1l1sq 0.3721 0.1329 2.8 0.00511 ** --- Signif. codes: 0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1 Names of linear predictors: ARdrift1, noiseVar1, ARcoeff11 Log-likelihood: -216.0714 on 362 degrees of freedom Number of Fisher scoring iterations: 9 ----- ** Standard errors based on the asymptotic distribution of the MLE estimates: ARcoeff1 drift -0.530 0.164 s.e. 0.081 0.212 Estimated linear predictor of sigma^2 (SD errors): (Intercept) TS1l1sq 1.2440 0.3721 Loglikelihood: -216.071 AIC: 438.143, AICc: 438.346, BIC: 446.555 ----- ** Summary of checks on stationarity / invertibility: Polynomial roots of the AR component computed from the estimated coefficients: (Examining stationarity/invertibility) Model1 Root1 1.888 > constraints(fit.Chan.etal) $`(Intercept)` [,1] [,2] [,3] [1,] 1 0 0 [2,] 0 1 0 [3,] 0 0 1 $TS1l1sq [,1] [1,] 0 [2,] 1 [3,] 0 > > > ###### EXAMPLE 2. VGLMs handling cointegrated (bivariate) time series. > # In this example, vglm() accommodates an error correction model > # of order (2, 2) to fit two (non-stationary) cointegrated time series. > > # Simulating some data. > set.seed(2017081901) > nn <- 280 > rho <- 0.75 > s2u <- exp(log(1.5)) # Gaussian noise1 > s2w <- exp(0) # Gaussian noise2 > my.errors <- rbinorm(nn, mean1 = 0, mean2 = 0, var1 = s2u, var2 = s2w, cov12 = rho) > ut <- my.errors[, 1] > wt <- my.errors[, 2] > yt <- xt <- numeric(0) > > xt[1] <- ut[1] # Initial value: error.u[0] > yt[1] <- wt[1] # Initial value: error.w[0] > beta <- c(0.0, 2.5, -0.32) # Coefficients true values. > > for (ii in 2:nn) { + xt[ii] <- xt[ii - 1] + ut[ii] + yt[ii] <- beta[1] + beta[2] * xt[ii] + beta[3] * yt[ii - 1] + wt[ii] + } > > # Regression of yt on xt, save residuals. Compute Order--1 differences. > errors.coint <- residuals(lm(yt ~ xt)) # Residuals from the static regression yt ~ xt > difx1 <- diff(ts(xt), lag = 1, differences = 1) # First difference for xt > dify1 <- diff(ts(yt), lag = 1, differences = 1) # First difference for yt > > # Set up the dataset (coint.data), including Order-2 lagged differences. > coint.data <- data.frame(embed(difx1, 3), embed(dify1, 3)) > colnames(coint.data) <- c("difx1", "difxLag1", "difxLag2", + "dify1", "difyLag1", "difyLag2") > > # Remove unutilized lagged errors accordingly. Here, use from t = 3. > errors.cointLag1 <- errors.coint[-c(1:2, nn)] > coint.data <- transform(coint.data, errors.cointLag1 = errors.cointLag1) > > > # Fitting an error correction model (2, 2), aka ECM(2, 2) > fit.coint <- vglm(cbind(dify1, difx1) ~ errors.cointLag1 + difxLag1 + difyLag1 + + difxLag2 + difyLag2, + binormal(zero = c("sd", "rho")), # 'sigma', 'rho' are intercept--only. + trace = FALSE, data = coint.data) > summary(fit.coint) Call: vglm(formula = cbind(dify1, difx1) ~ errors.cointLag1 + difxLag1 + difyLag1 + difxLag2 + difyLag2, family = binormal(zero = c("sd", "rho")), data = coint.data, trace = FALSE) Coefficients: Estimate Std. Error z value Pr(>|z|) (Intercept):1 0.046833 0.240268 0.195 0.8455 (Intercept):2 0.011329 0.079052 0.143 0.8860 (Intercept):3 1.385767 0.042486 32.617 < 2e-16 *** (Intercept):4 0.274117 0.042486 6.452 1.1e-10 *** (Intercept):5 4.688481 0.120168 39.016 < 2e-16 *** errors.cointLag1:1 -1.446855 0.593543 -2.438 0.0148 * errors.cointLag1:2 -0.062999 0.195285 -0.323 0.7470 difxLag1:1 -0.180110 0.904472 -0.199 0.8422 difxLag1:2 -0.069824 0.297585 -0.235 0.8145 difyLag1:1 0.003398 0.452088 0.008 0.9940 difyLag1:2 0.016666 0.148744 0.112 0.9108 difxLag2:1 -0.042153 0.351606 -0.120 0.9046 difxLag2:2 0.017436 0.115684 0.151 0.8802 difyLag2:1 -0.002433 0.178107 -0.014 0.9891 difyLag2:2 -0.008916 0.058600 -0.152 0.8791 --- Signif. codes: 0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1 Number of linear predictors: 5 Names of linear predictors: mean1, mean2, loglink(sd1), loglink(sd2), rhobitlink(rho) Log-likelihood: -785.9902 on 1370 degrees of freedom Number of Fisher scoring iterations: 7 No Hauck-Donner effect found in any of the estimates > coef(fit.coint, matrix = TRUE) mean1 mean2 loglink(sd1) loglink(sd2) (Intercept) 0.046833185 0.011328667 1.385767 0.2741172 errors.cointLag1 -1.446854506 -0.062999405 0.000000 0.0000000 difxLag1 -0.180110481 -0.069823971 0.000000 0.0000000 difyLag1 0.003398098 0.016666074 0.000000 0.0000000 difxLag2 -0.042152567 0.017436344 0.000000 0.0000000 difyLag2 -0.002432540 -0.008916135 0.000000 0.0000000 rhobitlink(rho) (Intercept) 4.688481 errors.cointLag1 0.000000 difxLag1 0.000000 difyLag1 0.000000 difxLag2 0.000000 difyLag2 0.000000 > > > ##### EXAMPLE 3. Quantile Modelling (QM). > # Here, the quantile function of the Maxwell distribution is modelled > # for percentiles 25%, 50% and 75%. The resulting quantile-curves > # are plotted. The rate parameter is determined by an artificial covariate. > > set.seed(123) > # An artificial covariate. > maxdata <- data.frame(x2 = sort(runif(n <- nn <- 120))) > # The 'rate' function. > mymu <- function(x) exp(2 - 6 * sin(2 * x - 0.2) / (x + 0.5)^2) > # Set up the data. > maxdata <- transform(maxdata, y = rmaxwell(n, rate = mymu(x2))) > > # 25%, 50% and 75% quantiles are to be modelled. > mytau <- c(0.25, 0.50, 0.75) > mydof <- 4 > > ### Using B-splines with 'mydof'-degrees of freedom on the predictors > fit.QM <- vglm(Q.reg(y, pvector = mytau) ~ bs(x2, df = mydof), + family = maxwell(link = maxwellQlink(p = mytau), zero = NULL), + data = maxdata, trace = TRUE) Error in maxwellQlink(p = mytau) : argument "theta" is missing, with no default Calls: vglm -> maxwell -> maxwellQlink Execution halted * checking PDF version of manual ... [28s] OK * checking HTML version of manual ... [23s] OK * DONE Status: 1 ERROR, 1 NOTE