* using log directory 'd:/RCompile/CRANincoming/R-devel/TidyML.Rcheck' * using R Under development (unstable) (2025-05-26 r88238 ucrt) * using platform: x86_64-w64-mingw32 * R was compiled by gcc.exe (GCC) 14.2.0 GNU Fortran (GCC) 14.2.0 * running under: Windows Server 2022 x64 (build 20348) * using session charset: UTF-8 * checking for file 'TidyML/DESCRIPTION' ... OK * this is package 'TidyML' version '0.1.0' * package encoding: UTF-8 * checking CRAN incoming feasibility ... [10s] NOTE Maintainer: 'Javier Martínez García ' New submission Possibly misspelled words in DESCRIPTION: KDD (18:55, 20:102, 28:94) TidyML (16:14, 22:87, 26:66) minimalistic (16:26) resuresults (25:79) The Title field should be in title case. Current version is: 'Machine Learning Modelling For Everyone' In title case that is: 'Machine Learning Modelling for Everyone' The Description field should not start with the package name, 'This package' or similar. * checking package namespace information ... OK * checking package dependencies ... OK * checking if this is a source package ... OK * checking if there is a namespace ... OK * checking for hidden files and directories ... OK * checking for portable file names ... OK * checking serialization versions ... OK * checking whether package 'TidyML' can be installed ... OK * checking installed package size ... OK * checking package directory ... OK * checking for future file timestamps ... OK * checking DESCRIPTION meta-information ... OK * checking top-level files ... NOTE Non-standard file/directory found at top level: 'README.html' * checking for left-over files ... OK * checking index information ... OK * checking package subdirectories ... NOTE Problems with news in 'NEWS.md': No news entries found. * checking code files for non-ASCII characters ... OK * checking R files for syntax errors ... OK * checking whether the package can be loaded ... OK * checking whether the package can be loaded with stated dependencies ... OK * checking whether the package can be unloaded cleanly ... OK * checking whether the namespace can be loaded with stated dependencies ... OK * checking whether the namespace can be unloaded cleanly ... OK * checking loading without being on the library search path ... OK * checking whether startup messages can be suppressed ... OK * checking use of S3 registration ... OK * checking dependencies in R code ... OK * checking S3 generic/method consistency ... OK * checking replacement functions ... OK * checking foreign function calls ... OK * checking R code for possible problems ... [11s] NOTE modify_datasets: no visible binding for global variable 'tidy_object' olden_barplot: no visible binding for global variable 'variable' olden_barplot: no visible binding for global variable 'importance' one_hot_predictors: no visible global function definition for 'all_of' plot2: no visible binding for global variable 'variable' plot2: no visible binding for global variable 'importance' plot_barplot: no visible binding for global variable 'Importance' plot_barplot: no visible binding for global variable 'Variable' plot_barplot: no visible binding for global variable 'StDev' plot_beeswarm: no visible binding for global variable 'value' plot_beeswarm: no visible binding for global variable 'variable' plot_beeswarm: no visible binding for global variable 'val_color' plot_boxplot: no visible binding for global variable 'variable' plot_boxplot: no visible binding for global variable 'value' plot_calibration_curve_binary: no visible global function definition for 'sym' plot_calibration_curve_binary: no visible binding for global variable 'data_set' plot_calibration_curve_binary: no visible binding for global variable 'y' plot_calibration_curve_binary: no visible binding for global variable 'prob_pred' plot_calibration_curve_binary: no visible binding for global variable 'prob_observed' plot_conf_mat: no visible binding for global variable 'y' plot_conf_mat: no visible binding for global variable '.pred_class' plot_dist_probs_binary: no visible binding for global variable 'data_set' plot_dist_probs_binary: no visible binding for global variable 'y' plot_dist_probs_multiclass: no visible binding for global variable 'data_set' plot_dist_probs_multiclass: no visible binding for global variable '.pred_class' plot_dist_probs_multiclass: no visible binding for global variable 'Class' plot_dist_probs_multiclass: no visible binding for global variable 'Probability' plot_dist_probs_multiclass: no visible binding for global variable 'y' plot_gain_curve_binary: no visible binding for global variable 'data_set' plot_gain_curve_binary: no visible binding for global variable 'y' plot_gain_curve_multiclass: no visible binding for global variable 'data_set' plot_gain_curve_multiclass: no visible binding for global variable 'y' plot_lift_curve_binary: no visible binding for global variable 'data_set' plot_lift_curve_binary: no visible binding for global variable 'y' plot_lift_curve_multiclass: no visible binding for global variable 'data_set' plot_lift_curve_multiclass: no visible binding for global variable 'y' plot_pr_curve_binary: no visible binding for global variable 'data_set' plot_pr_curve_binary: no visible binding for global variable 'y' plot_pr_curve_multiclass: no visible binding for global variable 'data_set' plot_pr_curve_multiclass: no visible binding for global variable 'y' plot_residuals_density: no visible binding for global variable 'y' plot_residuals_density: no visible binding for global variable '.pred' plot_residuals_density: no visible binding for global variable 'error' plot_residuals_density: no visible binding for global variable 'density' plot_roc_curve_binary: no visible binding for global variable 'data_set' plot_roc_curve_binary: no visible binding for global variable 'y' plot_roc_curve_multiclass: no visible binding for global variable 'data_set' plot_roc_curve_multiclass: no visible binding for global variable 'y' plot_scatter: no visible binding for global variable 'y' plot_scatter: no visible binding for global variable '.pred' plot_tuning_results: no visible binding for global variable '.' plot_tuning_results: no visible binding for global variable 'search_res' preprocessing: no visible global function definition for 'all_of' show_results: no visible binding for global variable 'data_set' show_results: no visible binding for global variable 'Class' sobol_calc: no visible global function definition for 'rnorm' sobol_plot: no visible binding for global variable 'S1' sobol_plot: no visible binding for global variable 'ST' sobol_plot: no visible binding for global variable 'value' sobol_plot: no visible binding for global variable 'variable' sobol_plot: no visible binding for global variable 'type' sobol_plot: no visible binding for global variable 'se' sobol_plot: no visible binding for global variable 'label' standarize_predictors: no visible global function definition for 'all_of' summary_binary : : no visible binding for global variable 'y' summary_binary : : no visible binding for global variable '.pred_class' summary_multiclass_average : : no visible binding for global variable 'y' summary_multiclass_average : : no visible binding for global variable '.pred_class' summary_multiclass_per_class : : : no visible binding for global variable 'truth' summary_multiclass_per_class : : : no visible binding for global variable 'estimate' summary_multiclass_per_class : : : no visible binding for global variable 'prob_estimate' summary_regression : : no visible binding for global variable 'y' summary_regression : : no visible binding for global variable '.pred' Undefined global functions or variables: . .pred .pred_class Class Importance Probability S1 ST StDev Variable all_of data_set density error estimate importance label prob_estimate prob_observed prob_pred rnorm se search_res sym tidy_object truth type val_color value variable y Consider adding importFrom("stats", "density", "rnorm") to your NAMESPACE file. Found if() conditions comparing class() to string: File 'TidyML/R/check_arguments.R': if (!(class(arg) == "logical")) ... Use inherits() (or maybe is()) instead. * checking Rd files ... OK * checking Rd metadata ... OK * checking Rd line widths ... NOTE Rd file 'build_model.Rd': \examples lines wider than 100 characters: formula = psych_well_bin ~ depression + emot_intel + resilience + life_sat, Rd file 'fine_tuning.Rd': \examples lines wider than 100 characters: formula = psych_well_bin ~ depression + emot_intel + resilience + life_sat, Rd file 'show_results.Rd': \examples lines wider than 100 characters: # Display summary metrics, ROC curve, and confusion matrix for a classification model with test partition formula = psych_well_bin ~ depression + emot_intel + resilience + life_sat, # Display summary metrics, scatter plot of predictions, and residuals distribution for a regression model These lines will be truncated in the PDF manual. * checking Rd cross-references ... OK * checking for missing documentation entries ... OK * checking for code/documentation mismatches ... OK * checking Rd \usage sections ... OK * checking Rd contents ... OK * checking for unstated dependencies in examples ... OK * checking contents of 'data' directory ... OK * checking data for non-ASCII characters ... OK * checking LazyData ... OK * checking data for ASCII and uncompressed saves ... OK * checking examples ... [137s] NOTE Examples with CPU (user + system) or elapsed time > 10s user system elapsed fine_tuning 399.64 43.33 94.29 sensitivity_analysis 57.47 4.38 28.97 show_results 33.57 3.62 7.46 * checking for unstated dependencies in 'tests' ... OK * checking tests ... [23m] ERROR Running 'testthat.R' [23m] Running the tests in 'tests/testthat.R' failed. Complete output: > # This file is part of the standard setup for testthat. > # It is recommended that you do not modify it. > # > # Where should you do additional test configuration? > # Learn more about the roles of various files in: > # * https://r-pkgs.org/testing-design.html#sec-tests-files-overview > # * https://testthat.r-lib.org/articles/special-files.html > > library(testthat) > library(TidyML) ************************************************** * * * TTTTTT dd MM MM LL * * TT ii dd yy yy MM M M MM LL * * TT ii dd dd y y MM M MM LL * * TT ii d dd yy MM MM LL * * TT ii dd dd yy MM MM LLLLLL * * * ************************************************** TidyML v0.1.0: **Start simple, scale smart** > > test_check("TidyML") Commencing Tuning...OMP: Warning #96: Cannot form a team with 48 threads, using 2 instead. OMP: Hint Consider unsetting KMP_DEVICE_THREAD_LIMIT (KMP_ALL_THREADS), KMP_TEAMS_THREAD_LIMIT, and OMP_THREAD_LIMIT (if any are set). ! validation: preprocessor 1/1, model 2/20: Loss is NaN at epoch 8. Training is stopped. ! validation: preprocessor 1/1, model 15/20: Loss is NaN at epoch 4. Training is stopped. ! No improvement for 5 iterations; returning current results. Tuning Finalized ###### Loss Curve ###### Commencing Tuning...Tuning Finalized ###### Loss Curve ###### Commencing Tuning...Commencing Tuning...! No improvement for 5 iterations; returning current results. Tuning Finalized ###### Loss Curve ###### ######### PFI Method Results ############## # A tibble: 7 x 3 Variable Importance StDev 1 depression 17.7 0.892 2 gender_Male 0.183 0.0806 3 gender_Female 0.100 0.111 4 socioec_status_High 0.0742 0.0192 5 socioec_status_Medium 0.0263 0.0430 6 age 0.00872 0.0777 7 socioec_status_Low -0.00207 0.0576 -- Starting `shapr::explain()` at 2025-05-27 14:19:54 -------------------------- i You passed a model to `shapr::explain()` which is not natively supported, and did not supply a `get_model_specs` function to `shapr::explain()`. Consistency checks between model and data is therefore disabled. -- Explanation overview -- * Model class: <_brulee_mlp/model_fit> * Approach: empirical * Iterative estimation: TRUE * Number of feature-wise Shapley values: 7 * Number of observations to explain: 200 * Computations (temporary) saved at: 'D:\temp\2025_05_27_13_55_17_158\RtmpKW6XyX\shapr_obj_3074478058b.rds' -- iterative computation started -- -- Iteration 1 ----------------------------------------------------------------- i Using 14 of 128 coalitions, 14 new. -- Iteration 2 ----------------------------------------------------------------- i Using 26 of 128 coalitions, 12 new. ######### SHAP Method Results ############## Importance StDev depression 16.400 0.720 age 1.690 0.084 gender_Male 0.591 0.026 socioec_status_Medium 0.557 0.024 socioec_status_Low 0.496 0.024 socioec_status_High 0.409 0.016 gender_Female 0.398 0.015 ######### Integrated Gradients Method Results ############## Importance StDev depression 0.70700 0.0310 gender_Male 0.06560 0.0048 gender_Female 0.06330 0.0048 age 0.04220 0.0037 socioec_status_Medium 0.01950 0.0019 socioec_status_Low 0.01850 0.0022 socioec_status_High 0.00722 0.0010 ######### Olden's Method Results ############## age depression gender_Female gender_Male socioec_status_High 1 0.04041123 -0.6511811 -0.08520333 -0.1051047 0.01565051 socioec_status_Low socioec_status_Medium 1 0.05885844 0.04359065 Commencing Tuning...! No improvement for 5 iterations; returning current results. Tuning Finalized ###### Loss Curve ###### ############# Showing Results ############# [ FAIL 1 | WARN 0 | SKIP 0 | PASS 193 ] ══ Failed tests ════════════════════════════════════════════════════════════════ ── Failure ('test-sensitivity_analysis.R:31:3'): PFI works properly regression ── olden$depression[1] (`actual`) not equal to -0.7335 (`expected`). `actual`: -0.65 `expected`: -0.73 [ FAIL 1 | WARN 0 | SKIP 0 | PASS 193 ] Error: Test failures Execution halted * checking PDF version of manual ... [20s] OK * checking HTML version of manual ... OK * DONE Status: 1 ERROR, 6 NOTEs