R Under development (unstable) (2026-02-04 r89376 ucrt) -- "Unsuffered Consequences" Copyright (C) 2026 The R Foundation for Statistical Computing Platform: x86_64-w64-mingw32/x64 R is free software and comes with ABSOLUTELY NO WARRANTY. You are welcome to redistribute it under certain conditions. Type 'license()' or 'licence()' for distribution details. R is a collaborative project with many contributors. Type 'contributors()' for more information and 'citation()' on how to cite R or R packages in publications. Type 'demo()' for some demos, 'help()' for on-line help, or 'help.start()' for an HTML browser interface to help. Type 'q()' to quit R. > library(rtemis) .:rtemis 1.0.0 🌊 x86_64-w64-mingw32/x64 > library(testthat) Attaching package: 'testthat' The following object is masked from 'package:rtemis': describe > > test_check("rtemis") 2026-02-05 01:47:15 ▶ [cluster] 2026-02-05 01:47:15 Input: 150 cases x 4 features. [summarize_unsupervised] 2026-02-05 01:47:15 Clustering with KMeans... [cluster] 2026-02-05 01:47:15 Checking unsupervised data... ✔ [check_unsupervised_data] 2026-02-05 01:47:15 Clustering with KMeans ... [cluster_KMeans] 2026-02-05 01:47:15 ✔ Done in 0.42 seconds. [cluster] 2026-02-05 01:47:15 ▶ [cluster] 2026-02-05 01:47:15 Input: 150 cases x 4 features. [summarize_unsupervised] 2026-02-05 01:47:15 Clustering with KMeans... [cluster] 2026-02-05 01:47:15 Checking unsupervised data... ✔ [check_unsupervised_data] 2026-02-05 01:47:15 Clustering with KMeans ... [cluster_KMeans] 2026-02-05 01:47:15 ✔ Done in 0.03 seconds. [cluster] 2026-02-05 01:47:15 ▶ [cluster] 2026-02-05 01:47:15 Input: 150 cases x 4 features. [summarize_unsupervised] 2026-02-05 01:47:15 Clustering with HardCL... [cluster] 2026-02-05 01:47:15 Checking unsupervised data... ✔ [check_unsupervised_data] 2026-02-05 01:47:15 Clustering with HardCL ... [cluster_HardCL] 2026-02-05 01:47:16 ✔ Done in 0.03 seconds. [cluster] 2026-02-05 01:47:16 ▶ [cluster] 2026-02-05 01:47:16 Input: 150 cases x 4 features. [summarize_unsupervised] 2026-02-05 01:47:16 Clustering with NeuralGas... [cluster] 2026-02-05 01:47:16 Checking unsupervised data... ✔ [check_unsupervised_data] 2026-02-05 01:47:16 Clustering with NeuralGas ... [cluster_NeuralGas] 2026-02-05 01:47:16 ✔ Done in 0.07 seconds. [cluster] 2026-02-05 01:47:16 ▶ [cluster] 2026-02-05 01:47:16 Input: 150 cases x 4 features. [summarize_unsupervised] 2026-02-05 01:47:16 Clustering with CMeans... [cluster] 2026-02-05 01:47:16 Checking unsupervised data... ✔ [check_unsupervised_data] 2026-02-05 01:47:16 Clustering with CMeans ... [cluster_CMeans] Iteration: 1, Error: 1.1042602505 Iteration: 2, Error: 0.5503193719 Iteration: 3, Error: 0.4423465986 Iteration: 4, Error: 0.4111220995 Iteration: 5, Error: 0.4043030220 Iteration: 6, Error: 0.4034648019 Iteration: 7, Error: 0.4033802840 Iteration: 8, Error: 0.4033722413 Iteration: 9, Error: 0.4033714854 Iteration: 10, Error: 0.4033714131 Iteration: 11, Error: 0.4033714056 Iteration: 12 converged, Error: 0.4033714046 2026-02-05 01:47:16 ✔ Done in 0.00 seconds. [cluster] 2026-02-05 01:47:16 ▶ [cluster] 2026-02-05 01:47:16 Input: 150 cases x 4 features. [summarize_unsupervised] 2026-02-05 01:47:16 Clustering with DBSCAN... [cluster] 2026-02-05 01:47:16 Checking unsupervised data... ✔ [check_unsupervised_data] 2026-02-05 01:47:16 Clustering with DBSCAN ... [cluster_DBSCAN] 2026-02-05 01:47:16 Found 3 clusters. [cluster] 2026-02-05 01:47:16 ✔ Done in 0.02 seconds. [cluster] 2026-02-05 01:47:16 ▶ [decomp] 2026-02-05 01:47:16 Input: 150 cases x 4 features. [summarize_unsupervised] 2026-02-05 01:47:16 Decomposing with PCA... [decomp] 2026-02-05 01:47:16 Checking unsupervised data... ✔ [check_unsupervised_data] 2026-02-05 01:47:16 Decomposing with PCA ... [decom_PCA] 2026-02-05 01:47:16 ✔ Done in 0.02 seconds. [decomp] 2026-02-05 01:47:16 ▶ [decomp] 2026-02-05 01:47:16 Input: 150 cases x 4 features. [summarize_unsupervised] 2026-02-05 01:47:16 Decomposing with ICA... [decomp] 2026-02-05 01:47:16 Checking unsupervised data... ✔ [check_unsupervised_data] 2026-02-05 01:47:16 Decomposing with ICA ... [decom_ICA] Centering colstandard Whitening Symmetric FastICA using logcosh approx. to neg-entropy function Iteration 1 tol=0.875525 Iteration 2 tol=0.364120 Iteration 3 tol=0.018013 Iteration 4 tol=0.000355 Iteration 5 tol=0.000066 2026-02-05 01:47:16 ✔ Done in 0.00 seconds. [decomp] 2026-02-05 01:47:17 ▶ [decomp] 2026-02-05 01:47:17 Input: 150 cases x 4 features. [summarize_unsupervised] 2026-02-05 01:47:17 Decomposing with NMF... [decomp] 2026-02-05 01:47:17 Checking unsupervised data... ✔ [check_unsupervised_data] 2026-02-05 01:47:17 Decomposing with NMF ... [decom_NMF] 2026-02-05 01:47:19 ✔ Done in 1.97 seconds. [decomp] 2026-02-05 01:47:20 ▶ [decomp] 2026-02-05 01:47:20 Input: 150 cases x 4 features. [summarize_unsupervised] 2026-02-05 01:47:20 Decomposing with UMAP... [decomp] 2026-02-05 01:47:20 Checking unsupervised data... ✔ [check_unsupervised_data] 2026-02-05 01:47:20 Decomposing with UMAP ... [decom_UMAP] 2026-02-05 01:47:22 ✔ Done in 2.20 seconds. [decomp] 2026-02-05 01:47:22 ▶ [decomp] 2026-02-05 01:47:22 Input: 150 cases x 4 features. [summarize_unsupervised] 2026-02-05 01:47:22 Decomposing with UMAP... [decomp] 2026-02-05 01:47:22 Checking unsupervised data... ✔ [check_unsupervised_data] 2026-02-05 01:47:22 Decomposing with UMAP ... [decom_UMAP] 2026-02-05 01:47:23 ✔ Done in 1.54 seconds. [decomp] 2026-02-05 01:47:23 ▶ [decomp] 2026-02-05 01:47:23 Input: 150 cases x 4 features. [summarize_unsupervised] 2026-02-05 01:47:23 Decomposing with tSNE... [decomp] 2026-02-05 01:47:23 Checking unsupervised data... ✔ [check_unsupervised_data] 2026-02-05 01:47:23 Decomposing with tSNE ... [decom_tSNE] 2026-02-05 01:47:23 Removing 1 duplicate case... [preprocess] 2026-02-05 01:47:23 Preprocessing done. [preprocess] 2026-02-05 01:47:23 ▶ [decomp] 2026-02-05 01:47:23 Input: 149 cases x 4 features. [summarize_unsupervised] 2026-02-05 01:47:23 Decomposing with tSNE... [decomp] 2026-02-05 01:47:23 Checking unsupervised data... ✔ [check_unsupervised_data] 2026-02-05 01:47:23 Decomposing with tSNE ... [decom_tSNE] 2026-02-05 01:47:24 ✔ Done in 0.27 seconds. [decomp] 2026-02-05 01:47:24 ▶ [decomp] 2026-02-05 01:47:24 Input: 150 cases x 4 features. [summarize_unsupervised] 2026-02-05 01:47:24 Decomposing with Isomap... [decomp] 2026-02-05 01:47:24 Checking unsupervised data... ✔ [check_unsupervised_data] 2026-02-05 01:47:24 Decomposing with Isomap ... [decom_Isomap] 2026-02-05 01:47:24 ✔ Done in 0.01 seconds. [decomp] 2026-02-05 01:47:24 Input contains more than one column; stratifying on last. [resample] 2026-02-05 01:47:24 Using max n bins possible = 3. [kfold] 2026-02-05 01:47:24 Scaling and centering 4 numeric features... [preprocess] 2026-02-05 01:47:24 Preprocessing done. [preprocess] 2026-02-05 01:47:24 Scaling and centering 4 numeric features... [preprocess] 2026-02-05 01:47:24 Preprocessing done. [preprocess] 2026-02-05 01:47:24 Scaling and centering 4 numeric features... [preprocess] 2026-02-05 01:47:24 Preprocessing done. [preprocess] 2026-02-05 01:47:24 Applying preprocessing to test data... [preprocess] 2026-02-05 01:47:24 Scaling and centering 4 numeric features... [preprocess] 2026-02-05 01:47:24 Preprocessing done. [preprocess] 2026-02-05 01:47:24 Imputing missing values using get_mode (discrete) and mean (continuous)... [preprocess] 2026-02-05 01:47:24 Preprocessing done. [preprocess] 2026-02-05 01:47:25 One hot encoding g... ✔ [one_hot] 2026-02-05 01:47:25 Preprocessing done. [preprocess] 2026-02-05 01:47:25 Input contains more than one column; stratifying on last. [resample] 2026-02-05 01:47:25 Using max n bins possible = 3. [kfold] 2026-02-05 01:47:25 Input contains more than one column; stratifying on last. [resample] 2026-02-05 01:47:25 Using max n bins possible = 3. [kfold] Attaching package: 'data.table' The following object is masked from 'package:base': %notin% 2026-02-05 01:47:25 Input contains more than one column; stratifying on last. [resample] 2026-02-05 01:47:25 Input contains more than one column; stratifying on last. [resample] 2026-02-05 01:47:25 Using max n bins possible = 2. [kfold] 2026-02-05 01:47:25 Input contains more than one column; stratifying on last. [resample] 2026-02-05 01:47:25 Using max n bins possible = 3. [kfold] 2026-02-05 01:47:25 ▶ [train] 2026-02-05 01:47:25 Training set: 358 cases x 6 features. [summarize_supervised] 2026-02-05 01:47:25  Test set: 42 cases x 6 features. [summarize_supervised] 2026-02-05 01:47:25 // Max workers: 1 => Algorithm: 1; Tuning: 1; Outer Resampling: 1 [get_n_workers] 2026-02-05 01:47:25 Training GLM Regression... [train] 2026-02-05 01:47:25 Checking data is ready for training... ✔ [check_supervised]  <Regression> GLM (Generalized Linear Model) <Training Regression Metrics>  MAE: 0.73  MSE: 0.82  RMSE: 0.91  Rsq: 0.83 <Test Regression Metrics>  MAE: 0.74  MSE: 1.03  RMSE: 1.01  Rsq: 0.77 2026-02-05 01:47:25 ✔ Done in 0.11 seconds. [train] 2026-02-05 01:47:25 ▶ [train] 2026-02-05 01:47:25 Training set: 358 cases x 6 features. [summarize_supervised] 2026-02-05 01:47:25  Test set: 42 cases x 6 features. [summarize_supervised] 2026-02-05 01:47:25 // Max workers: 1 => Algorithm: 1; Tuning: 1; Outer Resampling: 1 [get_n_workers] 2026-02-05 01:47:25 Training GLM Regression... [train] 2026-02-05 01:47:25 Checking data is ready for training... 2026-02-05 01:47:25 ▶ [train] 2026-02-05 01:47:25 Training set: 400 cases x 6 features. [summarize_supervised] 2026-02-05 01:47:25 // Max workers: 1 => Algorithm: 1; Tuning: 1; Outer Resampling: 1 [get_n_workers] 2026-02-05 01:47:25 <> Training GLM Regression using 3 independent folds... [train] 2026-02-05 01:47:25 Input contains more than one column; stratifying on last. [resample] 2026-02-05 01:47:25  Outer resampling done. [train] GLM (Generalized Linear Model) ⟳ Tested using 3 independent folds. Showing mean (sd) across resamples. MAE: 0.726 (0.014) MSE: 0.838 (0.049) RMSE: 0.915 (0.026) Rsq: 0.829 (0.011) Showing mean (sd) across resamples. MAE: 0.735 (0.027) MSE: 0.871 (0.097) RMSE: 0.932 (0.053) Rsq: 0.822 (0.022) 2026-02-05 01:47:25 ✔ Done in 0.19 seconds. [train] 2026-02-05 01:47:25 ▶ [train] 2026-02-05 01:47:25 Training set: 90 cases x 5 features. [summarize_supervised] 2026-02-05 01:47:25  Test set: 10 cases x 5 features. [summarize_supervised] 2026-02-05 01:47:25 // Max workers: 1 => Algorithm: 1; Tuning: 1; Outer Resampling: 1 [get_n_workers] 2026-02-05 01:47:25 Training GLM Classification... [train] 2026-02-05 01:47:25 Checking data is ready for training... ✔ [check_supervised]  <Classification> GLM (Generalized Linear Model) <Training Classification Metrics>  Predicted  Reference virginica versicolor   virginica 44 1  versicolor 1 44 Overall   Sensitivity 0.978  Specificity 0.978  Balanced_Accuracy 0.978  PPV 0.978  NPV 0.978  F1 0.978  Accuracy 0.978  AUC 0.998  Brier_Score 0.018 Positive Class virginica <Test Classification Metrics>  Predicted  Reference virginica versicolor   virginica 5 0  versicolor 1 4 Overall   Sensitivity 1.000  Specificity 0.800  Balanced_Accuracy 0.900  PPV 0.833  NPV 1.000  F1 0.909  Accuracy 0.900  AUC 1.000  Brier_Score 0.093 Positive Class virginica 2026-02-05 01:47:25 ✔ Done in 0.08 seconds. [train] 2026-02-05 01:47:25 ▶ [train] 2026-02-05 01:47:25 Training set: 90 cases x 5 features. [summarize_supervised] 2026-02-05 01:47:25  Test set: 10 cases x 5 features. [summarize_supervised] 2026-02-05 01:47:25 // Max workers: 1 => Algorithm: 1; Tuning: 1; Outer Resampling: 1 [get_n_workers] 2026-02-05 01:47:25 Calculating case weights using Inverse Frequency Weighting. [ifw] 2026-02-05 01:47:25 Training GLM Classification... [train] 2026-02-05 01:47:25 Checking data is ready for training... ✔ [check_supervised]  <Classification> GLM (Generalized Linear Model) <Training Classification Metrics>  Predicted  Reference virginica versicolor   virginica 44 1  versicolor 1 44 Overall   Sensitivity 0.978  Specificity 0.978  Balanced_Accuracy 0.978  PPV 0.978  NPV 0.978  F1 0.978  Accuracy 0.978  AUC 0.998  Brier_Score 0.018 Positive Class virginica <Test Classification Metrics>  Predicted  Reference virginica versicolor   virginica 5 0  versicolor 1 4 Overall   Sensitivity 1.000  Specificity 0.800  Balanced_Accuracy 0.900  PPV 0.833  NPV 1.000  F1 0.909  Accuracy 0.900  AUC 1.000  Brier_Score 0.093 Positive Class virginica 2026-02-05 01:47:25 ✔ Done in 0.06 seconds. [train] 2026-02-05 01:47:25 ▶ [train] 2026-02-05 01:47:25 Training set: 100 cases x 5 features. [summarize_supervised] 2026-02-05 01:47:25 // Max workers: 1 => Algorithm: 1; Tuning: 1; Outer Resampling: 1 [get_n_workers] 2026-02-05 01:47:25 <> Training GLM Classification using 3 independent folds... [train] 2026-02-05 01:47:25 Input contains more than one column; stratifying on last. [resample] 2026-02-05 01:47:25 Using max n bins possible = 2. [kfold] Warning caught: glm.fit: fitted probabilities numerically 0 or 1 occurred Reasons for this warning include: 1) Perfect Separation of classes. 2) Highly Imbalanced data. 3) Extreme values in predictors. 4) Too many predictors for the number of observations. 5) Multicollinearity. Suggestion: Try using GLMNET or tree-based algorithms Warning caught: glm.fit: algorithm did not converge Warning caught: glm.fit: fitted probabilities numerically 0 or 1 occurred Reasons for this warning include: 1) Perfect Separation of classes. 2) Highly Imbalanced data. 3) Extreme values in predictors. 4) Too many predictors for the number of observations. 5) Multicollinearity. Suggestion: Try using GLMNET or tree-based algorithms Warning caught: glm.fit: algorithm did not converge Warning caught: glm.fit: fitted probabilities numerically 0 or 1 occurred Reasons for this warning include: 1) Perfect Separation of classes. 2) Highly Imbalanced data. 3) Extreme values in predictors. 4) Too many predictors for the number of observations. 5) Multicollinearity. Suggestion: Try using GLMNET or tree-based algorithms 2026-02-05 01:47:26  Outer resampling done. [train] GLM (Generalized Linear Model) ⟳ Tested using 3 independent folds. Showing mean (sd) across resamples. Sensitivity: 0.990 (0.017) Specificity: 0.990 (0.017) Balanced_Accuracy: 0.990 (0.017) PPV: 0.990 (0.017) NPV: 0.990 (0.017) F1: 0.990 (0.017) Accuracy: 0.990 (0.017) AUC: 0.999 (1.6e-03) Brier_Score: 0.007 (0.011) Showing mean (sd) across resamples. Sensitivity: 0.880 (4.2e-03) Specificity: 0.940 (0.059) Balanced_Accuracy: 0.910 (0.030) PPV: 0.939 (0.059) NPV: 0.886 (0.007) F1: 0.908 (0.028) Accuracy: 0.910 (0.030) AUC: 0.945 (0.019) Brier_Score: 0.085 (0.027) 2026-02-05 01:47:26 ✔ Done in 0.20 seconds. [train] 2026-02-05 01:47:26 ▶ [train] 2026-02-05 01:47:26 Training set: 358 cases x 6 features. [summarize_supervised] 2026-02-05 01:47:26  Test set: 42 cases x 6 features. [summarize_supervised] 2026-02-05 01:47:26 // Max workers: 1 => Algorithm: 1; Tuning: 1; Outer Resampling: 1 [get_n_workers] 2026-02-05 01:47:26 Training GLMNET Regression... [train] 2026-02-05 01:47:26 Checking data is ready for training... ✔ [check_supervised]  <Regression> GLMNET (Elastic Net) <Training Regression Metrics>  MAE: 0.73  MSE: 0.82  RMSE: 0.91  Rsq: 0.83 <Test Regression Metrics>  MAE: 0.74  MSE: 1.02  RMSE: 1.01  Rsq: 0.77 2026-02-05 01:47:26 ✔ Done in 0.08 seconds. [train] 2026-02-05 01:47:26 ▶ [train] 2026-02-05 01:47:26 Training set: 358 cases x 6 features. [summarize_supervised] 2026-02-05 01:47:26  Test set: 42 cases x 6 features. [summarize_supervised] 2026-02-05 01:47:26 // Max workers: 2 => Algorithm: 1; Tuning: 2; Outer Resampling: 1 [get_n_workers] 2026-02-05 01:47:26 ▶ [tune_GridSearch] 2026-02-05 01:47:26 <> Tuning GLMNET by exhaustive grid search with 5 independent folds... [tune_GridSearch] 2026-02-05 01:47:26 1 parameter combination x 5 resamples: 5 models total (). [tune_GridSearch] 2026-02-05 01:47:26 Input contains more than one column; stratifying on last. [resample] <KFoldConfig>  n: 5  stratify_var: NULL  strat_n_bins: 4  id_strat: NULL  seed: NULL 2026-02-05 01:47:26 Tuning using future (mirai_multisession); N workers: 2 [tune_GridSearch] 2026-02-05 01:47:26 Current future plan: [tune_GridSearch] mirai_multisession: - args: function (..., workers = 2L, envir = parent.frame()) - tweaked: TRUE - call: future::plan(strategy = requested_plan, workers = n_workers) MiraiMultisessionFutureBackend: Inherits: MiraiFutureBackend, MultiprocessFutureBackend, FutureBackend UUID: 6b1751d942684b3f324567538d696cea Number of workers: 2 Number of free workers: 2 Available cores: 2 Automatic garbage collection: FALSE Early signaling: FALSE Interrupts are enabled: TRUE Maximum total size of globals: +Inf Maximum total size of value: +Inf Number of active futures: 0 Number of futures since start: 0 (0 created, 0 launched, 0 finished) Total runtime of futures: 0 secs (NaN secs/finished future) <Regression> GLMNET (Elastic Net) <Training Regression Metrics>  MAE: 0.77  MSE: 0.92  RMSE: 0.96  Rsq: 0.82 <Validation Regression Metrics>  MAE: 0.71  MSE: 0.80  RMSE: 0.90  Rsq: 0.84 <Regression> GLMNET (Elastic Net) <Training Regression Metrics>  MAE: 0.76  MSE: 0.91  RMSE: 0.95  Rsq: 0.82 <Validation Regression Metrics>  MAE: 0.74  MSE: 0.89  RMSE: 0.95  Rsq: 0.82 2026-02-05 01:47:27 Running grid line #1/5... [tune_GridSearch] 2026-02-05 01:47:27 ▶ [train] 2026-02-05 01:47:27  Training set: 285 cases x 6 features. [summarize_supervised] 2026-02-05 01:47:27 Validation set: 73 cases x 6 features. [summarize_supervised] 2026-02-05 01:47:27 // Max workers: 1 => Algorithm: 1; Tuning: 1; Outer Resampling: 1 [get_n_workers] 2026-02-05 01:47:27 Training GLMNET Regression... [train] 2026-02-05 01:47:28 Checking data is ready for training... ✔ [check_supervised]  2026-02-05 01:47:28 ✔ Done in 1.06 seconds. [train] 2026-02-05 01:47:28 Running grid line #2/5... [tune_GridSearch] 2026-02-05 01:47:28 ▶ [train] 2026-02-05 01:47:28  Training set: 288 cases x 6 features. [summarize_supervised] 2026-02-05 01:47:28 Validation set: 70 cases x 6 features. [summarize_supervised] 2026-02-05 01:47:28 // Max workers: 1 => Algorithm: 1; Tuning: 1; Outer Resampling: 1 [get_n_workers] 2026-02-05 01:47:28 Training GLMNET Regression... [train] 2026-02-05 01:47:28 Checking data is ready for training... ✔ [check_supervised]  2026-02-05 01:47:28 ✔ Done in 0.14 seconds. [train] <Regression> GLMNET (Elastic Net) <Training Regression Metrics>  MAE: 0.77  MSE: 0.94  RMSE: 0.97  Rsq: 0.82 <Validation Regression Metrics>  MAE: 0.72  MSE: 0.78  RMSE: 0.88  Rsq: 0.82 <Regression> GLMNET (Elastic Net) <Training Regression Metrics>  MAE: 0.75  MSE: 0.89  RMSE: 0.94  Rsq: 0.81 <Validation Regression Metrics>  MAE: 0.83  MSE: 1.03  RMSE: 1.01  Rsq: 0.82 <Regression> GLMNET (Elastic Net) <Training Regression Metrics>  MAE: 0.74  MSE: 0.87  RMSE: 0.93  Rsq: 0.83 <Validation Regression Metrics>  MAE: 0.81  MSE: 1.05  RMSE: 1.02  Rsq: 0.79 2026-02-05 01:47:27 Running grid line #3/5... [tune_GridSearch] 2026-02-05 01:47:27 ▶ [train] 2026-02-05 01:47:27  Training set: 286 cases x 6 features. [summarize_supervised] 2026-02-05 01:47:27 Validation set: 72 cases x 6 features. [summarize_supervised] 2026-02-05 01:47:27 // Max workers: 1 => Algorithm: 1; Tuning: 1; Outer Resampling: 1 [get_n_workers] 2026-02-05 01:47:27 Training GLMNET Regression... [train] 2026-02-05 01:47:29 Checking data is ready for training... ✔ [check_supervised]  2026-02-05 01:47:29 ✔ Done in 1.70 seconds. [train] 2026-02-05 01:47:29 Running grid line #4/5... [tune_GridSearch] 2026-02-05 01:47:29 ▶ [train] 2026-02-05 01:47:29  Training set: 288 cases x 6 features. [summarize_supervised] 2026-02-05 01:47:29 Validation set: 70 cases x 6 features. [summarize_supervised] 2026-02-05 01:47:29 // Max workers: 1 => Algorithm: 1; Tuning: 1; Outer Resampling: 1 [get_n_workers] 2026-02-05 01:47:29 Training GLMNET Regression... [train] 2026-02-05 01:47:29 Checking data is ready for training... ✔ [check_supervised]  2026-02-05 01:47:29 ✔ Done in 0.14 seconds. [train] 2026-02-05 01:47:29 Running grid line #5/5... [tune_GridSearch] 2026-02-05 01:47:29 ▶ [train] 2026-02-05 01:47:29  Training set: 285 cases x 6 features. [summarize_supervised] 2026-02-05 01:47:29 Validation set: 73 cases x 6 features. [summarize_supervised] 2026-02-05 01:47:29 // Max workers: 1 => Algorithm: 1; Tuning: 1; Outer Resampling: 1 [get_n_workers] 2026-02-05 01:47:29 Training GLMNET Regression... [train] 2026-02-05 01:47:29 Checking data is ready for training... ✔ [check_supervised]  2026-02-05 01:47:29 ✔ Done in 0.08 seconds. [train] 2026-02-05 01:47:29 Extracting best lambda from GLMNET models... [tune_GridSearch] 2026-02-05 01:47:29 Best config to minimize MSE: [tune_GridSearch]  lambda: {} => 0.146228052216688 2026-02-05 01:47:29 ✔ Done in 3.36 seconds. [tune_GridSearch] 2026-02-05 01:47:29  Tuning done. [tune_GridSearch] 2026-02-05 01:47:29 Training GLMNET Regression with tuned hyperparameters... [train] 2026-02-05 01:47:29 Checking data is ready for training... ✔ [check_supervised] 2026-02-05 01:47:29 NCOL(xm): 6 [train_GLMNET] 2026-02-05 01:47:29 Updated hyperparameters[["penalty_factor"]] to all 1s. [train_GLMNET] <Regression> GLMNET (Elastic Net) ⚙ Tuned using exhaustive grid search. <Training Regression Metrics>  MAE: 0.76  MSE: 0.90  RMSE: 0.95  Rsq: 0.82 <Test Regression Metrics>  MAE: 0.73  MSE: 0.95  RMSE: 0.98  Rsq: 0.79 2026-02-05 01:47:30 ✔ Done in 3.65 seconds. [train] 2026-02-05 01:47:30 ▶ [train] 2026-02-05 01:47:30 Training set: 358 cases x 6 features. [summarize_supervised] 2026-02-05 01:47:30  Test set: 42 cases x 6 features. [summarize_supervised] 2026-02-05 01:47:30 // Max workers: 2 => Algorithm: 1; Tuning: 2; Outer Resampling: 1 [get_n_workers] 2026-02-05 01:47:30 ▶ [train] 2026-02-05 01:47:30 Training set: 358 cases x 6 features. [summarize_supervised] 2026-02-05 01:47:30  Test set: 42 cases x 6 features. [summarize_supervised] 2026-02-05 01:47:30 // Max workers: 2 => Algorithm: 1; Tuning: 2; Outer Resampling: 1 [get_n_workers] 2026-02-05 01:47:30 <> Tuning GLMNET by exhaustive grid search with 5 independent folds... [tune_GridSearch] 2026-02-05 01:47:30 1 parameter combination x 5 resamples: 5 models total (). [tune_GridSearch] 2026-02-05 01:47:30 Input contains more than one column; stratifying on last. [resample] 2026-02-05 01:47:30 Tuning using mirai; N workers: 2 [tune_GridSearch] ======>------------------------ 20% | ETA: 6s ==============================> 100% | ETA: 0s 2026-02-05 01:47:32 Best config to minimize MSE: [tune_GridSearch]  lambda: {} => 0.160818950780549 2026-02-05 01:47:32  Tuning done. [tune_GridSearch] 2026-02-05 01:47:32 Training GLMNET Regression with tuned hyperparameters... [train] 2026-02-05 01:47:32 Checking data is ready for training... ✔ [check_supervised]  <Regression> GLMNET (Elastic Net) ⚙ Tuned using exhaustive grid search. <Training Regression Metrics>  MAE: 0.76  MSE: 0.92  RMSE: 0.96  Rsq: 0.81 <Test Regression Metrics>  MAE: 0.74  MSE: 0.95  RMSE: 0.97  Rsq: 0.79 2026-02-05 01:47:32 ✔ Done in 2.71 seconds. [train] 2026-02-05 01:47:32 ▶ [train] 2026-02-05 01:47:32 Training set: 358 cases x 6 features. [summarize_supervised] 2026-02-05 01:47:32  Test set: 42 cases x 6 features. [summarize_supervised] 2026-02-05 01:47:32 // Max workers: 1 => Algorithm: 1; Tuning: 1; Outer Resampling: 1 [get_n_workers] 2026-02-05 01:47:32 <> Tuning GLMNET by exhaustive grid search with 5 independent folds... [tune_GridSearch] 2026-02-05 01:47:32 2 parameter combinations x 5 resamples: 10 models total (). [tune_GridSearch] 2026-02-05 01:47:32 Input contains more than one column; stratifying on last. [resample] 2026-02-05 01:47:32 Tuning in sequence [tune_GridSearch] 2026-02-05 01:47:33 Best config to minimize MSE: [tune_GridSearch]  alpha: {0, 1} => 0  lambda: {} => 0.309667806207016 2026-02-05 01:47:33  Tuning done. [tune_GridSearch] 2026-02-05 01:47:33 Training GLMNET Regression with tuned hyperparameters... [train] 2026-02-05 01:47:33 Checking data is ready for training... ✔ [check_supervised]  <Regression> GLMNET (Elastic Net) ⚙ Tuned using exhaustive grid search. <Training Regression Metrics>  MAE: 0.75  MSE: 0.89  RMSE: 0.94  Rsq: 0.82 <Test Regression Metrics>  MAE: 0.71  MSE: 0.91  RMSE: 0.95  Rsq: 0.80 2026-02-05 01:47:33 ✔ Done in 1.03 seconds. [train] 2026-02-05 01:47:33 ▶ [train] 2026-02-05 01:47:33 Training set: 358 cases x 6 features. [summarize_supervised] 2026-02-05 01:47:33 Tuning parallelization enabled. Disabling outer resampling parallelization. [get_n_workers] 2026-02-05 01:47:33 // Max workers: 1 => Algorithm: 1; Tuning: 1; Outer Resampling: 1 [get_n_workers] 2026-02-05 01:47:33 <> Training GLMNET Regression using 3 independent folds... [train] 2026-02-05 01:47:33 Input contains more than one column; stratifying on last. [resample] \ 2/3 ETA: 1s | Training outer resamples... 2026-02-05 01:47:37  Outer resampling done. [train] GLMNET (Elastic Net) ⚙ Tuned using exhaustive grid search. ⟳ Tested using 3 independent folds. Showing mean (sd) across resamples. MAE: 0.758 (0.016) MSE: 0.900 (0.035) RMSE: 0.948 (0.018) Rsq: 0.818 (0.010) Showing mean (sd) across resamples. MAE: 0.770 (0.028) MSE: 0.932 (0.043) RMSE: 0.965 (0.022) Rsq: 0.811 (0.022) 2026-02-05 01:47:37 ✔ Done in 3.57 seconds. [train] 2026-02-05 01:47:37 ▶ [train] 2026-02-05 01:47:37 Training set: 90 cases x 5 features. [summarize_supervised] 2026-02-05 01:47:37  Test set: 10 cases x 5 features. [summarize_supervised] 2026-02-05 01:47:37 // Max workers: 1 => Algorithm: 1; Tuning: 1; Outer Resampling: 1 [get_n_workers] 2026-02-05 01:47:37 Calculating case weights using Inverse Frequency Weighting. [ifw] 2026-02-05 01:47:37 Training GLMNET Classification... [train] 2026-02-05 01:47:37 Checking data is ready for training... ✔ [check_supervised]  <Classification> GLMNET (Elastic Net) <Training Classification Metrics>  Predicted  Reference virginica versicolor   virginica 43 2  versicolor 2 43 Overall   Sensitivity 0.956  Specificity 0.956  Balanced_Accuracy 0.956  PPV 0.956  NPV 0.956  F1 0.956  Accuracy 0.956  AUC 0.998  Brier_Score 0.024 Positive Class virginica <Test Classification Metrics>  Predicted  Reference virginica versicolor   virginica 5 0  versicolor 1 4 Overall   Sensitivity 1.000  Specificity 0.800  Balanced_Accuracy 0.900  PPV 0.833  NPV 1.000  F1 0.909  Accuracy 0.900  AUC 1.000  Brier_Score 0.055 Positive Class virginica 2026-02-05 01:47:37 ✔ Done in 0.06 seconds. [train] 2026-02-05 01:47:37 ▶ [train] 2026-02-05 01:47:37 Training set: 135 cases x 4 features. [summarize_supervised] 2026-02-05 01:47:37  Test set: 15 cases x 4 features. [summarize_supervised] 2026-02-05 01:47:37 // Max workers: 1 => Algorithm: 1; Tuning: 1; Outer Resampling: 1 [get_n_workers] 2026-02-05 01:47:37 <> Tuning GLMNET by exhaustive grid search with 5 independent folds... [tune_GridSearch] 2026-02-05 01:47:37 1 parameter combination x 5 resamples: 5 models total (). [tune_GridSearch] 2026-02-05 01:47:37 Input contains more than one column; stratifying on last. [resample] 2026-02-05 01:47:37 Using max n bins possible = 3. [kfold] 2026-02-05 01:47:37 Tuning in sequence [tune_GridSearch] 2026-02-05 01:47:39 Best config to maximize Balanced_Accuracy: [tune_GridSearch]  lambda: {} => 0.00788446071925322 2026-02-05 01:47:39  Tuning done. [tune_GridSearch] 2026-02-05 01:47:39 Calculating case weights using Inverse Frequency Weighting. [ifw] 2026-02-05 01:47:39 Training GLMNET Classification with tuned hyperparameters... [train] 2026-02-05 01:47:39 Checking data is ready for training... ✔ [check_supervised]  <Classification> GLMNET (Elastic Net) ⚙ Tuned using exhaustive grid search. <Training Classification Metrics>  Predicted  Reference setosa versicolor virginica   setosa 45 0 0  versicolor 0 42 3  virginica 0 2 43 Overall   Balanced_Accuracy 0.963  F1 0.963  Accuracy 0.963 setosa versicolor virginica   Sensitivity 1.000 0.933 0.956  Specificity 1.000 0.978 0.967  Balanced_Accuracy 1.000 0.956 0.961  PPV 1.000 0.955 0.935  NPV 1.000 0.967 0.978  F1 1.000 0.944 0.945 <Test Classification Metrics>  Predicted  Reference setosa versicolor virginica   setosa 5 0 0  versicolor 0 5 0  virginica 0 1 4 Overall   Balanced_Accuracy 0.933  F1 0.933  Accuracy 0.933 setosa versicolor virginica   Sensitivity 1.000 1.000 0.800  Specificity 1.000 0.900 1.000  Balanced_Accuracy 1.000 0.950 0.900  PPV 1.000 0.833 1.000  NPV 1.000 1.000 0.909  F1 1.000 0.909 0.889 2026-02-05 01:47:39 ✔ Done in 1.56 seconds. [train] 2026-02-05 01:47:39 ▶ [train] 2026-02-05 01:47:39 Training set: 358 cases x 6 features. [summarize_supervised] 2026-02-05 01:47:39  Test set: 42 cases x 6 features. [summarize_supervised] 2026-02-05 01:47:39 // Max workers: 1 => Algorithm: 1; Tuning: 1; Outer Resampling: 1 [get_n_workers] 2026-02-05 01:47:39 Training GAM Regression... [train] 2026-02-05 01:47:39 Checking data is ready for training... ✔ [check_supervised]  <Regression> GAM (Generalized Additive Model) <Training Regression Metrics>  MAE: 0.72  MSE: 0.80  RMSE: 0.89  Rsq: 0.84 <Test Regression Metrics>  MAE: 0.75  MSE: 1.01  RMSE: 1.00  Rsq: 0.77 2026-02-05 01:47:39 ✔ Done in 0.14 seconds. [train] 2026-02-05 01:47:39 ▶ [train] 2026-02-05 01:47:39 Training set: 358 cases x 5 features. [summarize_supervised] 2026-02-05 01:47:39  Test set: 42 cases x 5 features. [summarize_supervised] 2026-02-05 01:47:39 // Max workers: 1 => Algorithm: 1; Tuning: 1; Outer Resampling: 1 [get_n_workers] 2026-02-05 01:47:39 Training GAM Regression... [train] 2026-02-05 01:47:39 Checking data is ready for training... ✔ [check_supervised]  <Regression> GAM (Generalized Additive Model) <Training Regression Metrics>  MAE: 1.49  MSE: 2.95  RMSE: 1.72  Rsq: 0.40 <Test Regression Metrics>  MAE: 1.26  MSE: 2.20  RMSE: 1.48  Rsq: 0.51 2026-02-05 01:47:39 ✔ Done in 0.11 seconds. [train] 2026-02-05 01:47:39 ▶ [train] 2026-02-05 01:47:39 Training set: 358 cases x 1 features. [summarize_supervised] 2026-02-05 01:47:39  Test set: 42 cases x 1 features. [summarize_supervised] 2026-02-05 01:47:39 // Max workers: 1 => Algorithm: 1; Tuning: 1; Outer Resampling: 1 [get_n_workers] 2026-02-05 01:47:39 Training GAM Regression... [train] 2026-02-05 01:47:39 Checking data is ready for training... ✔ [check_supervised]  <Regression> GAM (Generalized Additive Model) <Training Regression Metrics>  MAE: 1.37  MSE: 2.80  RMSE: 1.67  Rsq: 0.43 <Test Regression Metrics>  MAE: 1.16  MSE: 1.93  RMSE: 1.39  Rsq: 0.57 2026-02-05 01:47:39 ✔ Done in 0.05 seconds. [train] 2026-02-05 01:47:39 ▶ [train] 2026-02-05 01:47:39 Training set: 358 cases x 6 features. [summarize_supervised] 2026-02-05 01:47:39  Test set: 42 cases x 6 features. [summarize_supervised] 2026-02-05 01:47:39 // Max workers: 1 => Algorithm: 1; Tuning: 1; Outer Resampling: 1 [get_n_workers] 2026-02-05 01:47:39 <> Tuning GAM by exhaustive grid search with 5 independent folds... [tune_GridSearch] 2026-02-05 01:47:39 3 parameter combinations x 5 resamples: 15 models total (). [tune_GridSearch] 2026-02-05 01:47:39 Input contains more than one column; stratifying on last. [resample] 2026-02-05 01:47:39 Tuning in sequence [tune_GridSearch] 2026-02-05 01:47:40 Best config to minimize MSE: [tune_GridSearch]  k: {3, 5, 7} => 3 2026-02-05 01:47:40  Tuning done. [tune_GridSearch] 2026-02-05 01:47:40 Training GAM Regression with tuned hyperparameters... [train] 2026-02-05 01:47:40 Checking data is ready for training... ✔ [check_supervised]  <Regression> GAM (Generalized Additive Model) ⚙ Tuned using exhaustive grid search. <Training Regression Metrics>  MAE: 0.72  MSE: 0.82  RMSE: 0.91  Rsq: 0.83 <Test Regression Metrics>  MAE: 0.74  MSE: 1.02  RMSE: 1.01  Rsq: 0.77 2026-02-05 01:47:41 ✔ Done in 1.59 seconds. [train] 2026-02-05 01:47:41 ▶ [train] 2026-02-05 01:47:41 Training set: 400 cases x 6 features. [summarize_supervised] 2026-02-05 01:47:41 // Max workers: 1 => Algorithm: 1; Tuning: 1; Outer Resampling: 1 [get_n_workers] 2026-02-05 01:47:41 <> Training GAM Regression using 3 independent folds... [train] 2026-02-05 01:47:41 Input contains more than one column; stratifying on last. [resample] 2026-02-05 01:47:41  Outer resampling done. [train] GAM (Generalized Additive Model) ⟳ Tested using 3 independent folds. Showing mean (sd) across resamples. MAE: 0.716 (0.016) MSE: 0.813 (0.034) RMSE: 0.901 (0.019) Rsq: 0.834 (0.012) Showing mean (sd) across resamples. MAE: 0.747 (0.035) MSE: 0.891 (0.065) RMSE: 0.944 (0.034) Rsq: 0.817 (0.022) 2026-02-05 01:47:41 ✔ Done in 0.33 seconds. [train] 2026-02-05 01:47:41 ▶ [train] 2026-02-05 01:47:41 Training set: 90 cases x 5 features. [summarize_supervised] 2026-02-05 01:47:41  Test set: 10 cases x 5 features. [summarize_supervised] 2026-02-05 01:47:41 // Max workers: 1 => Algorithm: 1; Tuning: 1; Outer Resampling: 1 [get_n_workers] 2026-02-05 01:47:41 Training GAM Classification... [train] 2026-02-05 01:47:41 Checking data is ready for training... ✔ [check_supervised]  <Classification> GAM (Generalized Additive Model) <Training Classification Metrics>  Predicted  Reference virginica versicolor   virginica 45 0  versicolor 0 45 Overall   Sensitivity 1.000  Specificity 1.000  Balanced_Accuracy 1.000  PPV 1.000  NPV 1.000  F1 1.000  Accuracy 1.000  AUC 1.000  Brier_Score 2.2e-06 Positive Class virginica <Test Classification Metrics>  Predicted  Reference virginica versicolor   virginica 5 0  versicolor 1 4 Overall   Sensitivity 1.000  Specificity 0.800  Balanced_Accuracy 0.900  PPV 0.833  NPV 1.000  F1 0.909  Accuracy 0.900  AUC 0.900  Brier_Score 0.100 Positive Class virginica 2026-02-05 01:47:41 ✔ Done in 0.17 seconds. [train] 2026-02-05 01:47:41 ▶ [train] 2026-02-05 01:47:41 Training set: 90 cases x 5 features. [summarize_supervised] 2026-02-05 01:47:41  Test set: 10 cases x 5 features. [summarize_supervised] 2026-02-05 01:47:41 // Max workers: 1 => Algorithm: 1; Tuning: 1; Outer Resampling: 1 [get_n_workers] 2026-02-05 01:47:41 Calculating case weights using Inverse Frequency Weighting. [ifw] 2026-02-05 01:47:41 Training GAM Classification... [train] 2026-02-05 01:47:41 Checking data is ready for training... ✔ [check_supervised]  <Classification> GAM (Generalized Additive Model) <Training Classification Metrics>  Predicted  Reference virginica versicolor   virginica 45 0  versicolor 0 45 Overall   Sensitivity 1.000  Specificity 1.000  Balanced_Accuracy 1.000  PPV 1.000  NPV 1.000  F1 1.000  Accuracy 1.000  AUC 1.000  Brier_Score 2.2e-06 Positive Class virginica <Test Classification Metrics>  Predicted  Reference virginica versicolor   virginica 5 0  versicolor 1 4 Overall   Sensitivity 1.000  Specificity 0.800  Balanced_Accuracy 0.900  PPV 0.833  NPV 1.000  F1 0.909  Accuracy 0.900  AUC 0.900  Brier_Score 0.100 Positive Class virginica 2026-02-05 01:47:41 ✔ Done in 0.16 seconds. [train] 2026-02-05 01:47:41 ▶ [train] 2026-02-05 01:47:41 Training set: 358 cases x 6 features. [summarize_supervised] 2026-02-05 01:47:41  Test set: 42 cases x 6 features. [summarize_supervised] 2026-02-05 01:47:41 // Max workers: 1 => Algorithm: 1; Tuning: 1; Outer Resampling: 1 [get_n_workers] 2026-02-05 01:47:41 Training LinearSVM Regression... [train] 2026-02-05 01:47:41 Checking data is ready for training... ✔ [check_supervised] 2026-02-05 01:47:41 One hot encoding g... ✔ [one_hot] 2026-02-05 01:47:41 Preprocessing done. [preprocess] <Regression> LinearSVM (Support Vector Machine with Linear Kernel) <Training Regression Metrics>  MAE: 0.72  MSE: 0.83  RMSE: 0.91  Rsq: 0.83 <Test Regression Metrics>  MAE: 0.75  MSE: 1.02  RMSE: 1.01  Rsq: 0.77 2026-02-05 01:47:41 ✔ Done in 0.13 seconds. [train] 2026-02-05 01:47:41 ▶ [train] 2026-02-05 01:47:41 Training set: 358 cases x 6 features. [summarize_supervised] 2026-02-05 01:47:41  Test set: 42 cases x 6 features. [summarize_supervised] 2026-02-05 01:47:41 // Max workers: 1 => Algorithm: 1; Tuning: 1; Outer Resampling: 1 [get_n_workers] 2026-02-05 01:47:41 <> Tuning LinearSVM by exhaustive grid search with 5 independent folds... [tune_GridSearch] 2026-02-05 01:47:41 2 parameter combinations x 5 resamples: 10 models total (). [tune_GridSearch] 2026-02-05 01:47:41 Input contains more than one column; stratifying on last. [resample] 2026-02-05 01:47:41 Tuning in sequence [tune_GridSearch] 2026-02-05 01:47:43 Best config to minimize MSE: [tune_GridSearch]  cost: {1, 10} => 1 2026-02-05 01:47:43  Tuning done. [tune_GridSearch] 2026-02-05 01:47:43 Training LinearSVM Regression with tuned hyperparameters... [train] 2026-02-05 01:47:43 Checking data is ready for training... ✔ [check_supervised] 2026-02-05 01:47:43 One hot encoding g... ✔ [one_hot] 2026-02-05 01:47:43 Preprocessing done. [preprocess] <Regression> LinearSVM (Support Vector Machine with Linear Kernel) ⚙ Tuned using exhaustive grid search. <Training Regression Metrics>  MAE: 0.72  MSE: 0.83  RMSE: 0.91  Rsq: 0.83 <Test Regression Metrics>  MAE: 0.75  MSE: 1.02  RMSE: 1.01  Rsq: 0.77 2026-02-05 01:47:43 ✔ Done in 1.65 seconds. [train] 2026-02-05 01:47:43 ▶ [train] 2026-02-05 01:47:43 Training set: 400 cases x 6 features. [summarize_supervised] 2026-02-05 01:47:43 // Max workers: 1 => Algorithm: 1; Tuning: 1; Outer Resampling: 1 [get_n_workers] 2026-02-05 01:47:43 <> Training LinearSVM Regression using 3 independent folds... [train] 2026-02-05 01:47:43 Input contains more than one column; stratifying on last. [resample] 2026-02-05 01:47:43  Outer resampling done. [train] LinearSVM (Support Vector Machine with Linear Kernel) ⟳ Tested using 3 independent folds. Showing mean (sd) across resamples. MAE: 0.723 (0.023) MSE: 0.840 (0.062) RMSE: 0.916 (0.034) Rsq: 0.828 (0.019) Showing mean (sd) across resamples. MAE: 0.750 (0.048) MSE: 0.904 (0.142) RMSE: 0.949 (0.075) Rsq: 0.813 (0.040) 2026-02-05 01:47:43 ✔ Done in 0.25 seconds. [train] 2026-02-05 01:47:43 ▶ [train] 2026-02-05 01:47:43 Training set: 90 cases x 5 features. [summarize_supervised] 2026-02-05 01:47:43  Test set: 10 cases x 5 features. [summarize_supervised] 2026-02-05 01:47:43 // Max workers: 1 => Algorithm: 1; Tuning: 1; Outer Resampling: 1 [get_n_workers] 2026-02-05 01:47:43 Training LinearSVM Classification... [train] 2026-02-05 01:47:43 Checking data is ready for training... ✔ [check_supervised] 2026-02-05 01:47:43 One hot encoding gn... ✔ [one_hot] 2026-02-05 01:47:43 Preprocessing done. [preprocess] <Classification> LinearSVM (Support Vector Machine with Linear Kernel) <Training Classification Metrics>  Predicted  Reference virginica versicolor   virginica 45 0  versicolor 3 42 Overall   Sensitivity 1.000  Specificity 0.933  Balanced_Accuracy 0.967  PPV 0.938  NPV 1.000  F1 0.968  Accuracy 0.967  AUC 0.999  Brier_Score 0.024 Positive Class virginica <Test Classification Metrics>  Predicted  Reference virginica versicolor   virginica 5 0  versicolor 1 4 Overall   Sensitivity 1.000  Specificity 0.800  Balanced_Accuracy 0.900  PPV 0.833  NPV 1.000  F1 0.909  Accuracy 0.900  AUC 1.000  Brier_Score 0.048 Positive Class virginica 2026-02-05 01:47:43 ✔ Done in 0.06 seconds. [train] 2026-02-05 01:47:43 ▶ [train] 2026-02-05 01:47:43 Training set: 135 cases x 4 features. [summarize_supervised] 2026-02-05 01:47:43  Test set: 15 cases x 4 features. [summarize_supervised] 2026-02-05 01:47:43 // Max workers: 1 => Algorithm: 1; Tuning: 1; Outer Resampling: 1 [get_n_workers] 2026-02-05 01:47:43 Training LinearSVM Classification... [train] 2026-02-05 01:47:43 Checking data is ready for training... ✔ [check_supervised]  ✔ [one_hot] 2026-02-05 01:47:43 Preprocessing done. [preprocess] <Classification> LinearSVM (Support Vector Machine with Linear Kernel) <Training Classification Metrics>  Predicted  Reference setosa versicolor virginica   setosa 45 0 0  versicolor 0 43 2  virginica 0 1 44 Overall   Balanced_Accuracy 0.978  F1 0.978  Accuracy 0.978 setosa versicolor virginica   Sensitivity 1.000 0.956 0.978  Specificity 1.000 0.989 0.978  Balanced_Accuracy 1.000 0.972 0.978  PPV 1.000 0.977 0.957  NPV 1.000 0.978 0.989  F1 1.000 0.966 0.967 <Test Classification Metrics>  Predicted  Reference setosa versicolor virginica   setosa 5 0 0  versicolor 0 5 0  virginica 0 0 5 Overall   Balanced_Accuracy 1.000  F1 1.000  Accuracy 1.000 setosa versicolor virginica   Sensitivity 1.000 1.000 1.000  Specificity 1.000 1.000 1.000  Balanced_Accuracy 1.000 1.000 1.000  PPV 1.000 1.000 1.000  NPV 1.000 1.000 1.000  F1 1.000 1.000 1.000 2026-02-05 01:47:43 ✔ Done in 0.07 seconds. [train] 2026-02-05 01:47:43 ▶ [train] 2026-02-05 01:47:43 Training set: 100 cases x 5 features. [summarize_supervised] 2026-02-05 01:47:43 // Max workers: 1 => Algorithm: 1; Tuning: 1; Outer Resampling: 1 [get_n_workers] 2026-02-05 01:47:43 <> Training LinearSVM Classification using 3 independent folds... [train] 2026-02-05 01:47:43 Input contains more than one column; stratifying on last. [resample] 2026-02-05 01:47:43 Using max n bins possible = 2. [kfold] 2026-02-05 01:47:44  Outer resampling done. [train] LinearSVM (Support Vector Machine with Linear Kernel) ⟳ Tested using 3 independent folds. Showing mean (sd) across resamples. Sensitivity: 0.990 (0.017) Specificity: 0.970 (0.029) Balanced_Accuracy: 0.980 (0.022) PPV: 0.971 (0.029) NPV: 0.990 (0.017) F1: 0.981 (0.022) Accuracy: 0.980 (0.022) AUC: 0.999 (1.5e-03) Brier_Score: 0.023 (0.007) Showing mean (sd) across resamples. Sensitivity: 0.939 (0.063) Specificity: 0.901 (0.067) Balanced_Accuracy: 0.920 (0.019) PPV: 0.908 (0.051) NPV: 0.941 (0.059) F1: 0.921 (0.019) Accuracy: 0.920 (0.019) AUC: 0.989 (0.007) Brier_Score: 0.050 (1.9e-03) 2026-02-05 01:47:44 ✔ Done in 0.17 seconds. [train] 2026-02-05 01:47:44 ▶ [train] 2026-02-05 01:47:44 Training set: 358 cases x 6 features. [summarize_supervised] 2026-02-05 01:47:44  Test set: 42 cases x 6 features. [summarize_supervised] 2026-02-05 01:47:44 // Max workers: 1 => Algorithm: 1; Tuning: 1; Outer Resampling: 1 [get_n_workers] 2026-02-05 01:47:44 Training RadialSVM Regression... [train] 2026-02-05 01:47:44 Checking data is ready for training... ✔ [check_supervised] 2026-02-05 01:47:44 One hot encoding g... ✔ [one_hot] 2026-02-05 01:47:44 Preprocessing done. [preprocess] <Regression> RadialSVM (Support Vector Machine with Radial Kernel) <Training Regression Metrics>  MAE: 0.73  MSE: 0.84  RMSE: 0.92  Rsq: 0.83 <Test Regression Metrics>  MAE: 0.75  MSE: 1.01  RMSE: 1.01  Rsq: 0.77 2026-02-05 01:47:44 ✔ Done in 0.11 seconds. [train] 2026-02-05 01:47:44 ▶ [train] 2026-02-05 01:47:44 Training set: 358 cases x 6 features. [summarize_supervised] 2026-02-05 01:47:44  Test set: 42 cases x 6 features. [summarize_supervised] 2026-02-05 01:47:44 // Max workers: 1 => Algorithm: 1; Tuning: 1; Outer Resampling: 1 [get_n_workers] 2026-02-05 01:47:44 <> Tuning RadialSVM by exhaustive grid search with 5 independent folds... [tune_GridSearch] 2026-02-05 01:47:44 3 parameter combinations x 5 resamples: 15 models total (). [tune_GridSearch] 2026-02-05 01:47:44 Input contains more than one column; stratifying on last. [resample] 2026-02-05 01:47:44 Tuning in sequence [tune_GridSearch] 2026-02-05 01:47:45 Best config to minimize MSE: [tune_GridSearch]  cost: {1, 10, 100} => 1 2026-02-05 01:47:45  Tuning done. [tune_GridSearch] 2026-02-05 01:47:45 Training RadialSVM Regression with tuned hyperparameters... [train] 2026-02-05 01:47:45 Checking data is ready for training... ✔ [check_supervised] 2026-02-05 01:47:45 One hot encoding g... ✔ [one_hot] 2026-02-05 01:47:45 Preprocessing done. [preprocess] <Regression> RadialSVM (Support Vector Machine with Radial Kernel) ⚙ Tuned using exhaustive grid search. <Training Regression Metrics>  MAE: 0.73  MSE: 0.84  RMSE: 0.92  Rsq: 0.83 <Test Regression Metrics>  MAE: 0.75  MSE: 1.01  RMSE: 1.01  Rsq: 0.77 2026-02-05 01:47:45 ✔ Done in 1.59 seconds. [train] 2026-02-05 01:47:45 ▶ [train] 2026-02-05 01:47:45 Training set: 400 cases x 6 features. [summarize_supervised] 2026-02-05 01:47:45 // Max workers: 1 => Algorithm: 1; Tuning: 1; Outer Resampling: 1 [get_n_workers] 2026-02-05 01:47:45 <> Training RadialSVM Regression using 3 independent folds... [train] 2026-02-05 01:47:45 Input contains more than one column; stratifying on last. [resample] 2026-02-05 01:47:46  Outer resampling done. [train] RadialSVM (Support Vector Machine with Radial Kernel) ⟳ Tested using 3 independent folds. Showing mean (sd) across resamples. MAE: 0.726 (0.028) MSE: 0.858 (0.065) RMSE: 0.926 (0.035) Rsq: 0.825 (0.008) Showing mean (sd) across resamples. MAE: 0.755 (0.077) MSE: 0.909 (0.157) RMSE: 0.951 (0.085) Rsq: 0.816 (0.022) 2026-02-05 01:47:46 ✔ Done in 0.25 seconds. [train] 2026-02-05 01:47:46 ▶ [train] 2026-02-05 01:47:46 Training set: 400 cases x 6 features. [summarize_supervised] 2026-02-05 01:47:46 Tuning parallelization enabled. Disabling outer resampling parallelization. [get_n_workers] 2026-02-05 01:47:46 // Max workers: 1 => Algorithm: 1; Tuning: 1; Outer Resampling: 1 [get_n_workers] 2026-02-05 01:47:46 <> Training RadialSVM Regression using 3 independent folds... [train] 2026-02-05 01:47:46 Input contains more than one column; stratifying on last. [resample] 2026-02-05 01:47:48  Outer resampling done. [train] RadialSVM (Support Vector Machine with Radial Kernel) ⚙ Tuned using exhaustive grid search. ⟳ Tested using 3 independent folds. Showing mean (sd) across resamples. MAE: 0.722 (0.058) MSE: 0.847 (0.117) RMSE: 0.919 (0.064) Rsq: 0.827 (0.024) Showing mean (sd) across resamples. MAE: 0.778 (0.119) MSE: 0.959 (0.236) RMSE: 0.974 (0.121) Rsq: 0.803 (0.048) 2026-02-05 01:47:48 ✔ Done in 2.52 seconds. [train] 2026-02-05 01:47:48 ▶ [train] 2026-02-05 01:47:48 Training set: 90 cases x 5 features. [summarize_supervised] 2026-02-05 01:47:48  Test set: 10 cases x 5 features. [summarize_supervised] 2026-02-05 01:47:48 // Max workers: 1 => Algorithm: 1; Tuning: 1; Outer Resampling: 1 [get_n_workers] 2026-02-05 01:47:48 Training RadialSVM Classification... [train] 2026-02-05 01:47:48 Checking data is ready for training... ✔ [check_supervised] 2026-02-05 01:47:48 One hot encoding gn... ✔ [one_hot] 2026-02-05 01:47:48 Preprocessing done. [preprocess] <Classification> RadialSVM (Support Vector Machine with Radial Kernel) <Training Classification Metrics>  Predicted  Reference virginica versicolor   virginica 42 3  versicolor 2 43 Overall   Sensitivity 0.933  Specificity 0.956  Balanced_Accuracy 0.944  PPV 0.955  NPV 0.935  F1 0.944  Accuracy 0.944  AUC 0.995  Brier_Score 0.036 Positive Class virginica <Test Classification Metrics>  Predicted  Reference virginica versicolor   virginica 5 0  versicolor 1 4 Overall   Sensitivity 1.000  Specificity 0.800  Balanced_Accuracy 0.900  PPV 0.833  NPV 1.000  F1 0.909  Accuracy 0.900  AUC 1.000  Brier_Score 0.059 Positive Class virginica 2026-02-05 01:47:48 ✔ Done in 0.08 seconds. [train] 2026-02-05 01:47:48 ▶ [train] 2026-02-05 01:47:48 Training set: 90 cases x 5 features. [summarize_supervised] 2026-02-05 01:47:48  Test set: 10 cases x 5 features. [summarize_supervised] 2026-02-05 01:47:48 // Max workers: 1 => Algorithm: 1; Tuning: 1; Outer Resampling: 1 [get_n_workers] 2026-02-05 01:47:48 <> Tuning RadialSVM by exhaustive grid search with 5 independent folds... [tune_GridSearch] 2026-02-05 01:47:48 2 parameter combinations x 5 resamples: 10 models total (). [tune_GridSearch] 2026-02-05 01:47:48 Input contains more than one column; stratifying on last. [resample] 2026-02-05 01:47:48 Using max n bins possible = 2. [kfold] 2026-02-05 01:47:48 Tuning in sequence [tune_GridSearch] 2026-02-05 01:47:50 Best config to maximize Balanced_Accuracy: [tune_GridSearch]  cost: {1, 10} => 1 2026-02-05 01:47:50  Tuning done. [tune_GridSearch] 2026-02-05 01:47:50 Training RadialSVM Classification with tuned hyperparameters... [train] 2026-02-05 01:47:50 Checking data is ready for training... ✔ [check_supervised] 2026-02-05 01:47:50 One hot encoding gn... ✔ [one_hot] 2026-02-05 01:47:50 Preprocessing done. [preprocess] <Classification> RadialSVM (Support Vector Machine with Radial Kernel) ⚙ Tuned using exhaustive grid search. <Training Classification Metrics>  Predicted  Reference virginica versicolor   virginica 42 3  versicolor 2 43 Overall   Sensitivity 0.933  Specificity 0.956  Balanced_Accuracy 0.944  PPV 0.955  NPV 0.935  F1 0.944  Accuracy 0.944  AUC 0.995  Brier_Score 0.036 Positive Class virginica <Test Classification Metrics>  Predicted  Reference virginica versicolor   virginica 5 0  versicolor 1 4 Overall   Sensitivity 1.000  Specificity 0.800  Balanced_Accuracy 0.900  PPV 0.833  NPV 1.000  F1 0.909  Accuracy 0.900  AUC 1.000  Brier_Score 0.056 Positive Class virginica 2026-02-05 01:47:50 ✔ Done in 1.36 seconds. [train] 2026-02-05 01:47:50 ▶ [train] 2026-02-05 01:47:50 Training set: 100 cases x 5 features. [summarize_supervised] 2026-02-05 01:47:50 // Max workers: 1 => Algorithm: 1; Tuning: 1; Outer Resampling: 1 [get_n_workers] 2026-02-05 01:47:50 <> Training RadialSVM Classification using 3 independent folds... [train] 2026-02-05 01:47:50 Input contains more than one column; stratifying on last. [resample] 2026-02-05 01:47:50 Using max n bins possible = 2. [kfold] 2026-02-05 01:47:50  Outer resampling done. [train] RadialSVM (Support Vector Machine with Radial Kernel) ⟳ Tested using 3 independent folds. Showing mean (sd) across resamples. Sensitivity: 0.920 (0.017) Specificity: 0.949 (0.046) Balanced_Accuracy: 0.935 (0.018) PPV: 0.950 (0.045) NPV: 0.923 (0.013) F1: 0.934 (0.017) Accuracy: 0.935 (0.018) AUC: 0.988 (0.007) Brier_Score: 0.048 (0.007) Showing mean (sd) across resamples. Sensitivity: 0.918 (0.096) Specificity: 0.960 (0.035) Balanced_Accuracy: 0.939 (0.063) PPV: 0.957 (0.038) NPV: 0.925 (0.085) F1: 0.936 (0.067) Accuracy: 0.939 (0.063) AUC: 0.984 (0.014) Brier_Score: 0.058 (0.029) 2026-02-05 01:47:50 ✔ Done in 0.17 seconds. [train] 2026-02-05 01:47:50 ▶ [train] 2026-02-05 01:47:50 Training set: 100 cases x 5 features. [summarize_supervised] 2026-02-05 01:47:50 Tuning parallelization enabled. Disabling outer resampling parallelization. [get_n_workers] 2026-02-05 01:47:50 // Max workers: 1 => Algorithm: 1; Tuning: 1; Outer Resampling: 1 [get_n_workers] 2026-02-05 01:47:50 <> Training RadialSVM Classification using 3 independent folds... [train] 2026-02-05 01:47:50 Input contains more than one column; stratifying on last. [resample] 2026-02-05 01:47:50 Using max n bins possible = 2. [kfold] 2026-02-05 01:47:51  Outer resampling done. [train] RadialSVM (Support Vector Machine with Radial Kernel) ⚙ Tuned using exhaustive grid search. ⟳ Tested using 3 independent folds. Showing mean (sd) across resamples. Sensitivity: 0.990 (0.017) Specificity: 0.960 (0.016) Balanced_Accuracy: 0.975 (0.009) PPV: 0.962 (0.015) NPV: 0.990 (0.017) F1: 0.975 (0.008) Accuracy: 0.975 (0.009) AUC: 0.998 (2.2e-03) Brier_Score: 0.027 (0.006) Showing mean (sd) across resamples. Sensitivity: 0.919 (0.070) Specificity: 0.919 (0.070) Balanced_Accuracy: 0.919 (0.038) PPV: 0.923 (0.067) NPV: 0.923 (0.067) F1: 0.919 (0.038) Accuracy: 0.919 (0.038) AUC: 0.989 (0.011) Brier_Score: 0.053 (0.016) 2026-02-05 01:47:52 ✔ Done in 1.74 seconds. [train] 2026-02-05 01:47:52 ▶ [train] 2026-02-05 01:47:52 Training set: 135 cases x 4 features. [summarize_supervised] 2026-02-05 01:47:52  Test set: 15 cases x 4 features. [summarize_supervised] 2026-02-05 01:47:52 // Max workers: 1 => Algorithm: 1; Tuning: 1; Outer Resampling: 1 [get_n_workers] 2026-02-05 01:47:52 Training RadialSVM Classification... [train] 2026-02-05 01:47:52 Checking data is ready for training... ✔ [check_supervised]  ✔ [one_hot] 2026-02-05 01:47:52 Preprocessing done. [preprocess] <Classification> RadialSVM (Support Vector Machine with Radial Kernel) <Training Classification Metrics>  Predicted  Reference setosa versicolor virginica   setosa 45 0 0  versicolor 0 39 6  virginica 0 7 38 Overall   Balanced_Accuracy 0.904  F1 0.904  Accuracy 0.904 setosa versicolor virginica   Sensitivity 1.000 0.867 0.844  Specificity 1.000 0.922 0.933  Balanced_Accuracy 1.000 0.894 0.889  PPV 1.000 0.848 0.864  NPV 1.000 0.933 0.923  F1 1.000 0.857 0.854 <Test Classification Metrics>  Predicted  Reference setosa versicolor virginica   setosa 5 0 0  versicolor 0 5 0  virginica 0 2 3 Overall   Balanced_Accuracy 0.867  F1 0.861  Accuracy 0.867 setosa versicolor virginica   Sensitivity 1.000 1.000 0.600  Specificity 1.000 0.800 1.000  Balanced_Accuracy 1.000 0.900 0.800  PPV 1.000 0.714 1.000  NPV 1.000 1.000 0.833  F1 1.000 0.833 0.750 2026-02-05 01:47:52 ✔ Done in 0.08 seconds. [train] 2026-02-05 01:47:52 ▶ [train] 2026-02-05 01:47:52 Training set: 358 cases x 6 features. [summarize_supervised] 2026-02-05 01:47:52  Test set: 42 cases x 6 features. [summarize_supervised] 2026-02-05 01:47:52 // Max workers: 1 => Algorithm: 1; Tuning: 1; Outer Resampling: 1 [get_n_workers] 2026-02-05 01:47:52 Training CART Regression... [train] 2026-02-05 01:47:52 Checking data is ready for training... ✔ [check_supervised]  <Regression> CART (Classification and Regression Trees) <Training Regression Metrics>  MAE: 0.84  MSE: 1.13  RMSE: 1.06  Rsq: 0.77 <Test Regression Metrics>  MAE: 1.08  MSE: 1.80  RMSE: 1.34  Rsq: 0.60 2026-02-05 01:47:52 ✔ Done in 0.07 seconds. [train] 2026-02-05 01:47:52 ▶ [train] 2026-02-05 01:47:52 Training set: 358 cases x 6 features. [summarize_supervised] 2026-02-05 01:47:52  Test set: 42 cases x 6 features. [summarize_supervised] 2026-02-05 01:47:52 // Max workers: 1 => Algorithm: 1; Tuning: 1; Outer Resampling: 1 [get_n_workers] 2026-02-05 01:47:52 <> Tuning CART by exhaustive grid search with 5 independent folds... [tune_GridSearch] 2026-02-05 01:47:52 2 parameter combinations x 5 resamples: 10 models total (). [tune_GridSearch] 2026-02-05 01:47:52 Input contains more than one column; stratifying on last. [resample] 2026-02-05 01:47:52 Tuning in sequence [tune_GridSearch] 2026-02-05 01:47:52 Best config to minimize MSE: [tune_GridSearch]  maxdepth: {2, 3} => 3 2026-02-05 01:47:52  Tuning done. [tune_GridSearch] 2026-02-05 01:47:52 Training CART Regression with tuned hyperparameters... [train] 2026-02-05 01:47:52 Checking data is ready for training... ✔ [check_supervised]  <Regression> CART (Classification and Regression Trees) ⚙ Tuned using exhaustive grid search. <Training Regression Metrics>  MAE: 0.93  MSE: 1.43  RMSE: 1.19  Rsq: 0.71 <Test Regression Metrics>  MAE: 1.05  MSE: 1.59  RMSE: 1.26  Rsq: 0.64 2026-02-05 01:47:52 ✔ Done in 0.49 seconds. [train] 2026-02-05 01:47:52 ▶ [train] 2026-02-05 01:47:52 Training set: 400 cases x 6 features. [summarize_supervised] 2026-02-05 01:47:52 // Max workers: 1 => Algorithm: 1; Tuning: 1; Outer Resampling: 1 [get_n_workers] 2026-02-05 01:47:52 <> Training CART Regression using 3 independent folds... [train] 2026-02-05 01:47:52 Input contains more than one column; stratifying on last. [resample] 2026-02-05 01:47:52  Outer resampling done. [train] CART (Classification and Regression Trees) ⟳ Tested using 3 independent folds. Showing mean (sd) across resamples. MAE: 0.772 (0.011) MSE: 0.959 (0.024) RMSE: 0.979 (0.012) Rsq: 0.804 (0.007) Showing mean (sd) across resamples. MAE: 1.051 (0.048) MSE: 1.676 (0.185) RMSE: 1.293 (0.072) Rsq: 0.658 (0.029) 2026-02-05 01:47:52 ✔ Done in 0.19 seconds. [train] 2026-02-05 01:47:52 ▶ [train] 2026-02-05 01:47:52 Training set: 400 cases x 6 features. [summarize_supervised] 2026-02-05 01:47:52 Tuning parallelization enabled. Disabling outer resampling parallelization. [get_n_workers] 2026-02-05 01:47:52 // Max workers: 1 => Algorithm: 1; Tuning: 1; Outer Resampling: 1 [get_n_workers] 2026-02-05 01:47:52 <> Training CART Regression using 3 independent folds... [train] 2026-02-05 01:47:52 Input contains more than one column; stratifying on last. [resample] 2026-02-05 01:47:55  Outer resampling done. [train] CART (Classification and Regression Trees) ⚙ Tuned using exhaustive grid search. ⟳ Tested using 3 independent folds. Showing mean (sd) across resamples. MAE: 1.126 (0.049) MSE: 1.954 (0.132) RMSE: 1.397 (0.047) Rsq: 0.601 (0.030) Showing mean (sd) across resamples. MAE: 1.228 (0.060) MSE: 2.308 (0.222) RMSE: 1.518 (0.074) Rsq: 0.527 (0.053) 2026-02-05 01:47:55 ✔ Done in 2.58 seconds. [train] 2026-02-05 01:47:55 ▶ [train] 2026-02-05 01:47:55 Training set: 400 cases x 6 features. [summarize_supervised] 2026-02-05 01:47:55 Tuning parallelization enabled. Disabling outer resampling parallelization. [get_n_workers] 2026-02-05 01:47:55 // Max workers: 1 => Algorithm: 1; Tuning: 1; Outer Resampling: 1 [get_n_workers] 2026-02-05 01:47:55 <> Training CART Regression using 3 independent folds... [train] 2026-02-05 01:47:55 Input contains more than one column; stratifying on last. [resample] 2026-02-05 01:47:56  Outer resampling done. [train] CART (Classification and Regression Trees) ⚙ Tuned using exhaustive grid search. ⟳ Tested using 3 independent folds. Showing mean (sd) across resamples. MAE: 0.800 (0.061) MSE: 0.998 (0.121) RMSE: 0.998 (0.061) Rsq: 0.797 (0.016) Showing mean (sd) across resamples. MAE: 1.024 (0.061) MSE: 1.648 (0.199) RMSE: 1.282 (0.079) Rsq: 0.664 (0.037) 2026-02-05 01:47:56 ✔ Done in 1.46 seconds. [train] 2026-02-05 01:47:56 ▶ [train] 2026-02-05 01:47:56 Training set: 90 cases x 5 features. [summarize_supervised] 2026-02-05 01:47:56  Test set: 10 cases x 5 features. [summarize_supervised] 2026-02-05 01:47:56 // Max workers: 1 => Algorithm: 1; Tuning: 1; Outer Resampling: 1 [get_n_workers] 2026-02-05 01:47:56 <> Tuning CART by exhaustive grid search with 5 independent folds... [tune_GridSearch] 2026-02-05 01:47:56 2 parameter combinations x 5 resamples: 10 models total (). [tune_GridSearch] 2026-02-05 01:47:56 Input contains more than one column; stratifying on last. [resample] 2026-02-05 01:47:56 Using max n bins possible = 2. [kfold] 2026-02-05 01:47:56 Tuning in sequence [tune_GridSearch] 2026-02-05 01:47:57 Best config to maximize Balanced_Accuracy: [tune_GridSearch]  maxdepth: {1, 2} => 2 2026-02-05 01:47:57  Tuning done. [tune_GridSearch] 2026-02-05 01:47:57 Training CART Classification with tuned hyperparameters... [train] 2026-02-05 01:47:57 Checking data is ready for training... ✔ [check_supervised]  <Classification> CART (Classification and Regression Trees) ⚙ Tuned using exhaustive grid search. <Training Classification Metrics>  Predicted  Reference virginica versicolor   virginica 45 0  versicolor 2 43 Overall   Sensitivity 1.000  Specificity 0.956  Balanced_Accuracy 0.978  PPV 0.957  NPV 1.000  F1 0.978  Accuracy 0.978  AUC 0.987  Brier_Score 0.020 Positive Class virginica <Test Classification Metrics>  Predicted  Reference virginica versicolor   virginica 5 0  versicolor 1 4 Overall   Sensitivity 1.000  Specificity 0.800  Balanced_Accuracy 0.900  PPV 0.833  NPV 1.000  F1 0.909  Accuracy 0.900  AUC 0.900  Brier_Score 0.096 Positive Class virginica 2026-02-05 01:47:57 ✔ Done in 0.49 seconds. [train] 2026-02-05 01:47:57 ▶ [train] 2026-02-05 01:47:57 Training set: 90 cases x 5 features. [summarize_supervised] 2026-02-05 01:47:57  Test set: 10 cases x 5 features. [summarize_supervised] 2026-02-05 01:47:57 // Max workers: 1 => Algorithm: 1; Tuning: 1; Outer Resampling: 1 [get_n_workers] 2026-02-05 01:47:57 Calculating case weights using Inverse Frequency Weighting. [ifw] 2026-02-05 01:47:57 Training CART Classification... [train] 2026-02-05 01:47:57 Checking data is ready for training... ✔ [check_supervised]  <Classification> CART (Classification and Regression Trees) <Training Classification Metrics>  Predicted  Reference virginica versicolor   virginica 45 0  versicolor 0 45 Overall   Sensitivity 1.000  Specificity 1.000  Balanced_Accuracy 1.000  PPV 1.000  NPV 1.000  F1 1.000  Accuracy 1.000  AUC 1.000  Brier_Score 0.000 Positive Class virginica <Test Classification Metrics>  Predicted  Reference virginica versicolor   virginica 5 0  versicolor 1 4 Overall   Sensitivity 1.000  Specificity 0.800  Balanced_Accuracy 0.900  PPV 0.833  NPV 1.000  F1 0.909  Accuracy 0.900  AUC 0.900  Brier_Score 0.100 Positive Class virginica 2026-02-05 01:47:57 ✔ Done in 0.08 seconds. [train] 2026-02-05 01:47:57 ▶ [train] 2026-02-05 01:47:57 Training set: 90 cases x 5 features. [summarize_supervised] 2026-02-05 01:47:57  Test set: 10 cases x 5 features. [summarize_supervised] 2026-02-05 01:47:57 // Max workers: 1 => Algorithm: 1; Tuning: 1; Outer Resampling: 1 [get_n_workers] 2026-02-05 01:47:57 <> Tuning CART by exhaustive grid search with 5 independent folds... [tune_GridSearch] 2026-02-05 01:47:57 2 parameter combinations x 5 resamples: 10 models total (). [tune_GridSearch] 2026-02-05 01:47:57 Input contains more than one column; stratifying on last. [resample] 2026-02-05 01:47:57 Using max n bins possible = 2. [kfold] 2026-02-05 01:47:57 Tuning in sequence [tune_GridSearch] 2026-02-05 01:47:57 Best config to maximize Balanced_Accuracy: [tune_GridSearch]  maxdepth: {1, 2} => 2 2026-02-05 01:47:57  Tuning done. [tune_GridSearch] 2026-02-05 01:47:57 Training CART Classification with tuned hyperparameters... [train] 2026-02-05 01:47:57 Checking data is ready for training... ✔ [check_supervised]  <Classification> CART (Classification and Regression Trees) ⚙ Tuned using exhaustive grid search. <Training Classification Metrics>  Predicted  Reference virginica versicolor   virginica 45 0  versicolor 2 43 Overall   Sensitivity 1.000  Specificity 0.956  Balanced_Accuracy 0.978  PPV 0.957  NPV 1.000  F1 0.978  Accuracy 0.978  AUC 0.987  Brier_Score 0.020 Positive Class virginica <Test Classification Metrics>  Predicted  Reference virginica versicolor   virginica 5 0  versicolor 1 4 Overall   Sensitivity 1.000  Specificity 0.800  Balanced_Accuracy 0.900  PPV 0.833  NPV 1.000  F1 0.909  Accuracy 0.900  AUC 0.900  Brier_Score 0.096 Positive Class virginica 2026-02-05 01:47:58 ✔ Done in 0.50 seconds. [train] 2026-02-05 01:47:58 ▶ [train] 2026-02-05 01:47:58 Training set: 100 cases x 5 features. [summarize_supervised] 2026-02-05 01:47:58 Tuning parallelization enabled. Disabling outer resampling parallelization. [get_n_workers] 2026-02-05 01:47:58 // Max workers: 1 => Algorithm: 1; Tuning: 1; Outer Resampling: 1 [get_n_workers] 2026-02-05 01:47:58 <> Training CART Classification using 3 independent folds... [train] 2026-02-05 01:47:58 Input contains more than one column; stratifying on last. [resample] 2026-02-05 01:47:58 Using max n bins possible = 2. [kfold] 2026-02-05 01:47:59  Outer resampling done. [train] CART (Classification and Regression Trees) ⚙ Tuned using exhaustive grid search. ⟳ Tested using 3 independent folds. Showing mean (sd) across resamples. Sensitivity: 0.970 (0.052) Specificity: 0.970 (5.1e-04) Balanced_Accuracy: 0.970 (0.026) PPV: 0.970 (1.9e-03) NPV: 0.971 (0.049) F1: 0.969 (0.028) Accuracy: 0.970 (0.026) AUC: 0.974 (0.031) Brier_Score: 0.028 (0.025) Showing mean (sd) across resamples. Sensitivity: 0.902 (0.090) Specificity: 0.958 (0.072) Balanced_Accuracy: 0.930 (0.016) PPV: 0.963 (0.064) NPV: 0.915 (0.077) F1: 0.927 (0.021) Accuracy: 0.930 (0.016) AUC: 0.930 (0.016) Brier_Score: 0.068 (0.018) 2026-02-05 01:47:59 ✔ Done in 1.41 seconds. [train] 2026-02-05 01:47:59 ▶ [train] 2026-02-05 01:47:59 Training set: 135 cases x 4 features. [summarize_supervised] 2026-02-05 01:47:59  Test set: 15 cases x 4 features. [summarize_supervised] 2026-02-05 01:47:59 // Max workers: 1 => Algorithm: 1; Tuning: 1; Outer Resampling: 1 [get_n_workers] 2026-02-05 01:47:59 Training CART Classification... [train] 2026-02-05 01:47:59 Checking data is ready for training... ✔ [check_supervised]  <Classification> CART (Classification and Regression Trees) <Training Classification Metrics>  Predicted  Reference setosa versicolor virginica   setosa 45 0 0  versicolor 0 44 1  virginica 0 0 45 Overall   Balanced_Accuracy 0.993  F1 0.993  Accuracy 0.993 setosa versicolor virginica   Sensitivity 1.000 0.978 1.000  Specificity 1.000 1.000 0.989  Balanced_Accuracy 1.000 0.989 0.994  PPV 1.000 1.000 0.978  NPV 1.000 0.989 1.000  F1 1.000 0.989 0.989 <Test Classification Metrics>  Predicted  Reference setosa versicolor virginica   setosa 5 0 0  versicolor 0 5 0  virginica 0 0 5 Overall   Balanced_Accuracy 1.000  F1 1.000  Accuracy 1.000 setosa versicolor virginica   Sensitivity 1.000 1.000 1.000  Specificity 1.000 1.000 1.000  Balanced_Accuracy 1.000 1.000 1.000  PPV 1.000 1.000 1.000  NPV 1.000 1.000 1.000  F1 1.000 1.000 1.000 2026-02-05 01:47:59 ✔ Done in 0.06 seconds. [train] 2026-02-05 01:47:59 ▶ [train] 2026-02-05 01:47:59 Training set: 358 cases x 6 features. [summarize_supervised] 2026-02-05 01:47:59  Test set: 42 cases x 6 features. [summarize_supervised] 2026-02-05 01:47:59 // Max workers: 1 => Algorithm: 1; Tuning: 1; Outer Resampling: 1 [get_n_workers] 2026-02-05 01:47:59 Training LightCART Regression... [train] 2026-02-05 01:47:59 Checking data is ready for training... ✔ [check_supervised]  <Regression> LightCART (Decision Tree) <Training Regression Metrics>  MAE: 1.66  MSE: 4.24  RMSE: 2.06  Rsq: 0.15 <Test Regression Metrics>  MAE: 1.58  MSE: 3.75  RMSE: 1.94  Rsq: 0.16 2026-02-05 01:47:59 ✔ Done in 0.18 seconds. [train] 2026-02-05 01:47:59 ▶ [train] 2026-02-05 01:47:59 Training set: 358 cases x 6 features. [summarize_supervised] 2026-02-05 01:47:59  Test set: 42 cases x 6 features. [summarize_supervised] 2026-02-05 01:47:59 // Max workers: 1 => Algorithm: 1; Tuning: 1; Outer Resampling: 1 [get_n_workers] 2026-02-05 01:47:59 Training LightCART Regression... [train] 2026-02-05 01:47:59 Checking data is ready for training... ✔ [check_supervised]  <Regression> LightCART (Decision Tree) <Training Regression Metrics>  MAE: 1.66  MSE: 4.24  RMSE: 2.06  Rsq: 0.15 <Test Regression Metrics>  MAE: 1.58  MSE: 3.75  RMSE: 1.94  Rsq: 0.16 2026-02-05 01:47:59 ✔ Done in 0.06 seconds. [train] 2026-02-05 01:47:59 ▶ [train] 2026-02-05 01:47:59 Training set: 90 cases x 5 features. [summarize_supervised] 2026-02-05 01:47:59  Test set: 10 cases x 5 features. [summarize_supervised] 2026-02-05 01:47:59 // Max workers: 1 => Algorithm: 1; Tuning: 1; Outer Resampling: 1 [get_n_workers] 2026-02-05 01:47:59 Training LightCART Classification... [train] 2026-02-05 01:47:59 Checking data is ready for training... ✔ [check_supervised]  <Classification> LightCART (Decision Tree) <Training Classification Metrics>  Predicted  Reference virginica versicolor   virginica 0 45  versicolor 0 45 Overall   Sensitivity 0.000  Specificity 1.000  Balanced_Accuracy 0.500  PPV NA  NPV 0.500  F1 NA  Accuracy 0.500  AUC 0.500  Brier_Score 0.500 Positive Class virginica <Test Classification Metrics>  Predicted  Reference virginica versicolor   virginica 0 5  versicolor 0 5 Overall   Sensitivity 0.000  Specificity 1.000  Balanced_Accuracy 0.500  PPV NA  NPV 0.500  F1 NA  Accuracy 0.500  AUC 0.500  Brier_Score 0.500 Positive Class virginica 2026-02-05 01:47:59 ✔ Done in 0.08 seconds. [train] 2026-02-05 01:47:59 ▶ [train] 2026-02-05 01:47:59 Training set: 135 cases x 4 features. [summarize_supervised] 2026-02-05 01:47:59  Test set: 15 cases x 4 features. [summarize_supervised] 2026-02-05 01:47:59 // Max workers: 1 => Algorithm: 1; Tuning: 1; Outer Resampling: 1 [get_n_workers] 2026-02-05 01:47:59 Training LightCART Classification... [train] 2026-02-05 01:47:59 Checking data is ready for training... ✔ [check_supervised]  <Classification> LightCART (Decision Tree) <Training Classification Metrics>  Predicted  Reference setosa versicolor virginica   setosa 44 1 0  versicolor 0 43 2  virginica 0 3 42 Overall   Balanced_Accuracy 0.956  F1 0.956  Accuracy 0.956 setosa versicolor virginica   Sensitivity 0.978 0.956 0.933  Specificity 1.000 0.956 0.978  Balanced_Accuracy 0.989 0.956 0.956  PPV 1.000 0.915 0.955  NPV 0.989 0.977 0.967  F1 0.989 0.935 0.944 <Test Classification Metrics>  Predicted  Reference setosa versicolor virginica   setosa 5 0 0  versicolor 0 5 0  virginica 0 1 4 Overall   Balanced_Accuracy 0.933  F1 0.933  Accuracy 0.933 setosa versicolor virginica   Sensitivity 1.000 1.000 0.800  Specificity 1.000 0.900 1.000  Balanced_Accuracy 1.000 0.950 0.900  PPV 1.000 0.833 1.000  NPV 1.000 1.000 0.909  F1 1.000 0.909 0.889 2026-02-05 01:47:59 ✔ Done in 0.11 seconds. [train] 2026-02-05 01:47:59 ▶ [train] 2026-02-05 01:47:59 Training set: 358 cases x 6 features. [summarize_supervised] 2026-02-05 01:47:59  Test set: 42 cases x 6 features. [summarize_supervised] 2026-02-05 01:47:59 // Max workers: 1 => Algorithm: 1; Tuning: 1; Outer Resampling: 1 [get_n_workers] 2026-02-05 01:47:59 Training LightRF Regression... [train] 2026-02-05 01:47:59 Checking data is ready for training... ✔ [check_supervised]  <Regression> LightRF (LightGBM Random Forest) <Training Regression Metrics>  MAE: 0.92  MSE: 1.34  RMSE: 1.16  Rsq: 0.73 <Test Regression Metrics>  MAE: 0.89  MSE: 1.18  RMSE: 1.09  Rsq: 0.73 2026-02-05 01:48:00 ✔ Done in 0.21 seconds. [train] 2026-02-05 01:48:00 ▶ [train] 2026-02-05 01:48:00 Training set: 358 cases x 6 features. [summarize_supervised] 2026-02-05 01:48:00  Test set: 42 cases x 6 features. [summarize_supervised] 2026-02-05 01:48:00 // Max workers: 1 => Algorithm: 1; Tuning: 1; Outer Resampling: 1 [get_n_workers] 2026-02-05 01:48:00 <> Tuning LightRF by exhaustive grid search with 5 independent folds... [tune_GridSearch] 2026-02-05 01:48:00 2 parameter combinations x 5 resamples: 10 models total (). [tune_GridSearch] 2026-02-05 01:48:00 Input contains more than one column; stratifying on last. [resample] 2026-02-05 01:48:00 Tuning in sequence [tune_GridSearch] 2026-02-05 01:48:01 Best config to minimize MSE: [tune_GridSearch]  lambda_l1: {0, 0.1} => 0 2026-02-05 01:48:01  Tuning done. [tune_GridSearch] 2026-02-05 01:48:01 Training LightRF Regression with tuned hyperparameters... [train] 2026-02-05 01:48:01 Checking data is ready for training... ✔ [check_supervised]  <Regression> LightRF (LightGBM Random Forest) ⚙ Tuned using exhaustive grid search. <Training Regression Metrics>  MAE: 0.92  MSE: 1.33  RMSE: 1.15  Rsq: 0.73 <Test Regression Metrics>  MAE: 0.88  MSE: 1.17  RMSE: 1.08  Rsq: 0.74 2026-02-05 01:48:01 ✔ Done in 1.64 seconds. [train] 2026-02-05 01:48:01 ▶ [train] 2026-02-05 01:48:01 Training set: 400 cases x 6 features. [summarize_supervised] 2026-02-05 01:48:01 // Max workers: 1 => Algorithm: 1; Tuning: 1; Outer Resampling: 1 [get_n_workers] 2026-02-05 01:48:01 <> Training LightRF Regression using 3 independent folds... [train] 2026-02-05 01:48:01 Input contains more than one column; stratifying on last. [resample] 2026-02-05 01:48:06  Outer resampling done. [train] LightRF (LightGBM Random Forest) ⚙ Tuned using exhaustive grid search. ⟳ Tested using 3 independent folds. Showing mean (sd) across resamples. MAE: 0.965 (0.007) MSE: 1.460 (0.034) RMSE: 1.208 (0.014) Rsq: 0.702 (0.006) Showing mean (sd) across resamples. MAE: 1.039 (0.044) MSE: 1.657 (0.161) RMSE: 1.286 (0.063) Rsq: 0.662 (0.030) 2026-02-05 01:48:06 ✔ Done in 4.79 seconds. [train] 2026-02-05 01:48:06 ▶ [train] 2026-02-05 01:48:06 Training set: 90 cases x 5 features. [summarize_supervised] 2026-02-05 01:48:06  Test set: 10 cases x 5 features. [summarize_supervised] 2026-02-05 01:48:06 // Max workers: 1 => Algorithm: 1; Tuning: 1; Outer Resampling: 1 [get_n_workers] 2026-02-05 01:48:06 Training LightRF Classification... [train] 2026-02-05 01:48:06 Checking data is ready for training... ✔ [check_supervised]  <Classification> LightRF (LightGBM Random Forest) <Training Classification Metrics>  Predicted  Reference virginica versicolor   virginica 41 4  versicolor 1 44 Overall   Sensitivity 0.911  Specificity 0.978  Balanced_Accuracy 0.944  PPV 0.976  NPV 0.917  F1 0.943  Accuracy 0.944  AUC 0.996  Brier_Score 0.048 Positive Class virginica <Test Classification Metrics>  Predicted  Reference virginica versicolor   virginica 5 0  versicolor 1 4 Overall   Sensitivity 1.000  Specificity 0.800  Balanced_Accuracy 0.900  PPV 0.833  NPV 1.000  F1 0.909  Accuracy 0.900  AUC 0.900  Brier_Score 0.092 Positive Class virginica 2026-02-05 01:48:06 ✔ Done in 0.14 seconds. [train] 2026-02-05 01:48:06 ▶ [train] 2026-02-05 01:48:06 Training set: 90 cases x 5 features. [summarize_supervised] 2026-02-05 01:48:06  Test set: 10 cases x 5 features. [summarize_supervised] 2026-02-05 01:48:06 // Max workers: 1 => Algorithm: 1; Tuning: 1; Outer Resampling: 1 [get_n_workers] 2026-02-05 01:48:06 <> Tuning LightRF by exhaustive grid search with 5 independent folds... [tune_GridSearch] 2026-02-05 01:48:06 2 parameter combinations x 5 resamples: 10 models total (). [tune_GridSearch] 2026-02-05 01:48:06 Input contains more than one column; stratifying on last. [resample] 2026-02-05 01:48:06 Using max n bins possible = 2. [kfold] 2026-02-05 01:48:06 Tuning in sequence [tune_GridSearch] 2026-02-05 01:48:07 Best config to maximize Balanced_Accuracy: [tune_GridSearch]  max_depth: {-1, 5} => -1 2026-02-05 01:48:07  Tuning done. [tune_GridSearch] 2026-02-05 01:48:07 Training LightRF Classification with tuned hyperparameters... [train] 2026-02-05 01:48:07 Checking data is ready for training... ✔ [check_supervised]  <Classification> LightRF (LightGBM Random Forest) ⚙ Tuned using exhaustive grid search. <Training Classification Metrics>  Predicted  Reference virginica versicolor   virginica 41 4  versicolor 1 44 Overall   Sensitivity 0.911  Specificity 0.978  Balanced_Accuracy 0.944  PPV 0.976  NPV 0.917  F1 0.943  Accuracy 0.944  AUC 0.996  Brier_Score 0.048 Positive Class virginica <Test Classification Metrics>  Predicted  Reference virginica versicolor   virginica 5 0  versicolor 1 4 Overall   Sensitivity 1.000  Specificity 0.800  Balanced_Accuracy 0.900  PPV 0.833  NPV 1.000  F1 0.909  Accuracy 0.900  AUC 0.900  Brier_Score 0.092 Positive Class virginica 2026-02-05 01:48:08 ✔ Done in 1.23 seconds. [train] 2026-02-05 01:48:08 ▶ [train] 2026-02-05 01:48:08 Training set: 100 cases x 5 features. [summarize_supervised] 2026-02-05 01:48:08 // Max workers: 1 => Algorithm: 1; Tuning: 1; Outer Resampling: 1 [get_n_workers] 2026-02-05 01:48:08 <> Training LightRF Classification using 3 independent folds... [train] 2026-02-05 01:48:08 Input contains more than one column; stratifying on last. [resample] 2026-02-05 01:48:08 Using max n bins possible = 2. [kfold] 2026-02-05 01:48:08  Outer resampling done. [train] LightRF (LightGBM Random Forest) ⟳ Tested using 3 independent folds. Showing mean (sd) across resamples. Sensitivity: 0.950 (0.018) Specificity: 0.970 (0.030) Balanced_Accuracy: 0.960 (0.018) PPV: 0.970 (0.030) NPV: 0.951 (0.017) F1: 0.960 (0.018) Accuracy: 0.960 (0.018) AUC: 0.987 (0.008) Brier_Score: 0.106 (0.035) Showing mean (sd) across resamples. Sensitivity: 0.860 (0.089) Specificity: 0.940 (0.059) Balanced_Accuracy: 0.900 (0.016) PPV: 0.941 (0.056) NPV: 0.876 (0.064) F1: 0.895 (0.025) Accuracy: 0.900 (0.016) AUC: 0.985 (0.013) Brier_Score: 0.122 (0.024) 2026-02-05 01:48:08 ✔ Done in 0.41 seconds. [train] 2026-02-05 01:48:08 ▶ [train] 2026-02-05 01:48:08 Training set: 135 cases x 4 features. [summarize_supervised] 2026-02-05 01:48:08  Test set: 15 cases x 4 features. [summarize_supervised] 2026-02-05 01:48:08 // Max workers: 1 => Algorithm: 1; Tuning: 1; Outer Resampling: 1 [get_n_workers] 2026-02-05 01:48:08 Training LightRF Classification... [train] 2026-02-05 01:48:08 Checking data is ready for training... ✔ [check_supervised]  <Classification> LightRF (LightGBM Random Forest) <Training Classification Metrics>  Predicted  Reference setosa versicolor virginica   setosa 45 0 0  versicolor 1 42 2  virginica 0 3 42 Overall   Balanced_Accuracy 0.956  F1 0.955  Accuracy 0.956 setosa versicolor virginica   Sensitivity 1.000 0.933 0.933  Specificity 0.989 0.967 0.978  Balanced_Accuracy 0.994 0.950 0.956  PPV 0.978 0.933 0.955  NPV 1.000 0.967 0.967  F1 0.989 0.933 0.944 <Test Classification Metrics>  Predicted  Reference setosa versicolor virginica   setosa 5 0 0  versicolor 0 5 0  virginica 0 1 4 Overall   Balanced_Accuracy 0.933  F1 0.933  Accuracy 0.933 setosa versicolor virginica   Sensitivity 1.000 1.000 0.800  Specificity 1.000 0.900 1.000  Balanced_Accuracy 1.000 0.950 0.900  PPV 1.000 0.833 1.000  NPV 1.000 1.000 0.909  F1 1.000 0.909 0.889 2026-02-05 01:48:08 ✔ Done in 0.22 seconds. [train] 2026-02-05 01:48:08 ▶ [train] 2026-02-05 01:48:08 Training set: 358 cases x 6 features. [summarize_supervised] 2026-02-05 01:48:08  Test set: 42 cases x 6 features. [summarize_supervised] 2026-02-05 01:48:08 // Max workers: 1 => Algorithm: 1; Tuning: 1; Outer Resampling: 1 [get_n_workers] 2026-02-05 01:48:08 Training LightGBM Regression... [train] 2026-02-05 01:48:08 Checking data is ready for training... ✔ [check_supervised]  <Regression> LightGBM (Gradient Boosting) <Training Regression Metrics>  MAE: 1.29  MSE: 2.63  RMSE: 1.62  Rsq: 0.47 <Test Regression Metrics>  MAE: 1.20  MSE: 2.33  RMSE: 1.53  Rsq: 0.48 2026-02-05 01:48:08 ✔ Done in 0.19 seconds. [train] 2026-02-05 01:48:08 ▶ [train] 2026-02-05 01:48:08 Training set: 358 cases x 6 features. [summarize_supervised] 2026-02-05 01:48:08  Test set: 42 cases x 6 features. [summarize_supervised] 2026-02-05 01:48:08 // Max workers: 1 => Algorithm: 1; Tuning: 1; Outer Resampling: 1 [get_n_workers] 2026-02-05 01:48:08 <> Tuning LightGBM by exhaustive grid search with 5 independent folds... [tune_GridSearch] 2026-02-05 01:48:08 1 parameter combination x 5 resamples: 5 models total (). [tune_GridSearch] 2026-02-05 01:48:08 Input contains more than one column; stratifying on last. [resample] 2026-02-05 01:48:08 Tuning in sequence [tune_GridSearch] \ 2/5 ETA: 5s | Tuning... (5 combinations) 2026-02-05 01:48:15 Best config to minimize MSE: [tune_GridSearch]  nrounds: {} => 424 2026-02-05 01:48:15  Tuning done. [tune_GridSearch] 2026-02-05 01:48:15 Training LightGBM Regression with tuned hyperparameters... [train] 2026-02-05 01:48:15 Checking data is ready for training... ✔ [check_supervised]  <Regression> LightGBM (Gradient Boosting) ⚙ Tuned using exhaustive grid search. <Training Regression Metrics>  MAE: 0.60  MSE: 0.59  RMSE: 0.77  Rsq: 0.88 <Test Regression Metrics>  MAE: 0.80  MSE: 1.19  RMSE: 1.09  Rsq: 0.73 2026-02-05 01:48:16 ✔ Done in 7.83 seconds. [train] 2026-02-05 01:48:16 ▶ [train] 2026-02-05 01:48:16 Training set: 358 cases x 6 features. [summarize_supervised] 2026-02-05 01:48:16 // Max workers: 1 => Algorithm: 1; Tuning: 1; Outer Resampling: 1 [get_n_workers] 2026-02-05 01:48:16 <> Training LightGBM Regression using 3 independent folds... [train] 2026-02-05 01:48:16 Input contains more than one column; stratifying on last. [resample] 2026-02-05 01:48:20  Outer resampling done. [train] LightGBM (Gradient Boosting) ⚙ Tuned using exhaustive grid search. ⟳ Tested using 3 independent folds. Showing mean (sd) across resamples. MAE: 1.285 (0.021) MSE: 2.617 (0.034) RMSE: 1.618 (0.011) Rsq: 0.472 (4.1e-03) Showing mean (sd) across resamples. MAE: 1.353 (0.012) MSE: 2.834 (0.047) RMSE: 1.683 (0.014) Rsq: 0.428 (0.016) 2026-02-05 01:48:20 ✔ Done in 3.43 seconds. [train] 2026-02-05 01:48:20 ▶ [train] 2026-02-05 01:48:20 Training set: 90 cases x 5 features. [summarize_supervised] 2026-02-05 01:48:20  Test set: 10 cases x 5 features. [summarize_supervised] 2026-02-05 01:48:20 // Max workers: 1 => Algorithm: 1; Tuning: 1; Outer Resampling: 1 [get_n_workers] 2026-02-05 01:48:20 <> Tuning LightGBM by exhaustive grid search with 3 independent folds... [tune_GridSearch] 2026-02-05 01:48:20 1 parameter combination x 3 resamples: 3 models total (). [tune_GridSearch] 2026-02-05 01:48:20 Input contains more than one column; stratifying on last. [resample] 2026-02-05 01:48:20 Using max n bins possible = 2. [kfold] 2026-02-05 01:48:20 Tuning in sequence [tune_GridSearch] 2026-02-05 01:48:22 Best config to maximize Balanced_Accuracy: [tune_GridSearch]  nrounds: {} => 382 2026-02-05 01:48:22  Tuning done. [tune_GridSearch] 2026-02-05 01:48:22 Training LightGBM Classification with tuned hyperparameters... [train] 2026-02-05 01:48:22 Checking data is ready for training... ✔ [check_supervised] [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf <Classification> LightGBM (Gradient Boosting) ⚙ Tuned using exhaustive grid search. <Training Classification Metrics>  Predicted  Reference virginica versicolor   virginica 45 0  versicolor 2 43 Overall   Sensitivity 1.000  Specificity 0.956  Balanced_Accuracy 0.978  PPV 0.957  NPV 1.000  F1 0.978  Accuracy 0.978  AUC 1.000  Brier_Score 0.017 Positive Class virginica <Test Classification Metrics>  Predicted  Reference virginica versicolor   virginica 5 0  versicolor 1 4 Overall   Sensitivity 1.000  Specificity 0.800  Balanced_Accuracy 0.900  PPV 0.833  NPV 1.000  F1 0.909  Accuracy 0.900  AUC 1.000  Brier_Score 0.097 Positive Class virginica 2026-02-05 01:48:22 ✔ Done in 2.52 seconds. [train] 2026-02-05 01:48:22 ▶ [train] 2026-02-05 01:48:22 Training set: 135 cases x 4 features. [summarize_supervised] 2026-02-05 01:48:22  Test set: 15 cases x 4 features. [summarize_supervised] 2026-02-05 01:48:22 // Max workers: 1 => Algorithm: 1; Tuning: 1; Outer Resampling: 1 [get_n_workers] 2026-02-05 01:48:22 Training LightGBM Classification... [train] 2026-02-05 01:48:22 Checking data is ready for training... ✔ [check_supervised] [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf <Classification> LightGBM (Gradient Boosting) <Training Classification Metrics>  Predicted  Reference setosa versicolor virginica   setosa 45 0 0  versicolor 0 43 2  virginica 0 3 42 Overall   Balanced_Accuracy 0.963  F1 0.963  Accuracy 0.963 setosa versicolor virginica   Sensitivity 1.000 0.956 0.933  Specificity 1.000 0.967 0.978  Balanced_Accuracy 1.000 0.961 0.956  PPV 1.000 0.935 0.955  NPV 1.000 0.978 0.967  F1 1.000 0.945 0.944 <Test Classification Metrics>  Predicted  Reference setosa versicolor virginica   setosa 5 0 0  versicolor 0 5 0  virginica 0 1 4 Overall   Balanced_Accuracy 0.933  F1 0.933  Accuracy 0.933 setosa versicolor virginica   Sensitivity 1.000 1.000 0.800  Specificity 1.000 0.900 1.000  Balanced_Accuracy 1.000 0.950 0.900  PPV 1.000 0.833 1.000  NPV 1.000 1.000 0.909  F1 1.000 0.909 0.889 2026-02-05 01:48:22 ✔ Done in 0.19 seconds. [train] 2026-02-05 01:48:22 ▶ [train] 2026-02-05 01:48:22 Training set: 358 cases x 6 features. [summarize_supervised] 2026-02-05 01:48:22  Test set: 42 cases x 6 features. [summarize_supervised] 2026-02-05 01:48:22 // Max workers: 1 => Algorithm: 1; Tuning: 1; Outer Resampling: 1 [get_n_workers] 2026-02-05 01:48:22 Training LightRuleFit Regression... [train] 2026-02-05 01:48:22 Checking data is ready for training... ✔ [check_supervised] 2026-02-05 01:48:22 ▶ [train] 2026-02-05 01:48:22 Training set: 358 cases x 6 features. [summarize_supervised] 2026-02-05 01:48:22 // Max workers: 1 => Algorithm: 1; Tuning: 1; Outer Resampling: 1 [get_n_workers] 2026-02-05 01:48:22 Training LightGBM Regression... [train] 2026-02-05 01:48:22 Checking data is ready for training... ✔ [check_supervised] [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf <Regression> LightGBM (Gradient Boosting) <Training Regression Metrics>  MAE: 0.78  MSE: 0.96  RMSE: 0.98  Rsq: 0.81 2026-02-05 01:48:23 ✔ Done in 0.17 seconds. [train] 2026-02-05 01:48:23 Extracting LightGBM rules... ✔ [extract_rules] 2026-02-05 01:48:23 Extracted 185 unique rules. [extract_rules] 2026-02-05 01:48:23 Matching185rules to358cases... ✔ [match_cases_by_rules] 2026-02-05 01:48:23 ▶ [train] 2026-02-05 01:48:23 Training set: 358 cases x 185 features. [summarize_supervised] 2026-02-05 01:48:23 // Max workers: 1 => Algorithm: 1; Tuning: 1; Outer Resampling: 1 [get_n_workers] 2026-02-05 01:48:23 <> Tuning GLMNET by exhaustive grid search with 5 independent folds... [tune_GridSearch] 2026-02-05 01:48:23 1 parameter combination x 5 resamples: 5 models total (). [tune_GridSearch] 2026-02-05 01:48:23 Input contains more than one column; stratifying on last. [resample] 2026-02-05 01:48:23 Tuning in sequence [tune_GridSearch] 2026-02-05 01:48:26 Best config to minimize MSE: [tune_GridSearch]  lambda: {} => 0.122506762566697 2026-02-05 01:48:26  Tuning done. [tune_GridSearch] 2026-02-05 01:48:26 Training GLMNET Regression with tuned hyperparameters... [train] 2026-02-05 01:48:26 Checking data is ready for training... ✔ [check_supervised]  <Regression> GLMNET (Elastic Net) ⚙ Tuned using exhaustive grid search. <Training Regression Metrics>  MAE: 0.72  MSE: 0.83  RMSE: 0.91  Rsq: 0.83 2026-02-05 01:48:26 ✔ Done in 3.25 seconds. [train] 2026-02-05 01:48:26 Matching185rules to358cases... ✔ [match_cases_by_rules] 2026-02-05 01:48:26 Matching185rules to42cases... ✔ [match_cases_by_rules]  <Regression> LightRuleFit (LightGBM RuleFit) <Training Regression Metrics>  MAE: 0.72  MSE: 0.83  RMSE: 0.91  Rsq: 0.83 <Test Regression Metrics>  MAE: 0.85  MSE: 1.22  RMSE: 1.10  Rsq: 0.73 2026-02-05 01:48:26 ✔ Done in 3.79 seconds. [train] 2026-02-05 01:48:26 ▶ [train] 2026-02-05 01:48:26 Training set: 90 cases x 5 features. [summarize_supervised] 2026-02-05 01:48:26  Test set: 10 cases x 5 features. [summarize_supervised] 2026-02-05 01:48:26 // Max workers: 1 => Algorithm: 1; Tuning: 1; Outer Resampling: 1 [get_n_workers] 2026-02-05 01:48:26 Training LightRuleFit Classification... [train] 2026-02-05 01:48:26 Checking data is ready for training... ✔ [check_supervised] 2026-02-05 01:48:26 ▶ [train] 2026-02-05 01:48:26 Training set: 90 cases x 5 features. [summarize_supervised] 2026-02-05 01:48:26 // Max workers: 1 => Algorithm: 1; Tuning: 1; Outer Resampling: 1 [get_n_workers] 2026-02-05 01:48:26 Training LightGBM Classification... [train] 2026-02-05 01:48:26 Checking data is ready for training... ✔ [check_supervised] [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf <Classification> LightGBM (Gradient Boosting) <Training Classification Metrics>  Predicted  Reference virginica versicolor   virginica 45 0  versicolor 2 43 Overall   Sensitivity 1.000  Specificity 0.956  Balanced_Accuracy 0.978  PPV 0.957  NPV 1.000  F1 0.978  Accuracy 0.978  AUC 0.998  Brier_Score 0.019 Positive Class virginica 2026-02-05 01:48:26 ✔ Done in 0.15 seconds. [train] 2026-02-05 01:48:26 Extracting LightGBM rules... ✔ [extract_rules] 2026-02-05 01:48:26 Extracted 12 unique rules. [extract_rules] 2026-02-05 01:48:26 Matching12rules to90cases... ✔ [match_cases_by_rules] 2026-02-05 01:48:26 ▶ [train] 2026-02-05 01:48:26 Training set: 90 cases x 12 features. [summarize_supervised] 2026-02-05 01:48:26 // Max workers: 1 => Algorithm: 1; Tuning: 1; Outer Resampling: 1 [get_n_workers] 2026-02-05 01:48:26 <> Tuning GLMNET by exhaustive grid search with 5 independent folds... [tune_GridSearch] 2026-02-05 01:48:26 1 parameter combination x 5 resamples: 5 models total (). [tune_GridSearch] 2026-02-05 01:48:26 Input contains more than one column; stratifying on last. [resample] 2026-02-05 01:48:26 Using max n bins possible = 2. [kfold] 2026-02-05 01:48:26 Tuning in sequence [tune_GridSearch] 2026-02-05 01:48:27 Best config to maximize Balanced_Accuracy: [tune_GridSearch]  lambda: {} => 0.0571394304115356 2026-02-05 01:48:27  Tuning done. [tune_GridSearch] 2026-02-05 01:48:27 Calculating case weights using Inverse Frequency Weighting. [ifw] 2026-02-05 01:48:27 Training GLMNET Classification with tuned hyperparameters... [train] 2026-02-05 01:48:27 Checking data is ready for training... ✔ [check_supervised]  <Classification> GLMNET (Elastic Net) ⚙ Tuned using exhaustive grid search. <Training Classification Metrics>  Predicted  Reference virginica versicolor   virginica 45 0  versicolor 2 43 Overall   Sensitivity 1.000  Specificity 0.956  Balanced_Accuracy 0.978  PPV 0.957  NPV 1.000  F1 0.978  Accuracy 0.978  AUC 0.996  Brier_Score 0.027 Positive Class virginica 2026-02-05 01:48:27 ✔ Done in 0.77 seconds. [train] 2026-02-05 01:48:27 Matching12rules to90cases... ✔ [match_cases_by_rules] 2026-02-05 01:48:27 Matching12rules to10cases... ✔ [match_cases_by_rules]  <Classification> LightRuleFit (LightGBM RuleFit) <Training Classification Metrics>  Predicted  Reference virginica versicolor   virginica 45 0  versicolor 2 43 Overall   Sensitivity 1.000  Specificity 0.956  Balanced_Accuracy 0.978  PPV 0.957  NPV 1.000  F1 0.978  Accuracy 0.978  AUC 0.996  Brier_Score 0.027 Positive Class virginica <Test Classification Metrics>  Predicted  Reference virginica versicolor   virginica 5 0  versicolor 1 4 Overall   Sensitivity 1.000  Specificity 0.800  Balanced_Accuracy 0.900  PPV 0.833  NPV 1.000  F1 0.909  Accuracy 0.900  AUC 0.900  Brier_Score 0.095 Positive Class virginica 2026-02-05 01:48:27 ✔ Done in 1.00 seconds. [train] 2026-02-05 01:48:28 ▶ [train] 2026-02-05 01:48:28 Training set: 50 cases x 1 features. [summarize_supervised] 2026-02-05 01:48:28 // Max workers: 1 => Algorithm: 1; Tuning: 1; Outer Resampling: 1 [get_n_workers] 2026-02-05 01:48:28 Training Isotonic Regression... [train] 2026-02-05 01:48:28 Checking data is ready for training... ✔ [check_supervised]  <Regression> Isotonic (Isotonic Regression) <Training Regression Metrics>  MAE: 3.49  MSE: 26.42  RMSE: 5.14  Rsq: -0.21 2026-02-05 01:48:28 ✔ Done in 0.04 seconds. [train] 2026-02-05 01:48:28 ▶ [train] 2026-02-05 01:48:28 Training set: 200 cases x 1 features. [summarize_supervised] 2026-02-05 01:48:28 // Max workers: 1 => Algorithm: 1; Tuning: 1; Outer Resampling: 1 [get_n_workers] 2026-02-05 01:48:28 Training Isotonic Classification... [train] 2026-02-05 01:48:28 Checking data is ready for training... ✔ [check_supervised]  <Classification> Isotonic (Isotonic Regression) <Training Classification Metrics>  Predicted  Reference b a   b 90 6  a 12 92 Overall   Sensitivity 0.938  Specificity 0.885  Balanced_Accuracy 0.911  PPV 0.882  NPV 0.939  F1 0.909  Accuracy 0.910  AUC 0.978  Brier_Score 0.057 Positive Class b 2026-02-05 01:48:28 ✔ Done in 0.05 seconds. [train] 2026-02-05 01:48:28 ▶ [train] 2026-02-05 01:48:28 Training set: 358 cases x 6 features. [summarize_supervised] 2026-02-05 01:48:28  Test set: 42 cases x 6 features. [summarize_supervised] 2026-02-05 01:48:28 // Max workers: 1 => Algorithm: 1; Tuning: 1; Outer Resampling: 1 [get_n_workers] 2026-02-05 01:48:28 Training Ranger Regression... [train] 2026-02-05 01:48:28 Checking data is ready for training... ✔ [check_supervised]  <Regression> Ranger (Random Forest) <Training Regression Metrics>  MAE: 0.40  MSE: 0.26  RMSE: 0.51  Rsq: 0.95 <Test Regression Metrics>  MAE: 0.86  MSE: 1.18  RMSE: 1.08  Rsq: 0.74 2026-02-05 01:48:28 ✔ Done in 0.09 seconds. [train] 2026-02-05 01:48:28 ▶ [train] 2026-02-05 01:48:28 Training set: 358 cases x 6 features. [summarize_supervised] 2026-02-05 01:48:28  Test set: 42 cases x 6 features. [summarize_supervised] 2026-02-05 01:48:28 // Max workers: 1 => Algorithm: 1; Tuning: 1; Outer Resampling: 1 [get_n_workers] 2026-02-05 01:48:28 <> Tuning Ranger by exhaustive grid search with 5 independent folds... [tune_GridSearch] 2026-02-05 01:48:28 2 parameter combinations x 5 resamples: 10 models total (). [tune_GridSearch] 2026-02-05 01:48:28 Input contains more than one column; stratifying on last. [resample] 2026-02-05 01:48:28 Tuning in sequence [tune_GridSearch] 2026-02-05 01:48:29 Best config to minimize MSE: [tune_GridSearch]  mtry: {3, 6} => 6 2026-02-05 01:48:29  Tuning done. [tune_GridSearch] 2026-02-05 01:48:29 Training Ranger Regression with tuned hyperparameters... [train] 2026-02-05 01:48:29 Checking data is ready for training... ✔ [check_supervised]  <Regression> Ranger (Random Forest) ⚙ Tuned using exhaustive grid search. <Training Regression Metrics>  MAE: 0.35  MSE: 0.20  RMSE: 0.45  Rsq: 0.96 <Test Regression Metrics>  MAE: 0.88  MSE: 1.37  RMSE: 1.17  Rsq: 0.69 2026-02-05 01:48:29 ✔ Done in 0.71 seconds. [train] 2026-02-05 01:48:29 ▶ [train] 2026-02-05 01:48:29 Training set: 400 cases x 6 features. [summarize_supervised] 2026-02-05 01:48:29 // Max workers: 1 => Algorithm: 1; Tuning: 1; Outer Resampling: 1 [get_n_workers] 2026-02-05 01:48:29 <> Training Ranger Regression using 3 independent folds... [train] 2026-02-05 01:48:29 Input contains more than one column; stratifying on last. [resample] \ 2/3 ETA: 2s | Training outer resamples... 2026-02-05 01:48:34  Outer resampling done. [train] Ranger (Random Forest) ⟳ Tested using 3 independent folds. Showing mean (sd) across resamples. MAE: 0.406 (0.007) MSE: 0.259 (0.014) RMSE: 0.509 (0.014) Rsq: 0.947 (3.2e-03) Showing mean (sd) across resamples. MAE: 0.881 (0.051) MSE: 1.215 (0.135) RMSE: 1.101 (0.062) Rsq: 0.752 (0.027) 2026-02-05 01:48:34 ✔ Done in 5.00 seconds. [train] 2026-02-05 01:48:34 ▶ [train] 2026-02-05 01:48:34 Training set: 90 cases x 5 features. [summarize_supervised] 2026-02-05 01:48:34  Test set: 10 cases x 5 features. [summarize_supervised] 2026-02-05 01:48:34 // Max workers: 1 => Algorithm: 1; Tuning: 1; Outer Resampling: 1 [get_n_workers] 2026-02-05 01:48:34 Training Ranger Classification... [train] 2026-02-05 01:48:34 Checking data is ready for training... ✔ [check_supervised]  <Classification> Ranger (Random Forest) <Training Classification Metrics>  Predicted  Reference virginica versicolor   virginica 43 2  versicolor 1 44 Overall   Sensitivity 0.956  Specificity 0.978  Balanced_Accuracy 0.967  PPV 0.977  NPV 0.957  F1 0.966  Accuracy 0.967  AUC 0.998  Brier_Score 0.025 Positive Class virginica <Test Classification Metrics>  Predicted  Reference virginica versicolor   virginica 5 0  versicolor 1 4 Overall   Sensitivity 1.000  Specificity 0.800  Balanced_Accuracy 0.900  PPV 0.833  NPV 1.000  F1 0.909  Accuracy 0.900  AUC 0.960  Brier_Score 0.101 Positive Class virginica 2026-02-05 01:48:34 ✔ Done in 0.07 seconds. [train] 2026-02-05 01:48:34 ▶ [train] 2026-02-05 01:48:34 Training set: 90 cases x 5 features. [summarize_supervised] 2026-02-05 01:48:34  Test set: 10 cases x 5 features. [summarize_supervised] 2026-02-05 01:48:34 // Max workers: 1 => Algorithm: 1; Tuning: 1; Outer Resampling: 1 [get_n_workers] 2026-02-05 01:48:34 <> Tuning Ranger by exhaustive grid search with 5 independent folds... [tune_GridSearch] 2026-02-05 01:48:34 2 parameter combinations x 5 resamples: 10 models total (). [tune_GridSearch] 2026-02-05 01:48:34 Input contains more than one column; stratifying on last. [resample] 2026-02-05 01:48:34 Using max n bins possible = 2. [kfold] 2026-02-05 01:48:34 Tuning in sequence [tune_GridSearch] 2026-02-05 01:48:35 Best config to maximize Balanced_Accuracy: [tune_GridSearch]  mtry: {2, 4} => 2 2026-02-05 01:48:35  Tuning done. [tune_GridSearch] 2026-02-05 01:48:35 Training Ranger Classification with tuned hyperparameters... [train] 2026-02-05 01:48:35 Checking data is ready for training... ✔ [check_supervised]  <Classification> Ranger (Random Forest) ⚙ Tuned using exhaustive grid search. <Training Classification Metrics>  Predicted  Reference virginica versicolor   virginica 44 1  versicolor 2 43 Overall   Sensitivity 0.978  Specificity 0.956  Balanced_Accuracy 0.967  PPV 0.957  NPV 0.977  F1 0.967  Accuracy 0.967  AUC 0.997  Brier_Score 0.024 Positive Class virginica <Test Classification Metrics>  Predicted  Reference virginica versicolor   virginica 5 0  versicolor 1 4 Overall   Sensitivity 1.000  Specificity 0.800  Balanced_Accuracy 0.900  PPV 0.833  NPV 1.000  F1 0.909  Accuracy 0.900  AUC 1.000  Brier_Score 0.038 Positive Class virginica 2026-02-05 01:48:35 ✔ Done in 0.53 seconds. [train] 2026-02-05 01:48:35 ▶ [train] 2026-02-05 01:48:35 Training set: 100 cases x 5 features. [summarize_supervised] 2026-02-05 01:48:35 // Max workers: 1 => Algorithm: 1; Tuning: 1; Outer Resampling: 1 [get_n_workers] 2026-02-05 01:48:35 <> Training Ranger Classification using 3 independent folds... [train] 2026-02-05 01:48:35 Input contains more than one column; stratifying on last. [resample] 2026-02-05 01:48:35 Using max n bins possible = 2. [kfold] 2026-02-05 01:48:35  Outer resampling done. [train] Ranger (Random Forest) ⟳ Tested using 3 independent folds. Showing mean (sd) across resamples. Sensitivity: 0.960 (0.018) Specificity: 0.960 (0.034) Balanced_Accuracy: 0.960 (0.008) PPV: 0.961 (0.033) NPV: 0.960 (0.015) F1: 0.960 (0.007) Accuracy: 0.960 (0.008) AUC: 0.998 (1.1e-03) Brier_Score: 0.023 (1.5e-03) Showing mean (sd) across resamples. Sensitivity: 0.922 (0.068) Specificity: 0.940 (2.1e-03) Balanced_Accuracy: 0.931 (0.033) PPV: 0.939 (2.1e-03) NPV: 0.926 (0.064) F1: 0.929 (0.035) Accuracy: 0.931 (0.033) AUC: 0.987 (0.013) Brier_Score: 0.049 (0.022) 2026-02-05 01:48:35 ✔ Done in 0.18 seconds. [train] 2026-02-05 01:48:35 ▶ [train] 2026-02-05 01:48:35 Training set: 135 cases x 4 features. [summarize_supervised] 2026-02-05 01:48:35  Test set: 15 cases x 4 features. [summarize_supervised] 2026-02-05 01:48:35 // Max workers: 1 => Algorithm: 1; Tuning: 1; Outer Resampling: 1 [get_n_workers] 2026-02-05 01:48:35 Training Ranger Classification... [train] 2026-02-05 01:48:35 Checking data is ready for training... ✔ [check_supervised]  <Classification> Ranger (Random Forest) <Training Classification Metrics>  Predicted  Reference setosa versicolor virginica   setosa 45 0 0  versicolor 0 43 2  virginica 0 2 43 Overall   Balanced_Accuracy 0.970  F1 0.970  Accuracy 0.970 setosa versicolor virginica   Sensitivity 1.000 0.956 0.956  Specificity 1.000 0.978 0.978  Balanced_Accuracy 1.000 0.967 0.967  PPV 1.000 0.956 0.956  NPV 1.000 0.978 0.978  F1 1.000 0.956 0.956 <Test Classification Metrics>  Predicted  Reference setosa versicolor virginica   setosa 5 0 0  versicolor 0 5 0  virginica 0 1 4 Overall   Balanced_Accuracy 0.933  F1 0.933  Accuracy 0.933 setosa versicolor virginica   Sensitivity 1.000 1.000 0.800  Specificity 1.000 0.900 1.000  Balanced_Accuracy 1.000 0.950 0.900  PPV 1.000 0.833 1.000  NPV 1.000 1.000 0.909  F1 1.000 0.909 0.889 2026-02-05 01:48:35 ✔ Done in 0.08 seconds. [train] 2026-02-05 01:48:35 ▶ [train] 2026-02-05 01:48:35 Training set: 90 cases x 1 features. [summarize_supervised] 2026-02-05 01:48:35  Test set: 10 cases x 1 features. [summarize_supervised] 2026-02-05 01:48:35 // Max workers: 1 => Algorithm: 1; Tuning: 1; Outer Resampling: 1 [get_n_workers] 2026-02-05 01:48:35 Training Isotonic Classification... [train] 2026-02-05 01:48:35 Checking data is ready for training... ✔ [check_supervised]  <Classification> Isotonic (Isotonic Regression) <Training Classification Metrics>  Predicted  Reference virginica versicolor   virginica 45 0  versicolor 2 43 Overall   Sensitivity 1.000  Specificity 0.956  Balanced_Accuracy 0.978  PPV 0.957  NPV 1.000  F1 0.978  Accuracy 0.978  AUC 0.997  Brier_Score 0.017 Positive Class virginica <Test Classification Metrics>  Predicted  Reference virginica versicolor   virginica 5 0  versicolor 1 4 Overall   Sensitivity 1.000  Specificity 0.800  Balanced_Accuracy 0.900  PPV 0.833  NPV 1.000  F1 0.909  Accuracy 0.900  AUC 0.900  Brier_Score 0.100 Positive Class virginica 2026-02-05 01:48:35 ✔ Done in 0.06 seconds. [train] 2026-02-05 01:48:35 ▶ [train] 2026-02-05 01:48:35 Training set: 34 cases x 1 features. [summarize_supervised] 2026-02-05 01:48:35 // Max workers: 1 => Algorithm: 1; Tuning: 1; Outer Resampling: 1 [get_n_workers] 2026-02-05 01:48:35 <> Training Isotonic Classification using 5 independent folds... [train] 2026-02-05 01:48:35 Input contains more than one column; stratifying on last. [resample] 2026-02-05 01:48:35 Using max n bins possible = 2. [kfold] 2026-02-05 01:48:35  Outer resampling done. [train] Isotonic (Isotonic Regression) ⟳ Tested using 5 independent folds. Showing mean (sd) across resamples. Sensitivity: 1.000 (0.000) Specificity: 1.000 (0.000) Balanced_Accuracy: 1.000 (0.000) PPV: 1.000 (0.000) NPV: 1.000 (0.000) F1: 1.000 (0.000) Accuracy: 1.000 (0.000) AUC: 1.000 (0.000) Brier_Score: 0.000 (0.000) Showing mean (sd) across resamples. Sensitivity: 1.000 (0.000) Specificity: 1.000 (0.000) Balanced_Accuracy: 1.000 (0.000) PPV: 1.000 (0.000) NPV: 1.000 (0.000) F1: 1.000 (0.000) Accuracy: 1.000 (0.000) AUC: 1.000 (0.000) Brier_Score: 0.000 (0.000) 2026-02-05 01:48:35 ✔ Done in 0.27 seconds. [train] 2026-02-05 01:48:35 ▶ [train] 2026-02-05 01:48:35 Training set: 32 cases x 1 features. [summarize_supervised] 2026-02-05 01:48:35 // Max workers: 1 => Algorithm: 1; Tuning: 1; Outer Resampling: 1 [get_n_workers] 2026-02-05 01:48:35 <> Training Isotonic Classification using 5 independent folds... [train] 2026-02-05 01:48:35 Input contains more than one column; stratifying on last. [resample] 2026-02-05 01:48:35 Using max n bins possible = 2. [kfold] 2026-02-05 01:48:36  Outer resampling done. [train] Isotonic (Isotonic Regression) ⟳ Tested using 5 independent folds. Showing mean (sd) across resamples. Sensitivity: 1.000 (0.000) Specificity: 0.845 (0.052) Balanced_Accuracy: 0.922 (0.026) PPV: 0.867 (0.039) NPV: 1.000 (0.000) F1: 0.928 (0.022) Accuracy: 0.922 (0.026) AUC: 0.985 (0.009) Brier_Score: 0.042 (0.014) Showing mean (sd) across resamples. Sensitivity: 0.900 (0.224) Specificity: 0.800 (0.183) Balanced_Accuracy: 0.850 (0.091) PPV: 0.850 (0.137) NPV: 0.933 (0.149) F1: 0.848 (0.119) Accuracy: 0.850 (0.091) AUC: 0.917 (0.118) Brier_Score: 0.107 (0.100) 2026-02-05 01:48:36 ✔ Done in 0.25 seconds. [train] 2026-02-05 01:48:36 ▶ [train] 2026-02-05 01:48:36 Training set: 34 cases x 1 features. [summarize_supervised] 2026-02-05 01:48:36 // Max workers: 1 => Algorithm: 1; Tuning: 1; Outer Resampling: 1 [get_n_workers] 2026-02-05 01:48:36 <> Training Isotonic Classification using 5 independent folds... [train] 2026-02-05 01:48:36 Input contains more than one column; stratifying on last. [resample] 2026-02-05 01:48:36 Using max n bins possible = 2. [kfold] 2026-02-05 01:48:36  Outer resampling done. [train] Isotonic (Isotonic Regression) ⟳ Tested using 5 independent folds. Showing mean (sd) across resamples. Sensitivity: 1.000 (0.000) Specificity: 0.881 (0.043) Balanced_Accuracy: 0.941 (0.022) PPV: 0.895 (0.035) NPV: 1.000 (0.000) F1: 0.944 (0.019) Accuracy: 0.941 (0.022) AUC: 0.989 (0.006) Brier_Score: 0.034 (0.012) Showing mean (sd) across resamples. Sensitivity: 1.000 (0.000) Specificity: 0.817 (0.171) Balanced_Accuracy: 0.908 (0.085) PPV: 0.860 (0.129) NPV: 1.000 (0.000) F1: 0.921 (0.074) Accuracy: 0.908 (0.085) AUC: 0.978 (0.050) Brier_Score: 0.059 (0.050) 2026-02-05 01:48:36 ✔ Done in 0.23 seconds. [train] Generalized Linear Model was used for regression. R-squared was 0.83 in the training set and 0.77 in the test. Generalized Linear Model was used for regression. R-squared was 0.83 in the training set and 0.77 in the test. Generalized Linear Model was used for classification. Balanced accuracy was 0.98 on the training set and 0.90 in the test set. Generalized Linear Model was used for regression. R-squared was 0.83 in the training set and 0.77 in the test. Generalized Linear Model was used for classification. Balanced accuracy was 0.98 on the training set and 0.90 in the test set. Generalized Linear Model was used for regression. Mean R-squared was 0.83 on the training set and 0.82 on the test set across 3 independent folds. Generalized Linear Model was used for classification. Mean balanced accuracy was 0.99 in the training set and 0.91 in the test set across 3 independent folds. Generalized Linear Model (GLM) and Classification and Regression Trees (CART) were used for Regression. The top-performing model was GLM with a test-set Rsq of 0.822, followed by CART with Rsq of 0.664 respectively. Generalized Linear Model (GLM) and Classification and Regression Trees (CART) were used for Classification. The top-performing model was CART with a test-set Balanced Accuracy of 0.930, followed by GLM with Balanced_Accuracy of 0.910 respectively. Generalized Linear Model (GLM) and Classification and Regression Trees (CART) were used for Regression. The top-performing model was GLM with a test-set Rsq of 0.770, followed by CART with Rsq of 0.595 respectively. 2026-02-05 01:48:38 ▶ [train] 2026-02-05 01:48:38 Training set: 358 cases x 6 features. [summarize_supervised] 2026-02-05 01:48:38  Test set: 42 cases x 6 features. [summarize_supervised] 2026-02-05 01:48:38 // Max workers: 1 => Algorithm: 1; Tuning: 1; Outer Resampling: 1 [get_n_workers] 2026-02-05 01:48:38 Training GLM Regression... [train] 2026-02-05 01:48:38 Checking data is ready for training... ✔ [check_supervised]  <Regression> GLM (Generalized Linear Model) <Training Regression Metrics>  MAE: 0.73  MSE: 0.82  RMSE: 0.91  Rsq: 0.83 <Test Regression Metrics>  MAE: 0.74  MSE: 1.03  RMSE: 1.01  Rsq: 0.77 2026-02-05 01:48:39 Writing data to D:\temp\2026_02_05_01_45_16_1430\RtmpgbTXo7\file17fb0134466e7/mod_r_glm...✔ 0.035 secs [rt_save] 2026-02-05 01:48:39 Reload with:> obj <- readRDS('D:\temp\2026_02_05_01_45_16_1430\RtmpgbTXo7\file17fb0134466e7\mod_r_glm/train_GLM.rds') [rt_save] 2026-02-05 01:48:39 ✔ Done in 0.10 seconds. [train] 2026-02-05 01:48:39 ▶ [train] 2026-02-05 01:48:39 Training set: 400 cases x 6 features. [summarize_supervised] 2026-02-05 01:48:39 // Max workers: 1 => Algorithm: 1; Tuning: 1; Outer Resampling: 1 [get_n_workers] 2026-02-05 01:48:39 <> Training GLM Regression using 3 independent folds... [train] 2026-02-05 01:48:39 Input contains more than one column; stratifying on last. [resample] 2026-02-05 01:48:39  Outer resampling done. [train] GLM (Generalized Linear Model) ⟳ Tested using 3 independent folds. Showing mean (sd) across resamples. MAE: 0.722 (0.028) MSE: 0.834 (0.073) RMSE: 0.913 (0.041) Rsq: 0.830 (0.017) Showing mean (sd) across resamples. MAE: 0.744 (0.062) MSE: 0.886 (0.155) RMSE: 0.939 (0.081) Rsq: 0.819 (0.033) 2026-02-05 01:48:39 Writing data to D:\temp\2026_02_05_01_45_16_1430\RtmpgbTXo7\file17fb09995da7/resmod_r_glm...✔ 0.13 secs [rt_save] 2026-02-05 01:48:39 Reload with:> obj <- readRDS('D:\temp\2026_02_05_01_45_16_1430\RtmpgbTXo7\file17fb09995da7\resmod_r_glm/train_GLM.rds') [rt_save] 2026-02-05 01:48:39 ✔ Done in 0.32 seconds. [train] Classification and Regression Trees (CART), LightGBM Random Forest (LightRF), and Gradient Boosting (LightGBM) were used for Classification. The top-performing model was CART with a test-set Balanced Accuracy of 0.900, followed by LightRF and LightGBM with Balanced_Accuracy of 0.900 and 0.900 respectively. Classification and Regression Trees (CART), LightGBM Random Forest (LightRF), and Gradient Boosting (LightGBM) were used for Classification. The top-performing model was CART with a test-set Balanced Accuracy of 0.900, followed by LightRF and LightGBM with Balanced_Accuracy of 0.900 and 0.900 respectively. Elastic Net (GLMNET), Support Vector Machine with Radial Kernel (RadialSVM), and LightGBM Random Forest (LightRF) were used for Regression. The top-performing model was RadialSVM with a test-set Rsq of 0.773, followed by GLMNET and LightRF with Rsq of 0.772 and 0.735 respectively. Elastic Net (GLMNET), Support Vector Machine with Radial Kernel (RadialSVM), and LightGBM Random Forest (LightRF) were used for Regression. The top-performing model was RadialSVM with a test-set Rsq of 0.773, followed by GLMNET and LightRF with Rsq of 0.772 and 0.735 respectively. Generalized Linear Model (GLM), Support Vector Machine with Linear Kernel (LinearSVM), and LightGBM Random Forest (LightRF) were used for Classification. The top-performing model was LinearSVM with a test-set Balanced Accuracy of 0.920, followed by GLM and LightRF with Balanced_Accuracy of 0.910 and 0.900 respectively. Generalized Linear Model (GLM), Support Vector Machine with Linear Kernel (LinearSVM), and LightGBM Random Forest (LightRF) were used for Classification. The top-performing model was LinearSVM with a test-set Balanced Accuracy of 0.920, followed by GLM and LightRF with Balanced_Accuracy of 0.910 and 0.900 respectively. Generalized Linear Model (GLM), Support Vector Machine with Linear Kernel (LinearSVM), and LightGBM Random Forest (LightRF) were used for Regression. The top-performing model was GLM with a test-set Rsq of 0.822, followed by LinearSVM and LightRF with Rsq of 0.813 and 0.662 respectively. Generalized Linear Model (GLM), Support Vector Machine with Linear Kernel (LinearSVM), and LightGBM Random Forest (LightRF) were used for Regression. The top-performing model was GLM with a test-set Rsq of 0.822, followed by LinearSVM and LightRF with Rsq of 0.813 and 0.662 respectively. 2026-02-05 01:48:39 ▶ [train] 2026-02-05 01:48:39 Training set: 34 cases x 1 features. [summarize_supervised] 2026-02-05 01:48:39 // Max workers: 1 => Algorithm: 1; Tuning: 1; Outer Resampling: 1 [get_n_workers] 2026-02-05 01:48:39 <> Training Isotonic Classification using 5 independent folds... [train] 2026-02-05 01:48:39 Input contains more than one column; stratifying on last. [resample] 2026-02-05 01:48:39 Using max n bins possible = 2. [kfold] 2026-02-05 01:48:40  Outer resampling done. [train] Isotonic (Isotonic Regression) ⟳ Tested using 5 independent folds. Showing mean (sd) across resamples. Sensitivity: 0.985 (0.034) Specificity: 0.825 (0.106) Balanced_Accuracy: 0.905 (0.047) PPV: 0.856 (0.076) NPV: 0.985 (0.034) F1: 0.913 (0.039) Accuracy: 0.905 (0.047) AUC: 0.977 (0.017) Brier_Score: 0.051 (0.021) Showing mean (sd) across resamples. Sensitivity: 0.850 (0.224) Specificity: 0.750 (0.433) Balanced_Accuracy: 0.800 (0.209) PPV: 0.850 (0.224) NPV: NA (NA) F1: 0.817 (0.171) Accuracy: 0.800 (0.209) AUC: 0.883 (0.143) Brier_Score: 0.149 (0.168) 2026-02-05 01:48:40 ✔ Done in 0.28 seconds. [train] 2026-02-05 01:48:40 ▶ [train] 2026-02-05 01:48:40 Training set: 32 cases x 1 features. [summarize_supervised] 2026-02-05 01:48:40 // Max workers: 1 => Algorithm: 1; Tuning: 1; Outer Resampling: 1 [get_n_workers] 2026-02-05 01:48:40 <> Training Isotonic Classification using 5 independent folds... [train] 2026-02-05 01:48:40 Input contains more than one column; stratifying on last. [resample] 2026-02-05 01:48:40 Using max n bins possible = 2. [kfold] 2026-02-05 01:48:40  Outer resampling done. [train] Isotonic (Isotonic Regression) ⟳ Tested using 5 independent folds. Showing mean (sd) across resamples. Sensitivity: 0.938 (0.034) Specificity: 0.937 (0.035) Balanced_Accuracy: 0.938 (0.020) PPV: 0.938 (0.034) NPV: 0.940 (0.034) F1: 0.938 (0.020) Accuracy: 0.938 (0.020) AUC: 0.938 (0.020) Brier_Score: 0.057 (0.019) Showing mean (sd) across resamples. Sensitivity: 0.950 (0.112) Specificity: 0.867 (0.183) Balanced_Accuracy: 0.908 (0.085) PPV: 0.900 (0.137) NPV: 0.960 (0.089) F1: 0.914 (0.078) Accuracy: 0.908 (0.085) AUC: 0.908 (0.085) Brier_Score: 0.091 (0.079) 2026-02-05 01:48:40 ✔ Done in 0.31 seconds. [train] 2026-02-05 01:48:40 ▶ [train] 2026-02-05 01:48:40 Training set: 34 cases x 1 features. [summarize_supervised] 2026-02-05 01:48:40 // Max workers: 1 => Algorithm: 1; Tuning: 1; Outer Resampling: 1 [get_n_workers] 2026-02-05 01:48:40 <> Training Isotonic Classification using 5 independent folds... [train] 2026-02-05 01:48:40 Input contains more than one column; stratifying on last. [resample] 2026-02-05 01:48:40 Using max n bins possible = 2. [kfold] 2026-02-05 01:48:40  Outer resampling done. [train] Isotonic (Isotonic Regression) ⟳ Tested using 5 independent folds. Showing mean (sd) across resamples. Sensitivity: 0.884 (0.065) Specificity: 1.000 (0.000) Balanced_Accuracy: 0.942 (0.033) PPV: 1.000 (0.000) NPV: 0.898 (0.057) F1: 0.937 (0.035) Accuracy: 0.942 (0.033) AUC: 0.942 (0.033) Brier_Score: 0.051 (0.028) Showing mean (sd) across resamples. Sensitivity: 0.900 (0.224) Specificity: 1.000 (0.000) Balanced_Accuracy: 0.950 (0.112) PPV: 1.000 (0.000) NPV: 0.933 (0.149) F1: 0.933 (0.149) Accuracy: 0.950 (0.112) AUC: 0.950 (0.112) Brier_Score: 0.056 (0.108) 2026-02-05 01:48:40 ✔ Done in 0.29 seconds. [train] Group counts: group Low NS High 1 98 1 2026-02-05 01:48:51 ▶ [massGLM] 2026-02-05 01:48:51 Scaling and centering 40 numeric features... [preprocess] 2026-02-05 01:48:51 Preprocessing done. [preprocess] 2026-02-05 01:48:51 Fitting 40 GLMs of family gaussian with 2 predictors each... [massGLM] 2026-02-05 01:48:51 ✔ Done in 0.10 seconds. [massGLM] 2026-02-05 01:48:51 Plotting coefficients for x1 x 40 outcomes. [`plot.rtemis::MassGLM`] Group counts: group Low NS High 1 37 2 [ FAIL 0 | WARN 0 | SKIP 0 | PASS 266 ] > > proc.time() user system elapsed 75.29 10.03 97.20