R Under development (unstable) (2025-09-01 r88761 ucrt) -- "Unsuffered Consequences" Copyright (C) 2025 The R Foundation for Statistical Computing Platform: x86_64-w64-mingw32/x64 R is free software and comes with ABSOLUTELY NO WARRANTY. You are welcome to redistribute it under certain conditions. Type 'license()' or 'licence()' for distribution details. R is a collaborative project with many contributors. Type 'contributors()' for more information and 'citation()' on how to cite R or R packages in publications. Type 'demo()' for some demos, 'help()' for on-line help, or 'help.start()' for an HTML browser interface to help. Type 'q()' to quit R. > library(testthat) > library(edgemodelr) > > test_check("edgemodelr") Running on: Windows x86-64 Created cache directory: D:\temp\2025_09_02_17_40_17_5596\RtmpKExCEf/test_cache_dir_12345 Downloading model... From: https://huggingface.co/fake/model/resolve/main/fake.gguf To: D:\temp\2025_09_02_17_40_17_5596\RtmpKExCEf/test_cache_dir_12345/fake.gguf trying URL 'https://huggingface.co/fake/model/resolve/main/fake.gguf' % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 29 100 29 0 0 202 0 --:--:-- --:--:-- --:--:-- 207 --2025-09-02 17:59:27-- https://huggingface.co/fake/model/resolve/main/fake.gguf Loaded CA certificate '/usr/ssl/certs/ca-bundle.crt' Resolving huggingface.co (huggingface.co)... 52.222.136.92, 52.222.136.117, 52.222.136.38, ... Connecting to huggingface.co (huggingface.co)|52.222.136.92|:443... connected. HTTP request sent, awaiting response... 401 Unauthorized Username/Password Authentication Failed. trying URL 'https://huggingface.co/fake/model/resolve/main/fake.gguf' Setting up TinyLlama-1.1B... Created cache directory: C:\Users\CRAN\AppData\Local/R/cache/R/edgemodelr Downloading model... From: https://huggingface.co/TheBloke/TinyLlama-1.1B-Chat-v1.0-GGUF/resolve/main/tinyllama-1.1b-chat-v1.0.Q4_K_M.gguf To: C:\Users\CRAN\AppData\Local/R/cache/R/edgemodelr/tinyllama-1.1b-chat-v1.0.Q4_K_M.gguf trying URL 'https://huggingface.co/TheBloke/TinyLlama-1.1B-Chat-v1.0-GGUF/resolve/main/tinyllama-1.1b-chat-v1.0.Q4_K_M.gguf' Content type 'binary/octet-stream' length 668788096 bytes (637.8 MB) ================================================== downloaded 637.8 MB Download completed successfully! Model size: 637.8MB Setting up TinyLlama-OpenOrca... Downloading model... From: https://huggingface.co/TheBloke/TinyLlama-1.1B-1T-OpenOrca-GGUF/resolve/main/tinyllama-1.1b-1t-openorca.Q4_K_M.gguf To: C:\Users\CRAN\AppData\Local/R/cache/R/edgemodelr/tinyllama-1.1b-1t-openorca.Q4_K_M.gguf trying URL 'https://huggingface.co/TheBloke/TinyLlama-1.1B-1T-OpenOrca-GGUF/resolve/main/tinyllama-1.1b-1t-openorca.Q4_K_M.gguf' Content type 'binary/octet-stream' length 667814368 bytes (636.9 MB) ================================================== downloaded 636.9 MB Download completed successfully! Model size: 636.9MB Setting up llama3.2-1b... Downloading model... From: https://huggingface.co/bartowski/Llama-3.2-1B-Instruct-GGUF/resolve/main/Llama-3.2-1B-Instruct-Q4_K_M.gguf To: C:\Users\CRAN\AppData\Local/R/cache/R/edgemodelr/Llama-3.2-1B-Instruct-Q4_K_M.gguf trying URL 'https://huggingface.co/bartowski/Llama-3.2-1B-Instruct-GGUF/resolve/main/Llama-3.2-1B-Instruct-Q4_K_M.gguf' Content type 'text/plain; charset=utf-8' length 807694464 bytes (770.3 MB) ================================================== downloaded 770.3 MB Download completed successfully! Model size: 770.3MB [ FAIL 8 | WARN 7 | SKIP 13 | PASS 592 ] ══ Skipped tests (13) ══════════════════════════════════════════════════════════ • No test model available for concurrency tests (1): 'test-error-handling.R:230:7' • No test model available for edge case tests (1): 'test-error-handling.R:200:7' • No test model available for memory constraint tests (1): 'test-error-handling.R:130:7' • No test model available for streaming tests (1): 'test-streaming.R:321:5' • empty test (9): 'test-error-handling.R:1:1', 'test-integration.R:1:1', 'test-model-loading.R:106:1', 'test-model-management.R:1:1', 'test-model-management.R:283:1', 'test-model-management.R:336:1', 'test-streaming.R:1:1', 'test-streaming.R:326:1', 'test-text-completion.R:1:1' ══ Failed tests ════════════════════════════════════════════════════════════════ ── Error ('test-error-handling.R:5:5'): Invalid file paths are handled properly ── Error in `edge_load_model("does_not_exist.gguf")`: Model file not found: does_not_exist.gguf Try these options: 1. Download a model: edge_download_model('TheBloke/TinyLlama-1.1B-Chat-v1.0-GGUF', 'tinyllama-1.1b-chat-v1.0.Q4_K_M.gguf') 2. Quick setup: edge_quick_setup('TinyLlama-1.1B') 3. List models: edge_list_models() Backtrace: ▆ 1. ├─testthat::expect_error(...) at test-error-handling.R:5:5 2. │ └─testthat:::expect_condition_matching(...) 3. │ └─testthat:::quasi_capture(...) 4. │ ├─testthat (local) .capture(...) 5. │ │ └─base::withCallingHandlers(...) 6. │ └─rlang::eval_bare(quo_get_expr(.quo), quo_get_env(.quo)) 7. └─edgemodelr::edge_load_model("does_not_exist.gguf") ── Error ('test-error-handling.R:36:5'): Invalid parameters are rejected ─────── Error in `edge_load_model(dummy_path, n_ctx = -1)`: Model file not found: dummy.gguf Try these options: 1. Download a model: edge_download_model('TheBloke/TinyLlama-1.1B-Chat-v1.0-GGUF', 'tinyllama-1.1b-chat-v1.0.Q4_K_M.gguf') 2. Quick setup: edge_quick_setup('TinyLlama-1.1B') 3. List models: edge_list_models() Backtrace: ▆ 1. ├─testthat::expect_error(...) at test-error-handling.R:36:5 2. │ └─testthat:::expect_condition_matching(...) 3. │ └─testthat:::quasi_capture(...) 4. │ ├─testthat (local) .capture(...) 5. │ │ └─base::withCallingHandlers(...) 6. │ └─rlang::eval_bare(quo_get_expr(.quo), quo_get_env(.quo)) 7. ├─base::suppressWarnings(edge_load_model(dummy_path, n_ctx = -1)) 8. │ └─base::withCallingHandlers(...) 9. └─edgemodelr::edge_load_model(dummy_path, n_ctx = -1) ── Error ('test-error-handling.R:238:5'): Resource cleanup after errors ──────── Error in `edge_load_model("nonexistent.gguf")`: Model file not found: nonexistent.gguf Try these options: 1. Download a model: edge_download_model('TheBloke/TinyLlama-1.1B-Chat-v1.0-GGUF', 'tinyllama-1.1b-chat-v1.0.Q4_K_M.gguf') 2. Quick setup: edge_quick_setup('TinyLlama-1.1B') 3. List models: edge_list_models() Backtrace: ▆ 1. ├─testthat::expect_error(edge_load_model("nonexistent.gguf"), "Model file does not exist") at test-error-handling.R:238:5 2. │ └─testthat:::expect_condition_matching(...) 3. │ └─testthat:::quasi_capture(...) 4. │ ├─testthat (local) .capture(...) 5. │ │ └─base::withCallingHandlers(...) 6. │ └─rlang::eval_bare(quo_get_expr(.quo), quo_get_env(.quo)) 7. └─edgemodelr::edge_load_model("nonexistent.gguf") ── Error ('test-model-loading.R:31:3'): edge_load_model handles invalid paths ── Error in `edge_load_model("nonexistent_model.gguf")`: Model file not found: nonexistent_model.gguf Try these options: 1. Download a model: edge_download_model('TheBloke/TinyLlama-1.1B-Chat-v1.0-GGUF', 'tinyllama-1.1b-chat-v1.0.Q4_K_M.gguf') 2. Quick setup: edge_quick_setup('TinyLlama-1.1B') 3. List models: edge_list_models() Backtrace: ▆ 1. ├─testthat::expect_error(...) at test-model-loading.R:31:3 2. │ └─testthat:::expect_condition_matching(...) 3. │ └─testthat:::quasi_capture(...) 4. │ ├─testthat (local) .capture(...) 5. │ │ └─base::withCallingHandlers(...) 6. │ └─rlang::eval_bare(quo_get_expr(.quo), quo_get_env(.quo)) 7. └─edgemodelr::edge_load_model("nonexistent_model.gguf") ── Failure ('test-text-completion.R:121:7'): edge_completion parameter validation ── `edge_completion(ctx, "Hello", n_predict = "invalid")` did not throw the expected error. ── Failure ('test-text-completion.R:129:7'): edge_completion parameter validation ── `edge_completion(ctx, "Hello", temperature = "invalid")` did not throw the expected error. ── Failure ('test-text-completion.R:141:7'): edge_completion parameter validation ── `edge_completion(ctx, "Hello", top_p = "invalid")` did not throw the expected error. ── Error ('test-text-completion.R:284:9'): edge_completion stress tests ──────── Error: Error during completion: Failed to process prompt Backtrace: ▆ 1. └─edgemodelr::edge_completion(ctx, "Quick test", n_predict = 3) at test-text-completion.R:284:9 [ FAIL 8 | WARN 7 | SKIP 13 | PASS 592 ] Error: Test failures Execution halted