Logo
Explore Help
Sign In
rixty/KoboldAI-Client
1
1
Fork 0
You've already forked KoboldAI-Client
mirror of https://github.com/KoboldAI/KoboldAI-Client.git synced 2025-06-05 21:59:24 +02:00
Code Issues Projects Releases Wiki Activity
Files
fac006125eb039642fa76fb246c6e0aeeef87137
KoboldAI-Client/modeling
History
ebolam bc337bf090 Merge branch 'henk717:united' into Model_Plugins
2023-07-15 15:02:00 -04:00
..
inference_models
Better way of doing the if statement
2023-07-15 20:00:29 +02:00
inference_model.py
Fix for default inference model is_valid and requested_parameters having vram as a required parameter.
2023-07-15 11:11:29 -04:00
lazy_loader.py
Fall back to unpatched HF
2023-07-08 14:36:45 -05:00
logits_processors.py
Modeling: Fix logits processors (probs, biasing, lua)
2023-03-17 16:56:47 -05:00
patches.py
Fix for UI2 model loading not showing progress
2023-07-10 20:59:16 -04:00
post_token_hooks.py
Model: Respect model lazyload over kaivars
2023-03-10 20:00:39 -06:00
stoppers.py
Potential fix
2023-04-27 19:51:10 -05:00
test_generation.py
Model: More Jax import fixes and formatting
2023-03-17 15:36:44 -05:00
tokenizer.py
Fix tokenizer fallback for llama
2023-05-01 19:42:52 -05:00
warpers.py
Fix for --nobreakmodel forcing CPU
2023-06-02 12:58:59 -04:00
Powered by Gitea Version: 1.24.5 Page: 592ms Template: 24ms
English
Bahasa Indonesia Deutsch English Español Français Gaeilge Italiano Latviešu Magyar nyelv Nederlands Polski Português de Portugal Português do Brasil Suomi Svenska Türkçe Čeština Ελληνικά Български Русский Українська فارسی മലയാളം 日本語 简体中文 繁體中文(台灣) 繁體中文(香港) 한국어
Licenses API