Commit Graph

81 Commits

Author SHA1 Message Date
Cohee
c63cd87cc0 lint: Require semicolons 2023-12-02 21:11:06 +02:00
valadaptive
a37f874e38 Require single quotes 2023-12-02 13:04:51 -05:00
Cohee
a367285ac2
Merge pull request #1430 from valadaptive/eslint-fixes-2
ESLint fixes, part 2 - bulky changes
2023-12-02 19:43:11 +02:00
Cohee
0477f6a553 Use best match API tokenizers for Text Completion sources 2023-12-02 19:42:15 +02:00
valadaptive
27e63a7a77 Enable no-case-declarations lint 2023-12-02 10:32:26 -05:00
Cohee
e6c96553d0 Add text trimming commands 2023-11-26 13:55:22 +02:00
Cohee
1ebfddf07e Use mistral and yi tokenizers for custom token bans 2023-11-21 01:04:27 +02:00
Cohee
9b75e49b54 Add support for Yi tokenizer 2023-11-21 00:21:58 +02:00
Cohee
96caddfd71 Add koboldcpp as Text Completion source 2023-11-19 17:14:53 +02:00
kingbri
4cfa267b1b API Tokenizer: Add support for TabbyAPI
Use Tabby's /v1/token endpoints.

Signed-off-by: kingbri <bdashore3@proton.me>
2023-11-17 01:48:03 -05:00
Cohee
81fe9aa699 Fix updated tokenization via ooba API 2023-11-09 19:39:08 +02:00
Cohee
480099ee97 Mancer will work in legacy API mode. Remove Soft Prompt mentions. 2023-11-08 18:16:47 +02:00
Cohee
e76c18c104 Legacy ooba API compatibility shim 2023-11-08 10:13:28 +02:00
Cohee
865256f5c0 Fix ooba tokenization via API. Fix requiring streaming URL to generate 2023-11-08 03:38:04 +02:00
Cohee
57e845d0d7 Resolve best match tokenizer for itemization. Adjust styles of token counter 2023-11-06 20:25:59 +02:00
Cohee
e8ba328a14 Add text chunks display to token counter 2023-11-06 02:42:51 +02:00
Cohee
f248367ca3 Add Mistral tokenizer 2023-11-06 01:26:13 +02:00
Cohee
f0c0949aa0 Add token ids viewer to tokenizer plugin 2023-11-05 22:45:37 +02:00
Cohee
fedc3b887f Add llama2 tokenizer for OpenRouter models 2023-11-05 21:54:19 +02:00
Cohee
c2ba3a773a Delayed tokenizers initialization 2023-10-25 00:32:49 +03:00
Cohee
b167eb9e22 Add raw token ids support to OAI logit bias. Fix token counting for turbo models 2023-10-19 13:37:08 +03:00
Cohee
bfdd071001 Move tokenizer endpoint and functions to separate file 2023-09-16 18:48:06 +03:00
Cohee
853736fa93 Remove legacy NovelAI models 2023-09-06 14:32:06 +03:00
Cohee
267d0eb16f Fix API tokenizers usage with kcpp 2023-09-01 02:57:35 +03:00
Cohee
3b4e6f0b78 Add debug functions menu 2023-08-27 23:20:43 +03:00
Cohee
0844374de5 Remove old GPT-2 tokenizer. Redirect to tiktoken's tokenizer 2023-08-27 22:14:39 +03:00
Cohee
9660aaa2c2 Add NovelAI hypebot plugin 2023-08-27 18:27:34 +03:00
Cohee
c91ab3b5e0 Add Kobold tokenization to best match logic. Fix not being able to stop group chat regeneration 2023-08-24 21:23:35 +03:00
Cohee
ab52af4fb5 Add support for Koboldcpp tokenization endpoint 2023-08-24 20:19:57 +03:00
Cohee
e77da62b85 Add padding to cache key. Fix Safari display issues. Fix 400 on empty translate. Reset bias cache on changing model. 2023-08-23 10:32:48 +03:00
Cohee
bc5fc67906 Put tokenizer functions to a separate file. Cache local models token counts 2023-08-23 02:38:43 +03:00