Commit Graph

13 Commits

Author SHA1 Message Date
50h100a 5d5e552cbd correctly interpret some alternate whitespaces in token names 2024-10-19 00:24:35 -04:00
Cohee 558bbce919 Fix phone scrolling of logprobs 2024-07-15 01:18:54 +03:00
Wolfsblvt d7ade487b8 Refactor common enum for debounce timeouts 2024-04-28 06:21:47 +02:00
Cohee 022c180b62 Lint and clean-up 2024-04-15 00:39:15 +03:00
Cohee 9c218455c4 [chore] Run ESLint 2024-04-12 14:22:12 +03:00
Cohee 7389286862 Don't show logprobs when using smooth streaming 2024-04-02 15:51:00 +03:00
50h100a 6f7e7b85ab For Mancer:
- Allow logprobs (works)
- Allow multiswipe (not yet)
- Adjust visible samplers
Fix: 0 logprob is 100% chance, handle accordingly.
2024-03-24 14:45:37 -04:00
Cohee d5bf9fc28c Non-streaming logprobs for Aphrodite 2024-02-24 20:53:23 +02:00
Cohee eaadfea639 Extend debounce duration of logprobs renderer 2024-02-24 15:03:57 +02:00
Deciare 445cbda02f If token probability is a logarithm it'll be < 0
No need to read settings to find out if llama.cpp backend is in use...
2024-02-24 00:13:33 -05:00
Deciare 344b9eedbc Request token probabilities from llama.cpp backend
llama.cpp server token probabilities are given as values ranging from
0 to 1 instead of as logarithms.
2024-02-23 14:01:46 -05:00
Cohee b4646da187 Fix logprobs parser on NovelAI non-streaming 2024-01-29 11:13:48 +02:00
khanon 60044c18a4 Implement Token Probabilities UI using logprobs 2024-01-25 18:34:46 -06:00