Commit Graph

10 Commits

Author SHA1 Message Date
Cohee
022c180b62 Lint and clean-up 2024-04-15 00:39:15 +03:00
Cohee
9c218455c4 [chore] Run ESLint 2024-04-12 14:22:12 +03:00
Cohee
7389286862 Don't show logprobs when using smooth streaming 2024-04-02 15:51:00 +03:00
50h100a
6f7e7b85ab For Mancer:
- Allow logprobs (works)
- Allow multiswipe (not yet)
- Adjust visible samplers
Fix: 0 logprob is 100% chance, handle accordingly.
2024-03-24 14:45:37 -04:00
Cohee
d5bf9fc28c Non-streaming logprobs for Aphrodite 2024-02-24 20:53:23 +02:00
Cohee
eaadfea639 Extend debounce duration of logprobs renderer 2024-02-24 15:03:57 +02:00
Deciare
445cbda02f If token probability is a logarithm it'll be < 0
No need to read settings to find out if llama.cpp backend is in use...
2024-02-24 00:13:33 -05:00
Deciare
344b9eedbc Request token probabilities from llama.cpp backend
llama.cpp server token probabilities are given as values ranging from
0 to 1 instead of as logarithms.
2024-02-23 14:01:46 -05:00
Cohee
b4646da187 Fix logprobs parser on NovelAI non-streaming 2024-01-29 11:13:48 +02:00
khanon
60044c18a4 Implement Token Probabilities UI using logprobs 2024-01-25 18:34:46 -06:00