llama.cpp: fixed wording in comment for logprobs

This commit is contained in:
Isaac McFadyen
2024-12-27 01:15:35 -05:00
parent 77414045d9
commit d14f2f3c77

View File

@@ -1046,7 +1046,7 @@ export function parseTextgenLogprobs(token, logprobs) {
// 3 cases:
// 1. Before commit 6c5bc06, "probs" key with "tok_str"/"prob", and probs are [0, 1] so use them directly.
// 2. After commit 6c5bc06 but before commit 89d604f broke logprobs (they all return the first token's logprobs)
// We don't know the client version so we can't do much about this.
// We don't know the llama.cpp version so we can't do much about this.
// 3. After commit 89d604f uses OpenAI-compatible format with "completion_probabilities" and "token"/"logprob" keys.
// Note that it is also the *actual* logprob (negative number), so we need to convert to [0, 1].
if (logprobs?.[0]?.probs) {