SillyTavern/public/scripts/extensions/expressions
kingbri 6b656bf380 Expressions: Classify using LLM
Rather than using a separate BERT model to classify the last message,
use the LLM itself to get the classified expression label as a JSON
and set that as the current sprite. Doing this should take more information
into consideration and cut down on extra processing.

This is made possible by the use of constrained generation with JSON
schemas. Only available to TabbyAPI since it's the only backend that
supports the use of JSON schemas, but there can hopefully be a way
to use this with other backends as well.

Intercepts the generation and sets top_k = 1 (for greedy sampling)
and the json_schema to an emotion enum. Doing this also prevents
reingestion of the entire context every time a message is sent and
then asked to be classified, which doesn't compromise the chat
experience.

Signed-off-by: kingbri <bdashore3@proton.me>
2024-04-12 01:55:16 -04:00
..
add-custom-expression.html Custom char expressions 2023-09-14 21:30:02 +03:00
index.js Expressions: Classify using LLM 2024-04-12 01:55:16 -04:00
list-item.html Custom char expressions 2023-09-14 21:30:02 +03:00
manifest.json Initial commit 2023-07-20 20:32:15 +03:00
remove-custom-expression.html Custom char expressions 2023-09-14 21:30:02 +03:00
settings.html Expressions: Classify using LLM 2024-04-12 01:55:16 -04:00
style.css Mobile sprites fixes: hide non-VN sprite, fix group VN position, fix live2d conflicts 2023-11-27 03:22:35 +02:00