Yokayo
|
1d5cf8d25c
|
Work on translation
|
2025-01-12 00:42:58 +07:00 |
|
Cohee
|
2b3e44cca3
|
Clear custom model selection on loading presets
|
2025-01-08 02:06:29 +02:00 |
|
Cohee
|
1557dec2bc
|
Revert "Don't auto-select custom model to the first model in the list"
This reverts commit d791b54528318f71d7653c690a032ba47b863b6e.
|
2025-01-08 01:46:31 +02:00 |
|
Cohee
|
d791b54528
|
Don't auto-select custom model to the first model in the list
|
2025-01-07 20:39:35 +02:00 |
|
Cohee
|
6552038712
|
Update no validate warning
|
2025-01-04 14:16:25 +02:00 |
|
Rivelle
|
81cb3430bb
|
Update openai.js: fix i18n
|
2025-01-04 15:12:55 +08:00 |
|
Cohee
|
2ca70090aa
|
Mistral: Fix endpoint validation on status check
|
2024-12-29 22:39:26 +02:00 |
|
Cohee
|
69f8d02c53
|
Fix continue prefill using Claude prefill for non-Claude sources
|
2024-12-29 22:34:56 +02:00 |
|
Cohee
|
cdb31699d4
|
Expose new post-processing as "Semi-strict"
|
2024-12-29 21:20:15 +02:00 |
|
Cohee
|
4c7d160d41
|
DeepSeek
Closes #3233
|
2024-12-29 20:38:13 +02:00 |
|
Cohee
|
662f0e9c73
|
Gemini: Thought toggle
Closes #3220
|
2024-12-29 18:23:07 +02:00 |
|
Cohee
|
39cfb35c1a
|
Gemini: Fix cross-chunk parsing of multipart replies
|
2024-12-27 23:15:09 +02:00 |
|
Cohee
|
a82c05a8ac
|
Gemini thinking: Specify context size, system prompt and vision support
|
2024-12-27 22:39:26 +02:00 |
|
Cohee
|
7adc6d38e2
|
OpenRouter: Add control for middle-out transform
Closes #3033
|
2024-12-24 21:51:47 +02:00 |
|
Cohee
|
404a217622
|
Backfill imported bias entry ids
|
2024-12-23 00:49:35 +02:00 |
|
Cohee
|
7f94cb4bee
|
CC: Simplify default wrappers for personality and scenario
|
2024-12-22 23:36:58 +02:00 |
|
Cohee
|
3f7b91a4eb
|
Add uuid to CC logit bias entries
|
2024-12-22 23:27:20 +02:00 |
|
Cohee
|
1ebaf18210
|
feat: add drag-and-drop functionality for logit bias lists
|
2024-12-22 19:26:47 +02:00 |
|
Cohee
|
73614f2f8d
|
Refactor prompt converters with group names awareness
|
2024-12-20 23:30:57 +02:00 |
|
Cohee
|
d7328af4c8
|
Merge branch 'staging' into group-join-examples
|
2024-12-20 22:32:19 +02:00 |
|
Cohee
|
8753ae34be
|
Merge pull request #3193 from cloak1505/r7b
Add Cohere command-r7b-12-2024
|
2024-12-17 12:18:54 +02:00 |
|
cloak1505
|
ce536201e6
|
Fix Cohere context sizes
|
2024-12-15 17:12:58 -06:00 |
|
Cohee
|
2d66b7204a
|
Fixes to group join examples parsing
|
2024-12-15 18:09:17 +02:00 |
|
M0cho
|
9ea8fc92e4
|
Update: [openai.js] Remove deleted GAI models from context length checks
Removed gemini-pro-vision and text-bison-001 (PaLM) from context length checks.
|
2024-12-15 13:06:30 +09:00 |
|
Cohee
|
47dea8159e
|
Merge branch 'staging' into inject-filter
|
2024-12-12 23:35:23 +02:00 |
|
Cohee
|
cf1b98e25d
|
Add 'gemini-2.0-flash-exp' to supported vision models
|
2024-12-11 23:40:45 +02:00 |
|
M0cho
|
a64c8ade9d
|
Support Gemini 2.0 Flash-exp
|
2024-12-12 06:31:27 +09:00 |
|
Cohee
|
e773678f2d
|
Prompt Manager: forbid overrides if a prompt is disabled
|
2024-12-11 02:19:13 +02:00 |
|
Cohee
|
bcfb07de5e
|
New llama-3.3 Groq model
Closes #3168
|
2024-12-10 17:59:59 +02:00 |
|
Cohee
|
f6b9cd970d
|
Add gemini-exp-1206 to supported image prompt models
|
2024-12-07 14:36:53 +02:00 |
|
Cohee
|
d6f34f7b2c
|
Add prompt injection filters
|
2024-12-06 19:53:02 +02:00 |
|
M0cho
|
073b76a693
|
Support Gemini-exp-1206
|
2024-12-07 02:19:15 +09:00 |
|
Cohee
|
7dfba69fc1
|
Import promptManager from openai.js
|
2024-12-05 22:06:16 +02:00 |
|
Cohee
|
8de1d26eaa
|
NanoGPT: Unhide sampling parameters
|
2024-12-04 00:00:08 +02:00 |
|
Cohee
|
9382845dee
|
Claude: remove user filler from prompt converter
|
2024-11-24 19:05:41 +02:00 |
|
Cohee
|
df50fece6c
|
Fix quota error async-ness
|
2024-11-24 01:48:08 +02:00 |
|
Cohee
|
70c45fb001
|
Merge branch 'staging' into fix/connRefusedErrMsg
|
2024-11-24 01:41:15 +02:00 |
|
Cohee
|
2b481dae2d
|
Fix continue prefill newline prefix
|
2024-11-23 20:41:57 +02:00 |
|
Cohee
|
ecbf9df79a
|
Update context sizes for new Cohere models
|
2024-11-22 17:54:03 +00:00 |
|
Cohee
|
85ca08a2ea
|
Settings for new gemini
|
2024-11-22 17:50:33 +00:00 |
|
ceruleandeep
|
8de551fc94
|
Return 502 with error description when connection to remote CC API fails
If chat-completions/generate returns an error, throw the error message
Reformat display of exceptions during SD prompt text generation
|
2024-11-22 11:55:27 +11:00 |
|
ceruleandeep
|
0383ea52e9
|
Linting and commenting
Linting and commenting
Linting and commenting
Linting and commenting
Linting and commenting
|
2024-11-22 11:55:27 +11:00 |
|
Cohee
|
b651d31780
|
MistralAI: new pixtral large model
|
2024-11-18 16:10:20 +00:00 |
|
Cohee
|
c9d2b609f1
|
match => includes
|
2024-11-16 15:41:41 +02:00 |
|
Cohee
|
37f4fd4def
|
Merge pull request #3073 from M0ch0/staging
Supports GEMINI EXP 1114
|
2024-11-16 15:40:56 +02:00 |
|
Cohee
|
33d8a91bf2
|
Linter fixes
|
2024-11-16 14:22:46 +02:00 |
|
M0cho
|
30bca8e39b
|
Supports GEMINI EXP
|
2024-11-15 05:26:10 +09:00 |
|
Yokayo
|
88ad22196c
|
Merge branch 'staging' of https://github.com/Yokayo/SillyTavern into staging
|
2024-11-06 01:33:27 +07:00 |
|
Yokayo
|
9d664bc679
|
Update tl
|
2024-11-06 01:33:24 +07:00 |
|
Cohee
|
abef12d403
|
Fix "OpenAI-compatible" endpoints choking on empty logit bias
|
2024-11-05 16:42:51 +00:00 |
|