9010 Commits

Author SHA1 Message Date
Wolfsblvt
95a31cdd98 Remove logit bias from o1 and o3
- They do not be supporting it anymore
2025-02-07 19:51:21 +01:00
Wolfsblvt
d1ec9eb8ab Enabled streaming for o1 and o3
- They do be supporting it now
2025-02-07 19:50:01 +01:00
Cohee
0bcf8b5d21
Merge pull request #3439 from sirius422/fix-old-gemini-models
fix: adjust safetySettings handling for legacy Gemini models
2025-02-07 13:54:59 +02:00
sirius422
f32938981c fix: adjust safetySettings handling for legacy and experimental Gemini models
- gemini-1.5-flash-8b-exp-0827
- gemini-1.5-flash-8b-exp-0924
- gemini-pro
- gemini-1.0-pro
2025-02-07 16:35:44 +08:00
Wolfsblvt
25d1db3852 Fix issues with reasoning add/delete 2025-02-06 22:39:29 +01:00
Wolfsblvt
3ec3d71c5f Reduced round corners on thinking, make ppl happy 2025-02-06 22:17:21 +01:00
Wolfsblvt
79b8229b1b Add reasoning time seconds tooltip 2025-02-06 21:53:34 +01:00
Wolfsblvt
06303bb62f Move reasoning stuff to reasoning.js 2025-02-06 21:50:33 +01:00
Cohee
1acee3d4c9 Gemini: Add sysprompt for gemini 2.0 pro/flash 2025-02-06 21:46:17 +02:00
Wolfsblvt
b6d6727135
Merge pull request #3434 from sirius422/add-gemini-models
Add new gemini models and update the safety settings
2025-02-06 20:33:54 +01:00
Wolfsblvt
92a83ed01b Merge branch 'staging' into pr/3434 2025-02-06 20:18:25 +01:00
Wolfsblvt
f85e464ffd
Merge pull request #3437 from SillyTavern/gemini-models
Add new gemini models
2025-02-05 23:32:39 +01:00
Wolfsblvt
852ec86e26 Add new gemini models
- Gemini 2.0 Flash
- Gemini 2.0 Flash Lite Preview
- Gemini 2.0 Pro Experimental
2025-02-05 23:30:11 +01:00
Wolfsblvt
72ae3aa33d Just some css classes for WI entries 2025-02-05 23:15:30 +01:00
sirius422
d35bd3b073 feat: update Gemini safety settings
- Set threshold to OFF for most models, except for HARM_CATEGORY_CIVIC_INTEGRITY.
- Handle specific cases for a few models.
2025-02-06 04:59:30 +08:00
sirius422
b074f9fa89 feat: update Gemini models
- Add new Gemini models (2025/02/05)
2025-02-06 04:50:54 +08:00
Cohee
ed46b96ba2 Fix reasoning-get with no argument 2025-02-04 23:41:16 +02:00
Cohee
acf71fd702
Merge pull request #3431 from SillyTavern/ipv6_auto
Better IPv6 support
2025-02-04 23:23:02 +02:00
Cohee
5f564343ec Remove console log 2025-02-04 23:14:01 +02:00
Cohee
dfb062af41 Clean-uo blank lines 2025-02-04 22:52:07 +02:00
Cohee
d9bb5e6b1f Revert old default values 2025-02-04 22:46:24 +02:00
Cohee
363d8a4121 Merge branch 'staging' into ipv6_auto 2025-02-04 22:45:46 +02:00
Cohee
d9fc76f336
Merge pull request #3430 from qvink/generate_before_combine_prompts_await
await GENERATE_BEFORE_COMBINE_PROMPTS
2025-02-04 21:45:59 +02:00
qvink
e2c3af2114 Changing the event GENERATE_BEFORE_COMBINE_PROMPTS to an await emit() instead of emitAndWait(). 2025-02-04 11:22:23 -07:00
Cohee
7aa9b857c2
Merge pull request #3423 from SillyTavern/wi-dry-run-delays
WI dry run checks delay + logging
2025-02-04 17:44:28 +02:00
Cohee
327f4b1074 ditto 2025-02-04 10:56:42 +02:00
Cohee
4a58948b27 fix: correct typo in log message 2025-02-04 10:54:27 +02:00
Cohee
1f206c3a36
Merge pull request #3425 from Tosd0/staging
Optimize zh-CN Translations
2025-02-04 10:53:43 +02:00
Cohee
c9144cd824
Merge pull request #3424 from SillyTavern/oai-reasoning-effort
OpenAI "Reasoning Effort" for o1/o3-mini + oai models restructured
2025-02-04 10:52:30 +02:00
Cohee
dd2154c19b Refactor reasoning effort handling for OpenAI models in chat completions 2025-02-04 10:50:58 +02:00
Cohee
dade4eafa5 Add reasoning_effort to Custom.
Update inline image quality style
2025-02-04 10:45:51 +02:00
Cohee
4f63b471d1 Save reasoning effort to preset 2025-02-04 10:28:04 +02:00
Cohee
16bb9b3679 Update models list
1. Moderation can't be prompted from /chat/completions
2. Move turbo base model on top of the group
2025-02-04 10:23:25 +02:00
Cohee
6ececb2ceb Fix text display on stream abort 2025-02-04 10:16:26 +02:00
Sevenyine
6edbb23903 update 2025-02-04 12:34:33 +08:00
Sevenyine
50baaaae81 Optimize Translations 2025-02-04 11:36:03 +08:00
Wolfsblvt
da531f12c2 Fix reasoning error on empty chat 2025-02-04 00:20:17 +01:00
Wolfsblvt
ba0e852f20 Add o1 + update fill openai model list 2025-02-04 00:11:56 +01:00
Wolfsblvt
d0ebac37c1 Make the backend actually send reasoning_effort 2025-02-03 23:58:01 +01:00
Wolfsblvt
3f9af45493 Add reasoning_effort for OpenAI models 2025-02-03 23:29:53 +01:00
Wolfsblvt
fd3c427f07 WI dry run checks delay
- WI dry run now also checks for delays
- Better visibility of what a dry run is in the logs
2025-02-03 22:12:35 +01:00
Cohee
524ecf8acd
Merge pull request #3418 from pcpthm/fix-reasoning-duplicated
Fix reasoning is duplicated when a continue generation is interrupted
2025-02-03 16:44:08 +02:00
pcpthm
c4be4d7d61 Fix reasoning duplicated when continue is interrupted 2025-02-03 21:07:49 +09:00
Cohee
89a16f2030
Merge pull request #3415 from dvejmz/bugfix/fix-o3-mini-selector-labels 2025-02-03 00:47:23 +02:00
David Jimenez
1dfc09e4d2
fix: add missing o3-mini labels to model selector 2025-02-02 21:38:44 +00:00
Cohee
bfd57b66ef [wip] OpenRouter: Add model providers endpoint 2025-02-02 23:30:44 +02:00
Cohee
71e1fc91f1 select2: Add disabled option styling 2025-02-02 23:30:11 +02:00
Cohee
14703846a7 OpenAI: add options for o3-mini 2025-02-02 22:21:21 +02:00
Cohee
5494e89fdb
Merge pull request #3413 from SillyTavern/thinking-is-stylish
Thinking is stylish - if you are not cool, I don't know how to help you
2025-02-02 21:46:23 +02:00
Cohee
4ca55d3b9b Revert edit buttons units 2025-02-02 20:55:03 +02:00