Wolfsblvt
c8e05a34d6
Add gemini pro to hidden thinking models
2025-02-08 01:48:14 +01:00
Wolfsblvt
d94ac48b65
Add thinking time for hidden reasoning models
...
- Streamline reasoning UI update functionality
- Add helper function to identify hidden reasoning models
- Fix/update reasoning time calculation to actually utilize start gen time
- Fix reasoning UI update on swipe
- add CSS class for hidden reasoning blocks (to make it possible to hide for users)
2025-02-08 00:45:33 +01:00
Cohee
5886bb6b3a
Merge pull request #3442 from SillyTavern/more-reasoning-ui
...
Smaller improvements and fixes to the reasoning UI
2025-02-07 23:55:18 +02:00
Cohee
8387431534
Revert "Unify reasoning request effect on parsing"
...
This reverts commit ca7d2aeec3c5feb5cf9cc9da0a62159aed87d323.
2025-02-07 23:46:38 +02:00
Cohee
754a14ff3e
Add types for dompurify
2025-02-07 23:45:57 +02:00
Cohee
ca7d2aeec3
Unify reasoning request effect on parsing
2025-02-07 23:35:07 +02:00
Cohee
28ba262087
Reduce "few seconds" threshold to 3
2025-02-07 23:33:10 +02:00
Wolfsblvt
23d2c40e05
Remove not needed reasoning ui event handler
2025-02-07 22:21:01 +01:00
Wolfsblvt
4094887624
Add jsdoc
2025-02-07 21:46:41 +01:00
Cohee
4492828b9f
Merge pull request #3441 from SillyTavern/gpt-o1-o3-streaming
...
Enable streaming for gpt-o1 and gpt-o3
2025-02-07 22:05:29 +02:00
Cohee
a2aef5ea4a
Use array.includes
2025-02-07 22:04:21 +02:00
Wolfsblvt
cc401b2c9d
Whelp, seems like o1 main still no streaming
2025-02-07 20:12:10 +01:00
Wolfsblvt
95a31cdd98
Remove logit bias from o1 and o3
...
- They do not be supporting it anymore
2025-02-07 19:51:21 +01:00
Wolfsblvt
d1ec9eb8ab
Enabled streaming for o1 and o3
...
- They do be supporting it now
2025-02-07 19:50:01 +01:00
Cohee
0bcf8b5d21
Merge pull request #3439 from sirius422/fix-old-gemini-models
...
fix: adjust safetySettings handling for legacy Gemini models
2025-02-07 13:54:59 +02:00
sirius422
f32938981c
fix: adjust safetySettings handling for legacy and experimental Gemini models
...
- gemini-1.5-flash-8b-exp-0827
- gemini-1.5-flash-8b-exp-0924
- gemini-pro
- gemini-1.0-pro
2025-02-07 16:35:44 +08:00
Wolfsblvt
25d1db3852
Fix issues with reasoning add/delete
2025-02-06 22:39:29 +01:00
Wolfsblvt
3ec3d71c5f
Reduced round corners on thinking, make ppl happy
2025-02-06 22:17:21 +01:00
Wolfsblvt
79b8229b1b
Add reasoning time seconds tooltip
2025-02-06 21:53:34 +01:00
Wolfsblvt
06303bb62f
Move reasoning stuff to reasoning.js
2025-02-06 21:50:33 +01:00
Cohee
1acee3d4c9
Gemini: Add sysprompt for gemini 2.0 pro/flash
2025-02-06 21:46:17 +02:00
Wolfsblvt
b6d6727135
Merge pull request #3434 from sirius422/add-gemini-models
...
Add new gemini models and update the safety settings
2025-02-06 20:33:54 +01:00
Wolfsblvt
92a83ed01b
Merge branch 'staging' into pr/3434
2025-02-06 20:18:25 +01:00
Wolfsblvt
f85e464ffd
Merge pull request #3437 from SillyTavern/gemini-models
...
Add new gemini models
2025-02-05 23:32:39 +01:00
Wolfsblvt
852ec86e26
Add new gemini models
...
- Gemini 2.0 Flash
- Gemini 2.0 Flash Lite Preview
- Gemini 2.0 Pro Experimental
2025-02-05 23:30:11 +01:00
Wolfsblvt
72ae3aa33d
Just some css classes for WI entries
2025-02-05 23:15:30 +01:00
sirius422
d35bd3b073
feat: update Gemini safety settings
...
- Set threshold to OFF for most models, except for HARM_CATEGORY_CIVIC_INTEGRITY.
- Handle specific cases for a few models.
2025-02-06 04:59:30 +08:00
sirius422
b074f9fa89
feat: update Gemini models
...
- Add new Gemini models (2025/02/05)
2025-02-06 04:50:54 +08:00
Cohee
ed46b96ba2
Fix reasoning-get with no argument
2025-02-04 23:41:16 +02:00
Cohee
acf71fd702
Merge pull request #3431 from SillyTavern/ipv6_auto
...
Better IPv6 support
2025-02-04 23:23:02 +02:00
Cohee
5f564343ec
Remove console log
2025-02-04 23:14:01 +02:00
Cohee
dfb062af41
Clean-uo blank lines
2025-02-04 22:52:07 +02:00
Cohee
d9bb5e6b1f
Revert old default values
2025-02-04 22:46:24 +02:00
Cohee
363d8a4121
Merge branch 'staging' into ipv6_auto
2025-02-04 22:45:46 +02:00
Cohee
d9fc76f336
Merge pull request #3430 from qvink/generate_before_combine_prompts_await
...
await GENERATE_BEFORE_COMBINE_PROMPTS
2025-02-04 21:45:59 +02:00
qvink
e2c3af2114
Changing the event GENERATE_BEFORE_COMBINE_PROMPTS to an await emit() instead of emitAndWait().
2025-02-04 11:22:23 -07:00
Cohee
7aa9b857c2
Merge pull request #3423 from SillyTavern/wi-dry-run-delays
...
WI dry run checks delay + logging
2025-02-04 17:44:28 +02:00
Cohee
327f4b1074
ditto
2025-02-04 10:56:42 +02:00
Cohee
4a58948b27
fix: correct typo in log message
2025-02-04 10:54:27 +02:00
Cohee
1f206c3a36
Merge pull request #3425 from Tosd0/staging
...
Optimize zh-CN Translations
2025-02-04 10:53:43 +02:00
Cohee
c9144cd824
Merge pull request #3424 from SillyTavern/oai-reasoning-effort
...
OpenAI "Reasoning Effort" for o1/o3-mini + oai models restructured
2025-02-04 10:52:30 +02:00
Cohee
dd2154c19b
Refactor reasoning effort handling for OpenAI models in chat completions
2025-02-04 10:50:58 +02:00
Cohee
dade4eafa5
Add reasoning_effort to Custom.
...
Update inline image quality style
2025-02-04 10:45:51 +02:00
Cohee
4f63b471d1
Save reasoning effort to preset
2025-02-04 10:28:04 +02:00
Cohee
16bb9b3679
Update models list
...
1. Moderation can't be prompted from /chat/completions
2. Move turbo base model on top of the group
2025-02-04 10:23:25 +02:00
Cohee
6ececb2ceb
Fix text display on stream abort
2025-02-04 10:16:26 +02:00
Sevenyine
6edbb23903
update
2025-02-04 12:34:33 +08:00
Sevenyine
50baaaae81
Optimize Translations
2025-02-04 11:36:03 +08:00
Wolfsblvt
da531f12c2
Fix reasoning error on empty chat
2025-02-04 00:20:17 +01:00
Wolfsblvt
ba0e852f20
Add o1 + update fill openai model list
2025-02-04 00:11:56 +01:00