Cohee
0551c8023e
Move context/instruct templates to default context index
2024-03-28 22:54:37 +02:00
Cohee
4f58e04ef3
Move default instruct/context templates out of public
2024-03-28 22:40:43 +02:00
Cohee
e25c419491
Update Default chat comps preset
2024-03-24 17:09:28 +02:00
Cohee
965bb54f7d
Option to add names to completion contents
2024-03-19 21:53:40 +02:00
Cohee
bd223486de
Include additional headers for all supported Text Completion types.
2024-03-14 00:48:08 +02:00
Cohee
e24fbfdc1d
Update default OAI sampler parameters
2024-03-13 02:25:20 +02:00
Cohee
00a4a12d7d
Remove "Exclude Assistant suffix" option
2024-03-05 20:41:53 +02:00
Deciare
d554edc023
Support underlined text formatting.
...
- Enable the `underline` option for Showdown.
- Implement option for underlined text colour.
- Update stylesheet.
2024-03-01 00:35:27 -05:00
Deciare
9eba076ae4
Sampler order for llama.cpp server backend
2024-02-23 23:01:04 -05:00
Cohee
0c1cf9ff2e
Send sampler priority as array
2024-02-21 00:53:54 +02:00
kalomaze
2065f95edc
Sampler priority support
2024-02-10 14:57:41 -06:00
Cohee
f669b959c3
Rename RP instruct
2024-02-02 17:37:38 +02:00
Cohee
2f3dca2348
Add endpoint for transformers.js TTS
2024-02-02 01:51:02 +02:00
Cohee
4b845dd442
Add backend for transformers.js whisper
2024-02-02 00:36:40 +02:00
h-a-s-k
9354697753
Actually call them example chats
2024-01-13 13:06:51 -03:00
Cohee
24cd072e69
Update default vector storage model
2023-12-31 04:00:27 +02:00
LenAnderson
f862ffafd2
add option in config.yaml to use png for avatar thumbs
2023-12-22 14:23:50 +00:00
Cohee
c212a71425
Fix ignore list of preset manager
2023-12-20 15:51:00 +02:00
Cohee
67dd52c21b
#1309 Ollama text completion backend
2023-12-19 16:38:11 +02:00
Cohee
b0d9f14534
Re-add Together as a text completion source
2023-12-17 23:38:03 +02:00
Cohee
c7c1513e91
Add proxy support for multimodal captions. Add caption pre-prompt
2023-12-17 19:41:20 +02:00
Cohee
16795dd5cc
Add server plugin loader
2023-12-16 22:21:40 +02:00
valadaptive
0ee19d2ede
Set background client-side
2023-12-15 05:45:21 -05:00
valadaptive
5569a63595
Remove legacy_streaming setting
...
This was a workaround for older versions of Slaude that implemented SSE
improperly. This was fixed in Slaude 7 months ago, so the workaround can
be removed.
2023-12-07 18:00:36 -05:00
Cohee
bca43b11fa
Enable match whole words by default
2023-12-06 16:53:48 +02:00
Cohee
45df576f1c
Re-add default presets for content manager
2023-12-03 15:07:21 +02:00
Cohee
478330542d
Default to non-listen for new installs
2023-12-03 00:54:28 +02:00
Cohee
04ef9fba54
Disable context stop strings on pull but enable for new installs
2023-12-02 02:19:32 +02:00
Cohee
726bb2e041
#1405 Add formality config for deepl
2023-12-01 14:12:56 +02:00
Cohee
1ce009b84e
[FEATURE_REQUEST] config.yaml basicAuthUser Default Setting Recommendation #1415
2023-11-29 14:05:19 +02:00
Cohee
e541c2b186
#1412 Add randomized user ids to OpenAI
2023-11-29 00:11:10 +02:00
Cohee
a7024a1d34
Migrate to config.yaml
2023-11-25 23:45:33 +02:00
Cohee
b24d4f2340
Add opt-in CORS bypass endpoint
2023-11-25 21:56:57 +02:00
Cohee
52d9855916
Code lint
2023-11-21 02:00:50 +02:00
LenAnderson
46cc04c798
add default comfy workflow
2023-11-20 15:59:38 +00:00
Cohee
28bb5db04f
Add new settings to default/settings.json.
2023-11-11 16:21:20 +02:00
Cohee
2c7b954a8d
#1328 New API schema for ooba / mancer / aphrodite
2023-11-08 00:17:13 +02:00
Cohee
9396ca585c
#1287 Add user.css file
2023-10-28 12:48:42 +03:00
Cohee
52ecad1cdf
Rework single-line mode, add section for Context Formatting settings
2023-10-27 21:02:03 +03:00
Cohee
5dbe2ebf29
Add chat file backups
2023-10-24 22:09:55 +03:00
Cohee
c4e6b565a5
Add SD prompt expansion
2023-10-20 15:03:26 +03:00
SDS
5848ec498b
Assorted fixes and improvements ( #1208 )
...
* Kobold Presets fixed
* Help texts fixed
* Scale API for connectAPISlash
* Quick Reply checkboxes fixed
* New Instruct Mode Presets
* More date/time macros
* ChatML context template and instruct prompt format
* Mistral context template and instruct prompt format
* Removed use_default_badwordsids from kobold presets
* Renamed ChatML to Mistral-OpenOrca (ChatML)
* Renamed Mistral-OpenOrca (removed ChatML)
* Removed single_line from kobold presets
* Removed obsolete use_stop_sequence setting
* Ban EOS Token off by default
* Split AI Resp. Conf. in global and preset-specific settings
* Resolve conflicts
* Fix title
* Add translations for new help texts
* Fix i18n.json whitespace
* Make Mistral-OpenOrca system prompt more generic
* Renamed "Mistral-OpenOrca" to "ChatML" again
* More (UI) fixes and improvements
* Sendas hint fixed
---------
Co-authored-by: Cohee <18619528+Cohee1207@users.noreply.github.com>
2023-10-07 19:25:36 +03:00
Cohee
ae4a9a7b14
Remove legacy chat lazy load
2023-09-21 22:07:56 +03:00
Cohee
2c84c93f3d
Add thumbnails quality config
2023-09-16 21:53:30 +03:00
Cohee
d34f7d3e1a
Replace multigen with auto-continue
2023-09-15 21:34:41 +03:00
Cohee
6ad786f348
Add alternative local vectors source.
...
x5 speed boost!!
2023-09-14 23:40:13 +03:00
Cohee
0cc048cb64
Refactor transformers.js usage
2023-09-14 23:12:33 +03:00
Cohee
dc4a6e862b
Add local caption pipeline to UI plugin
2023-09-12 00:15:21 +03:00
Cohee
c76c76410c
Add ability to override local classification model
2023-09-11 01:25:22 +03:00
Cohee
322511caa9
Remove legacy Pygmalion formatting, part 2
2023-09-06 14:19:29 +03:00