Commit Graph

36 Commits

Author SHA1 Message Date
valadaptive
5569a63595 Remove legacy_streaming setting
This was a workaround for older versions of Slaude that implemented SSE
improperly. This was fixed in Slaude 7 months ago, so the workaround can
be removed.
2023-12-07 18:00:36 -05:00
Cohee
bca43b11fa Enable match whole words by default 2023-12-06 16:53:48 +02:00
Cohee
04ef9fba54 Disable context stop strings on pull but enable for new installs 2023-12-02 02:19:32 +02:00
Cohee
28bb5db04f Add new settings to default/settings.json. 2023-11-11 16:21:20 +02:00
Cohee
2c7b954a8d #1328 New API schema for ooba / mancer / aphrodite 2023-11-08 00:17:13 +02:00
Cohee
52ecad1cdf Rework single-line mode, add section for Context Formatting settings 2023-10-27 21:02:03 +03:00
SDS
5848ec498b
Assorted fixes and improvements (#1208)
* Kobold Presets fixed

* Help texts fixed

* Scale API for connectAPISlash

* Quick Reply checkboxes fixed

* New Instruct Mode Presets

* More date/time macros

* ChatML context template and instruct prompt format

* Mistral context template and instruct prompt format

* Removed use_default_badwordsids from kobold presets

* Renamed ChatML to Mistral-OpenOrca (ChatML)

* Renamed Mistral-OpenOrca (removed ChatML)

* Removed single_line from kobold presets

* Removed obsolete use_stop_sequence setting

* Ban EOS Token off by default

* Split AI Resp. Conf. in global and preset-specific settings

* Resolve conflicts

* Fix title

* Add translations for new help texts

* Fix i18n.json whitespace

* Make Mistral-OpenOrca system prompt more generic

* Renamed "Mistral-OpenOrca" to "ChatML" again

* More (UI) fixes and improvements

* Sendas hint fixed

---------

Co-authored-by: Cohee <18619528+Cohee1207@users.noreply.github.com>
2023-10-07 19:25:36 +03:00
Cohee
ae4a9a7b14 Remove legacy chat lazy load 2023-09-21 22:07:56 +03:00
Cohee
d34f7d3e1a Replace multigen with auto-continue 2023-09-15 21:34:41 +03:00
Cohee
322511caa9 Remove legacy Pygmalion formatting, part 2 2023-09-06 14:19:29 +03:00
Cohee
5ef79bd64d Remove NSFW avoidance prompt from Prompt Manager 2023-09-05 18:14:56 +03:00
Cohee
44f88c61ff Add simplified UI switch 2023-08-29 18:04:10 +03:00
Stefan Daniel Schwarz
1d7165c047 context template preset manager 2023-08-26 12:09:47 +02:00
Stefan Daniel Schwarz
3e0ce12b23 first_output_sequence and system_sequence_prefix 2023-08-25 22:34:08 +02:00
Cohee
f5fd15ffd2 #976 Return "Continue on send". Allow continuing the first chat message. Add debug function for i18n. 2023-08-24 15:13:04 +03:00
Cohee
e2507e8840 #976 Add "quick continue" button. Remove "send to continue" option. 2023-08-24 01:37:44 +03:00
Cohee
1ce848c1c3 Move before / after char WI to story strings 2023-08-24 00:26:47 +03:00
Stefan Daniel Schwarz
dd7b21c63d renamed roleplay instruct preset 2023-08-23 22:23:51 +02:00
Stefan Daniel Schwarz
252be20c16 Return of the Roleplay Context 2023-08-22 23:40:47 +02:00
Cohee
d6b06d5828
Merge pull request #994 from StefanDanielSchwarz/roleplay-preset-hotfix
Roleplay Preset Hotfix
2023-08-22 02:41:32 +03:00
Cohee
57b126bfbf Save chat completions settings to an object. Update numeric setting types 2023-08-22 00:35:46 +03:00
Cohee
75db476f76 Update default settings 2023-08-22 00:24:29 +03:00
Stefan Daniel Schwarz
c6ce06b339 Put "### Input:" back into Roleplay system prompt 2023-08-21 23:24:24 +02:00
Stefan Daniel Schwarz
9df4c51b07 Instruct Mode improvements 2023-08-21 22:32:58 +02:00
Stefan Daniel Schwarz
7b8d10d25e Added Include Names and Forced Names to all instruct presets 2023-08-20 16:10:22 +02:00
Cohee
80092b3170 #790 Simplify local prompt formatting. Use handlebars to render story string. 2023-08-17 22:47:34 +03:00
Cohee
81ed4d8431 Reorderable samplers for Novel 2023-08-16 20:34:47 +03:00
Mike Weldon
feb523bd01 NovelAI Kayra 1.1 update
* Updated some presets and added Cosmic Cube
* Change defaults for NovelAI to select Clio on cold start
* Automatically change the preset to an appropriate default whenever you change the model
* Removed deprecated Top G sampler
2023-08-15 18:52:29 -07:00
Mike Weldon
41ec7e5600 Many NovelAI fixes from dev guidance
* Remove AI Module "Autoselect" and make the auto-instruct work for all modules. This is how NAI is supposed to work.
* Log the response from the API.
* Move the AI Module setting up to the top of the settings window since it isn't part of the preset.
* Refactor phrase_rep_pen to use the actual API strings.
* Clamp the maximum token length to 150 before we call the API.
* Clamp the minimum token length in the UX from 1 to 150.
* Fix bug where the preamble was not initialized on cold start.
* Get rid of extra newline before dinkus.
* Make always_force_name2 default true.
2023-08-14 19:35:21 -07:00
Stefan Daniel Schwarz
b407fe2388 custom_stopping_strings_macro toggleable option 2023-08-04 16:53:49 +02:00
Cohee
84283bc2b4 Add "Best match" tokenizer option 2023-08-04 14:17:05 +03:00
Cohee
99af6ed472 Update default settings 2023-08-01 11:57:25 +03:00
Cohee
053dbbd25c Onboarding experience and new default user avatar 2023-07-21 16:42:18 +03:00
Cohee
cbfd48b2e5 Update default settings 2023-07-21 15:46:58 +03:00
Cohee
b05d501f82 Add default content by contest winners 2023-07-21 15:28:32 +03:00
Cohee
edd41989fd Initial commit 2023-07-20 20:32:15 +03:00