Commit Graph

168 Commits

Author SHA1 Message Date
Cohee ab06aa4bf5 Add support for outgoing request proxying
Closes #2824
2024-09-11 22:36:50 +03:00
Cohee 81251b073a Implement downloadable tokenizers
Closes #2574, #2754
2024-09-06 16:28:34 +00:00
Cohee a0dc16d84c Extend default context size to 8k 2024-09-02 02:26:17 +03:00
Succubyss fa806707ef
default: parser flags → true 2024-08-26 10:07:59 -05:00
Cohee 7d4b3e0800 Use J1.5 Large by default 2024-08-26 12:11:29 +03:00
Cohee 5fc16a2474 New AI21 Jamba + tokenizer 2024-08-26 12:07:36 +03:00
Cohee 5d48e081a6 Upscale default avatars 2024-08-25 22:04:08 +03:00
Cohee 60c22bf803 Add config value for extensions auto-update 2024-08-25 19:48:01 +03:00
Cohee eb015e4a9a Update default themes and settings 2024-08-25 16:43:50 +03:00
Cohee 1e3a97a3aa Fix ollama keepAlive config hint. 2024-08-24 10:15:48 +03:00
Cohee 7322dd1954 Add optional Claude system prompt cache. 2024-08-15 21:25:08 +03:00
Cohee cb7185fa12 [chore] Fix grammar, add JSDocs 2024-08-15 20:29:17 +03:00
BPplays da5581e20e
support for Ipv6 (#2593)
* started adding v6 support

* added error checking and change messages to the user

* fixed lsp caused issue

* fixed formatting error

* added error handling to https

* fixed formatting errors

* brought server starting into different func and added enable v6 and v4

* added error checking for disabling both v6 and v4. added option to prefer v6 for dns. added that stuff to the default config

* fixed dumb bug

* changed to settings named disable ipvx

* fixed failed ips still showing as listening

* fixed error handling

* changed ip protocol config layout

* small const name changes

* fixed no error if only available protocol fails, and changed wording of some errors

* fixed error handling saying 'non-fatal error' for protocol fail even when it's the only one enabled

* moved more logic to listen error handler

* fixed eslint issues

* added more info on when to prefer ipv6 for dns

* in conf changed one 'ipv6' to 'IPv6' for consistency

* changed error message and redid how starting the server works

* removed unneeded log

* removed unneeded log

* removed unneeded comments

* fixed errors

* fixed errors

* fixed errors

* changed the wording of ip related error messages

* removed empty lines

* changed to .finally(startServer);

* removed some whitespace

* disabled ipv6 by default ╯︿╰ and changed some message wording

* added auto mode for autorun hostname and changed formatting for listening log and added goto message with autorun url

* added autorun port override

* removed debug log

* changed formatting

* added cli args to ipv6 and autorun stuff

* moved cli args around

* changed formatting

* changed colors for ip

* added avoidLocalhost cli arg

* changed formatting

* changed to not print protocol on listening

* added config option for avoid localhost and changed formatting of messages

* fixed avoid localhost config option

* Fix ipv4 color

---------

Co-authored-by: Cohee <18619528+Cohee1207@users.noreply.github.com>
2024-08-15 20:12:12 +03:00
Cohee f305ba7ce7 Configurable ollama keep_alive
Closes #1859
2024-08-11 17:32:31 +03:00
Cohee 32c48cf9fa Fix default value for OpenRouter Top A 2024-08-07 20:58:19 +03:00
Cohee 3a8614db94 Update models in default files 2024-08-01 00:53:45 +03:00
Cohee 5ad433c576 #2557 Put MistralAI prefix under a feature toggle 2024-07-27 19:57:40 +03:00
Cohee 5f0e74bd56 Rename PHI/aux UI fields 2024-07-21 14:29:13 +03:00
Cohee 230e6cd142 Update default textgen presets 2024-07-12 23:54:26 +03:00
Cohee 6167f50a89 Remove dead property from default settings 2024-07-12 23:52:55 +03:00
Cohee 02e65ff176 Configurable session expiration 2024-07-06 14:50:36 +03:00
Wolfsblvt ff23808d3b Add WI toggle to include/exclude names in scanning 2024-07-06 03:23:02 +02:00
Cohee 13630c896a
Add instruct/context for Gemma 2 (#2477)
* Add instruct/context for Gemma 2

* Add Gemma 2 Roleplay variation

* Revert "Add Gemma 2 Roleplay variation"

This reverts commit d1473e18a1.
2024-07-06 01:54:36 +03:00
Cohee ea768661e8 Add theme contest winner, pt.2 2024-07-04 01:12:26 +03:00
Cohee ce18b33e73 Set the maximum number of backups via config 2024-07-01 01:17:56 +03:00
Cohee 902dfbcdcc Add theme contest winners, pt.1 2024-06-28 10:30:32 +00:00
DreamGenX c8eaa15f18
Add DreamGen llama 3 templates (#2389) 2024-06-17 20:54:08 +03:00
Cohee 1ac2241d2c Lower main text intensity in Cappuccino theme 2024-06-15 13:30:03 +03:00
Cohee 432be09583
Merge pull request #2259 from Succubyss/staging
[Claude] Implements Assistant Impersonation Prefill
2024-05-17 11:15:37 +03:00
Succubyss c822b9e2da Implements Assistant Impersonation Prefill 2024-05-16 21:59:58 -05:00
Cohee c661fea07d #2227 Implement content scaffolding 2024-05-17 02:43:14 +03:00
Cohee c4ade296ae Rotate Flux the Cat to downloadable content index 2024-05-12 15:09:00 +03:00
Cohee 0ed81e3b1a Rotate Coding Sensei to downloadable content index 2024-05-12 14:49:13 +03:00
Cohee b13434c505 Merge branch 'release' into staging 2024-05-04 20:45:48 +03:00
RossAscends 204a934553
update coding sensei with proper codeblock format 2024-05-05 00:06:46 +09:00
sasha0552 2bd239fe81
Initial vLLM support 2024-05-02 22:40:40 +00:00
Cohee eb4cae4e6d Add WL to config. Code clean-up. 2024-05-01 19:52:34 +03:00
Cohee b42125a654 Fix content index 2024-04-27 18:03:14 +03:00
Hirose 3a8b8ed639 Skill Issue 2024-04-27 08:20:44 -05:00
Hirose 3a78d69b5b Use {{name}} macro, create new templates 2024-04-27 07:39:52 -05:00
Hirose c3578d2cda Use names in place of role for ChatML and LLama-3-Instruct 2024-04-26 20:14:51 -05:00
Stefan Daniel Schwarz d34a0ee20e Phi Instruct context+instruct presets 2024-04-24 23:47:04 +02:00
Cohee 2f45f50d37 Add config value for forwarded IPs whitelisting 2024-04-22 15:52:59 +03:00
Cohee ef5d505de3 Merge branch 'staging' into neo-server 2024-04-21 18:28:56 +03:00
Cohee 842b463e60 System same as user for Llama 3 2024-04-21 18:28:44 +03:00
Cohee b3bbec83b6 Merge branch 'staging' into neo-server 2024-04-20 02:56:05 +03:00
Cohee 3ff5884112 Forbid external media by default 2024-04-20 01:11:37 +03:00
Cohee 391c3e9eff Remove dupes, change system prompt 2024-04-19 22:08:31 +03:00
Cohee b8f7db8d43
Merge pull request #2106 from StefanDanielSchwarz/Llama-3-Instruct-presets
Llama 3 Instruct context+instruct presets
2024-04-19 21:40:29 +03:00
RossAscends 1c5e7483e2
add Llama 3 instruct preset 2024-04-20 03:08:54 +09:00