Cohee
|
48c3e81f42
|
DRY
|
2024-11-02 13:22:41 +02:00 |
|
Cohee
|
293d6ff60d
|
Run mutating variable macros first
|
2024-11-02 12:53:44 +02:00 |
|
Cohee
|
48d8e6e2c3
|
Port #3031 onto new engine
|
2024-11-02 00:55:34 +02:00 |
|
Cohee
|
5c90c8b1f6
|
Add post-process fn to evaluation
|
2024-11-02 00:44:12 +02:00 |
|
Cohee
|
8f373cf1dc
|
Macros: refactor with a single replace point
|
2024-11-01 21:47:25 +02:00 |
|
Cohee
|
b837c482fc
|
Merge pull request #3029 from P3il4/staging
Optimize chat manager logic
|
2024-11-01 20:17:29 +02:00 |
|
Cohee
|
4f6c5522bc
|
Independent chat management content scroll
|
2024-11-01 20:16:51 +02:00 |
|
Cohee
|
07feccbe35
|
koboldcpp: parse logprobs
|
2024-11-01 11:38:31 +02:00 |
|
Cohee
|
8c568bfa13
|
Merge branch 'staging' into optimize-chat-manager
|
2024-10-31 21:05:36 +02:00 |
|
Cohee
|
5fe1bc46e6
|
Fix jquery plugins typedef
|
2024-10-31 21:05:22 +02:00 |
|
Cohee
|
a52129392e
|
Align chat block items
|
2024-10-31 20:56:07 +02:00 |
|
Cohee
|
c303e27f62
|
Set search type to the input
|
2024-10-31 20:51:57 +02:00 |
|
Cohee
|
c7f94c6c14
|
Adjust doc link
|
2024-10-31 20:51:10 +02:00 |
|
Cohee
|
a8ff518b65
|
Add a space to message counter
|
2024-10-31 20:50:22 +02:00 |
|
Cohee
|
47b7745ceb
|
Extend message preview length
|
2024-10-31 20:47:31 +02:00 |
|
Cohee
|
6e36b77f1a
|
Remove TAI artifact
|
2024-10-31 19:57:29 +02:00 |
|
Cohee
|
547c4f6757
|
Fix sorting and selected highlight
|
2024-10-31 19:54:24 +02:00 |
|
Cohee
|
f4ef324203
|
Merge pull request #3024 from dylan1951/add-nano-gpt-provider
Add NanoGPT chat completions provider
|
2024-10-31 19:30:24 +02:00 |
|
Cohee
|
2a451cf6a1
|
Add logo img
|
2024-10-31 19:29:43 +02:00 |
|
p3il4
|
30e9e90b38
|
optimize chat manager logic
|
2024-10-31 17:48:58 +03:00 |
|
Cohee
|
4babf322c1
|
Fix model restoration on load
|
2024-10-30 02:09:28 +02:00 |
|
Cohee
|
f5f11eebb2
|
Support auto-connect
|
2024-10-30 02:02:56 +02:00 |
|
Cohee
|
085d852b57
|
Trigger inputs
|
2024-10-30 01:52:53 +02:00 |
|
Cohee
|
5ee0a6ec30
|
Support unlocked context size
|
2024-10-30 01:50:31 +02:00 |
|
dylan
|
e7522bba76
|
Populate model list from models endpoint
|
2024-10-29 19:38:46 +13:00 |
|
Cohee
|
00f0f755fc
|
Support comma-separated list of llama.cpp sequence breakers #3026
|
2024-10-28 11:44:26 +00:00 |
|
Cohee
|
542f77aeb8
|
Safe sequence breakers parse
|
2024-10-28 11:39:59 +00:00 |
|
Cohee
|
ef3cb73477
|
Don't auto-swipe on aborted stream
|
2024-10-28 11:31:15 +00:00 |
|
Cohee
|
2030c2c711
|
Fix auto-continue with stream aborting
Closes #3021
|
2024-10-28 11:14:35 +02:00 |
|
Cohee
|
9493d05f2c
|
Localize only the moment instance
|
2024-10-28 11:01:48 +02:00 |
|
Beinsezii
|
ace2902cb8
|
llama.cpp Enable dry w/ array convert
The new PR that was merged needs an array instead of a str
https://github.com/ggerganov/llama.cpp/pull/9702
|
2024-10-26 16:07:07 -07:00 |
|
Cohee
|
f5bdb52c25
|
Merge branch 'staging' into webpack
|
2024-10-26 19:11:29 +03:00 |
|
Cohee
|
5b1a4fc723
|
Update char PHI macro lingo
|
2024-10-26 17:03:01 +03:00 |
|
dylan
|
0882fb2d15
|
Add NanoGPT as chat completions provider
|
2024-10-26 16:51:39 +13:00 |
|
Wolfram Ravenwolf
|
139c6b9c71
|
Add Claude 3.5 Sonnet (latest) options
|
2024-10-24 12:51:34 +02:00 |
|
Cohee
|
e5183d7283
|
Extended getContext with SlashCommand classes
|
2024-10-23 23:46:33 +03:00 |
|
Cohee
|
0f320dd362
|
WebSearch: Add endpoint for Tavily
|
2024-10-23 23:38:17 +03:00 |
|
Cohee
|
b5cdb29bf3
|
/popup: add scroll argument, allow scroll by default
|
2024-10-23 23:03:07 +03:00 |
|
Cohee
|
e03f1b14e6
|
Enable vertical scrolling in /buttons
Closes #3012
|
2024-10-23 23:00:08 +03:00 |
|
Cohee
|
d7c575994d
|
Replace macros in user filler message
Fixes #3011
|
2024-10-23 22:57:48 +03:00 |
|
David Jimenez
|
030808d308
|
feat: add Claude 3.5 Sonnet 20241022 API model
|
2024-10-22 19:05:51 +01:00 |
|
Cohee
|
7e9f5b8ee2
|
Indicate connected textarea for expanded editor
|
2024-10-21 11:01:55 +03:00 |
|
Cohee
|
2e80c7ceb2
|
Remove old formula renderer CSS
|
2024-10-20 15:06:20 +03:00 |
|
Cohee
|
1c50180daa
|
Merge branch 'staging' into webpack
|
2024-10-19 13:15:54 +03:00 |
|
Cohee
|
0a79d55983
|
Merge pull request #3002 from 50h100a/streamprobs
Correctly view token probabilities when 'Continue'-ing a response.
|
2024-10-19 13:14:44 +03:00 |
|
Cohee
|
afde6e3f45
|
Add word-break in logprobs display
|
2024-10-19 13:13:42 +03:00 |
|
Cohee
|
78bbf0ed02
|
Fix format
|
2024-10-19 13:13:24 +03:00 |
|
50h100a
|
9f97a144e8
|
slightly change stream "abort" flow so token probabilities get successfully updated
|
2024-10-19 00:31:12 -04:00 |
|
50h100a
|
5d5e552cbd
|
correctly interpret some alternate whitespaces in token names
|
2024-10-19 00:24:35 -04:00 |
|
Cohee
|
1ac6780e9c
|
MistralAI: Explicitly set context size for ministral
|
2024-10-18 20:52:17 +03:00 |
|