Change max tokens to 4096

It works smoothly on the TPU colab, so lets allow it. People should not turn this all the way up unless they got the hardware, but we want to allow this for those that do.
This commit is contained in:
henk717 2021-08-22 20:59:47 +02:00
parent a151e1a33a
commit 2fd544cad7
1 changed files with 1 additions and 1 deletions

View File

@ -70,7 +70,7 @@ gensettingstf = [{
"label": "Max Tokens",
"id": "settknmax",
"min": 512,
"max": 2048,
"max": 4096,
"step": 8,
"default": 1024,
"tooltip": "Max number of tokens of context to submit to the AI for sampling. Make sure this is higher than Amount to Generate. Higher values increase VRAM/RAM usage."