2048 maxtoken default
Almost everyone prefers 2048 max tokens because of the superior coherency. It should only be lower due to ram limits, but the menu already shows the optimal ram for 2048. Negatively effected users can turn it down themselves, for everyone else especially on rented machines or colab 2048 is a better default.
This commit is contained in:
parent
f47db6d155
commit
b30370bf4b
|
@ -199,7 +199,7 @@ class vars:
|
|||
model_type = "" # Model Type (Automatically taken from the model config)
|
||||
noai = False # Runs the script without starting up the transformers pipeline
|
||||
aibusy = False # Stops submissions while the AI is working
|
||||
max_length = 1024 # Maximum number of tokens to submit per action
|
||||
max_length = 2048 # Maximum number of tokens to submit per action
|
||||
ikmax = 3000 # Maximum number of characters to submit to InferKit
|
||||
genamt = 80 # Amount of text for each action to generate
|
||||
ikgen = 200 # Number of characters for InferKit to generate
|
||||
|
|
Loading…
Reference in New Issue