Model: Respect sampler bounds in torch

A rather embarassing way to spend an hour debugging after I told myself
"I'd better remember to add this important thing to the torch side".

Samplers were being applied when in their "off values" causing
boring mathmatical operations to take place (ie anything x 0 is always
0)
This commit is contained in:
somebody
2023-03-07 21:13:20 -06:00
parent 6b45367cc7
commit cb6010d666

View File

@@ -97,6 +97,10 @@ class HFTorchInferenceModel(HFInferenceModel):
for sid in utils.koboldai_vars.sampler_order:
warper = Warper.from_id(sid)
if not warper.value_is_valid():
continue
if warper == warpers.RepetitionPenalty:
# Rep pen needs more data than other samplers
scores = warper.torch(scores, input_ids=input_ids)