mirror of
https://github.com/KoboldAI/KoboldAI-Client.git
synced 2025-06-05 21:59:24 +02:00
Model: Respect sampler bounds in torch
A rather embarassing way to spend an hour debugging after I told myself "I'd better remember to add this important thing to the torch side". Samplers were being applied when in their "off values" causing boring mathmatical operations to take place (ie anything x 0 is always 0)
This commit is contained in:
@@ -97,6 +97,10 @@ class HFTorchInferenceModel(HFInferenceModel):
|
||||
|
||||
for sid in utils.koboldai_vars.sampler_order:
|
||||
warper = Warper.from_id(sid)
|
||||
|
||||
if not warper.value_is_valid():
|
||||
continue
|
||||
|
||||
if warper == warpers.RepetitionPenalty:
|
||||
# Rep pen needs more data than other samplers
|
||||
scores = warper.torch(scores, input_ids=input_ids)
|
||||
|
Reference in New Issue
Block a user