RaushanTurganbay HF Staff commited on
Commit
99b21ee
·
verified ·
1 Parent(s): cb17b97

Fix generation config saving error

Browse files

Simply loading the model `from_pretrained` and saving without changes throws an error with transformers latest versions, because we validate strictly that generation params are consistent. Setting a sampling flag solves the issue

Current woraround is to manually change the value after loading the model

```python
from transformers import Mistral3ForConditionalGeneration
import torch

text_encoder = Mistral3ForConditionalGeneration.from_pretrained(
"mistralai/Mistral-Small-3.2-24B-Instruct-2506",
dtype=torch.bfloat16
).to("cuda")
model.generation_config.do_sample = True # change the value
model.save_pretrained(output_dir) # SUCCESS!
```

Files changed (1) hide show
  1. generation_config.json +2 -1
generation_config.json CHANGED
@@ -3,5 +3,6 @@
3
  "bos_token_id": 1,
4
  "eos_token_id": 2,
5
  "transformers_version": "4.52.4",
6
- "temperature": 0.15
 
7
  }
 
3
  "bos_token_id": 1,
4
  "eos_token_id": 2,
5
  "transformers_version": "4.52.4",
6
+ "temperature": 0.15,
7
+ "do_sample": true
8
  }