Update transformers version to 5.0
Browse files- README.md +1 -2
- config.json +0 -5
README.md
CHANGED
|
@@ -118,7 +118,7 @@ top_k=50
|
|
| 118 |
Install the required dependencies:
|
| 119 |
|
| 120 |
```bash
|
| 121 |
-
pip install -U transformers kernels torch accelerate
|
| 122 |
```
|
| 123 |
|
| 124 |
Run inference with the following code:
|
|
@@ -136,7 +136,6 @@ model = AutoModelForCausalLM.from_pretrained(
|
|
| 136 |
pretrained_model_name_or_path=MODEL_ID,
|
| 137 |
torch_dtype=torch.bfloat16,
|
| 138 |
device_map="auto",
|
| 139 |
-
trust_remote_code=True,
|
| 140 |
)
|
| 141 |
|
| 142 |
# Prepare input
|
|
|
|
| 118 |
Install the required dependencies:
|
| 119 |
|
| 120 |
```bash
|
| 121 |
+
pip install -U "transformers>=5.0" kernels torch accelerate
|
| 122 |
```
|
| 123 |
|
| 124 |
Run inference with the following code:
|
|
|
|
| 136 |
pretrained_model_name_or_path=MODEL_ID,
|
| 137 |
torch_dtype=torch.bfloat16,
|
| 138 |
device_map="auto",
|
|
|
|
| 139 |
)
|
| 140 |
|
| 141 |
# Prepare input
|
config.json
CHANGED
|
@@ -3,11 +3,6 @@
|
|
| 3 |
"architectures": [
|
| 4 |
"SolarOpenForCausalLM"
|
| 5 |
],
|
| 6 |
-
"auto_map": {
|
| 7 |
-
"AutoConfig": "configuration_solar_open.SolarOpenConfig",
|
| 8 |
-
"AutoModel": "modeling_solar_open.SolarOpenModel",
|
| 9 |
-
"AutoModelForCausalLM": "modeling_solar_open.SolarOpenForCausalLM"
|
| 10 |
-
},
|
| 11 |
"pad_token_id": 2,
|
| 12 |
"bos_token_id": 1,
|
| 13 |
"eos_token_id": 2,
|
|
|
|
| 3 |
"architectures": [
|
| 4 |
"SolarOpenForCausalLM"
|
| 5 |
],
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 6 |
"pad_token_id": 2,
|
| 7 |
"bos_token_id": 1,
|
| 8 |
"eos_token_id": 2,
|