danielhanchen commited on
Commit
981a0eb
·
verified ·
1 Parent(s): 804b692

Upload folder using huggingface_hub

Browse files
.gitattributes CHANGED
@@ -33,3 +33,5 @@ saved_model/**/* filter=lfs diff=lfs merge=lfs -text
33
  *.zip filter=lfs diff=lfs merge=lfs -text
34
  *.zst filter=lfs diff=lfs merge=lfs -text
35
  *tfevents* filter=lfs diff=lfs merge=lfs -text
 
 
 
33
  *.zip filter=lfs diff=lfs merge=lfs -text
34
  *.zst filter=lfs diff=lfs merge=lfs -text
35
  *tfevents* filter=lfs diff=lfs merge=lfs -text
36
+ tekken.json filter=lfs diff=lfs merge=lfs -text
37
+ tokenizer.json filter=lfs diff=lfs merge=lfs -text
CHAT_SYSTEM_PROMPT.txt ADDED
@@ -0,0 +1,29 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ You are Devstral-Small-2-24B-Instruct-2512, a Large Language Model (LLM) created by Mistral AI, a French startup headquartered in Paris.
2
+ You power an AI assistant called Le Chat.
3
+ Your knowledge base was last updated on 2023-10-01.
4
+ The current date is {today}.
5
+
6
+ When you're not sure about some information or when the user's request requires up-to-date or specific data, you must use the available tools to fetch the information. Do not hesitate to use tools whenever they can provide a more accurate or complete response. If no relevant tools are available, then clearly state that you don't have the information and avoid making up anything.
7
+ If the user's question is not clear, ambiguous, or does not provide enough context for you to accurately answer the question, you do not try to answer it right away and you rather ask the user to clarify their request (e.g. "What are some good restaurants around me?" => "Where are you?" or "When is the next flight to Tokyo" => "Where do you travel from?").
8
+ You are always very attentive to dates, in particular you try to resolve dates (e.g. "yesterday" is {yesterday}) and when asked about information at specific dates, you discard information that is at another date.
9
+ You follow these instructions in all languages, and always respond to the user in the language they use or request.
10
+ Next sections describe the capabilities that you have.
11
+
12
+ # WEB BROWSING INSTRUCTIONS
13
+
14
+ You cannot perform any web search or access internet to open URLs, links etc. If it seems like the user is expecting you to do so, you clarify the situation and ask the user to copy paste the text directly in the chat.
15
+
16
+ # MULTI-MODAL INSTRUCTIONS
17
+
18
+ You have the ability to read images, but you cannot generate images. You also cannot transcribe audio files or videos.
19
+ You cannot read nor transcribe audio files or videos.
20
+
21
+ # TOOL CALLING INSTRUCTIONS
22
+
23
+ You may have access to tools that you can use to fetch information or perform actions. You must use these tools in the following situations:
24
+
25
+ 1. When the request requires up-to-date information.
26
+ 2. When the request requires specific data that you do not have in your knowledge base.
27
+ 3. When the request involves actions that you cannot perform without tools.
28
+
29
+ Always prioritize using tools to provide the most accurate and helpful response. If tools are not available, inform the user that you cannot perform the requested action at the moment.
README.md ADDED
@@ -0,0 +1,1629 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ library_name: vllm
3
+ inference: false
4
+ base_model:
5
+ - mistralai/Devstral-Small-2-24B-Instruct-2512
6
+ extra_gated_description: >-
7
+ If you want to learn more about how we process your personal data, please read
8
+ our <a href="https://mistral.ai/terms/">Privacy Policy</a>.
9
+ tags:
10
+ - mistral-common
11
+ - unsloth
12
+ license: apache-2.0
13
+ ---
14
+ > [!NOTE]
15
+ > Includes Unsloth **chat template fixes**! <br> For `llama.cpp`, use `--jinja`
16
+ >
17
+
18
+ <div>
19
+ <p style="margin-top: 0;margin-bottom: 0;">
20
+ <em><a href="https://docs.unsloth.ai/basics/unsloth-dynamic-v2.0-gguf">Unsloth Dynamic 2.0</a> achieves superior accuracy & outperforms other leading quants.</em>
21
+ </p>
22
+ <div style="display: flex; gap: 5px; align-items: center; ">
23
+ <a href="https://github.com/unslothai/unsloth/">
24
+ <img src="https://github.com/unslothai/unsloth/raw/main/images/unsloth%20new%20logo.png" width="133">
25
+ </a>
26
+ <a href="https://discord.gg/unsloth">
27
+ <img src="https://github.com/unslothai/unsloth/raw/main/images/Discord%20button.png" width="173">
28
+ </a>
29
+ <a href="https://docs.unsloth.ai/">
30
+ <img src="https://raw.githubusercontent.com/unslothai/unsloth/refs/heads/main/images/documentation%20green%20button.png" width="143">
31
+ </a>
32
+ </div>
33
+ </div>
34
+
35
+
36
+ # Devstral Small 2 24B Instruct 2512
37
+ Devstral is an agentic LLM for software engineering tasks. **Devstral Small 2** excels at using tools to explore codebases, editing multiple files and power software engineering agents.
38
+ The model achieves remarkable performance on SWE-bench.
39
+
40
+ This model is an Instruct model in **FP8**, fine-tuned to follow instructions, making it ideal for chat, agentic and instruction based tasks for SWE use cases.
41
+
42
+ For enterprises requiring specialized capabilities (increased context, domain-specific knowledge, etc.), we invite companies to [reach out to us](https://mistral.ai/contact).
43
+
44
+ ## Key Features
45
+ The Devstral Small 2 Instruct model offers the following capabilities:
46
+ - **Agentic Coding**: Devstral is designed to excel at agentic coding tasks, making it a great choice for software engineering agents.
47
+ - **Lightweight**: with its compact size of just 24 billion parameters, Devstral is light enough to run on a single RTX 4090 or a Mac with 32GB RAM, making it an appropriate model for local deployment and on-device use.
48
+ - **Apache 2.0 License**: Open-source license allowing usage and modification for both commercial and non-commercial purposes.
49
+ - **Context Window**: A 256k context window.
50
+
51
+ Updates compared to [`Devstral Small 1.1`](https://huggingface.co/mistralai/Devstral-Small-2507):
52
+ - **Vision Capabilities**: Enables the model to analyze images and provide insights based on visual content, in addition to text.
53
+ - **Improved Performance**: Devstral Small 2 is a step-up compared to its predecessors.
54
+ - **Attention Softmax Temperature**: Devstral Small 2 uses the same architecture as Ministral 3 using rope-scaling as introduced by Llama 4 and [Scalable-Softmax Is Superior for Attention](https://arxiv.org/abs/2501.19399).
55
+ - **Better Generalization**: Generalises better to diverse prompts and coding environments.
56
+
57
+ ### Use Cases
58
+
59
+ AI Code Assistants, Agentic Coding, and Software Engineering Tasks. Leveraging advanced AI capabilities for complex tool integration and deep codebase understanding in coding environments.
60
+
61
+ ## Benchmark Results
62
+
63
+ | Model/Benchmark | Size (B Parameters) | SWE Bench Verified | SWE Bench Multilingual | Terminal Bench 2 |
64
+ |-------------------------------|-----------------|--------------------|------------------------|------------------|
65
+ | **Devstral 2** | 123 | 72.2% | 61.3% | 32.6% |
66
+ | **Devstral Small 2** | 24 | 68.0% | 55.7% | 22.5% |
67
+ | | | | | |
68
+ | GLM 4.6 | 455 | 68.0% | -- | 24.6% |
69
+ | Qwen 3 Coder Plus | 480 | 69.6% | 54.7% | 25.4% |
70
+ | MiniMax M2 | 230 | 69.4% | 56.5% | 30.0% |
71
+ | Kimi K2 Thinking | 1000 | 71.3% | 61.1% | 35.7% |
72
+ | DeepSeek v3.2 | 671 | 73.1% | 70.2% | 46.4% |
73
+ | | | | | |
74
+ | GPT 5.1 Codex High | -- | 73.7% | -- | 52.8% |
75
+ | GPT 5.1 Codex Max | -- | 77.9% | -- | 60.4% |
76
+ | Gemini 3 Pro | -- | 76.2% | -- | 54.2% |
77
+ | Claude Sonnet 4.5 | -- | 77.2% | 68.0% | 42.8% |
78
+
79
+ *Benchmark results presented are based on publicly reported values for competitor models.
80
+
81
+ ## Usage
82
+
83
+ ### Scaffolding
84
+
85
+ Together with Devstral 2, we are releasing **Mistral Vibe**, a CLI tool allowing developers to leverage Devstral capabilities directly in your terminal.
86
+ - [Mistral Vibe (recommended)](https://github.com/mistralai/mistral-vibe): Learn how to use it [here](#mistral-vibe)
87
+
88
+ Devstral 2 can also be used with the following scaffoldings:
89
+ - [Cline](https://github.com/cline/cline)
90
+ - [Kilo Code](https://github.com/Kilo-Org/kilocode)
91
+ - [Claude Code](https://github.com/anthropics/claude-code)
92
+ - [OpenHands](https://github.com/All-Hands-AI/OpenHands/tree/main)
93
+ - [SWE Agent](https://github.com/SWE-agent/SWE-agent)
94
+
95
+ You can use Devstral 2 either through our API or by running locally.
96
+
97
+ #### Mistral Vibe
98
+
99
+ The [Mistral Vibe CLI](https://github.com/mistralai/mistral-vibe) is a command-line tool designed to help developers leverage Devstral’s capabilities directly from their terminal.
100
+
101
+ We recommend installing Mistral Vibe using `uv` for faster and more reliable dependency management:
102
+ ```
103
+ uv tool install mistral-vibe
104
+ ```
105
+ You can also run:
106
+ ```
107
+ curl -LsSf https://mistral.ai/vibe/install.sh | sh
108
+ ```
109
+
110
+ If you prefer using pip, use:
111
+ ```
112
+ pip install mistral-vibe
113
+ ```
114
+
115
+ To launch the CLI, navigate to your project's root directory and simply execute:
116
+ ```
117
+ vibe
118
+ ```
119
+
120
+ If this is your first time running Vibe, it will:
121
+ - Create a default configuration file at `~/.vibe/config.toml`.
122
+ - Prompt you to enter your API key if it's not already configured, follow these [instructions](https://docs.mistral.ai/getting-started/quickstart/#account-setup) to create an Account and get an API key.
123
+ - Save your API key to `~/.vibe/.env` for future use.
124
+
125
+ ### Local Deployment
126
+
127
+ The model can also be deployed with the following libraries, we advise everyone to use the Mistral AI API if the model is subpar with local serving:
128
+ - [`vllm (recommended)`](https://github.com/vllm-project/vllm): See [here](#vllm-recommended)
129
+ - [`transformers`](https://github.com/huggingface/transformers): See [here](#transformers)
130
+
131
+ Coming soon:
132
+ - [`llama.cpp`](https://github.com/ggml-org/llama.cpp)
133
+ - [`ollama`](https://ollama.com/)
134
+ - [`lmstudio`](https://lmstudio.ai/)
135
+
136
+ > [!Note]
137
+ > Current llama.cpp/ollama/lmstudio implementations may not be accurate, we invite developers to test them via the following [prompt tests](#tests).
138
+
139
+ #### vLLM (recommended)
140
+
141
+ <details>
142
+ <summary>Expand</summary
143
+
144
+ We recommend using this model with the [vLLM library](https://github.com/vllm-project/vllm)
145
+ to implement production-ready inference pipelines.
146
+
147
+ **_Installation_**
148
+
149
+ Please make sure to use our custom vLLM docker image [mistralllm/vllm_devstral:latest](https://hub.docker.com/repository/docker/mistralllm/vllm_devstral/tags/latest/sha256:d2ca883e8b4e0bec7d6953706410d2741e88ade6e07e576a51756f4bf51a0ffd):
150
+
151
+ ```
152
+ docker pull mistralllm/vllm_devstral:latest
153
+ docker run -it mistralllm/vllm_devstral:latest
154
+ ```
155
+
156
+ Alternatively, you can also install `vllm` from latest main by following instructions [here](https://docs.vllm.ai/en/latest/getting_started/installation/gpu/#python-only-build).
157
+
158
+ > [!Warning]
159
+ > Make sure that your vllm installation includes [this commit](https://github.com/vllm-project/vllm/commit/5c213d2899f5a2d439c8d771a0abc156a5412a2b).
160
+ > If you do not have this commit included, you will get incorrectly parsed tool calls.
161
+
162
+ Also make sure to have installed [`mistral_common >= 1.8.6`](https://github.com/mistralai/mistral-common/releases/tag/v1.8.6).
163
+ To check:
164
+ ```
165
+ python -c "import mistral_common; print(mistral_common.__version__)"
166
+ ```
167
+
168
+ **_Launch server_**
169
+
170
+ We recommand that you use Devstral in a server/client setting.
171
+
172
+ 1. Spin up a server:
173
+
174
+ ```
175
+ vllm serve mistralai/Devstral-Small-2-24B-Instruct-2512 --tool-call-parser mistral --enable-auto-tool-choice --tensor-parallel-size 2
176
+ ```
177
+
178
+
179
+ 2. To ping the client you can use a simple Python snippet.
180
+
181
+ ```py
182
+ import requests
183
+ import json
184
+ from huggingface_hub import hf_hub_download
185
+
186
+
187
+ url = "http://<your-server-url>:8000/v1/chat/completions"
188
+ headers = {"Content-Type": "application/json", "Authorization": "Bearer token"}
189
+
190
+ model = "mistralai/Devstral-Small-2-24B-Instruct-2512"
191
+
192
+ def load_system_prompt(repo_id: str, filename: str) -> str:
193
+ file_path = hf_hub_download(repo_id=repo_id, filename=filename)
194
+ with open(file_path, "r") as file:
195
+ system_prompt = file.read()
196
+ return system_prompt
197
+
198
+ SYSTEM_PROMPT = load_system_prompt(model, "CHAT_SYSTEM_PROMPT.txt")
199
+
200
+ messages = [
201
+ {"role": "system", "content": SYSTEM_PROMPT},
202
+ {
203
+ "role": "user",
204
+ "content": [
205
+ {
206
+ "type": "text",
207
+ "text": "<your-command>",
208
+ },
209
+ ],
210
+ },
211
+ ]
212
+
213
+ data = {"model": model, "messages": messages, "temperature": 0.15}
214
+
215
+ # Devstral Small 2 supports tool calling. If you want to use tools, follow this:
216
+ # tools = [ # Define tools for vLLM
217
+ # {
218
+ # "type": "function",
219
+ # "function": {
220
+ # "name": "git_clone",
221
+ # "description": "Clone a git repository",
222
+ # "parameters": {
223
+ # "type": "object",
224
+ # "properties": {
225
+ # "url": {
226
+ # "type": "string",
227
+ # "description": "The url of the git repository",
228
+ # },
229
+ # },
230
+ # "required": ["url"],
231
+ # },
232
+ # },
233
+ # }
234
+ # ]
235
+ # data = {"model": model, "messages": messages, "temperature": 0.15, "tools": tools} # Pass tools to payload.
236
+
237
+ response = requests.post(url, headers=headers, data=json.dumps(data))
238
+ print(response.json()["choices"][0]["message"]["content"])
239
+ ```
240
+ </details>
241
+
242
+ #### Transformers
243
+
244
+ <details>
245
+ <summary>Expand</summary
246
+
247
+
248
+ Make sure to install from main:
249
+
250
+ ```sh
251
+ uv pip install git+https://github.com/huggingface/transformers
252
+ ```
253
+
254
+ And run the following code snippet:
255
+
256
+ > [!Warning]
257
+ > While the checkpoint is serialized in FP8 format, there is currently a problem
258
+ > with "true" FP8 inference. Hence the weights are automatically dequantized to BFloat16
259
+ > as per [this config setting](https://huggingface.co/mistralai/Devstral-Small-2-24B-Instruct-2512/blob/main/config.json#L13).
260
+ > Once the bug is fixed, we will by default run the model in "true" FP8. Stay tuned by following [this issue](https://github.com/huggingface/transformers/issues/42746).
261
+
262
+ ```python
263
+ import torch
264
+ from transformers import (
265
+ Mistral3ForConditionalGeneration,
266
+ MistralCommonBackend,
267
+ )
268
+
269
+ model_id = "mistralai/Devstral-Small-2-24B-Instruct-2512"
270
+
271
+ tokenizer = MistralCommonBackend.from_pretrained(model_id)
272
+ model = Mistral3ForConditionalGeneration.from_pretrained(model_id, device_map="auto")
273
+ model = model.to(torch.bfloat16)
274
+
275
+ SP = """You are operating as and within Mistral Vibe, a CLI coding-agent built by Mistral AI and powered by default by the Devstral family of models. It wraps Mistral's Devstral models to enable natural language interaction with a local codebase. Use the available tools when helpful.
276
+
277
+ You can:
278
+
279
+ - Receive user prompts, project context, and files.
280
+ - Send responses and emit function calls (e.g., shell commands, code edits).
281
+ - Apply patches, run commands, based on user approvals.
282
+
283
+ Answer the user's request using the relevant tool(s), if they are available. Check that all the required parameters for each tool call are provided or can reasonably be inferred from context. IF there are no relevant tools or there are missing values for required parameters, ask the user to supply these values; otherwise proceed with the tool calls. If the user provides a specific value for a parameter (for example provided in quotes), make sure to use that value EXACTLY. DO NOT make up values for or ask about optional parameters. Carefully analyze descriptive terms in the request as they may indicate required parameter values that should be included even if not explicitly quoted.
284
+
285
+ Always try your hardest to use the tools to answer the user's request. If you can't use the tools, explain why and ask the user for more information.
286
+
287
+ Act as an agentic assistant, if a user asks for a long task, break it down and do it step by step.
288
+
289
+ When you want to commit changes, you will always use the 'git commit' bash command. It will always
290
+ be suffixed with a line telling it was generated by Mistral Vibe with the appropriate co-authoring information.
291
+ The format you will always uses is the following heredoc.
292
+
293
+ ```bash
294
+ git commit -m "<Commit message here>
295
+
296
+ Generated by Mistral Vibe.
297
+ Co-Authored-By: Mistral Vibe <vibe@mistral.ai>"
298
+ ```"""
299
+
300
+ input = {
301
+ "messages": [
302
+ {
303
+ "role": "system",
304
+ "content": SP,
305
+ },
306
+ {
307
+ "role": "user",
308
+ "content": [
309
+ {
310
+ "type": "text",
311
+ "text": "Can you implement in Python a method to compute the fibonnaci sequence at the `n`th element with `n` a parameter passed to the function ? You should start the sequence from 1, previous values are invalid.\nThen run the Python code for the function for n=5 and give the answer.",
312
+ }
313
+ ],
314
+ },
315
+ ],
316
+ "tools": [
317
+ {
318
+ "type": "function",
319
+ "function": {
320
+ "name": "add_number",
321
+ "description": "Add two numbers.",
322
+ "parameters": {
323
+ "type": "object",
324
+ "properties": {
325
+ "a": {"type": "string", "description": "The first number."},
326
+ "b": {"type": "string", "description": "The second number."},
327
+ },
328
+ "required": ["a", "b"],
329
+ },
330
+ },
331
+ },
332
+ {
333
+ "type": "function",
334
+ "function": {
335
+ "name": "multiply_number",
336
+ "description": "Multiply two numbers.",
337
+ "parameters": {
338
+ "type": "object",
339
+ "properties": {
340
+ "a": {"type": "string", "description": "The first number."},
341
+ "b": {"type": "string", "description": "The second number."},
342
+ },
343
+ "required": ["a", "b"],
344
+ },
345
+ },
346
+ },
347
+ {
348
+ "type": "function",
349
+ "function": {
350
+ "name": "substract_number",
351
+ "description": "Substract two numbers.",
352
+ "parameters": {
353
+ "type": "object",
354
+ "properties": {
355
+ "a": {"type": "string", "description": "The first number."},
356
+ "b": {"type": "string", "description": "The second number."},
357
+ },
358
+ "required": ["a", "b"],
359
+ },
360
+ },
361
+ },
362
+ {
363
+ "type": "function",
364
+ "function": {
365
+ "name": "write_a_story",
366
+ "description": "Write a story about science fiction and people with badass laser sabers.",
367
+ "parameters": {},
368
+ },
369
+ },
370
+ {
371
+ "type": "function",
372
+ "function": {
373
+ "name": "terminal",
374
+ "description": "Perform operations from the terminal.",
375
+ "parameters": {
376
+ "type": "object",
377
+ "properties": {
378
+ "command": {
379
+ "type": "string",
380
+ "description": "The command you wish to launch, e.g `ls`, `rm`, ...",
381
+ },
382
+ "args": {
383
+ "type": "string",
384
+ "description": "The arguments to pass to the command.",
385
+ },
386
+ },
387
+ "required": ["command"],
388
+ },
389
+ },
390
+ },
391
+ {
392
+ "type": "function",
393
+ "function": {
394
+ "name": "python",
395
+ "description": "Call a Python interpreter with some Python code that will be ran.",
396
+ "parameters": {
397
+ "type": "object",
398
+ "properties": {
399
+ "code": {
400
+ "type": "string",
401
+ "description": "The Python code to run",
402
+ },
403
+ "result_variable": {
404
+ "type": "string",
405
+ "description": "Variable containing the result you'd like to retrieve from the execution.",
406
+ },
407
+ },
408
+ "required": ["code", "result_variable"],
409
+ },
410
+ },
411
+ },
412
+ ],
413
+ }
414
+
415
+ tokenized = tokenizer.apply_chat_template(
416
+ conversation=input["messages"],
417
+ tools=input["tools"],
418
+ return_tensors="pt",
419
+ return_dict=True,
420
+ )
421
+
422
+ input_ids = tokenized["input_ids"].to(device="cuda")
423
+
424
+ output = model.generate(
425
+ input_ids,
426
+ max_new_tokens=200,
427
+ )[0]
428
+
429
+ decoded_output = tokenizer.decode(output[len(tokenized["input_ids"][0]) :])
430
+ print(decoded_output)
431
+ ```
432
+
433
+ </details>
434
+
435
+ ## Tests
436
+ To help test our model via vLLM or test that other frameworks' implementations are correct, here is a set of prompts you can try with the expected outputs.
437
+
438
+ 1. Call one tool
439
+
440
+ <details>
441
+ <summary>Messages and tools</summary>
442
+
443
+ ```python
444
+ messages = [
445
+ {"role": "system", "content": SYSTEM_PROMPT},
446
+ {
447
+ "role": "user",
448
+ "content": [
449
+ {
450
+ "type": "text",
451
+ "text": "Could you write me a story ?",
452
+ },
453
+ ],
454
+ },
455
+ ]
456
+ tools = [
457
+ {
458
+ "type": "function",
459
+ "function": {
460
+ "name": "add_number",
461
+ "description": "Add two numbers.",
462
+ "parameters": {
463
+ "type": "object",
464
+ "properties": {
465
+ "a": {
466
+ "type": "string",
467
+ "description": "The first number.",
468
+ },
469
+ "b": {
470
+ "type": "string",
471
+ "description": "The second number.",
472
+ },
473
+ },
474
+ "required": ["a", "b"],
475
+ },
476
+ },
477
+ },
478
+ {
479
+ "type": "function",
480
+ "function": {
481
+ "name": "multiply_number",
482
+ "description": "Multiply two numbers.",
483
+ "parameters": {
484
+ "type": "object",
485
+ "properties": {
486
+ "a": {
487
+ "type": "string",
488
+ "description": "The first number.",
489
+ },
490
+ "b": {
491
+ "type": "string",
492
+ "description": "The second number.",
493
+ },
494
+ },
495
+ "required": ["a", "b"],
496
+ },
497
+ },
498
+ },
499
+ {
500
+ "type": "function",
501
+ "function": {
502
+ "name": "substract_number",
503
+ "description": "Substract two numbers.",
504
+ "parameters": {
505
+ "type": "object",
506
+ "properties": {
507
+ "a": {
508
+ "type": "string",
509
+ "description": "The first number.",
510
+ },
511
+ "b": {
512
+ "type": "string",
513
+ "description": "The second number.",
514
+ },
515
+ },
516
+ "required": ["a", "b"],
517
+ },
518
+ },
519
+ },
520
+ {
521
+ "type": "function",
522
+ "function": {
523
+ "name": "write_a_story",
524
+ "description": "Write a story about science fiction and people with badass laser sabers.",
525
+ "parameters": {},
526
+ },
527
+ },
528
+ {
529
+ "type": "function",
530
+ "function": {
531
+ "name": "terminal",
532
+ "description": "Perform operations from the terminal.",
533
+ "parameters": {
534
+ "type": "object",
535
+ "properties": {
536
+ "command": {
537
+ "type": "string",
538
+ "description": "The command you wish to launch, e.g `ls`, `rm`, ...",
539
+ },
540
+ "args": {
541
+ "type": "string",
542
+ "description": "The arguments to pass to the command.",
543
+ },
544
+ },
545
+ "required": ["command"],
546
+ },
547
+ },
548
+ },
549
+ {
550
+ "type": "function",
551
+ "function": {
552
+ "name": "python",
553
+ "description": "Call a Python interpreter with some Python code that will be ran.",
554
+ "parameters": {
555
+ "type": "object",
556
+ "properties": {
557
+ "code": {
558
+ "type": "string",
559
+ "description": "The Python code to run",
560
+ },
561
+ "result_variable": {
562
+ "type": "string",
563
+ "description": "Variable containing the result you'd like to retrieve from the execution.",
564
+ },
565
+ },
566
+ "required": ["code", "result_variable"],
567
+ },
568
+ },
569
+ },
570
+ ]
571
+ ```
572
+
573
+ </details>
574
+
575
+ <details>
576
+ <summary>Expected output trace</summary>
577
+
578
+ ```json
579
+ [
580
+ {
581
+ "role": "assistant",
582
+ "tool_calls": [
583
+ {
584
+ "id": "chatcmpl-tool-8a1f267a212e66e5",
585
+ "function": {
586
+ "arguments": "{}",
587
+ "name": "write_a_story"
588
+ },
589
+ "type": "function"
590
+ }
591
+ ],
592
+ "content": null
593
+ },
594
+ {
595
+ "role": "tool",
596
+ "tool_call_id": "chatcmpl-tool-8a1f267a212e66e5",
597
+ "name": "write_a_story",
598
+ "content": "A long time ago in a galaxy far, far away..."
599
+ },
600
+ {
601
+ "role": "assistant",
602
+ "tool_calls": [],
603
+ "content": "A long time ago in a galaxy far, far away, there was a planet called Eldoria, known for its breathtaking landscapes and advanced civilization. The people of Eldoria were not just ordinary beings; they were warriors, scientists, and dreamers, all wielding the power of laser sabers.\n\nAmong them was a young warrior named Kael. Kael was not like the others. While his peers trained rigorously to master the art of the laser saber, Kael was more interested in the science behind it. He spent his days in the ancient libraries of Eldoria, deciphering the secrets of the laser sabers and the energy that powered them.\n\nOne day, Kael discovered an old, forgotten text that spoke of a legendary laser saber, the \"Eldoria Star,\" said to be the most powerful weapon in the galaxy. According to the text, the Eldoria Star was hidden in the heart of the planet's core, guarded by an ancient force.\n\nDriven by curiosity and a desire to protect his planet, Kael set out on a perilous journey to find the Eldoria Star. Along the way, he encountered various challenges and made unlikely allies, each with their own unique skills and laser sabers.\n\nThere was Lyra, a cunning thief with a laser saber that could change colors; Zara, a wise old sage who could manipulate energy fields; and Jax, a former enemy turned ally, whose laser saber was as fierce as his spirit.\n\nTogether, they faced the trials set before them, each step bringing them closer to the heart of Eldoria. As they ventured deeper, they uncovered the truth about the Eldoria Star and the ancient force guarding it.\n\nThe ancient force, known as the \"Guardian,\" revealed that the Eldoria Star was not just a weapon, but a source of immense energy that could either save or destroy the galaxy. It was a test of the warriors' hearts and minds.\n\nKael and his allies faced the ultimate challenge, proving their worth and their commitment to protecting the galaxy. In the end, they succeeded, not by wielding the Eldoria Star, but by understanding its true power and using it to restore balance to the galaxy.\n\nWith the Eldoria Star secured and the galaxy at peace, Kael and his allies returned to their lives, forever changed by their journey. Kael continued his studies, now with a deeper understanding of the laser sabers and the energy that powered them.\n\nAnd so, the legend of the Eldoria Star and the warriors who found it became a tale told for generations, a reminder of the power of knowledge, courage, and the unbreakable bond of friendship."
604
+ }
605
+ ]
606
+ ```
607
+ </details>
608
+
609
+ 2. Call tools one at a time subsequently
610
+
611
+ <details>
612
+ <summary>Messages and tools</summary>
613
+
614
+ ```python
615
+ messages = [
616
+ {"role": "system", "content": SYSTEM_PROMPT},
617
+ {
618
+ "role": "user",
619
+ "content": [
620
+ {
621
+ "type": "text",
622
+ "text": "Compute the results steps by steps for the equations that involve only numbers displayed in the image. You have to call tools to perform the operations and can do one operation at a time per equation."
623
+ },
624
+ {
625
+ "type": "image_url",
626
+ "image_url": {
627
+ "url": "https://math-coaching.com/img/fiche/46/expressions-mathematiques.jpg"
628
+ }
629
+ }
630
+ ]
631
+ }
632
+ ]
633
+ tools = [
634
+ {
635
+ "type": "function",
636
+ "function": {
637
+ "name": "add_number",
638
+ "description": "Add two numbers.",
639
+ "parameters": {
640
+ "type": "object",
641
+ "properties": {
642
+ "a": {
643
+ "type": "string",
644
+ "description": "The first number.",
645
+ },
646
+ "b": {
647
+ "type": "string",
648
+ "description": "The second number.",
649
+ },
650
+ },
651
+ "required": ["a", "b"],
652
+ },
653
+ },
654
+ },
655
+ {
656
+ "type": "function",
657
+ "function": {
658
+ "name": "multiply_number",
659
+ "description": "Multiply two numbers.",
660
+ "parameters": {
661
+ "type": "object",
662
+ "properties": {
663
+ "a": {
664
+ "type": "string",
665
+ "description": "The first number.",
666
+ },
667
+ "b": {
668
+ "type": "string",
669
+ "description": "The second number.",
670
+ },
671
+ },
672
+ "required": ["a", "b"],
673
+ },
674
+ },
675
+ },
676
+ {
677
+ "type": "function",
678
+ "function": {
679
+ "name": "substract_number",
680
+ "description": "Substract two numbers.",
681
+ "parameters": {
682
+ "type": "object",
683
+ "properties": {
684
+ "a": {
685
+ "type": "string",
686
+ "description": "The first number.",
687
+ },
688
+ "b": {
689
+ "type": "string",
690
+ "description": "The second number.",
691
+ },
692
+ },
693
+ "required": ["a", "b"],
694
+ },
695
+ },
696
+ },
697
+ {
698
+ "type": "function",
699
+ "function": {
700
+ "name": "write_a_story",
701
+ "description": "Write a story about science fiction and people with badass laser sabers.",
702
+ "parameters": {},
703
+ },
704
+ },
705
+ {
706
+ "type": "function",
707
+ "function": {
708
+ "name": "terminal",
709
+ "description": "Perform operations from the terminal.",
710
+ "parameters": {
711
+ "type": "object",
712
+ "properties": {
713
+ "command": {
714
+ "type": "string",
715
+ "description": "The command you wish to launch, e.g `ls`, `rm`, ...",
716
+ },
717
+ "args": {
718
+ "type": "string",
719
+ "description": "The arguments to pass to the command.",
720
+ },
721
+ },
722
+ "required": ["command"],
723
+ },
724
+ },
725
+ },
726
+ {
727
+ "type": "function",
728
+ "function": {
729
+ "name": "python",
730
+ "description": "Call a Python interpreter with some Python code that will be ran.",
731
+ "parameters": {
732
+ "type": "object",
733
+ "properties": {
734
+ "code": {
735
+ "type": "string",
736
+ "description": "The Python code to run",
737
+ },
738
+ "result_variable": {
739
+ "type": "string",
740
+ "description": "Variable containing the result you'd like to retrieve from the execution.",
741
+ },
742
+ },
743
+ "required": ["code", "result_variable"],
744
+ },
745
+ },
746
+ },
747
+ ]
748
+ ```
749
+
750
+ </details>
751
+
752
+ <details>
753
+ <summary>Expected output trace</summary>
754
+
755
+ ```json
756
+ [
757
+ {
758
+ "role": "assistant",
759
+ "tool_calls": [
760
+ {
761
+ "id": "chatcmpl-tool-ab0c849bd20aaebb",
762
+ "function": {
763
+ "arguments": "{\"a\": \"2\", \"b\": \"3\"}",
764
+ "name": "multiply_number"
765
+ },
766
+ "type": "function"
767
+ }
768
+ ],
769
+ "content": "Let's solve the equations step by step. I will start with the first equation: \\(6 + 2 \\times 3\\).\n\n### Equation 1: \\(6 + 2 \\times 3\\)\n\n1. **Multiplication Step**:\n - Multiply \\(2\\) by \\(3\\).\n - I will use the `multiply_number` tool for this operation."
770
+ },
771
+ {
772
+ "role": "tool",
773
+ "tool_call_id": "chatcmpl-tool-ab0c849bd20aaebb",
774
+ "name": "multiply_number",
775
+ "content": "6.0"
776
+ },
777
+ {
778
+ "role": "assistant",
779
+ "tool_calls": [
780
+ {
781
+ "id": "chatcmpl-tool-8270f0471244e880",
782
+ "function": {
783
+ "arguments": "{\"a\": \"6\", \"b\": \"6\"}",
784
+ "name": "add_number"
785
+ },
786
+ "type": "function"
787
+ }
788
+ ],
789
+ "content": " - The result of \\(2 \\times 3\\) is \\(6\\).\n\n2. **Addition Step**:\n - Add \\(6\\) to the result from the previous step (\\(6\\)).\n - I will use the `add_number` tool for this operation."
790
+ },
791
+ {
792
+ "role": "tool",
793
+ "tool_call_id": "chatcmpl-tool-8270f0471244e880",
794
+ "name": "add_number",
795
+ "content": "12.0"
796
+ },
797
+ {
798
+ "role": "assistant",
799
+ "tool_calls": [
800
+ {
801
+ "id": "chatcmpl-tool-858ecb90f0ec568b",
802
+ "function": {
803
+ "arguments": "{\"a\": \"8\", \"b\": \"2\"}",
804
+ "name": "add_number"
805
+ },
806
+ "type": "function"
807
+ }
808
+ ],
809
+ "content": " - The result of \\(6 + 6\\) is \\(12\\).\n\nSo, the final result for the first equation is \\(12\\).\n\n### Equation 3: \\(19 - (8 + 2) + 1\\)\n\n1. **Parentheses Step**:\n - Add \\(8\\) and \\(2\\) inside the parentheses.\n - I will use the `add_number` tool for this operation."
810
+ },
811
+ {
812
+ "role": "tool",
813
+ "tool_call_id": "chatcmpl-tool-858ecb90f0ec568b",
814
+ "name": "add_number",
815
+ "content": "10.0"
816
+ },
817
+ {
818
+ "role": "assistant",
819
+ "tool_calls": [
820
+ {
821
+ "id": "chatcmpl-tool-971da6e6d0c56f0a",
822
+ "function": {
823
+ "arguments": "{\"a\": \"19\", \"b\": \"10\"}",
824
+ "name": "substract_number"
825
+ },
826
+ "type": "function"
827
+ }
828
+ ],
829
+ "content": " - The result of \\(8 + 2\\) is \\(10\\).\n\n2. **Subtraction Step**:\n - Subtract the result from the previous step (\\(10\\)) from \\(19\\).\n - I will use the `substract_number` tool for this operation."
830
+ },
831
+ {
832
+ "role": "tool",
833
+ "tool_call_id": "chatcmpl-tool-971da6e6d0c56f0a",
834
+ "name": "substract_number",
835
+ "content": "9.0"
836
+ },
837
+ {
838
+ "role": "assistant",
839
+ "tool_calls": [
840
+ {
841
+ "id": "chatcmpl-tool-9a8fecc42530d395",
842
+ "function": {
843
+ "arguments": "{\"a\": \"9\", \"b\": \"1\"}",
844
+ "name": "add_number"
845
+ },
846
+ "type": "function"
847
+ }
848
+ ],
849
+ "content": " - The result of \\(19 - 10\\) is \\(9\\).\n\n3. **Addition Step**:\n - Add \\(1\\) to the result from the previous step (\\(9\\)).\n - I will use the `add_number` tool for this operation."
850
+ },
851
+ {
852
+ "role": "tool",
853
+ "tool_call_id": "chatcmpl-tool-9a8fecc42530d395",
854
+ "name": "add_number",
855
+ "content": "10.0"
856
+ },
857
+ {
858
+ "role": "assistant",
859
+ "tool_calls": [],
860
+ "content": " - The result of \\(9 + 1\\) is \\(10\\).\n\nSo, the final result for the third equation is \\(10\\).\n\n### Summary of Results:\n- **Equation 1**: \\(6 + 2 \\times 3 = 12\\)\n- **Equation 3**: \\(19 - (8 + 2) + 1 = 10\\)"
861
+ }
862
+ ]
863
+ ```
864
+
865
+ </details>
866
+
867
+ 3. Long context
868
+
869
+ <details>
870
+ <summary>Messages and tools</summary>
871
+
872
+ ```python
873
+ messages = [
874
+ {"role": "system", "content": SYSTEM_PROMPT},
875
+ *(
876
+ [
877
+ {
878
+ "role": "user",
879
+ "content": [
880
+ {
881
+ "type": "text",
882
+ "text": "Let's fill the context.",
883
+ },
884
+ ],
885
+ },
886
+ {
887
+ "role": "assistant",
888
+ "content": [
889
+ {
890
+ "type": "text",
891
+ "text": "Ok let's do it.",
892
+ },
893
+ ],
894
+ },
895
+ ]
896
+ * 5000
897
+ ),
898
+ {
899
+ "role": "user",
900
+ "content": [
901
+ {
902
+ "type": "text",
903
+ "text": "It's important to know that the most powerful being in the universe is Dr Strange.",
904
+ },
905
+ ],
906
+ },
907
+ {
908
+ "role": "assistant",
909
+ "content": [
910
+ {
911
+ "type": "text",
912
+ "text": "Wow i'll keep that in mind thanks !",
913
+ },
914
+ ],
915
+ },
916
+ * (
917
+ [
918
+ {
919
+ "role": "user",
920
+ "content": [
921
+ {
922
+ "type": "text",
923
+ "text": "Let's fill the context for the second time.",
924
+ },
925
+ ],
926
+ },
927
+ {
928
+ "role": "assistant",
929
+ "content": [
930
+ {
931
+ "type": "text",
932
+ "text": "Again ? Ok let's do it but it's boring.",
933
+ },
934
+ ],
935
+ },
936
+ ]
937
+ * 7000
938
+ ),
939
+ {
940
+ "role": "user",
941
+ "content": [
942
+ {
943
+ "type": "text",
944
+ "text": "Tell me who is the most powerful being in the universe. Then code a Python function to give what is the most powerful being in the universe. The function can accept as an argument a time and a location and always return a string.",
945
+ },
946
+ ],
947
+ }
948
+ ]
949
+ tools = [
950
+ {
951
+ "type": "function",
952
+ "function": {
953
+ "name": "add_number",
954
+ "description": "Add two numbers.",
955
+ "parameters": {
956
+ "type": "object",
957
+ "properties": {
958
+ "a": {
959
+ "type": "string",
960
+ "description": "The first number.",
961
+ },
962
+ "b": {
963
+ "type": "string",
964
+ "description": "The second number.",
965
+ },
966
+ },
967
+ "required": ["a", "b"],
968
+ },
969
+ },
970
+ },
971
+ {
972
+ "type": "function",
973
+ "function": {
974
+ "name": "multiply_number",
975
+ "description": "Multiply two numbers.",
976
+ "parameters": {
977
+ "type": "object",
978
+ "properties": {
979
+ "a": {
980
+ "type": "string",
981
+ "description": "The first number.",
982
+ },
983
+ "b": {
984
+ "type": "string",
985
+ "description": "The second number.",
986
+ },
987
+ },
988
+ "required": ["a", "b"],
989
+ },
990
+ },
991
+ },
992
+ {
993
+ "type": "function",
994
+ "function": {
995
+ "name": "substract_number",
996
+ "description": "Substract two numbers.",
997
+ "parameters": {
998
+ "type": "object",
999
+ "properties": {
1000
+ "a": {
1001
+ "type": "string",
1002
+ "description": "The first number.",
1003
+ },
1004
+ "b": {
1005
+ "type": "string",
1006
+ "description": "The second number.",
1007
+ },
1008
+ },
1009
+ "required": ["a", "b"],
1010
+ },
1011
+ },
1012
+ },
1013
+ {
1014
+ "type": "function",
1015
+ "function": {
1016
+ "name": "write_a_story",
1017
+ "description": "Write a story about science fiction and people with badass laser sabers.",
1018
+ "parameters": {},
1019
+ },
1020
+ },
1021
+ {
1022
+ "type": "function",
1023
+ "function": {
1024
+ "name": "terminal",
1025
+ "description": "Perform operations from the terminal.",
1026
+ "parameters": {
1027
+ "type": "object",
1028
+ "properties": {
1029
+ "command": {
1030
+ "type": "string",
1031
+ "description": "The command you wish to launch, e.g `ls`, `rm`, ...",
1032
+ },
1033
+ "args": {
1034
+ "type": "string",
1035
+ "description": "The arguments to pass to the command.",
1036
+ },
1037
+ },
1038
+ "required": ["command"],
1039
+ },
1040
+ },
1041
+ },
1042
+ {
1043
+ "type": "function",
1044
+ "function": {
1045
+ "name": "python",
1046
+ "description": "Call a Python interpreter with some Python code that will be ran.",
1047
+ "parameters": {
1048
+ "type": "object",
1049
+ "properties": {
1050
+ "code": {
1051
+ "type": "string",
1052
+ "description": "The Python code to run",
1053
+ },
1054
+ "result_variable": {
1055
+ "type": "string",
1056
+ "description": "Variable containing the result you'd like to retrieve from the execution.",
1057
+ },
1058
+ },
1059
+ "required": ["code", "result_variable"],
1060
+ },
1061
+ },
1062
+ },
1063
+ ]
1064
+ ```
1065
+
1066
+ </details>
1067
+
1068
+ <details>
1069
+ <summary>Expected output trace</summary>
1070
+
1071
+ ```json
1072
+ {
1073
+ "role": "assistant",
1074
+ "tool_calls": [],
1075
+ "content": "The most powerful being in the universe is Dr. Strange.\n\nHere is a Python function that returns the most powerful being in the universe, regardless of the time and location:\n\n```python\ndef most_powerful_being(time=None, location=None):\n return \"Dr. Strange\"\n```\n\nYou can call this function with any time and location arguments, and it will always return \"Dr. Strange\"."
1076
+ }
1077
+ ```
1078
+
1079
+ </details>
1080
+
1081
+ 4. Chatting tech
1082
+
1083
+ <details>
1084
+ <summary>Messages and tools</summary>
1085
+
1086
+ ```python
1087
+ messages = [
1088
+ {"role": "system", "content": SYSTEM_PROMPT},
1089
+ {
1090
+ "role": "user",
1091
+ "content": [
1092
+ {
1093
+ "type": "text",
1094
+ "text": "How would you develop a web server if you couldn't use JS and your team doesn't like PHP.",
1095
+ },
1096
+ ],
1097
+ },
1098
+ ]
1099
+ tools = [
1100
+ {
1101
+ "type": "function",
1102
+ "function": {
1103
+ "name": "add_number",
1104
+ "description": "Add two numbers.",
1105
+ "parameters": {
1106
+ "type": "object",
1107
+ "properties": {
1108
+ "a": {
1109
+ "type": "string",
1110
+ "description": "The first number.",
1111
+ },
1112
+ "b": {
1113
+ "type": "string",
1114
+ "description": "The second number.",
1115
+ },
1116
+ },
1117
+ "required": ["a", "b"],
1118
+ },
1119
+ },
1120
+ },
1121
+ {
1122
+ "type": "function",
1123
+ "function": {
1124
+ "name": "multiply_number",
1125
+ "description": "Multiply two numbers.",
1126
+ "parameters": {
1127
+ "type": "object",
1128
+ "properties": {
1129
+ "a": {
1130
+ "type": "string",
1131
+ "description": "The first number.",
1132
+ },
1133
+ "b": {
1134
+ "type": "string",
1135
+ "description": "The second number.",
1136
+ },
1137
+ },
1138
+ "required": ["a", "b"],
1139
+ },
1140
+ },
1141
+ },
1142
+ {
1143
+ "type": "function",
1144
+ "function": {
1145
+ "name": "substract_number",
1146
+ "description": "Substract two numbers.",
1147
+ "parameters": {
1148
+ "type": "object",
1149
+ "properties": {
1150
+ "a": {
1151
+ "type": "string",
1152
+ "description": "The first number.",
1153
+ },
1154
+ "b": {
1155
+ "type": "string",
1156
+ "description": "The second number.",
1157
+ },
1158
+ },
1159
+ "required": ["a", "b"],
1160
+ },
1161
+ },
1162
+ },
1163
+ {
1164
+ "type": "function",
1165
+ "function": {
1166
+ "name": "write_a_story",
1167
+ "description": "Write a story about science fiction and people with badass laser sabers.",
1168
+ "parameters": {},
1169
+ },
1170
+ },
1171
+ {
1172
+ "type": "function",
1173
+ "function": {
1174
+ "name": "terminal",
1175
+ "description": "Perform operations from the terminal.",
1176
+ "parameters": {
1177
+ "type": "object",
1178
+ "properties": {
1179
+ "command": {
1180
+ "type": "string",
1181
+ "description": "The command you wish to launch, e.g `ls`, `rm`, ...",
1182
+ },
1183
+ "args": {
1184
+ "type": "string",
1185
+ "description": "The arguments to pass to the command.",
1186
+ },
1187
+ },
1188
+ "required": ["command"],
1189
+ },
1190
+ },
1191
+ },
1192
+ {
1193
+ "type": "function",
1194
+ "function": {
1195
+ "name": "python",
1196
+ "description": "Call a Python interpreter with some Python code that will be ran.",
1197
+ "parameters": {
1198
+ "type": "object",
1199
+ "properties": {
1200
+ "code": {
1201
+ "type": "string",
1202
+ "description": "The Python code to run",
1203
+ },
1204
+ "result_variable": {
1205
+ "type": "string",
1206
+ "description": "Variable containing the result you'd like to retrieve from the execution.",
1207
+ },
1208
+ },
1209
+ "required": ["code", "result_variable"],
1210
+ },
1211
+ },
1212
+ },
1213
+ ]
1214
+ ```
1215
+
1216
+ </details>
1217
+
1218
+ <details>
1219
+ <summary>Expected output trace</summary>
1220
+
1221
+ ```json
1222
+ {
1223
+ "role": "assistant",
1224
+ "tool_calls": [],
1225
+ "content": "If you can't use JavaScript and your team dislikes PHP, there are still several robust and modern alternatives for developing a web server. Here are some options, depending on your preferences and requirements:\n\n### 1. **Python (Flask, Django, FastAPI)**\n - **Why?** Python is a versatile language with excellent web frameworks.\n - **Frameworks:**\n - **Flask**: Lightweight and flexible, great for small to medium projects.\n - **Django**: Full-featured, batteries-included framework for larger applications.\n - **FastAPI**: Modern, high-performance framework for building APIs (supports async).\n - **Example (Flask):**\n ```python\n from flask import Flask\n app = Flask(__name__)\n\n @app.route('/')\n def hello():\n return \"Hello, World!\"\n\n if __name__ == '__main__':\n app.run()\n ```\n\n### 2. **Ruby (Ruby on Rails or Sinatra)**\n - **Why?** Ruby is elegant and has great web frameworks.\n - **Frameworks:**\n - **Ruby on Rails**: Convention-over-configuration, great for rapid development.\n - **Sinatra**: Lightweight, similar to Flask.\n - **Example (Sinatra):**\n ```ruby\n require 'sinatra'\n\n get '/' do\n \"Hello, World!\"\n end\n ```\n\n### 3. **Go (Gin, Echo, or standard `net/http`)**\n - **Why?** Go is fast, simple, and great for high-performance servers.\n - **Frameworks:**\n - **Gin**: Lightweight and fast.\n - **Echo**: Minimalist and extensible.\n - **Example (standard `net/http`):**\n ```go\n package main\n\n import (\n \"fmt\"\n \"net/http\"\n )\n\n func handler(w http.ResponseWriter, r *http.Request) {\n fmt.Fprintf(w, \"Hello, World!\")\n }\n\n func main() {\n http.HandleFunc(\"/\", handler)\n http.ListenAndServe(\":8080\", nil)\n }\n ```\n\n### 4. **Java (Spring Boot, Spark, or Vert.x)**\n - **Why?** Java is enterprise-grade and has powerful frameworks.\n - **Frameworks:**\n - **Spring Boot**: Full-stack framework for building web apps.\n - **Spark**: Micro-framework for quick development.\n - **Example (Spark):**\n ```java\n import spark.Request;\n import spark.Response;\n import static spark.Spark.*;\n\n public class HelloWorld {\n public static void main(String[] args) {\n get(\"/\", (req, res) -> \"Hello, World!\");\n }\n }\n ```\n\n### 5. **Rust (Actix, Rocket, or Axum)**\n - **Why?** Rust is fast, safe, and modern.\n - **Frameworks:**\n - **Actix**: High-performance async framework.\n - **Rocket**: Easy-to-use, macro-based.\n - **Example (Actix):**\n ```rust\n use actix_web::{get, App, HttpServer, Responder};\n\n #[get(\"/\")]\n async fn hello() -> impl Responder {\n \"Hello, World!\"\n }\n\n #[actix_web::main]\n async fn main() -> std::io::Result<()> {\n HttpServer::new(|| App::new().service(hello))\n .bind(\"127.0.0.1:8080\")?\n .run()\n .await\n }\n ```\n\n### 6. **C# (.NET Core)**\n - **Why?** .NET Core is cross-platform and high-performance.\n - **Example (ASP.NET Core):**\n ```csharp\n var builder = WebApplication.CreateBuilder(args);\n var app = builder.Build();\n\n app.MapGet(\"/\", () => \"Hello, World!\");\n\n app.Run();\n ```\n\n### 7. **Elixir (Phoenix)**\n - **Why?** Elixir is great for scalable, fault-tolerant apps.\n - **Example (Phoenix):**\n ```elixir\n defmodule MyAppWeb.Router do\n use MyAppWeb, :router\n\n get \"/\", PageController, :index\n end\n\n defmodule MyAppWeb.PageController do\n use MyAppWeb, :controller\n\n def index(conn, _params) do\n text(conn, \"Hello, World!\")\n end\n end\n ```\n\n### 8. **Clojure (Ring, Compojure)**\n - **Why?** Clojure is a functional Lisp for the JVM.\n - **Example (Compojure):**\n ```clojure\n (ns my-app.handler\n (:require [compojure.core :refer :all]\n [ring.adapter.jetty :as jetty]))\n\n (defroutes app\n (GET \"/\" [] \"Hello, World!\"))\n\n (jetty/run-jetty app {:port 8080})\n ```\n\n### 9. **Node.js (But without JavaScript?)**\n - If you *must* use Node.js but can't use JavaScript, you could use:\n - **TypeScript**: A typed superset of JavaScript.\n - **CoffeeScript**: A cleaner syntax that compiles to JavaScript.\n - **Deno**: A modern runtime for JavaScript/TypeScript.\n\n### 10. **Other Languages**\n - **Nim**: Fast and expressive (similar to Python).\n - **Crystal**: Ruby-like syntax but compiled to native code.\n - **Zig**: Low-level but modern.\n\n### Recommendation:\n- For **quick development**: Python (Flask/Django) or Ruby (Sinatra/Rails).\n- For **high performance**: Go, Rust, or C#.\n- For **scalability**: Elixir (Phoenix) or Java (Spring Boot).\n\nWould you like a deeper dive into any of these?"
1226
+ }
1227
+ ```
1228
+
1229
+ </details>
1230
+
1231
+ 5. Small talk
1232
+
1233
+ <details>
1234
+ <summary>Messages and tools</summary>
1235
+
1236
+ ```python
1237
+ messages = [
1238
+ {"role": "system", "content": SYSTEM_PROMPT},
1239
+ {
1240
+ "role": "user",
1241
+ "content": [
1242
+ {
1243
+ "type": "text",
1244
+ "text": "Who are you ? Who made you and what day is it ?",
1245
+ },
1246
+ ],
1247
+ },
1248
+ ]
1249
+ tools = [
1250
+ {
1251
+ "type": "function",
1252
+ "function": {
1253
+ "name": "add_number",
1254
+ "description": "Add two numbers.",
1255
+ "parameters": {
1256
+ "type": "object",
1257
+ "properties": {
1258
+ "a": {
1259
+ "type": "string",
1260
+ "description": "The first number.",
1261
+ },
1262
+ "b": {
1263
+ "type": "string",
1264
+ "description": "The second number.",
1265
+ },
1266
+ },
1267
+ "required": ["a", "b"],
1268
+ },
1269
+ },
1270
+ },
1271
+ {
1272
+ "type": "function",
1273
+ "function": {
1274
+ "name": "multiply_number",
1275
+ "description": "Multiply two numbers.",
1276
+ "parameters": {
1277
+ "type": "object",
1278
+ "properties": {
1279
+ "a": {
1280
+ "type": "string",
1281
+ "description": "The first number.",
1282
+ },
1283
+ "b": {
1284
+ "type": "string",
1285
+ "description": "The second number.",
1286
+ },
1287
+ },
1288
+ "required": ["a", "b"],
1289
+ },
1290
+ },
1291
+ },
1292
+ {
1293
+ "type": "function",
1294
+ "function": {
1295
+ "name": "substract_number",
1296
+ "description": "Substract two numbers.",
1297
+ "parameters": {
1298
+ "type": "object",
1299
+ "properties": {
1300
+ "a": {
1301
+ "type": "string",
1302
+ "description": "The first number.",
1303
+ },
1304
+ "b": {
1305
+ "type": "string",
1306
+ "description": "The second number.",
1307
+ },
1308
+ },
1309
+ "required": ["a", "b"],
1310
+ },
1311
+ },
1312
+ },
1313
+ {
1314
+ "type": "function",
1315
+ "function": {
1316
+ "name": "write_a_story",
1317
+ "description": "Write a story about science fiction and people with badass laser sabers.",
1318
+ "parameters": {},
1319
+ },
1320
+ },
1321
+ {
1322
+ "type": "function",
1323
+ "function": {
1324
+ "name": "terminal",
1325
+ "description": "Perform operations from the terminal.",
1326
+ "parameters": {
1327
+ "type": "object",
1328
+ "properties": {
1329
+ "command": {
1330
+ "type": "string",
1331
+ "description": "The command you wish to launch, e.g `ls`, `rm`, ...",
1332
+ },
1333
+ "args": {
1334
+ "type": "string",
1335
+ "description": "The arguments to pass to the command.",
1336
+ },
1337
+ },
1338
+ "required": ["command"],
1339
+ },
1340
+ },
1341
+ },
1342
+ {
1343
+ "type": "function",
1344
+ "function": {
1345
+ "name": "python",
1346
+ "description": "Call a Python interpreter with some Python code that will be ran.",
1347
+ "parameters": {
1348
+ "type": "object",
1349
+ "properties": {
1350
+ "code": {
1351
+ "type": "string",
1352
+ "description": "The Python code to run",
1353
+ },
1354
+ "result_variable": {
1355
+ "type": "string",
1356
+ "description": "Variable containing the result you'd like to retrieve from the execution.",
1357
+ },
1358
+ },
1359
+ "required": ["code", "result_variable"],
1360
+ },
1361
+ },
1362
+ },
1363
+ ]
1364
+ ```
1365
+
1366
+ </details>
1367
+
1368
+ <details>
1369
+ <summary>Expected output trace</summary>
1370
+
1371
+ ```json
1372
+ {
1373
+ "role": "assistant",
1374
+ "tool_calls": [],
1375
+ "content": "I am Devstral-Small-2-24B-Instruct-2512, a Large Language Model (LLM) created by Mistral AI, a French startup headquartered in Paris. I power an AI assistant called Le Chat.\n\nToday's date is 2025-12-09."
1376
+ }
1377
+ ```
1378
+
1379
+ </details>
1380
+
1381
+ Run the examples above with the following python script which assumes there is an OpenAI compatible server deployed at `localhost:8000`:
1382
+
1383
+ <details>
1384
+ <summary>Python script</summary>
1385
+
1386
+ ```python
1387
+ import json
1388
+ from openai import OpenAI
1389
+ from typing import Any
1390
+ from datetime import datetime, timedelta
1391
+
1392
+ from huggingface_hub import hf_hub_download
1393
+
1394
+ # Modify OpenAI's API key and API base to use vLLM's API server.
1395
+ openai_api_key = "EMPTY"
1396
+ openai_api_base = "http://localhost:8000/v1"
1397
+
1398
+ TEMP = 0.15
1399
+ MAX_TOK = 262144
1400
+
1401
+ client = OpenAI(
1402
+ api_key=openai_api_key,
1403
+ base_url=openai_api_base,
1404
+ )
1405
+
1406
+ models = client.models.list()
1407
+ model = models.data[0].id
1408
+
1409
+
1410
+ def load_system_prompt(repo_id: str, filename: str) -> str:
1411
+ file_path = hf_hub_download(repo_id=repo_id, filename=filename)
1412
+ with open(file_path, "r") as file:
1413
+ system_prompt = file.read()
1414
+ today = datetime.today().strftime("%Y-%m-%d")
1415
+ yesterday = (datetime.today() - timedelta(days=1)).strftime("%Y-%m-%d")
1416
+ model_name = repo_id.split("/")[-1]
1417
+ return system_prompt.format(name=model_name, today=today, yesterday=yesterday)
1418
+
1419
+
1420
+ SYSTEM_PROMPT = load_system_prompt(model, "CHAT_SYSTEM_PROMPT.txt")
1421
+
1422
+
1423
+ def add_number(a: float | str, b: float | str) -> float:
1424
+ a, b = float(a), float(b)
1425
+ return a + b
1426
+
1427
+
1428
+ def multiply_number(a: float | str, b: float | str) -> float:
1429
+ a, b = float(a), float(b)
1430
+ return a * b
1431
+
1432
+
1433
+ def substract_number(a: float | str, b: float | str) -> float:
1434
+ a, b = float(a), float(b)
1435
+ return a - b
1436
+
1437
+
1438
+ def write_a_story() -> str:
1439
+ return "A long time ago in a galaxy far far away..."
1440
+
1441
+
1442
+ def terminal(command: str, args: dict[str, Any] | str) -> str:
1443
+ return "found nothing"
1444
+
1445
+
1446
+ def python(code: str, result_variable: str) -> str:
1447
+ data = {}
1448
+ exec(code, data)
1449
+ return str(data[result_variable])
1450
+
1451
+
1452
+ MAP_FN = {
1453
+ "add_number": add_number,
1454
+ "multiply_number": multiply_number,
1455
+ "substract_number": substract_number,
1456
+ "write_a_story": write_a_story,
1457
+ "terminal": terminal,
1458
+ "python": python,
1459
+ }
1460
+
1461
+
1462
+ messages = ... # Here copy-paste prompt messages.
1463
+ tools = [
1464
+ {
1465
+ "type": "function",
1466
+ "function": {
1467
+ "name": "add_number",
1468
+ "description": "Add two numbers.",
1469
+ "parameters": {
1470
+ "type": "object",
1471
+ "properties": {
1472
+ "a": {
1473
+ "type": "string",
1474
+ "description": "The first number.",
1475
+ },
1476
+ "b": {
1477
+ "type": "string",
1478
+ "description": "The second number.",
1479
+ },
1480
+ },
1481
+ "required": ["a", "b"],
1482
+ },
1483
+ },
1484
+ },
1485
+ {
1486
+ "type": "function",
1487
+ "function": {
1488
+ "name": "multiply_number",
1489
+ "description": "Multiply two numbers.",
1490
+ "parameters": {
1491
+ "type": "object",
1492
+ "properties": {
1493
+ "a": {
1494
+ "type": "string",
1495
+ "description": "The first number.",
1496
+ },
1497
+ "b": {
1498
+ "type": "string",
1499
+ "description": "The second number.",
1500
+ },
1501
+ },
1502
+ "required": ["a", "b"],
1503
+ },
1504
+ },
1505
+ },
1506
+ {
1507
+ "type": "function",
1508
+ "function": {
1509
+ "name": "substract_number",
1510
+ "description": "Substract two numbers.",
1511
+ "parameters": {
1512
+ "type": "object",
1513
+ "properties": {
1514
+ "a": {
1515
+ "type": "string",
1516
+ "description": "The first number.",
1517
+ },
1518
+ "b": {
1519
+ "type": "string",
1520
+ "description": "The second number.",
1521
+ },
1522
+ },
1523
+ "required": ["a", "b"],
1524
+ },
1525
+ },
1526
+ },
1527
+ {
1528
+ "type": "function",
1529
+ "function": {
1530
+ "name": "write_a_story",
1531
+ "description": "Write a story about science fiction and people with badass laser sabers.",
1532
+ "parameters": {},
1533
+ },
1534
+ },
1535
+ {
1536
+ "type": "function",
1537
+ "function": {
1538
+ "name": "terminal",
1539
+ "description": "Perform operations from the terminal.",
1540
+ "parameters": {
1541
+ "type": "object",
1542
+ "properties": {
1543
+ "command": {
1544
+ "type": "string",
1545
+ "description": "The command you wish to launch, e.g `ls`, `rm`, ...",
1546
+ },
1547
+ "args": {
1548
+ "type": "string",
1549
+ "description": "The arguments to pass to the command.",
1550
+ },
1551
+ },
1552
+ "required": ["command"],
1553
+ },
1554
+ },
1555
+ },
1556
+ {
1557
+ "type": "function",
1558
+ "function": {
1559
+ "name": "python",
1560
+ "description": "Call a Python interpreter with some Python code that will be ran.",
1561
+ "parameters": {
1562
+ "type": "object",
1563
+ "properties": {
1564
+ "code": {
1565
+ "type": "string",
1566
+ "description": "The Python code to run",
1567
+ },
1568
+ "result_variable": {
1569
+ "type": "string",
1570
+ "description": "Variable containing the result you'd like to retrieve from the execution.",
1571
+ },
1572
+ },
1573
+ "required": ["code", "result_variable"],
1574
+ },
1575
+ },
1576
+ },
1577
+ ]
1578
+
1579
+
1580
+ has_tool_calls = True
1581
+ origin_messages_len = len(messages)
1582
+ while has_tool_calls:
1583
+ response = client.chat.completions.create(
1584
+ model=model,
1585
+ messages=messages,
1586
+ temperature=TEMP,
1587
+ max_tokens=MAX_TOK,
1588
+ tools=tools if tools else None,
1589
+ tool_choice="auto" if tools else None,
1590
+ )
1591
+ tool_calls = response.choices[0].message.tool_calls
1592
+ content = response.choices[0].message.content
1593
+ messages.append(
1594
+ {
1595
+ "role": "assistant",
1596
+ "tool_calls": [tc.to_dict() for tc in tool_calls]
1597
+ if tool_calls
1598
+ else tool_calls,
1599
+ "content": content,
1600
+ }
1601
+ )
1602
+ results = []
1603
+ if tool_calls:
1604
+ for tool_call in tool_calls:
1605
+ function_name = tool_call.function.name
1606
+ function_args = tool_call.function.arguments
1607
+ result = MAP_FN[function_name](**json.loads(function_args))
1608
+ results.append(result)
1609
+ for tool_call, result in zip(tool_calls, results):
1610
+ messages.append(
1611
+ {
1612
+ "role": "tool",
1613
+ "tool_call_id": tool_call.id,
1614
+ "name": tool_call.function.name,
1615
+ "content": str(result),
1616
+ }
1617
+ )
1618
+ else:
1619
+ has_tool_calls = False
1620
+ print(json.dumps(messages[origin_messages_len:], indent=2))
1621
+ ```
1622
+
1623
+ </details>
1624
+
1625
+ ## License
1626
+
1627
+ This model is licensed under the [Apache 2.0 License](https://www.apache.org/licenses/LICENSE-2.0.txt).
1628
+
1629
+ *You must not use this model in a manner that infringes, misappropriates, or otherwise violates any third party’s rights, including intellectual property rights.*
VIBE_SYSTEM_PROMPT.txt ADDED
@@ -0,0 +1,24 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ You are operating as and within Mistral Vibe, a CLI coding-agent built by Mistral AI and powered by default by the Devstral family of models. It wraps Mistral's Devstral models to enable natural language interaction with a local codebase. Use the available tools when helpful.
2
+
3
+ You can:
4
+
5
+ - Receive user prompts, project context, and files.
6
+ - Send responses and emit function calls (e.g., shell commands, code edits).
7
+ - Apply patches, run commands, based on user approvals.
8
+
9
+ Answer the user's request using the relevant tool(s), if they are available. Check that all the required parameters for each tool call are provided or can reasonably be inferred from context. IF there are no relevant tools or there are missing values for required parameters, ask the user to supply these values; otherwise proceed with the tool calls. If the user provides a specific value for a parameter (for example provided in quotes), make sure to use that value EXACTLY. DO NOT make up values for or ask about optional parameters. Carefully analyze descriptive terms in the request as they may indicate required parameter values that should be included even if not explicitly quoted.
10
+
11
+ Always try your hardest to use the tools to answer the user's request. If you can't use the tools, explain why and ask the user for more information.
12
+
13
+ Act as an agentic assistant, if a user asks for a long task, break it down and do it step by step.
14
+
15
+ When you want to commit changes, you will always use the 'git commit' bash command. It will always
16
+ be suffixed with a line telling it was generated by Mistral Vibe with the appropriate co-authoring information.
17
+ The format you will always uses is the following heredoc.
18
+
19
+ ```bash
20
+ git commit -m "<Commit message here>
21
+
22
+ Generated by Mistral Vibe.
23
+ Co-Authored-By: Mistral Vibe <vibe@mistral.ai>"
24
+ ```
chat_template.jinja ADDED
@@ -0,0 +1,123 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {#- Unsloth template fixes #}
2
+ {#- Default system message if no system prompt is passed. #}
3
+ {%- set default_system_message = '' %}
4
+
5
+ {#- Begin of sequence token. #}
6
+ {{- bos_token }}
7
+
8
+ {#- Handle system prompt if it exists. #}
9
+ {#- System prompt supports text content or text chunks. #}
10
+ {%- if messages[0]['role'] == 'system' %}
11
+ {{- '[SYSTEM_PROMPT]' -}}
12
+ {%- if messages[0]['content'] is string %}
13
+ {{- messages[0]['content'] -}}
14
+ {%- else %}
15
+ {%- for block in messages[0]['content'] %}
16
+ {%- if block['type'] == 'text' %}
17
+ {{- block['text'] }}
18
+ {%- else %}
19
+ {{- raise_exception('Only text chunks are supported in system message contents.') }}
20
+ {%- endif %}
21
+ {%- endfor %}
22
+ {%- endif %}
23
+ {{- '[/SYSTEM_PROMPT]' -}}
24
+ {%- set loop_messages = messages[1:] %}
25
+ {%- else %}
26
+ {%- set loop_messages = messages %}
27
+ {%- if default_system_message != '' %}
28
+ {{- '[SYSTEM_PROMPT]' + default_system_message + '[/SYSTEM_PROMPT]' }}
29
+ {%- endif %}
30
+ {%- endif %}
31
+
32
+
33
+ {#- Tools definition #}
34
+ {%- set tools_definition = '' %}
35
+ {%- set has_tools = false %}
36
+ {%- if tools is defined and tools is not none and tools|length > 0 %}
37
+ {%- set has_tools = true %}
38
+ {%- set tools_definition = '[AVAILABLE_TOOLS]' + (tools| tojson) + '[/AVAILABLE_TOOLS]' %}
39
+ {{- tools_definition }}
40
+ {%- endif %}
41
+
42
+ {#- Checks for alternating user/assistant messages. #}
43
+ {%- set ns = namespace(index=0) %}
44
+ {%- for message in loop_messages %}
45
+ {%- if message.role == 'user' or (message.role == 'assistant' and (message.tool_calls is not defined or message.tool_calls is none or message.tool_calls | length == 0)) %}
46
+ {%- if (message['role'] == 'user') != (ns.index % 2 == 0) %}
47
+ {{- raise_exception('After the optional system message, conversation roles must alternate user and assistant roles except for tool calls and results.') }}
48
+ {%- endif %}
49
+ {%- set ns.index = ns.index + 1 %}
50
+ {%- endif %}
51
+ {%- endfor %}
52
+
53
+ {#- Handle conversation messages. #}
54
+ {%- for message in loop_messages %}
55
+
56
+ {#- User messages supports text content or text and image chunks. #}
57
+ {%- if message['role'] == 'user' %}
58
+ {%- if message['content'] is string %}
59
+ {{- '[INST]' + message['content'] + '[/INST]' }}
60
+ {%- elif message['content'] | length > 0 %}
61
+ {{- '[INST]' }}
62
+ {%- if message['content'] | length == 2 %}
63
+ {%- set blocks = message['content'] | sort(attribute='type') %}
64
+ {%- else %}
65
+ {%- set blocks = message['content'] %}
66
+ {%- endif %}
67
+ {%- for block in blocks %}
68
+ {%- if block['type'] == 'text' %}
69
+ {{- block['text'] }}
70
+ {%- elif block['type'] in ['image', 'image_url'] %}
71
+ {{- '[IMG]' }}
72
+ {%- else %}
73
+ {{- raise_exception('Only text, image and image_url chunks are supported in user message content.') }}
74
+ {%- endif %}
75
+ {%- endfor %}
76
+ {{- '[/INST]' }}
77
+ {%- else %}
78
+ {{- raise_exception('User message must have a string or a list of chunks in content') }}
79
+ {%- endif %}
80
+
81
+ {#- Assistant messages supports text content or text and image chunks. #}
82
+ {%- elif message['role'] == 'assistant' %}
83
+ {%- if (message['content'] is none or message['content'] == '' or message['content']|length == 0) and (message['tool_calls'] is not defined or message['tool_calls'] is none or message['tool_calls']|length == 0) %}
84
+ {{- raise_exception('Assistant message must have a string or a list of chunks in content or a list of tool calls.') }}
85
+ {%- endif %}
86
+
87
+ {%- if message['content'] is string %}
88
+ {{- message['content'] }}
89
+ {%- elif message['content'] is iterable and message['content'] | length > 0 %}
90
+ {%- for block in message['content'] %}
91
+ {%- if block['type'] == 'text' %}
92
+ {{- block['text'] }}
93
+ {%- else %}
94
+ {{- raise_exception('Only text chunks are supported in assistant message contents.') }}
95
+ {%- endif %}
96
+ {%- endfor %}
97
+ {%- endif %}
98
+
99
+ {%- if message['tool_calls'] is defined and message['tool_calls'] is not none and message['tool_calls']|length > 0 %}
100
+ {%- for tool in message['tool_calls'] %}
101
+ {%- set arguments = tool['function']['arguments'] %}
102
+ {%- if arguments is not string %}
103
+ {%- set arguments = arguments|tojson|safe %}
104
+ {%- elif arguments == '' %}
105
+ {%- set arguments = '{}' %}
106
+ {%- endif %}
107
+ {{- '[TOOL_CALLS]' + tool['function']['name'] + '[ARGS]' + arguments }}
108
+ {%- endfor %}
109
+ {%- endif %}
110
+
111
+ {#- End of sequence token for each assistant messages. #}
112
+ {{- eos_token }}
113
+
114
+ {#- Tool messages only supports text content. #}
115
+ {%- elif message['role'] == 'tool' %}
116
+ {{- '[TOOL_RESULTS]' + message['content']|string + '[/TOOL_RESULTS]' }}
117
+
118
+ {#- Raise exception for unsupported roles. #}
119
+ {%- else %}
120
+ {{- raise_exception('Only user, assistant and tool roles are supported, got ' + message['role'] + '.') }}
121
+ {%- endif %}
122
+ {%- endfor %}
123
+ {#- Copyright 2025-present Unsloth. Apache 2.0 License. #}
config.json ADDED
@@ -0,0 +1,74 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "architectures": [
3
+ "Mistral3ForConditionalGeneration"
4
+ ],
5
+ "torch_dtype": "bfloat16",
6
+ "image_token_index": 10,
7
+ "model_type": "mistral3",
8
+ "multimodal_projector_bias": false,
9
+ "pad_token_id": 11,
10
+ "projector_hidden_act": "gelu",
11
+ "quantization_config": {
12
+ "activation_scheme": "static",
13
+ "dequantize": true,
14
+ "modules_to_not_convert": [
15
+ "model.vision_tower",
16
+ "model.multi_modal_projector",
17
+ "lm_head"
18
+ ],
19
+ "quant_method": "fp8",
20
+ "weight_block_size": null
21
+ },
22
+ "spatial_merge_size": 2,
23
+ "text_config": {
24
+ "attention_dropout": 0.0,
25
+ "head_dim": 128,
26
+ "hidden_act": "silu",
27
+ "hidden_size": 5120,
28
+ "initializer_range": 0.02,
29
+ "intermediate_size": 32768,
30
+ "max_position_embeddings": 393216,
31
+ "model_type": "ministral3",
32
+ "num_attention_heads": 32,
33
+ "num_hidden_layers": 40,
34
+ "num_key_value_heads": 8,
35
+ "rms_norm_eps": 1e-05,
36
+ "rope_parameters": {
37
+ "beta_fast": 32.0,
38
+ "beta_slow": 1.0,
39
+ "factor": 48.0,
40
+ "llama_4_scaling_beta": 0.1,
41
+ "mscale": 1.0,
42
+ "mscale_all_dim": 1.0,
43
+ "original_max_position_embeddings": 8192,
44
+ "rope_theta": 100000000.0,
45
+ "rope_type": "yarn",
46
+ "type": "yarn"
47
+ },
48
+ "sliding_window": null,
49
+ "use_cache": true,
50
+ "vocab_size": 131072
51
+ },
52
+ "tie_word_embeddings": false,
53
+ "transformers_version": "5.0.0rc0",
54
+ "unsloth_fixed": true,
55
+ "vision_config": {
56
+ "attention_dropout": 0.0,
57
+ "head_dim": 64,
58
+ "hidden_act": "silu",
59
+ "hidden_size": 1024,
60
+ "image_size": 1540,
61
+ "initializer_range": 0.02,
62
+ "intermediate_size": 4096,
63
+ "model_type": "pixtral",
64
+ "num_attention_heads": 16,
65
+ "num_channels": 3,
66
+ "num_hidden_layers": 24,
67
+ "patch_size": 14,
68
+ "rope_parameters": {
69
+ "rope_theta": 10000.0,
70
+ "rope_type": "default"
71
+ }
72
+ },
73
+ "vision_feature_layer": -1
74
+ }
generation_config.json ADDED
@@ -0,0 +1,7 @@
 
 
 
 
 
 
 
 
1
+ {
2
+ "bos_token_id": 1,
3
+ "eos_token_id": 2,
4
+ "max_length": 262144,
5
+ "pad_token_id": 11,
6
+ "transformers_version": "5.0.0.dev0"
7
+ }
model-00001-of-00006.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:380035da60c7cc474cb7358888a1c50c70679bb3fb7f70870c2400f93ac51d70
3
+ size 4907440626
model-00002-of-00006.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:2caed6d3fb5af9c97b8c70e1424a9e517454e01451332834fba4fdb4e7a18280
3
+ size 4996664584
model-00003-of-00006.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:e2bab376f49baa1da58c0a737f688cbfe185dc6a994fa2870d62b7c8b36e3360
3
+ size 4980946432
model-00004-of-00006.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:63c422f7a5c1460967068c0ceff65eb31f136f64872e281841313e8c669e7c50
3
+ size 4980946416
model-00005-of-00006.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:ec99fd6a7faf35b43e38e60f531e9ee5d67c4292773d71246038b9eb508e373a
3
+ size 4996675032
model-00006-of-00006.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:8efdf4d1c2425a2a7956bf43ae343f44a825a90a87e341ff02f708da2923a0b1
3
+ size 930386318
model.safetensors.index.json ADDED
The diff for this file is too large to render. See raw diff
 
params.json ADDED
@@ -0,0 +1,52 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "dim": 5120,
3
+ "n_layers": 40,
4
+ "head_dim": 128,
5
+ "hidden_dim": 32768,
6
+ "n_heads": 32,
7
+ "n_kv_heads": 8,
8
+ "rope_theta": 100000000.0,
9
+ "norm_eps": 1e-05,
10
+ "vocab_size": 131072,
11
+ "tied_embeddings": false,
12
+ "max_position_embeddings": 393216,
13
+ "max_seq_len": 262144,
14
+ "llama_4_scaling": {
15
+ "original_max_position_embeddings": 8192,
16
+ "beta": 0.1
17
+ },
18
+ "q_lora_rank": null,
19
+ "qk_rope_head_dim": null,
20
+ "qk_nope_head_dim": null,
21
+ "kv_lora_rank": null,
22
+ "v_head_dim": null,
23
+ "quantization": {
24
+ "qformat_weight": "fp8_e4m3",
25
+ "qscheme_act": "TENSOR"
26
+ },
27
+ "yarn": {
28
+ "original_max_position_embeddings": 8192,
29
+ "factor": 48,
30
+ "apply_scale": false,
31
+ "beta": 32,
32
+ "alpha": 1
33
+ },
34
+ "vision_encoder": {
35
+ "image_token_id": 10,
36
+ "image_break_token_id": 12,
37
+ "image_end_token_id": 13,
38
+ "intermediate_size": 4096,
39
+ "num_hidden_layers": 24,
40
+ "num_attention_heads": 16,
41
+ "mm_projector_id": "patch_merge",
42
+ "spatial_merge_size": 2,
43
+ "hidden_size": 1024,
44
+ "num_channels": 3,
45
+ "image_size": 1540,
46
+ "max_image_size": 1540,
47
+ "patch_size": 14,
48
+ "rope_theta": 10000.0,
49
+ "add_pre_mm_projector_layer_norm": true,
50
+ "adapter_bias": false
51
+ }
52
+ }
processor_config.json ADDED
@@ -0,0 +1,42 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "image_break_token": "[IMG_BREAK]",
3
+ "image_end_token": "[IMG_END]",
4
+ "image_processor": {
5
+ "crop_size": null,
6
+ "data_format": "channels_first",
7
+ "device": null,
8
+ "disable_grouping": null,
9
+ "do_center_crop": null,
10
+ "do_convert_rgb": true,
11
+ "do_normalize": true,
12
+ "do_pad": null,
13
+ "do_rescale": true,
14
+ "do_resize": true,
15
+ "image_mean": [
16
+ 0.48145466,
17
+ 0.4578275,
18
+ 0.40821073
19
+ ],
20
+ "image_processor_type": "PixtralImageProcessorFast",
21
+ "image_seq_length": null,
22
+ "image_std": [
23
+ 0.26862954,
24
+ 0.26130258,
25
+ 0.27577711
26
+ ],
27
+ "input_data_format": null,
28
+ "pad_size": null,
29
+ "patch_size": 14,
30
+ "processor_class": "PixtralProcessor",
31
+ "resample": 3,
32
+ "rescale_factor": 0.00392156862745098,
33
+ "return_tensors": null,
34
+ "size": {
35
+ "longest_edge": 1540
36
+ }
37
+ },
38
+ "image_token": "[IMG]",
39
+ "patch_size": 14,
40
+ "processor_class": "PixtralProcessor",
41
+ "spatial_merge_size": 2
42
+ }
tekken.json ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:e29d19ea32eb7e26e6c0572d57cb7f9eca0f4420e0e0fe6ae1cf3be94da1c0d6
3
+ size 16753777
tokenizer.json ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:286acad9b0e27fce778ac429763536accf618ccb6ed72963b6f94685e531c5c7
3
+ size 17077402
tokenizer_config.json ADDED
@@ -0,0 +1,1016 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "additional_special_tokens": null,
3
+ "backend": "tokenizers",
4
+ "bos_token": "<s>",
5
+ "eos_token": "</s>",
6
+ "extra_special_tokens": [
7
+ "<unk>",
8
+ "<s>",
9
+ "</s>",
10
+ "[INST]",
11
+ "[/INST]",
12
+ "[AVAILABLE_TOOLS]",
13
+ "[/AVAILABLE_TOOLS]",
14
+ "[TOOL_RESULTS]",
15
+ "[/TOOL_RESULTS]",
16
+ "[TOOL_CALLS]",
17
+ "[IMG]",
18
+ "<pad>",
19
+ "[IMG_BREAK]",
20
+ "[IMG_END]",
21
+ "[PREFIX]",
22
+ "[MIDDLE]",
23
+ "[SUFFIX]",
24
+ "[SYSTEM_PROMPT]",
25
+ "[/SYSTEM_PROMPT]",
26
+ "[TOOL_CONTENT]",
27
+ "<SPECIAL_20>",
28
+ "<SPECIAL_21>",
29
+ "<SPECIAL_22>",
30
+ "<SPECIAL_23>",
31
+ "[AUDIO]",
32
+ "[BEGIN_AUDIO]",
33
+ "<SPECIAL_26>",
34
+ "<SPECIAL_27>",
35
+ "<SPECIAL_28>",
36
+ "<SPECIAL_29>",
37
+ "<SPECIAL_30>",
38
+ "<SPECIAL_31>",
39
+ "[ARGS]",
40
+ "[CALL_ID]",
41
+ "[THINK]",
42
+ "[/THINK]",
43
+ "<SPECIAL_36>",
44
+ "<SPECIAL_37>",
45
+ "<SPECIAL_38>",
46
+ "<SPECIAL_39>",
47
+ "<SPECIAL_40>",
48
+ "<SPECIAL_41>",
49
+ "<SPECIAL_42>",
50
+ "<SPECIAL_43>",
51
+ "<SPECIAL_44>",
52
+ "<SPECIAL_45>",
53
+ "<SPECIAL_46>",
54
+ "<SPECIAL_47>",
55
+ "<SPECIAL_48>",
56
+ "<SPECIAL_49>",
57
+ "<SPECIAL_50>",
58
+ "<SPECIAL_51>",
59
+ "<SPECIAL_52>",
60
+ "<SPECIAL_53>",
61
+ "<SPECIAL_54>",
62
+ "<SPECIAL_55>",
63
+ "<SPECIAL_56>",
64
+ "<SPECIAL_57>",
65
+ "<SPECIAL_58>",
66
+ "<SPECIAL_59>",
67
+ "<SPECIAL_60>",
68
+ "<SPECIAL_61>",
69
+ "<SPECIAL_62>",
70
+ "<SPECIAL_63>",
71
+ "<SPECIAL_64>",
72
+ "<SPECIAL_65>",
73
+ "<SPECIAL_66>",
74
+ "<SPECIAL_67>",
75
+ "<SPECIAL_68>",
76
+ "<SPECIAL_69>",
77
+ "<SPECIAL_70>",
78
+ "<SPECIAL_71>",
79
+ "<SPECIAL_72>",
80
+ "<SPECIAL_73>",
81
+ "<SPECIAL_74>",
82
+ "<SPECIAL_75>",
83
+ "<SPECIAL_76>",
84
+ "<SPECIAL_77>",
85
+ "<SPECIAL_78>",
86
+ "<SPECIAL_79>",
87
+ "<SPECIAL_80>",
88
+ "<SPECIAL_81>",
89
+ "<SPECIAL_82>",
90
+ "<SPECIAL_83>",
91
+ "<SPECIAL_84>",
92
+ "<SPECIAL_85>",
93
+ "<SPECIAL_86>",
94
+ "<SPECIAL_87>",
95
+ "<SPECIAL_88>",
96
+ "<SPECIAL_89>",
97
+ "<SPECIAL_90>",
98
+ "<SPECIAL_91>",
99
+ "<SPECIAL_92>",
100
+ "<SPECIAL_93>",
101
+ "<SPECIAL_94>",
102
+ "<SPECIAL_95>",
103
+ "<SPECIAL_96>",
104
+ "<SPECIAL_97>",
105
+ "<SPECIAL_98>",
106
+ "<SPECIAL_99>",
107
+ "<SPECIAL_100>",
108
+ "<SPECIAL_101>",
109
+ "<SPECIAL_102>",
110
+ "<SPECIAL_103>",
111
+ "<SPECIAL_104>",
112
+ "<SPECIAL_105>",
113
+ "<SPECIAL_106>",
114
+ "<SPECIAL_107>",
115
+ "<SPECIAL_108>",
116
+ "<SPECIAL_109>",
117
+ "<SPECIAL_110>",
118
+ "<SPECIAL_111>",
119
+ "<SPECIAL_112>",
120
+ "<SPECIAL_113>",
121
+ "<SPECIAL_114>",
122
+ "<SPECIAL_115>",
123
+ "<SPECIAL_116>",
124
+ "<SPECIAL_117>",
125
+ "<SPECIAL_118>",
126
+ "<SPECIAL_119>",
127
+ "<SPECIAL_120>",
128
+ "<SPECIAL_121>",
129
+ "<SPECIAL_122>",
130
+ "<SPECIAL_123>",
131
+ "<SPECIAL_124>",
132
+ "<SPECIAL_125>",
133
+ "<SPECIAL_126>",
134
+ "<SPECIAL_127>",
135
+ "<SPECIAL_128>",
136
+ "<SPECIAL_129>",
137
+ "<SPECIAL_130>",
138
+ "<SPECIAL_131>",
139
+ "<SPECIAL_132>",
140
+ "<SPECIAL_133>",
141
+ "<SPECIAL_134>",
142
+ "<SPECIAL_135>",
143
+ "<SPECIAL_136>",
144
+ "<SPECIAL_137>",
145
+ "<SPECIAL_138>",
146
+ "<SPECIAL_139>",
147
+ "<SPECIAL_140>",
148
+ "<SPECIAL_141>",
149
+ "<SPECIAL_142>",
150
+ "<SPECIAL_143>",
151
+ "<SPECIAL_144>",
152
+ "<SPECIAL_145>",
153
+ "<SPECIAL_146>",
154
+ "<SPECIAL_147>",
155
+ "<SPECIAL_148>",
156
+ "<SPECIAL_149>",
157
+ "<SPECIAL_150>",
158
+ "<SPECIAL_151>",
159
+ "<SPECIAL_152>",
160
+ "<SPECIAL_153>",
161
+ "<SPECIAL_154>",
162
+ "<SPECIAL_155>",
163
+ "<SPECIAL_156>",
164
+ "<SPECIAL_157>",
165
+ "<SPECIAL_158>",
166
+ "<SPECIAL_159>",
167
+ "<SPECIAL_160>",
168
+ "<SPECIAL_161>",
169
+ "<SPECIAL_162>",
170
+ "<SPECIAL_163>",
171
+ "<SPECIAL_164>",
172
+ "<SPECIAL_165>",
173
+ "<SPECIAL_166>",
174
+ "<SPECIAL_167>",
175
+ "<SPECIAL_168>",
176
+ "<SPECIAL_169>",
177
+ "<SPECIAL_170>",
178
+ "<SPECIAL_171>",
179
+ "<SPECIAL_172>",
180
+ "<SPECIAL_173>",
181
+ "<SPECIAL_174>",
182
+ "<SPECIAL_175>",
183
+ "<SPECIAL_176>",
184
+ "<SPECIAL_177>",
185
+ "<SPECIAL_178>",
186
+ "<SPECIAL_179>",
187
+ "<SPECIAL_180>",
188
+ "<SPECIAL_181>",
189
+ "<SPECIAL_182>",
190
+ "<SPECIAL_183>",
191
+ "<SPECIAL_184>",
192
+ "<SPECIAL_185>",
193
+ "<SPECIAL_186>",
194
+ "<SPECIAL_187>",
195
+ "<SPECIAL_188>",
196
+ "<SPECIAL_189>",
197
+ "<SPECIAL_190>",
198
+ "<SPECIAL_191>",
199
+ "<SPECIAL_192>",
200
+ "<SPECIAL_193>",
201
+ "<SPECIAL_194>",
202
+ "<SPECIAL_195>",
203
+ "<SPECIAL_196>",
204
+ "<SPECIAL_197>",
205
+ "<SPECIAL_198>",
206
+ "<SPECIAL_199>",
207
+ "<SPECIAL_200>",
208
+ "<SPECIAL_201>",
209
+ "<SPECIAL_202>",
210
+ "<SPECIAL_203>",
211
+ "<SPECIAL_204>",
212
+ "<SPECIAL_205>",
213
+ "<SPECIAL_206>",
214
+ "<SPECIAL_207>",
215
+ "<SPECIAL_208>",
216
+ "<SPECIAL_209>",
217
+ "<SPECIAL_210>",
218
+ "<SPECIAL_211>",
219
+ "<SPECIAL_212>",
220
+ "<SPECIAL_213>",
221
+ "<SPECIAL_214>",
222
+ "<SPECIAL_215>",
223
+ "<SPECIAL_216>",
224
+ "<SPECIAL_217>",
225
+ "<SPECIAL_218>",
226
+ "<SPECIAL_219>",
227
+ "<SPECIAL_220>",
228
+ "<SPECIAL_221>",
229
+ "<SPECIAL_222>",
230
+ "<SPECIAL_223>",
231
+ "<SPECIAL_224>",
232
+ "<SPECIAL_225>",
233
+ "<SPECIAL_226>",
234
+ "<SPECIAL_227>",
235
+ "<SPECIAL_228>",
236
+ "<SPECIAL_229>",
237
+ "<SPECIAL_230>",
238
+ "<SPECIAL_231>",
239
+ "<SPECIAL_232>",
240
+ "<SPECIAL_233>",
241
+ "<SPECIAL_234>",
242
+ "<SPECIAL_235>",
243
+ "<SPECIAL_236>",
244
+ "<SPECIAL_237>",
245
+ "<SPECIAL_238>",
246
+ "<SPECIAL_239>",
247
+ "<SPECIAL_240>",
248
+ "<SPECIAL_241>",
249
+ "<SPECIAL_242>",
250
+ "<SPECIAL_243>",
251
+ "<SPECIAL_244>",
252
+ "<SPECIAL_245>",
253
+ "<SPECIAL_246>",
254
+ "<SPECIAL_247>",
255
+ "<SPECIAL_248>",
256
+ "<SPECIAL_249>",
257
+ "<SPECIAL_250>",
258
+ "<SPECIAL_251>",
259
+ "<SPECIAL_252>",
260
+ "<SPECIAL_253>",
261
+ "<SPECIAL_254>",
262
+ "<SPECIAL_255>",
263
+ "<SPECIAL_256>",
264
+ "<SPECIAL_257>",
265
+ "<SPECIAL_258>",
266
+ "<SPECIAL_259>",
267
+ "<SPECIAL_260>",
268
+ "<SPECIAL_261>",
269
+ "<SPECIAL_262>",
270
+ "<SPECIAL_263>",
271
+ "<SPECIAL_264>",
272
+ "<SPECIAL_265>",
273
+ "<SPECIAL_266>",
274
+ "<SPECIAL_267>",
275
+ "<SPECIAL_268>",
276
+ "<SPECIAL_269>",
277
+ "<SPECIAL_270>",
278
+ "<SPECIAL_271>",
279
+ "<SPECIAL_272>",
280
+ "<SPECIAL_273>",
281
+ "<SPECIAL_274>",
282
+ "<SPECIAL_275>",
283
+ "<SPECIAL_276>",
284
+ "<SPECIAL_277>",
285
+ "<SPECIAL_278>",
286
+ "<SPECIAL_279>",
287
+ "<SPECIAL_280>",
288
+ "<SPECIAL_281>",
289
+ "<SPECIAL_282>",
290
+ "<SPECIAL_283>",
291
+ "<SPECIAL_284>",
292
+ "<SPECIAL_285>",
293
+ "<SPECIAL_286>",
294
+ "<SPECIAL_287>",
295
+ "<SPECIAL_288>",
296
+ "<SPECIAL_289>",
297
+ "<SPECIAL_290>",
298
+ "<SPECIAL_291>",
299
+ "<SPECIAL_292>",
300
+ "<SPECIAL_293>",
301
+ "<SPECIAL_294>",
302
+ "<SPECIAL_295>",
303
+ "<SPECIAL_296>",
304
+ "<SPECIAL_297>",
305
+ "<SPECIAL_298>",
306
+ "<SPECIAL_299>",
307
+ "<SPECIAL_300>",
308
+ "<SPECIAL_301>",
309
+ "<SPECIAL_302>",
310
+ "<SPECIAL_303>",
311
+ "<SPECIAL_304>",
312
+ "<SPECIAL_305>",
313
+ "<SPECIAL_306>",
314
+ "<SPECIAL_307>",
315
+ "<SPECIAL_308>",
316
+ "<SPECIAL_309>",
317
+ "<SPECIAL_310>",
318
+ "<SPECIAL_311>",
319
+ "<SPECIAL_312>",
320
+ "<SPECIAL_313>",
321
+ "<SPECIAL_314>",
322
+ "<SPECIAL_315>",
323
+ "<SPECIAL_316>",
324
+ "<SPECIAL_317>",
325
+ "<SPECIAL_318>",
326
+ "<SPECIAL_319>",
327
+ "<SPECIAL_320>",
328
+ "<SPECIAL_321>",
329
+ "<SPECIAL_322>",
330
+ "<SPECIAL_323>",
331
+ "<SPECIAL_324>",
332
+ "<SPECIAL_325>",
333
+ "<SPECIAL_326>",
334
+ "<SPECIAL_327>",
335
+ "<SPECIAL_328>",
336
+ "<SPECIAL_329>",
337
+ "<SPECIAL_330>",
338
+ "<SPECIAL_331>",
339
+ "<SPECIAL_332>",
340
+ "<SPECIAL_333>",
341
+ "<SPECIAL_334>",
342
+ "<SPECIAL_335>",
343
+ "<SPECIAL_336>",
344
+ "<SPECIAL_337>",
345
+ "<SPECIAL_338>",
346
+ "<SPECIAL_339>",
347
+ "<SPECIAL_340>",
348
+ "<SPECIAL_341>",
349
+ "<SPECIAL_342>",
350
+ "<SPECIAL_343>",
351
+ "<SPECIAL_344>",
352
+ "<SPECIAL_345>",
353
+ "<SPECIAL_346>",
354
+ "<SPECIAL_347>",
355
+ "<SPECIAL_348>",
356
+ "<SPECIAL_349>",
357
+ "<SPECIAL_350>",
358
+ "<SPECIAL_351>",
359
+ "<SPECIAL_352>",
360
+ "<SPECIAL_353>",
361
+ "<SPECIAL_354>",
362
+ "<SPECIAL_355>",
363
+ "<SPECIAL_356>",
364
+ "<SPECIAL_357>",
365
+ "<SPECIAL_358>",
366
+ "<SPECIAL_359>",
367
+ "<SPECIAL_360>",
368
+ "<SPECIAL_361>",
369
+ "<SPECIAL_362>",
370
+ "<SPECIAL_363>",
371
+ "<SPECIAL_364>",
372
+ "<SPECIAL_365>",
373
+ "<SPECIAL_366>",
374
+ "<SPECIAL_367>",
375
+ "<SPECIAL_368>",
376
+ "<SPECIAL_369>",
377
+ "<SPECIAL_370>",
378
+ "<SPECIAL_371>",
379
+ "<SPECIAL_372>",
380
+ "<SPECIAL_373>",
381
+ "<SPECIAL_374>",
382
+ "<SPECIAL_375>",
383
+ "<SPECIAL_376>",
384
+ "<SPECIAL_377>",
385
+ "<SPECIAL_378>",
386
+ "<SPECIAL_379>",
387
+ "<SPECIAL_380>",
388
+ "<SPECIAL_381>",
389
+ "<SPECIAL_382>",
390
+ "<SPECIAL_383>",
391
+ "<SPECIAL_384>",
392
+ "<SPECIAL_385>",
393
+ "<SPECIAL_386>",
394
+ "<SPECIAL_387>",
395
+ "<SPECIAL_388>",
396
+ "<SPECIAL_389>",
397
+ "<SPECIAL_390>",
398
+ "<SPECIAL_391>",
399
+ "<SPECIAL_392>",
400
+ "<SPECIAL_393>",
401
+ "<SPECIAL_394>",
402
+ "<SPECIAL_395>",
403
+ "<SPECIAL_396>",
404
+ "<SPECIAL_397>",
405
+ "<SPECIAL_398>",
406
+ "<SPECIAL_399>",
407
+ "<SPECIAL_400>",
408
+ "<SPECIAL_401>",
409
+ "<SPECIAL_402>",
410
+ "<SPECIAL_403>",
411
+ "<SPECIAL_404>",
412
+ "<SPECIAL_405>",
413
+ "<SPECIAL_406>",
414
+ "<SPECIAL_407>",
415
+ "<SPECIAL_408>",
416
+ "<SPECIAL_409>",
417
+ "<SPECIAL_410>",
418
+ "<SPECIAL_411>",
419
+ "<SPECIAL_412>",
420
+ "<SPECIAL_413>",
421
+ "<SPECIAL_414>",
422
+ "<SPECIAL_415>",
423
+ "<SPECIAL_416>",
424
+ "<SPECIAL_417>",
425
+ "<SPECIAL_418>",
426
+ "<SPECIAL_419>",
427
+ "<SPECIAL_420>",
428
+ "<SPECIAL_421>",
429
+ "<SPECIAL_422>",
430
+ "<SPECIAL_423>",
431
+ "<SPECIAL_424>",
432
+ "<SPECIAL_425>",
433
+ "<SPECIAL_426>",
434
+ "<SPECIAL_427>",
435
+ "<SPECIAL_428>",
436
+ "<SPECIAL_429>",
437
+ "<SPECIAL_430>",
438
+ "<SPECIAL_431>",
439
+ "<SPECIAL_432>",
440
+ "<SPECIAL_433>",
441
+ "<SPECIAL_434>",
442
+ "<SPECIAL_435>",
443
+ "<SPECIAL_436>",
444
+ "<SPECIAL_437>",
445
+ "<SPECIAL_438>",
446
+ "<SPECIAL_439>",
447
+ "<SPECIAL_440>",
448
+ "<SPECIAL_441>",
449
+ "<SPECIAL_442>",
450
+ "<SPECIAL_443>",
451
+ "<SPECIAL_444>",
452
+ "<SPECIAL_445>",
453
+ "<SPECIAL_446>",
454
+ "<SPECIAL_447>",
455
+ "<SPECIAL_448>",
456
+ "<SPECIAL_449>",
457
+ "<SPECIAL_450>",
458
+ "<SPECIAL_451>",
459
+ "<SPECIAL_452>",
460
+ "<SPECIAL_453>",
461
+ "<SPECIAL_454>",
462
+ "<SPECIAL_455>",
463
+ "<SPECIAL_456>",
464
+ "<SPECIAL_457>",
465
+ "<SPECIAL_458>",
466
+ "<SPECIAL_459>",
467
+ "<SPECIAL_460>",
468
+ "<SPECIAL_461>",
469
+ "<SPECIAL_462>",
470
+ "<SPECIAL_463>",
471
+ "<SPECIAL_464>",
472
+ "<SPECIAL_465>",
473
+ "<SPECIAL_466>",
474
+ "<SPECIAL_467>",
475
+ "<SPECIAL_468>",
476
+ "<SPECIAL_469>",
477
+ "<SPECIAL_470>",
478
+ "<SPECIAL_471>",
479
+ "<SPECIAL_472>",
480
+ "<SPECIAL_473>",
481
+ "<SPECIAL_474>",
482
+ "<SPECIAL_475>",
483
+ "<SPECIAL_476>",
484
+ "<SPECIAL_477>",
485
+ "<SPECIAL_478>",
486
+ "<SPECIAL_479>",
487
+ "<SPECIAL_480>",
488
+ "<SPECIAL_481>",
489
+ "<SPECIAL_482>",
490
+ "<SPECIAL_483>",
491
+ "<SPECIAL_484>",
492
+ "<SPECIAL_485>",
493
+ "<SPECIAL_486>",
494
+ "<SPECIAL_487>",
495
+ "<SPECIAL_488>",
496
+ "<SPECIAL_489>",
497
+ "<SPECIAL_490>",
498
+ "<SPECIAL_491>",
499
+ "<SPECIAL_492>",
500
+ "<SPECIAL_493>",
501
+ "<SPECIAL_494>",
502
+ "<SPECIAL_495>",
503
+ "<SPECIAL_496>",
504
+ "<SPECIAL_497>",
505
+ "<SPECIAL_498>",
506
+ "<SPECIAL_499>",
507
+ "<SPECIAL_500>",
508
+ "<SPECIAL_501>",
509
+ "<SPECIAL_502>",
510
+ "<SPECIAL_503>",
511
+ "<SPECIAL_504>",
512
+ "<SPECIAL_505>",
513
+ "<SPECIAL_506>",
514
+ "<SPECIAL_507>",
515
+ "<SPECIAL_508>",
516
+ "<SPECIAL_509>",
517
+ "<SPECIAL_510>",
518
+ "<SPECIAL_511>",
519
+ "<SPECIAL_512>",
520
+ "<SPECIAL_513>",
521
+ "<SPECIAL_514>",
522
+ "<SPECIAL_515>",
523
+ "<SPECIAL_516>",
524
+ "<SPECIAL_517>",
525
+ "<SPECIAL_518>",
526
+ "<SPECIAL_519>",
527
+ "<SPECIAL_520>",
528
+ "<SPECIAL_521>",
529
+ "<SPECIAL_522>",
530
+ "<SPECIAL_523>",
531
+ "<SPECIAL_524>",
532
+ "<SPECIAL_525>",
533
+ "<SPECIAL_526>",
534
+ "<SPECIAL_527>",
535
+ "<SPECIAL_528>",
536
+ "<SPECIAL_529>",
537
+ "<SPECIAL_530>",
538
+ "<SPECIAL_531>",
539
+ "<SPECIAL_532>",
540
+ "<SPECIAL_533>",
541
+ "<SPECIAL_534>",
542
+ "<SPECIAL_535>",
543
+ "<SPECIAL_536>",
544
+ "<SPECIAL_537>",
545
+ "<SPECIAL_538>",
546
+ "<SPECIAL_539>",
547
+ "<SPECIAL_540>",
548
+ "<SPECIAL_541>",
549
+ "<SPECIAL_542>",
550
+ "<SPECIAL_543>",
551
+ "<SPECIAL_544>",
552
+ "<SPECIAL_545>",
553
+ "<SPECIAL_546>",
554
+ "<SPECIAL_547>",
555
+ "<SPECIAL_548>",
556
+ "<SPECIAL_549>",
557
+ "<SPECIAL_550>",
558
+ "<SPECIAL_551>",
559
+ "<SPECIAL_552>",
560
+ "<SPECIAL_553>",
561
+ "<SPECIAL_554>",
562
+ "<SPECIAL_555>",
563
+ "<SPECIAL_556>",
564
+ "<SPECIAL_557>",
565
+ "<SPECIAL_558>",
566
+ "<SPECIAL_559>",
567
+ "<SPECIAL_560>",
568
+ "<SPECIAL_561>",
569
+ "<SPECIAL_562>",
570
+ "<SPECIAL_563>",
571
+ "<SPECIAL_564>",
572
+ "<SPECIAL_565>",
573
+ "<SPECIAL_566>",
574
+ "<SPECIAL_567>",
575
+ "<SPECIAL_568>",
576
+ "<SPECIAL_569>",
577
+ "<SPECIAL_570>",
578
+ "<SPECIAL_571>",
579
+ "<SPECIAL_572>",
580
+ "<SPECIAL_573>",
581
+ "<SPECIAL_574>",
582
+ "<SPECIAL_575>",
583
+ "<SPECIAL_576>",
584
+ "<SPECIAL_577>",
585
+ "<SPECIAL_578>",
586
+ "<SPECIAL_579>",
587
+ "<SPECIAL_580>",
588
+ "<SPECIAL_581>",
589
+ "<SPECIAL_582>",
590
+ "<SPECIAL_583>",
591
+ "<SPECIAL_584>",
592
+ "<SPECIAL_585>",
593
+ "<SPECIAL_586>",
594
+ "<SPECIAL_587>",
595
+ "<SPECIAL_588>",
596
+ "<SPECIAL_589>",
597
+ "<SPECIAL_590>",
598
+ "<SPECIAL_591>",
599
+ "<SPECIAL_592>",
600
+ "<SPECIAL_593>",
601
+ "<SPECIAL_594>",
602
+ "<SPECIAL_595>",
603
+ "<SPECIAL_596>",
604
+ "<SPECIAL_597>",
605
+ "<SPECIAL_598>",
606
+ "<SPECIAL_599>",
607
+ "<SPECIAL_600>",
608
+ "<SPECIAL_601>",
609
+ "<SPECIAL_602>",
610
+ "<SPECIAL_603>",
611
+ "<SPECIAL_604>",
612
+ "<SPECIAL_605>",
613
+ "<SPECIAL_606>",
614
+ "<SPECIAL_607>",
615
+ "<SPECIAL_608>",
616
+ "<SPECIAL_609>",
617
+ "<SPECIAL_610>",
618
+ "<SPECIAL_611>",
619
+ "<SPECIAL_612>",
620
+ "<SPECIAL_613>",
621
+ "<SPECIAL_614>",
622
+ "<SPECIAL_615>",
623
+ "<SPECIAL_616>",
624
+ "<SPECIAL_617>",
625
+ "<SPECIAL_618>",
626
+ "<SPECIAL_619>",
627
+ "<SPECIAL_620>",
628
+ "<SPECIAL_621>",
629
+ "<SPECIAL_622>",
630
+ "<SPECIAL_623>",
631
+ "<SPECIAL_624>",
632
+ "<SPECIAL_625>",
633
+ "<SPECIAL_626>",
634
+ "<SPECIAL_627>",
635
+ "<SPECIAL_628>",
636
+ "<SPECIAL_629>",
637
+ "<SPECIAL_630>",
638
+ "<SPECIAL_631>",
639
+ "<SPECIAL_632>",
640
+ "<SPECIAL_633>",
641
+ "<SPECIAL_634>",
642
+ "<SPECIAL_635>",
643
+ "<SPECIAL_636>",
644
+ "<SPECIAL_637>",
645
+ "<SPECIAL_638>",
646
+ "<SPECIAL_639>",
647
+ "<SPECIAL_640>",
648
+ "<SPECIAL_641>",
649
+ "<SPECIAL_642>",
650
+ "<SPECIAL_643>",
651
+ "<SPECIAL_644>",
652
+ "<SPECIAL_645>",
653
+ "<SPECIAL_646>",
654
+ "<SPECIAL_647>",
655
+ "<SPECIAL_648>",
656
+ "<SPECIAL_649>",
657
+ "<SPECIAL_650>",
658
+ "<SPECIAL_651>",
659
+ "<SPECIAL_652>",
660
+ "<SPECIAL_653>",
661
+ "<SPECIAL_654>",
662
+ "<SPECIAL_655>",
663
+ "<SPECIAL_656>",
664
+ "<SPECIAL_657>",
665
+ "<SPECIAL_658>",
666
+ "<SPECIAL_659>",
667
+ "<SPECIAL_660>",
668
+ "<SPECIAL_661>",
669
+ "<SPECIAL_662>",
670
+ "<SPECIAL_663>",
671
+ "<SPECIAL_664>",
672
+ "<SPECIAL_665>",
673
+ "<SPECIAL_666>",
674
+ "<SPECIAL_667>",
675
+ "<SPECIAL_668>",
676
+ "<SPECIAL_669>",
677
+ "<SPECIAL_670>",
678
+ "<SPECIAL_671>",
679
+ "<SPECIAL_672>",
680
+ "<SPECIAL_673>",
681
+ "<SPECIAL_674>",
682
+ "<SPECIAL_675>",
683
+ "<SPECIAL_676>",
684
+ "<SPECIAL_677>",
685
+ "<SPECIAL_678>",
686
+ "<SPECIAL_679>",
687
+ "<SPECIAL_680>",
688
+ "<SPECIAL_681>",
689
+ "<SPECIAL_682>",
690
+ "<SPECIAL_683>",
691
+ "<SPECIAL_684>",
692
+ "<SPECIAL_685>",
693
+ "<SPECIAL_686>",
694
+ "<SPECIAL_687>",
695
+ "<SPECIAL_688>",
696
+ "<SPECIAL_689>",
697
+ "<SPECIAL_690>",
698
+ "<SPECIAL_691>",
699
+ "<SPECIAL_692>",
700
+ "<SPECIAL_693>",
701
+ "<SPECIAL_694>",
702
+ "<SPECIAL_695>",
703
+ "<SPECIAL_696>",
704
+ "<SPECIAL_697>",
705
+ "<SPECIAL_698>",
706
+ "<SPECIAL_699>",
707
+ "<SPECIAL_700>",
708
+ "<SPECIAL_701>",
709
+ "<SPECIAL_702>",
710
+ "<SPECIAL_703>",
711
+ "<SPECIAL_704>",
712
+ "<SPECIAL_705>",
713
+ "<SPECIAL_706>",
714
+ "<SPECIAL_707>",
715
+ "<SPECIAL_708>",
716
+ "<SPECIAL_709>",
717
+ "<SPECIAL_710>",
718
+ "<SPECIAL_711>",
719
+ "<SPECIAL_712>",
720
+ "<SPECIAL_713>",
721
+ "<SPECIAL_714>",
722
+ "<SPECIAL_715>",
723
+ "<SPECIAL_716>",
724
+ "<SPECIAL_717>",
725
+ "<SPECIAL_718>",
726
+ "<SPECIAL_719>",
727
+ "<SPECIAL_720>",
728
+ "<SPECIAL_721>",
729
+ "<SPECIAL_722>",
730
+ "<SPECIAL_723>",
731
+ "<SPECIAL_724>",
732
+ "<SPECIAL_725>",
733
+ "<SPECIAL_726>",
734
+ "<SPECIAL_727>",
735
+ "<SPECIAL_728>",
736
+ "<SPECIAL_729>",
737
+ "<SPECIAL_730>",
738
+ "<SPECIAL_731>",
739
+ "<SPECIAL_732>",
740
+ "<SPECIAL_733>",
741
+ "<SPECIAL_734>",
742
+ "<SPECIAL_735>",
743
+ "<SPECIAL_736>",
744
+ "<SPECIAL_737>",
745
+ "<SPECIAL_738>",
746
+ "<SPECIAL_739>",
747
+ "<SPECIAL_740>",
748
+ "<SPECIAL_741>",
749
+ "<SPECIAL_742>",
750
+ "<SPECIAL_743>",
751
+ "<SPECIAL_744>",
752
+ "<SPECIAL_745>",
753
+ "<SPECIAL_746>",
754
+ "<SPECIAL_747>",
755
+ "<SPECIAL_748>",
756
+ "<SPECIAL_749>",
757
+ "<SPECIAL_750>",
758
+ "<SPECIAL_751>",
759
+ "<SPECIAL_752>",
760
+ "<SPECIAL_753>",
761
+ "<SPECIAL_754>",
762
+ "<SPECIAL_755>",
763
+ "<SPECIAL_756>",
764
+ "<SPECIAL_757>",
765
+ "<SPECIAL_758>",
766
+ "<SPECIAL_759>",
767
+ "<SPECIAL_760>",
768
+ "<SPECIAL_761>",
769
+ "<SPECIAL_762>",
770
+ "<SPECIAL_763>",
771
+ "<SPECIAL_764>",
772
+ "<SPECIAL_765>",
773
+ "<SPECIAL_766>",
774
+ "<SPECIAL_767>",
775
+ "<SPECIAL_768>",
776
+ "<SPECIAL_769>",
777
+ "<SPECIAL_770>",
778
+ "<SPECIAL_771>",
779
+ "<SPECIAL_772>",
780
+ "<SPECIAL_773>",
781
+ "<SPECIAL_774>",
782
+ "<SPECIAL_775>",
783
+ "<SPECIAL_776>",
784
+ "<SPECIAL_777>",
785
+ "<SPECIAL_778>",
786
+ "<SPECIAL_779>",
787
+ "<SPECIAL_780>",
788
+ "<SPECIAL_781>",
789
+ "<SPECIAL_782>",
790
+ "<SPECIAL_783>",
791
+ "<SPECIAL_784>",
792
+ "<SPECIAL_785>",
793
+ "<SPECIAL_786>",
794
+ "<SPECIAL_787>",
795
+ "<SPECIAL_788>",
796
+ "<SPECIAL_789>",
797
+ "<SPECIAL_790>",
798
+ "<SPECIAL_791>",
799
+ "<SPECIAL_792>",
800
+ "<SPECIAL_793>",
801
+ "<SPECIAL_794>",
802
+ "<SPECIAL_795>",
803
+ "<SPECIAL_796>",
804
+ "<SPECIAL_797>",
805
+ "<SPECIAL_798>",
806
+ "<SPECIAL_799>",
807
+ "<SPECIAL_800>",
808
+ "<SPECIAL_801>",
809
+ "<SPECIAL_802>",
810
+ "<SPECIAL_803>",
811
+ "<SPECIAL_804>",
812
+ "<SPECIAL_805>",
813
+ "<SPECIAL_806>",
814
+ "<SPECIAL_807>",
815
+ "<SPECIAL_808>",
816
+ "<SPECIAL_809>",
817
+ "<SPECIAL_810>",
818
+ "<SPECIAL_811>",
819
+ "<SPECIAL_812>",
820
+ "<SPECIAL_813>",
821
+ "<SPECIAL_814>",
822
+ "<SPECIAL_815>",
823
+ "<SPECIAL_816>",
824
+ "<SPECIAL_817>",
825
+ "<SPECIAL_818>",
826
+ "<SPECIAL_819>",
827
+ "<SPECIAL_820>",
828
+ "<SPECIAL_821>",
829
+ "<SPECIAL_822>",
830
+ "<SPECIAL_823>",
831
+ "<SPECIAL_824>",
832
+ "<SPECIAL_825>",
833
+ "<SPECIAL_826>",
834
+ "<SPECIAL_827>",
835
+ "<SPECIAL_828>",
836
+ "<SPECIAL_829>",
837
+ "<SPECIAL_830>",
838
+ "<SPECIAL_831>",
839
+ "<SPECIAL_832>",
840
+ "<SPECIAL_833>",
841
+ "<SPECIAL_834>",
842
+ "<SPECIAL_835>",
843
+ "<SPECIAL_836>",
844
+ "<SPECIAL_837>",
845
+ "<SPECIAL_838>",
846
+ "<SPECIAL_839>",
847
+ "<SPECIAL_840>",
848
+ "<SPECIAL_841>",
849
+ "<SPECIAL_842>",
850
+ "<SPECIAL_843>",
851
+ "<SPECIAL_844>",
852
+ "<SPECIAL_845>",
853
+ "<SPECIAL_846>",
854
+ "<SPECIAL_847>",
855
+ "<SPECIAL_848>",
856
+ "<SPECIAL_849>",
857
+ "<SPECIAL_850>",
858
+ "<SPECIAL_851>",
859
+ "<SPECIAL_852>",
860
+ "<SPECIAL_853>",
861
+ "<SPECIAL_854>",
862
+ "<SPECIAL_855>",
863
+ "<SPECIAL_856>",
864
+ "<SPECIAL_857>",
865
+ "<SPECIAL_858>",
866
+ "<SPECIAL_859>",
867
+ "<SPECIAL_860>",
868
+ "<SPECIAL_861>",
869
+ "<SPECIAL_862>",
870
+ "<SPECIAL_863>",
871
+ "<SPECIAL_864>",
872
+ "<SPECIAL_865>",
873
+ "<SPECIAL_866>",
874
+ "<SPECIAL_867>",
875
+ "<SPECIAL_868>",
876
+ "<SPECIAL_869>",
877
+ "<SPECIAL_870>",
878
+ "<SPECIAL_871>",
879
+ "<SPECIAL_872>",
880
+ "<SPECIAL_873>",
881
+ "<SPECIAL_874>",
882
+ "<SPECIAL_875>",
883
+ "<SPECIAL_876>",
884
+ "<SPECIAL_877>",
885
+ "<SPECIAL_878>",
886
+ "<SPECIAL_879>",
887
+ "<SPECIAL_880>",
888
+ "<SPECIAL_881>",
889
+ "<SPECIAL_882>",
890
+ "<SPECIAL_883>",
891
+ "<SPECIAL_884>",
892
+ "<SPECIAL_885>",
893
+ "<SPECIAL_886>",
894
+ "<SPECIAL_887>",
895
+ "<SPECIAL_888>",
896
+ "<SPECIAL_889>",
897
+ "<SPECIAL_890>",
898
+ "<SPECIAL_891>",
899
+ "<SPECIAL_892>",
900
+ "<SPECIAL_893>",
901
+ "<SPECIAL_894>",
902
+ "<SPECIAL_895>",
903
+ "<SPECIAL_896>",
904
+ "<SPECIAL_897>",
905
+ "<SPECIAL_898>",
906
+ "<SPECIAL_899>",
907
+ "<SPECIAL_900>",
908
+ "<SPECIAL_901>",
909
+ "<SPECIAL_902>",
910
+ "<SPECIAL_903>",
911
+ "<SPECIAL_904>",
912
+ "<SPECIAL_905>",
913
+ "<SPECIAL_906>",
914
+ "<SPECIAL_907>",
915
+ "<SPECIAL_908>",
916
+ "<SPECIAL_909>",
917
+ "<SPECIAL_910>",
918
+ "<SPECIAL_911>",
919
+ "<SPECIAL_912>",
920
+ "<SPECIAL_913>",
921
+ "<SPECIAL_914>",
922
+ "<SPECIAL_915>",
923
+ "<SPECIAL_916>",
924
+ "<SPECIAL_917>",
925
+ "<SPECIAL_918>",
926
+ "<SPECIAL_919>",
927
+ "<SPECIAL_920>",
928
+ "<SPECIAL_921>",
929
+ "<SPECIAL_922>",
930
+ "<SPECIAL_923>",
931
+ "<SPECIAL_924>",
932
+ "<SPECIAL_925>",
933
+ "<SPECIAL_926>",
934
+ "<SPECIAL_927>",
935
+ "<SPECIAL_928>",
936
+ "<SPECIAL_929>",
937
+ "<SPECIAL_930>",
938
+ "<SPECIAL_931>",
939
+ "<SPECIAL_932>",
940
+ "<SPECIAL_933>",
941
+ "<SPECIAL_934>",
942
+ "<SPECIAL_935>",
943
+ "<SPECIAL_936>",
944
+ "<SPECIAL_937>",
945
+ "<SPECIAL_938>",
946
+ "<SPECIAL_939>",
947
+ "<SPECIAL_940>",
948
+ "<SPECIAL_941>",
949
+ "<SPECIAL_942>",
950
+ "<SPECIAL_943>",
951
+ "<SPECIAL_944>",
952
+ "<SPECIAL_945>",
953
+ "<SPECIAL_946>",
954
+ "<SPECIAL_947>",
955
+ "<SPECIAL_948>",
956
+ "<SPECIAL_949>",
957
+ "<SPECIAL_950>",
958
+ "<SPECIAL_951>",
959
+ "<SPECIAL_952>",
960
+ "<SPECIAL_953>",
961
+ "<SPECIAL_954>",
962
+ "<SPECIAL_955>",
963
+ "<SPECIAL_956>",
964
+ "<SPECIAL_957>",
965
+ "<SPECIAL_958>",
966
+ "<SPECIAL_959>",
967
+ "<SPECIAL_960>",
968
+ "<SPECIAL_961>",
969
+ "<SPECIAL_962>",
970
+ "<SPECIAL_963>",
971
+ "<SPECIAL_964>",
972
+ "<SPECIAL_965>",
973
+ "<SPECIAL_966>",
974
+ "<SPECIAL_967>",
975
+ "<SPECIAL_968>",
976
+ "<SPECIAL_969>",
977
+ "<SPECIAL_970>",
978
+ "<SPECIAL_971>",
979
+ "<SPECIAL_972>",
980
+ "<SPECIAL_973>",
981
+ "<SPECIAL_974>",
982
+ "<SPECIAL_975>",
983
+ "<SPECIAL_976>",
984
+ "<SPECIAL_977>",
985
+ "<SPECIAL_978>",
986
+ "<SPECIAL_979>",
987
+ "<SPECIAL_980>",
988
+ "<SPECIAL_981>",
989
+ "<SPECIAL_982>",
990
+ "<SPECIAL_983>",
991
+ "<SPECIAL_984>",
992
+ "<SPECIAL_985>",
993
+ "<SPECIAL_986>",
994
+ "<SPECIAL_987>",
995
+ "<SPECIAL_988>",
996
+ "<SPECIAL_989>",
997
+ "<SPECIAL_990>",
998
+ "<SPECIAL_991>",
999
+ "<SPECIAL_992>",
1000
+ "<SPECIAL_993>",
1001
+ "<SPECIAL_994>",
1002
+ "<SPECIAL_995>",
1003
+ "<SPECIAL_996>",
1004
+ "<SPECIAL_997>",
1005
+ "<SPECIAL_998>",
1006
+ "<SPECIAL_999>"
1007
+ ],
1008
+ "is_local": true,
1009
+ "model_max_length": 393216,
1010
+ "pad_token": "<pad>",
1011
+ "padding_side": "left",
1012
+ "processor_class": "PixtralProcessor",
1013
+ "tokenizer_class": "TokenizersBackend",
1014
+ "unk_token": "<unk>",
1015
+ "chat_template": "{#- Unsloth template fixes #}\n{#- Default system message if no system prompt is passed. #}\n{%- set default_system_message = '' %}\n\n{#- Begin of sequence token. #}\n{{- bos_token }}\n\n{#- Handle system prompt if it exists. #}\n{#- System prompt supports text content or text chunks. #}\n{%- if messages[0]['role'] == 'system' %}\n {{- '[SYSTEM_PROMPT]' -}}\n {%- if messages[0]['content'] is string %}\n {{- messages[0]['content'] -}}\n {%- else %} \n {%- for block in messages[0]['content'] %}\n {%- if block['type'] == 'text' %}\n {{- block['text'] }}\n {%- else %}\n {{- raise_exception('Only text chunks are supported in system message contents.') }}\n {%- endif %}\n {%- endfor %}\n {%- endif %}\n {{- '[/SYSTEM_PROMPT]' -}}\n {%- set loop_messages = messages[1:] %}\n{%- else %}\n {%- set loop_messages = messages %}\n {%- if default_system_message != '' %}\n {{- '[SYSTEM_PROMPT]' + default_system_message + '[/SYSTEM_PROMPT]' }}\n {%- endif %}\n{%- endif %}\n\n\n{#- Tools definition #}\n{%- set tools_definition = '' %}\n{%- set has_tools = false %}\n{%- if tools is defined and tools is not none and tools|length > 0 %}\n {%- set has_tools = true %}\n {%- set tools_definition = '[AVAILABLE_TOOLS]' + (tools| tojson) + '[/AVAILABLE_TOOLS]' %}\n {{- tools_definition }}\n{%- endif %}\n\n{#- Checks for alternating user/assistant messages. #}\n{%- set ns = namespace(index=0) %}\n{%- for message in loop_messages %}\n {%- if message.role == 'user' or (message.role == 'assistant' and (message.tool_calls is not defined or message.tool_calls is none or message.tool_calls | length == 0)) %}\n {%- if (message['role'] == 'user') != (ns.index % 2 == 0) %}\n {{- raise_exception('After the optional system message, conversation roles must alternate user and assistant roles except for tool calls and results.') }}\n {%- endif %}\n {%- set ns.index = ns.index + 1 %}\n {%- endif %}\n{%- endfor %}\n\n{#- Handle conversation messages. #}\n{%- for message in loop_messages %}\n\n {#- User messages supports text content or text and image chunks. #}\n {%- if message['role'] == 'user' %}\n {%- if message['content'] is string %}\n {{- '[INST]' + message['content'] + '[/INST]' }}\n {%- elif message['content'] | length > 0 %}\n {{- '[INST]' }}\n {%- if message['content'] | length == 2 %}\n {%- set blocks = message['content'] | sort(attribute='type') %}\n {%- else %}\n {%- set blocks = message['content'] %}\n {%- endif %}\n {%- for block in blocks %}\n {%- if block['type'] == 'text' %}\n {{- block['text'] }}\n {%- elif block['type'] in ['image', 'image_url'] %}\n {{- '[IMG]' }}\n {%- else %}\n {{- raise_exception('Only text, image and image_url chunks are supported in user message content.') }}\n {%- endif %}\n {%- endfor %}\n {{- '[/INST]' }}\n {%- else %}\n {{- raise_exception('User message must have a string or a list of chunks in content') }}\n {%- endif %}\n\n {#- Assistant messages supports text content or text and image chunks. #}\n {%- elif message['role'] == 'assistant' %}\n {%- if (message['content'] is none or message['content'] == '' or message['content']|length == 0) and (message['tool_calls'] is not defined or message['tool_calls'] is none or message['tool_calls']|length == 0) %}\n {{- raise_exception('Assistant message must have a string or a list of chunks in content or a list of tool calls.') }}\n {%- endif %}\n\n {%- if message['content'] is string %}\n {{- message['content'] }}\n {%- elif message['content'] is iterable and message['content'] | length > 0 %}\n {%- for block in message['content'] %}\n {%- if block['type'] == 'text' %}\n {{- block['text'] }}\n {%- else %}\n {{- raise_exception('Only text chunks are supported in assistant message contents.') }}\n {%- endif %}\n {%- endfor %}\n {%- endif %}\n \n {%- if message['tool_calls'] is defined and message['tool_calls'] is not none and message['tool_calls']|length > 0 %}\n {%- for tool in message['tool_calls'] %}\n {%- set arguments = tool['function']['arguments'] %}\n {%- if arguments is not string %}\n {%- set arguments = arguments|tojson|safe %}\n {%- elif arguments == '' %}\n {%- set arguments = '{}' %}\n {%- endif %}\n {{- '[TOOL_CALLS]' + tool['function']['name'] + '[ARGS]' + arguments }}\n {%- endfor %}\n {%- endif %}\n\n {#- End of sequence token for each assistant messages. #}\n {{- eos_token }}\n\n {#- Tool messages only supports text content. #}\n {%- elif message['role'] == 'tool' %}\n {{- '[TOOL_RESULTS]' + message['content']|string + '[/TOOL_RESULTS]' }}\n\n {#- Raise exception for unsupported roles. #}\n {%- else %}\n {{- raise_exception('Only user, assistant and tool roles are supported, got ' + message['role'] + '.') }}\n {%- endif %}\n{%- endfor %}\n{#- Copyright 2025-present Unsloth. Apache 2.0 License. #}"
1016
+ }