Quantized model
#1
by
MaziyarPanahi
- opened
Thanks for sharing your model, I have converted and quantized it to GGUF if anyone needs it:
https://huggingface.co/MaziyarPanahi/LWM-Text-256K-GGUF
Thanks for sharing your model, I have converted and quantized it to GGUF if anyone needs it:
https://huggingface.co/MaziyarPanahi/LWM-Text-256K-GGUF