Require support for macOS mps support/ollama support

#3
by robbie-wx - opened

Dears,

You've indeed done an amazing job. Could you please guide how to run the model on macOS, via mps device support or distribute it with ollama?
Thanks a lot!

Support for Baichuan M1 has been merged to mlx-lm (https://github.com/ml-explore/mlx-lm), Currently not working with quantized models, but I have submitted a PR that fixes this.

Sign up or log in to comment