Running models locally with ML Studio

If you’re looking for an easy way to download and run models locally on your Mac, check out ML Studio. It’s great for testing the latest models and experimenting with different quantization versions.

I used to use terminal directly, but ML Studio makes it much nicer for everyday use. If ML Studio is not your cup of tea, another option is combining Ollama with Open Web.

For complex tasks, Claude or OpenAI are still king, but more and more I find I can use local models that perform just as well for many tasks. The one I’m currently using is Phi 4 (lmstudio-community/phi-4-GGUF), not only it gives great answers, but it runs beautifully on my M1 Max. 

Marc