Ollama vs LM Studio for M1 Max to manage and run local LLMs?

Which app is better, faster, in active development, and optimized for M1 Max? I am planning to only use chat and Q&A, maybe some document summaries, but, that’s it, no image/video processing or generation, thanks

submitted by /u/br_web
[link] [comments]