by quantas on 5/27/25, 11:58 AM with 4 comments
Out of that frustration I've built AnyLM, which aims to solve that by providing a single, native application (currently for Windows, with macOS and more planned) to interact with your local models from LM Studio/Ollama but also models from api providers like OpenAI, Anthropic and Google.
This has been a fun (and challenging!) project. I'd be super grateful for any feedback, suggestions, or if you just want to try it out and let me know what you think!
by havan_agrawal on 5/27/25, 11:31 PM
by phito on 5/28/25, 6:14 AM
Also no Linux support planned :(