by ssabev on 3/16/25, 1:38 PM with 1 comments
by ssabev on 3/16/25, 1:38 PM
now you can run an openai-compatible proxy for all of your "local model" needs.
currently using it as my custom provider in repoprompt
pkg includes: client, cli, and proxy