by hellovai on 3/26/24, 1:09 PM with 2 comments
by sjkoelle on 3/26/24, 11:41 PM
by hellovai on 3/26/24, 1:09 PM
https://imgs.xkcd.com/comics/fixing_problems.png
At some point, it became BAML: a type-safe, self-contained way to call LLMs from Python and/or TypeScript.
BAML encapsulates all the boilerplate for:
- flexible parsing of LLM responses into your exact data model - streaming LLM responses as partial JSON - wrapping LLM calls with retries and fallback strategies
Our VSCode extension provides:
- real-time prompt previews, - an LLM testing playground, and - syntax highlighting (of course)
We also have a bunch of cool features in the works: conditionals and loops in our prompt templates, image support, and more powerful types.
We're still pretty early and would love to hear your feedback. To get started: