from Hacker News

Cannoli allows you to build and run no-code LLM scripts in Obsidian

by JieJie on 8/22/24, 7:22 AM with 1 comments

  • by JieJie on 8/22/24, 7:26 AM

    I'm having fun with this visual editor for LLM scripts. It's almost like Hypercard for LLMs.

    On my 16GB MacBook Air, I did not have to set the OLLAMA_ORIGINS env variable. Maybe I did that a long time ago, as I have a previous Ollama install. This is the first really fun toy/tool that I've found that uses local (also supports foundation model APIs) LLMs to do something interesting.

    I'm having a ball!