by iaresee on 4/27/25, 3:35 PM with 65 comments
by kristopolous on 4/27/25, 6:49 PM
Look at the savebrace screenshot here
https://github.com/kristopolous/Streamdown?tab=readme-ov-fil...
There's a markdown renderer which can extract code samples, a code sample viewer, and a tool to do the tmux handling and this all uses things like fzf and simple tools like simonw's llm. It's all I/O so it's all swappable.
It sits adjacent and you can go back and forth, using the chat when you need to but not doing everything through it.
You can also make it go away and then when it comes back it's the same context so you're not starting over.
Since I offload the actual llm loop, you can use whatever you want. The hooks are at the interface and parsing level.
When rendering the markdown, streamdown saves the code blocks as null-delimited chunks in the configurable /tmp/sd/savebrace. This allows things like xargs, fzf, or a suite of unix tools to manipulate it in sophisticated chains.
Again, it's not a package, it's an open architecture.
I know I don't have a slick pitch site but it's intentionally dispersive like Unix is supposed to be.
It's ready to go, just ask me. Everyone I've shown in person has followed up with things like "This has changed my life".
I'm trying to make llm workflow components. The WIMP of the LLM era. Things that are flexible, primitive in a good way, and also very easy to use.
Bug reports, contributions, and even opinionated designers are highly encouraged!
by arkasan on 4/28/25, 1:43 AM
by protocolture on 4/28/25, 12:02 AM
But then I realise that I do enough sensitive stuff on the terminal that I don't really want this unless I have a model running locally.
Then I worry about all the times I have seen a junior run a command from the internet and bricked a production server.
by malux85 on 4/27/25, 5:43 PM
Get rid of this bit, so the user asks question, gets command.
Make it so the user can ask a follow up question if they want, but this is just noise, taking up valuable terminal space
by amelius on 4/28/25, 11:41 AM
Do you want to execute this command? [Y]es/No/Edit
perhaps also add an "Explain" option, because for some commands it is not immediately obvious what they do (or are supposed to do).by iaresee on 4/27/25, 3:35 PM
by rcarmo on 4/27/25, 10:21 PM
This seems… like an amazing attack vector. Hope it integrates with litellm/ollama without fuss so I can run it locally.
by eranrund on 4/28/25, 6:22 PM
by rpigab on 4/28/25, 12:57 PM
by dimatura on 4/27/25, 6:59 PM
by rawoke083600 on 4/28/25, 9:21 AM
But well done for launching (the following is not hate, but onboarding feedback)
Who else had issues about API key ?
1. What is a TMUXAI_OPENROUTER_API_KEY ?? (is like an OPENAI key) ?
2. If its an API key for TMUXAI ? Where do I find this ? Can't see on the website ? (probably haven't searched properly, but why make me search ?)
3. SUPER simple instructions to install, but ZERO (discoverable) instructions where/how to find and set API key ??
4. When running tmuxai instead of telling me I need an API key. How about putting an actual link to where I can find the API key.
Again well done for launching... sure it took hard word and effort.
by dr_kretyn on 4/27/25, 7:24 PM
by zipping1549 on 4/28/25, 3:40 AM
by jph00 on 4/27/25, 9:24 PM
It was created by one of my colleagues, Nathan Cooper.
https://www.answer.ai/posts/2024-12-05-introducing-shell-sag...
by inciampati on 4/27/25, 6:36 PM
by jmdots on 4/27/25, 11:11 PM
by porcoda on 4/28/25, 12:26 AM
by sepositus on 4/28/25, 4:42 AM
by alvinunreal on 4/27/25, 3:42 PM
Appreciate the feedback as it evolves.
by poulpy123 on 4/28/25, 10:44 AM
by atsaloli on 4/27/25, 10:40 PM
by smallpipe on 4/27/25, 6:29 PM
by neuroelectron on 4/28/25, 11:41 AM
by pjmlp on 4/28/25, 5:56 AM
by mathfailure on 4/27/25, 6:16 PM
by kurtis_reed on 4/27/25, 6:19 PM