from Hacker News

Show HN: Chorus, a Mac app that lets you chat with a bunch of AIs at once

by Charlieholtz on 12/29/24, 9:47 PM with 73 comments

There's so many cool models to try but they're all in different places. In Chorus you can chat with a bunch of models all at once, and add your own system prompts.

Like 4o with a CBT overview, or a succinct Claude. Excited to hear your thoughts!

  • by maroonblazer on 12/30/24, 1:37 AM

    Just tried this with an interpersonal situation I'm going through. The default seems to be Claude 3.5 Sonnet and ChatGPT-4o. I got the results I've come to expect from those two, with the latter better at non-programming kinds of prompts.

    The app presented the option of prompting additional models, including Gemini Flash 2.0, one I'd never used before. It gave the best response and was surprisingly good.

    Curious to know how Chorus is paying for the compute, as I was expecting to have to use my own API keys.

  • by Charlieholtz on 12/30/24, 1:35 PM

    Hi! One of the creators of Chorus here. Really cool to hear how everyone is using it. We made this as an experiment because it felt silly to constantly be switching between the ChatGPT, Claude, and LMStudio desktop app. It's also nice to be able to run models with custom system prompts in one place (I have Claude with a summary of how CBT works that I find pretty helpful).

    It's a Tauri 2.0 desktop app (not Electron!), so it is using the Mac's native browser view and a Rust backend. It also makes DMG size relatively small (~25mb but we can get it much smaller once we get rid of some bloat).

    Right now Chorus is proxying API calls to our server, so it's free to use. We didn't add bring-your-own-api key to this version because it was a bit quicker to ship. This was kind of an experimental winter break project, so didn't think too hard about it. Likely will have to fix that (and add bring your own key? or a paid version?) as more of you use it :)

    Definitely planning on adding support for local models too. Happy to answer any other questions, and any feedback is super helpful (and motivating!) for us.

    UPDATE: Just added the option to bring your own API keys! It should be rolling out over the next hour or so.

  • by dcreater on 12/30/24, 6:04 AM

    Airtrain.ai and msty.app have had this for a while.

    What isn't there and would be useful is to not have them side by side but rather swipable. When you're using for code comparisons even 2 gets stuffy

  • by solomatov on 12/30/24, 4:45 AM

    I would be much more likely to install this if it was published in the app store.
  • by mikae1 on 12/30/24, 9:53 AM

    Was hoping this would be a LM Studio alternative (for local LLMs) with a friendlier UI. I think there's a genuine need for that.

    It could make available only the LLLMs that your Mac is able to run.

    Many Silicon owners are sitting on very able hardware without even knowing.

  • by nomilk on 12/30/24, 1:21 AM

    Love the idea. I frequently use ChatGPT (out of habit) and while it's generating, copy/paste the same prompt into claude and grok. This seems like a good way to save time.
  • by sleno on 12/30/24, 1:35 AM

    very well designed! how does this work? in the sense that i didn't have to copy/paste any keys and yet this is offering paid models for free.
  • by kanodiaashu on 12/30/24, 4:16 AM

    This reminds me of the search engine aggregators in the old days that used to somehow install themselves on internet explorer and then collected search results from multiple providers and sometimes compared them. I wonder if this time these tools will persist.
  • by HelloUsername on 12/30/24, 10:16 PM

  • by sharonbiren on 1/6/25, 1:27 PM

    Is it supposed to support Intel-based chips in the future? It cannot run on my Mac
  • by cmiller1 on 12/30/24, 2:13 PM

    Is the name a Star Trek TNG reference? https://memory-alpha.fandom.com/wiki/Riva%27s_chorus
  • by rubymamis on 12/30/24, 7:18 AM

    If you're looking for a fast, native alternative for Windows, Linux (and macOS), you can join my new app waitlist: https://www.get-vox.com
  • by wonderfuly on 1/2/25, 12:24 PM

    ChatHub is the first service to do this, and it's been around for almost two years, even before the release of the GPT-3.5 API.
  • by detente18 on 1/4/25, 5:18 PM

    Your changelog is neat - is this custom built or via some embeddable tool?
  • by prmoustache on 12/30/24, 10:28 AM

    Or you can do that on your tmux terminal multiplexer using the synchronize-pane options.

    A number of terminals can also do that natively (kitty comes to mind).

  • by paul7986 on 12/30/24, 2:27 AM

    Cool and GPT/Claude think there are only 2 "r"s in strawberry?

    Wow that's a bit scary (use GPT a lot) how bad a fail that is!

  • by cryptozeus on 12/30/24, 4:54 AM

    Thanks for simple landing page and most simple example anyone can understand.
  • by ranguna on 12/30/24, 9:03 AM

    lmarena.ai is also pretty good. It's not mac exclusive, works from the browser and has a bunch of different AIs to choose from. It doesn't keep a history when you close the tab though
  • by whatever1 on 12/30/24, 7:03 AM

    Isn’t this cheating? What will the AI overlords think about this behavior once they take over things ?
  • by sagarpatil on 12/30/24, 4:49 AM

    msty.app does this and much more. It’s open source too.
  • by 486sx33 on 12/30/24, 2:25 AM

    Sweet !