from Hacker News

Ask HN: How to send context when you use LLMs via API for a B2B use-case?

by msnkarthik on 7/29/24, 7:26 AM with 2 comments

In B2B use-case, we pass on the data to LLMs via API and get the answers and display to the users. It's like a wrapper on top of LLM API. But since API calls are one-off, there dont have much context about a whole chat. How do you send so much of context when communicating with GPT via an API and not directly through chat. Wondering this for a B2B saas use-case.
  • by fdarkaou on 7/29/24, 7:16 PM

    most LLMs today have a large context window where you can send a history of your chat

    i've built multiple chat demo apps (see anotherwrapper.com) and there what i basically did was store a full copy of the history in the DB & then in a config file i specified how many previous messages i want to include in my chat history when interacting to the API