from Hacker News

Ask HN: How do you manage your AI prompts?

by siddharthgoel88 on 12/5/24, 6:20 AM with 7 comments

Hello folks!

Do you guys use any tools/process to your AI prompts? Or still prefer to keep it adhoc?

  • by runjake on 12/6/24, 10:24 PM

    Depending on the use case and frequency, I either:

    - Save them as a ChatGPT custom GPT or a Claude Project.

    - Create a RayCast AI Command. https://manual.raycast.com/ai

    - Save them as a text snippet in Obsidian notes. https://obsidian.md

  • by tobiasnvdw on 12/7/24, 3:27 PM

    Mostly plain text files saved locally for easy copy-pasting.

    I'll occassionally use prompts from the Anthropic library (https://docs.anthropic.com/en/prompt-library/library) and make some minor modifications to them. E.g. I'll modify the "prose polisher" prompt from the prompt library for refining written text in specific ways.

  • by cloudking on 12/6/24, 9:30 PM

    For ChatGPT I've found this search extension useful to find previously used prompts: https://chromewebstore.google.com/detail/gpt-search-chat-his...

    Source code: https://github.com/polywock/gpt-search

  • by muzani on 12/8/24, 4:34 AM

    I keep it adhoc - models change so frequently that prompts are always broken all the time. Most of the ones I've used last year are no longer relevant.

    "Prompt engineering" may be a thing of the past. These days, you can sketch a vague table on a piece of paper and take a photo of it with a phone, and AI will figure out exactly what you're trying to do.

  • by 97-109-107 on 12/6/24, 12:26 PM

    Maybe I'm hijacking, but I see a generalized problem - how do you keep snippets of text that you use in your browser?

    My current kludge is to edit long fields of text in an external editor via a browser addon, and have the editor save all such edits locally.

  • by wruza on 12/5/24, 4:30 PM

    I’m thinking of making a simple wrapper around APIs, because web-based AIs tend to dump literal tons of text due to monetary incentives. For now I prepend a standard pseudo-system stub to all my chats, works fine in my case.
  • by cyberhunter on 12/5/24, 6:06 PM

    Tired of OpenAI account deletions and Gemini template hiccups? Frustrated with manually typing or copy-pasting prompts every time you switch between LLM clients? If you're like me and want a smoother way to manage your prompts, I built a tool that might be just what you need.

    *The Problem:*

    * OpenAI accounts can be deleted unexpectedly.

    * Gemini templates sometimes fail to work.

    * Re-typing or copy-pasting prompts across multiple clients is tedious.

    *The Solution: DryPrompt*

    DryPrompt lets you create reusable prompt templates with variable fields. You set up the framework once, and then simply fill in the variables to generate the full prompt.

    *How It Works:*

    1. *Go to:* dryprompt.go123.live

    2. *Sign up:* It's free and allows you to sync your prompts across devices.

    3. *Create a template:* Define your prompt structure and mark the parts you want to change with variables.

    4. *Use it:* Copy the template, replace the variables with your specific content, and you've got your ready-to-use prompt!

    *Example:*

    Let's say you need to internationalize multiple code files. With DryPrompt, you can create a template that includes the file code as a variable. Each time, just copy the template, paste in the new file's code, and you'll instantly get the internationalization prompt. No more tedious copying and manual concatenation!

    *Give it a try and make your LLM workflow more efficient:* dryprompt.go123.live