from Hacker News

Ask HN: Best platform for self-hosting LLM Models?

by Kalpeshbhalekar on 5/1/24, 3:23 AM with 3 comments

  • by verdverm on 5/1/24, 4:17 AM

    This generally means running a GPU all the time. My personal preference is to use my preferred cloud (GCP).

    FWIW, I'm using the VertexAI API rather than running an LLM all the time. They have data privacy in the ToS, so I'm not worried about them training on my data. It's far cheaper and better than running a lower quality model myself. When I get around to some fine-tunings, they have options, but you can get pretty far with prompts, RAG, and agents

  • by runjake on 5/3/24, 1:32 AM

    Best platform would be hellishly expensive cloud compute or an expensive PC with a beefy, expensive Nvidia GPU, all of which generates a lot of heat.

    I personally get a lot of mileage out of an M3 Max with 36gb memory.

  • by fragmede on 5/3/24, 1:36 AM

    the three choices are the cloud, a gaming rig, or a Mac. how much money do you have right now, how much money do you have for upkeep, how much time do you have for it, how much do you pay for power; it's a broad question!

    best is hard to definite so it depends on what you and your needs.