from Hacker News

Ask HN: Where to Host Llama 2?

by retrovrv on 8/14/23, 3:58 AM with 2 comments

There's ollama/ggml etc for local setup, but other than Replicate, what are the other options for hosting Llama 2?
  • by brucethemoose2 on 8/14/23, 4:06 AM

    vast.ai is a popular and economical option.

    A single 3090 will host 70B reasonably well, two will fit it completely in vram.

    Another thing I suggest is hosting on AI horde with koboldcpp, if the UI/API works for you and the finetune is appropriate for public use. You will get priority access to your host, but fulfilling other prompts in its spare time will earn you kudos to try other models people are hosting, or to get more burst throughput.

    https://lite.koboldai.net/#