by null4bl3 on 11/25/23, 9:01 AM with 2 comments
The speed of answers and computation is not really an issue, and I know that most selfhosted solutions obviously is in no way fully on-par with services like Chatgpt or Stable-diffusion.
I do have somewhat modest resources.
16 GB RAM NVIDIA GPU with 4 GB vRAM.
Is there any options that means I can run it selfhosted?
by firebaze on 11/25/23, 9:18 AM