by andrea81 on 3/11/24, 4:49 PM with 4 comments
Actually I want to see the ones compatible to run locally in my current laptop (I'm NOT going to spend money to run a model).
Is there a website where I can enter my laptop specs and list me AI models?
by declaredapple on 3/11/24, 5:34 PM
Most quantization models start falling apart rapidly after 4bit, but you can go lower...
If you can fit it in GPU memory it'll likely be usable, if you can't fit it in memory be prepared for single digit t/s or less, the only exception is for people with M1/M2/M3 cpus.
by PaulHoule on 3/11/24, 4:59 PM