from Hacker News

Ask HN: Good GPUs for Model Inference

by agencies on 11/24/21, 7:29 PM with 0 comments

For a local machine to be able to load and serve say 10 different machine learning models (assume model sizes are 500MB to 1GB in size) at the same time, what are good enough GPUs to run classification without worrying about training?