by spark_chicken on 7/7/24, 5:00 PM with 2 comments
These days when I do literature reviews for the trending research, I find that most of the useful models need insane amount of GPU to train. As a student in a small lab, we are lack of GPU power. Is it still meaningful if I stay in this lab to learn deep learning??? I felt like all my work is like a toy...feeling lost, any advice?
by talldayo on 7/7/24, 5:10 PM
Thankfully, you can get pretty good results finetuning on a much smaller and cheaper GPU like the 3060ti. Going from GPU-poor to GPU-rich is easier than you might think.
by psyklic on 7/7/24, 7:40 PM