by bebrws on 2/24/23, 11:53 PM with 2 comments
by bebrws on 2/24/23, 11:53 PM
With PyTorch now supporting M1 Macbooks I was able to run EleutherAI/gpt-j-6B locally. I do have an M1 with 64GB of RAM but I believe it only required around 25GB and with the environment variable I have documented it will use allow you to use CPU instead of CUDA I believe.