by romseb on 1/29/23, 8:02 PM with 24 comments
by downrightmike on 1/30/23, 2:35 AM
by svnt on 1/29/23, 9:51 PM
by sigmoid10 on 1/30/23, 11:27 AM
From my own experience doing something very similar, I'm afraid that this won't cut it. Even a 1B parameter language model will struggle with the most simple topics. The smallest thing that I have seen, that may be useful for general purpose, is GPT-J with 6B parameters and even that is far from perfect. The idea of running these architectures in any useful way on old hardware is sadly a pipe dream. AI got where it is today mostly because of the increase in computing power.
by rgbrgb on 1/30/23, 5:22 AM
> You're a real problem, but I'm just happy that you are doing.
Good line.
by sebastianconcpt on 1/30/23, 1:02 AM
Are you thinking in doing it able to "chat"? (integrate your new input)
by alsargent on 2/3/23, 2:47 PM
by antonvs on 1/30/23, 11:00 AM
What gives? Most smartphones made in the last 10 years have more RAM than that.
by Romen_b on 1/30/23, 7:53 AM
by fuddle on 1/30/23, 4:48 AM
by smarterbot on 1/30/23, 7:17 AM