by panqueca on 4/24/24, 4:44 AM with 13 comments
by Roshni1990r on 5/2/24, 2:25 PM
OpenELM offers models with 270M to 3B parameters, pre-trained and instruction-tuned, with Good results across various benchmarks.
My Feedback:
First Phi 3, now OpenELM. It's great to see these small models improving. I know they're not ready for production in all cases, but they're really great for specific tasks.
I see small open-source models as the future because they offer better speed, require less compute, and use fewer resources, making them more accessible and practical for a wider range of applications.
What do you think about this? Do you consider using small opensource. If yes what you are thinking to make?
I am going to use it on my smartphone
by monkeydust on 4/26/24, 7:30 AM
by panqueca on 4/24/24, 4:53 AM
by unraveller on 4/24/24, 7:45 AM
by buildbot on 4/26/24, 5:41 AM
by gnabgib on 4/24/24, 4:47 AM
by sunflowerfly on 4/25/24, 12:40 AM