by tzm on 7/18/23, 4:58 PM with 3 comments
by russellbeattie on 7/18/23, 6:13 PM
I bet next up will be at the end of the year when Apple surely jumps into the ring with their version of on-device Siri, powered by their ML chips.
by yumraj on 7/18/23, 6:31 PM
Are they expecting for Llama 2 to still be relevant in 2024? Shouldn't it have been LLM or Llama (without the version).
by brucethemoose2 on 7/18/23, 7:49 PM
MLC-LLM already runs LlamaV1 (and probably V2?) performantly on Android and iOS. But again, no one seems to care because its a relatively barebones demo.
Qualcomm needs to build and extensive feature set and probably a UI, otherwise no one is going to care about this implementation either.