from
Hacker News
Top
New
FlashAttention-2: Making Transformers 800% faster AND exact
by
kken
on 8/5/23, 6:11 PM with 0 comments