by vlan121 on 4/17/25, 7:46 PM with 19 comments
by mertleee on 4/20/25, 6:36 PM
It's part of why they love agents and tools like cursor -> turns a problem that could've been one prompt and a few hundred tokens into dozens of prompts and thousands of tokens ;)
by ivape on 4/20/25, 6:34 PM
I see this as the same as a reasoning loop. This is the approach I use to quickly code up pseudo reasoning loops on local projects. Someone had asked in another thread "how can I get the LLM to generate a whole book", well, just like this. If it can keep prompting itself to ask "what would chapter N be?" until "THE END", then you get your book.
by danielbln on 4/20/25, 8:23 PM
by kordlessagain on 4/20/25, 11:47 PM
by K0balt on 4/21/25, 4:43 AM
Still clever.
by James_K on 4/20/25, 9:35 PM
by mentalgear on 4/22/25, 8:40 AM
by seeknotfind on 4/20/25, 6:26 PM
by NooneAtAll3 on 4/20/25, 8:05 PM
by mentalgear on 4/22/25, 8:40 AM