shaharl6000.bsky.social
@shaharl6000.bsky.social
Our results indicate that processing multiple documents is a separate challenge from handling long contexts.
Joint Work with @nirmazor.bsky.social, @lihishalmon.bsky.social, Michael Hassid and @gabistanovsky.bsky.social
3/3
@nlphuji.bsky.social
March 11, 2025 at 2:34 PM
Previous work shows that adding more documents can hurt LLM performance, but didn’t isolate the impact of document quantity under a fixed context length.
Our findings reveal that increasing the number of retrieved documents in RAG settings significantly challenges LLMs.🤯
2/3
March 11, 2025 at 2:33 PM