Considering the insanity of the AI arms race going on now, and the incredible sums of money be thrown at any slight advantage, is there any reason to believe that any meaningful AI breakthrough would be openly published for anyone to leverage?
Superficially it sounds like this could create a bit more of a move toward doing compaction on some continuous basis, or compacting in batches once you hit the context limit, rather than starting fresh with a summary and system prompt..
Feels like high fidelity, fast compaction could be a path to “solving” long context.
Considering the insanity of the AI arms race going on now, and the incredible sums of money be thrown at any slight advantage, is there any reason to believe that any meaningful AI breakthrough would be openly published for anyone to leverage?
Superficially it sounds like this could create a bit more of a move toward doing compaction on some continuous basis, or compacting in batches once you hit the context limit, rather than starting fresh with a summary and system prompt..
Feels like high fidelity, fast compaction could be a path to “solving” long context.
None of the compaction accuracies look impressive.
I think matching or exceeding the original cache at 20% compacted size is fairly impressive.
This is big for long-horizon tasks