1 points | by SomaticPirate 10 hours ago ago
1 comments
A useful reminder that just because an LLM "appears" to be thinking and reasoning that it likely isn't. If it hasn't seen something similar or "in distribution" before, then it typically doesn't perform well.
A useful reminder that just because an LLM "appears" to be thinking and reasoning that it likely isn't. If it hasn't seen something similar or "in distribution" before, then it typically doesn't perform well.