Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I've been trying to figure out how the ChatGPT UI manages to keep the conversation context when it's over the limits of what the model can ingest.

I even tried asking ChatGPT :D



Its context window is quite large -- 8192 tokens, where a token is about ~4 characters. But it's quite possible they are using GPT itself to summarize the older parts of the conversation so they can fit more in by only keeping the important bits.


I was thinking something like that, bearing in mind that humans can't remember every single detail about a conversation either.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: