Hacker News
new
|
past
|
comments
|
ask
|
show
|
jobs
|
submit
login
jb_s
on Feb 2, 2023
|
parent
|
context
|
favorite
| on:
ChatGPT Plus
I've been trying to figure out how the ChatGPT UI manages to keep the conversation context when it's over the limits of what the model can ingest.
I even tried asking ChatGPT :D
gamegoblin
on Feb 2, 2023
[–]
Its context window is quite large -- 8192 tokens, where a token is about ~4 characters. But it's quite possible they are using GPT itself to summarize the older parts of the conversation so they can fit more in by only keeping the important bits.
jb_s
on Feb 2, 2023
|
parent
[–]
I was thinking something like that, bearing in mind that humans can't remember every single detail about a conversation either.
Guidelines
|
FAQ
|
Lists
|
API
|
Security
|
Legal
|
Apply to YC
|
Contact
Search:
I even tried asking ChatGPT :D