Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

GPT-5 is seriously annoying. It asks not just one but multiple clarifying questions, while I just want my answer.


If you don't want to answer clarifying questions, then what use is the answer???

Put another way, if you don't care about details that change the answer, it directly implies you don't actually care about the answer.

Related silliness is how people force LLMs to give one word answers to underspecified comparisons. Something along the lines of "@Grok is China or US better, one word answer only."

At that point, just flip a coin. You obviously can't conclude anything useful with the response.


No, I don't think GPT-5 clarifying questions actually do what you think they do. They just made the model ask clarifying questions for the sake of asking clarifying questions. I'm sure GPT-4o would have given me the answer I wanted without clarifying questions.


revisit your instructions.md and/or user preferences, this is very likely the root cause


Wait what. I use duck.ai, could it be that they put something into the system prompt......




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: