"Solved" in the AI/game theory has a very strict definition. It indicates that you have formally proven that one of the players can guarantee an outcome from the very beginning of the game.
The less-strict definition being thrown around here in the comments is more like "This AI can always beat this human because it is much stronger."
I think most people discussing this mean the later, less pedantic option. I mean, that’s the spirit of AI. Can we make it think like a human, or even more so. We are the yardstick.
That is a silly mis-use of the term and that is not being pedantic. A problem isn't solved just because you beat the existing solution (i.e. human players). As long as there is the potential for a better solution that can beat your solution there is work to be done.
"Solved" in the AI/game theory has a very strict definition. It indicates that you have formally proven that one of the players can guarantee an outcome from the very beginning of the game.
The less-strict definition being thrown around here in the comments is more like "This AI can always beat this human because it is much stronger."