Unlikely except if you send them to a iphone which is registered with a "child" account.
Apple uses two different approaches:
1. Some way to try to detect _known_ child pornographic material, but it's fuzzy and there is no guarantee that it doesn't make mistakes like detecting a flower pot as child porn. But the chance that your photos get "miss detected" as _known_ child pornographic material shouldn't be too high. BUT given how many parents have IPhones it's basically guaranteed to happen from time to time!
2. Some KI child porn detection on child accounts, which is not unlikely to labile such innocent photos as child porn.
Even in the child account case it's not sent to Apple - it alerts parent accounts in the family. It's also just nudity generally, more akin to garden variety parental control content filtering.
The child account iMessage thing is really entirely separate from the CSAM related iCloud announcement. It's unfortunate people keep confusing them.
Apple uses two different approaches:
1. Some way to try to detect _known_ child pornographic material, but it's fuzzy and there is no guarantee that it doesn't make mistakes like detecting a flower pot as child porn. But the chance that your photos get "miss detected" as _known_ child pornographic material shouldn't be too high. BUT given how many parents have IPhones it's basically guaranteed to happen from time to time!
2. Some KI child porn detection on child accounts, which is not unlikely to labile such innocent photos as child porn.