e.g. Google's Gemini's output had some rather embarrassing biases in its outputs; to the point where asking it to "draw a 1943 German soldier" resulted in images of women and black soldiers. https://www.nytimes.com/2024/02/22/technology/google-gemini-...
I wouldn't put that on the same level as "refusing to talk about massacre of civilians"; but I wouldn't put it to the level of "free and unbiased" either.
I'm not sure it's avoiding biases so much as trying to have the currently favoured bias. Obviously it got it a bit wrong with the nazi thing. It's tricky for humans too to know what you are supposed to say some times.
I wouldn't put that on the same level as "refusing to talk about massacre of civilians"; but I wouldn't put it to the level of "free and unbiased" either.