For sure, hands are generally getting better, but they are still a persistent problem. Mostly you need a prompter who isn't lazy and is actually looking at the outputs.
And next year, the AI will check us due to being embedded into our minds. Can't have any rogue humans
The first thing I check out in AI generated images is the hands. The hands in this ad are nightmare fuel. I can't believe they still wrnt ahead and published this, lol.
Those awful hands aside... has anyone seen that shirt! The buttons suddenly stop and middle split just disappears past the waist 😳
It got confused by the glitter thingy which ended up being placed right over that line, so it stopped continuing the line. The ML models literally have an object permanence memory problem, except defined over geometric patterns instead of over time.
The neural network was told to make Asian people but it really wanted to make white people.
Nonpolitically.
Sarcasm aside, I really hate when computer touchers go off into the weeds, throwing dictionaries all the while, about how their favorite treat dispensers can't have political bias no matter what biases were programmed into them or what biases were in the data fed into them or what is asked of the output by the treat receivers.
The buttons of that white shirt go almost all the way down. Sort of like a polo shirt, but with 10x more buttons.
Whoever owns that raised forearm top left must have constant trouble grazing their knuckles on the ground.
Besides all the arm and hand comments, I noticed the center guy doesn't have buttons that run all the way down. I'm not sure if this is accurate clothing or not. It just seemed unusual.
cost cutting at its finest.
For a "Fine Arts College" too. Couldn't they have let some students do it?
Although ... Maybe they were worried about the consequences of rejecting a painting.
I know it's not unusual, the picture doesnt represent that diversity is all I was saying. I had read somewhere that AI seems biased towards white people.
It's bias towards the sample set. There was a Russian researcher that made an algorithm that upscaled pixelated images and used a Russian database for it. It turned a pixelated image of Obama into a white man.
I'm curious if this Indian ad's AI is trained off a western database or an Indian one. "Colorism" is an issue in Indian and only one face looks distinctly European to me.
The clothes - except for the weird kurta x shirt hybrid - look fine, so the AI probably trained on photos of Indians. Also, who looks distinctly European to you? All of these figures look passably Indian to me.
The shadows are wrong too. The people at the front seem to have a strong light source from their left (but not all angles are correct); the ones at the back, from ahead of them.
Three of the four arms sticking upwards don't look like they're attached to anybody. The one on the far left, that one is obviously attached to the guy there, but whose hand is he holding? That lady next to him? The arm is twisted around. And the two arms on the right side, they look disembodied, like they are props in a group photo. Weird.