Meta’s AI image generator is coming under fire for its apparent struggles to create images of couples or friends from different racial backgrounds.
Meta’s AI image generator is coming under fire for its apparent struggles to create images of couples or friends from different racial backgrounds.
No, that’s not a real problem either. Model search techniques are very mature, the first automated tools for this were released in the 90s, they’ve only gotten better.
AI can’t ‘train itself’, there is no training required for an optimization problem. A system that queries the value of the objective function - “how good is this solution” - then tweaks parameters according to the optimization algorithm - traffic light timings - and queries the objective function again isn’t training itself, it isn’t learning, it is centuries-old mathematics.
There’s a lot of intentional and unintentional misinformation around what “AI” is, what it can do, and what it can do that is actually novel. Beyond Generative AI - the new craze - most of what is packaged as AI are mature algorithms applied to an old problem in a stagnant field and then repackaged as a corporate press release.
Take drug discovery. No “AI” didn’t just make 50 new antibiotics, they just hired a chemist who graduated in the last decade who understands commercial retrosynthetic search tools and who asked the biopharma guy what functional groups they think would work.