Google, ever eager to lean into generative AI, is launching a new shopping feature that shows clothes on a lineup of real-life fashion models.
A part of a wide range of updates to Google Shopping rolling out in the coming weeks, Google’s virtual try-on tool for apparel takes an image of clothing and attempts to predict how it would drape, fold, cling, stretch and form wrinkles and shadows on a set of real models in different poses.
Virtual try-on is powered by a new diffusion-based model Google developed internally. Diffusion models — which include the text-to-art generators Stable Diffusion and DALL-E 2 — learn to gradually subtract noise from a starting image made entirely of noise, moving it closer step by step to a target.
Google trained the model using many pairs of images, each including a person wearing a garment in two unique poses — for insta...