Google Shopping AI feature could abolish changing rooms

Google has announced a new feature for US shoppers looking to try on women’s clothes from brands including Anthropologie, Everlane, H&M, and LOFT. From now on, you can tap the “Try On” badge and see how the clothes look on a model that looks like you.

The search giant expects to offer the Try On feature across more brands over time, including for men’s tops later this year. Explaining how the feature leverages generative AI, Google said:

“Our new generative AI model can take just one clothing image and accurately reflect how it would drape, fold, cling, stretch and form wrinkles and shadows on a diverse set of real models in various poses. We selected people ranging in sizes XXS-4XL representing different skin tones (using the Monk Skin Tone Scale as a guide), body shapes, ethnicities and hair types.”

Aside from letting you try on the clothes on models that look like you, the new shopping feature improves product discoverability. Using machine learning and visual matching algorithms, you can find products based on their colour, style, and pattern from multiple retailers.

By pulling in products from multiple retailers, users also stand to save money when buying new clothes and competition among different stores could also increase. If you’re residing outside of the United States, there is, unfortunately, no news on when this feature will arrive internationally just yet.

Hopefully, this feature lets you stop those annoying trips to the changing room in clothing stores. For those of you who prefer to order online, this feature should give you a better idea about whether a product will fit you OK.

If you’re interested in learning a bit more about this new feature, Google has written a blog post that explains how it works on a high level without going into really nerdy technical details.

Report a problem with article
Next Article

New Steam client update from Valve adds revamped in-game overlay, Notes and more

Previous Article

Microsoft launches Viva Pulse public preview for fast feedback and reactions