您的当前位置:首页 >娛樂 >【】 正文

【】

时间:2024-11-22 02:20:51 来源:网络整理编辑:娛樂

核心提示

Cher's virtual closet in the 1995 film classic Clueless is one of the most coveted futuristic tech s

Cher's virtual closet in the 1995 film classic Clueless is one of the most coveted futuristic tech suggestions of the pre-Y2K era. It's also the closest comparison Google has for its latest virtual try-on tool, which uses a new AI model to offer an online shopping experience that's nearly as customizable as the one owned by the Beverly Hills fashionista.

Launched today, Google's virtual try-on (VTO) seeks to make the online shopping experience more like the process of buying apparel in-store, with realistic image try-ons that provide more size- and skin-tone-inclusive options for online customers, as well as apparel choices that mimic the service a customer would get with a personal sales associate, Google explains.

"42 percent of online shoppers don’t feel represented by images of models and 59 percent feel dissatisfied with an item they shopped for online because it looked different on them than expected," the company wrote in its announcement. "Our new guided refinements can help U.S. shoppers fine-tune products until you find the perfect piece. Thanks to machine learning and new visual matching algorithms, you can refine using inputs like color, style, and pattern. And unlike shopping in a store, you’re not limited to one retailer: You’ll see options from stores across the web. You can find this feature, available for tops to start, right within product listings."

SEE ALSO:Apple avoids the AI trap at WWDC

The new tech is marketed as the most advanced version of what we've come to know as Augmented Reality (AR) try-on options, like the Metaverse makeup experiences released last year, the many beauty filters introduced regularly on TikTok, or brand-based marketing gags like Gucci's entirely virtual shoe offering. Amazon's even tried out its own AR tool for virtually trying on shoes and eyeglasses.

Mashable Light SpeedWant more out-of-this world tech, space and science stories?Sign up for Mashable's weekly Light Speed newsletter.By signing up you agree to our Terms of Use and Privacy Policy.Thanks for signing up!

Google's new tech uses a different process that combines its Shopping Graph, the company's worldwide database of shopping information, with a process known as "diffusion." In diffusion, Google explains, extra pixels (known as "noise") are gradually added to an image and then removed, leaving behind an imprint or reconstruction of the original image. When that's utilized in an AI model trained with a library of images showing human models wearing garments in different poses (rather than a collection of text data, like a large language model or LLM) it results in shoppers being able to generate a new, more accurate, try-on image.

A diagram showing the process of adding and removing pixel noise to two separate images.Credit: Google

Google also says its virtual try-on goes beyond other forms of AR options by offering a wider selection of diverse — although still limited — human models and increasing the ability of shoppers to select a model that looks like them to make more informed shopping choices.

Eight models of different body types and races all wearing the same green shirt. Credit: Google

"Our new generative AI model can take just one clothing image and accurately reflect how it would drape, fold, cling, stretch and form wrinkles and shadows on a diverse set of real models in various poses. We selected people ranging in sizes XXS-4XL representing different skin tones (using the Monk Skin Tone Scale as a guide), body shapes, ethnicities and hair types," the company wrote.


Related Stories
  • Be My Eyes meets GPT-4: Is AI the next frontier in accessibility?
  • What not to share with ChatGPT if you use it for work
  • AI helped make a song on 'the last Beatles record', McCartney says
  • How to try Google's text-to-music generator MusicLM
  • At Google I/O 2023, Search gets an AI overhaul

Unlike the AI passes of brands like Levi's, which opted to forgo human models for AI-generated diversity, Google is attempting to combine real-life human variability with the complexity of AI. Other applications of AI and AR have proven to be limited in their training, with a lack of diversity in their creation that leads to real-life consequences for people of color using them — making Google's explicit effort to expand its data input for the virtual try-on AI model notable.

SEE ALSO:TikTok beauty filters can be super realistic—unless you're a person of color

But the question remains if the new tool will actually have a lasting, broad impact on building AI with diversity in mind. At the very least, the new virtual try-on tool takes a significant step beyond other offerings on the market, acknowledging shoppers with a range of body shapes and appearances (and their potential purchasing might) in a suggestion that AI could be used to promote fashion inclusivity. Guess it's time for a summer closet upgrade.

TopicsArtificial IntelligenceSocial Good