September 20, 2019

The use of AI in retail continue to rise. We see it in chatbots, ecommerce apps, customer analytics and more.  A perfect example is AI-powered visual search, which is quickly becoming an imperative for retailers to enhance and streamline the consumer shopping experience to boost sales.

In a study of more than 1,000 gen Z and millennial consumers by ViSenze, an at intelligent image recognition firm, 62% of Gen Z and millennial consumers said they wanted visual search capabilities more than any other new technology. The basic process is simple: a consumer sees something – in real life, in a magazine, online, etc. – takes a picture with their smartphone, and uploads it to an app that uses AI-powered image recognition to help them identify, find, and buy similar items (or the exact item), or related items. Retailers are adopting this and other ways to improve customer experiences and gain insight and influence over the customer journey.

Visual search creates a new waypoint along the customer journey, but it also serves to bypass other waypoints – further abstracting shoppers from traditional person-to-person brand interactions which used to be the hallmark of personalized service. Now, consumers are hyper-personalizing their own journeys – for better, or worse.

Retailers are increasingly reliant on data, analytics, mobile apps, chatbots, ecommerce and importantly, the underlying algorithms that power AI and machine learning to deliver the insights and recommendations that influence consumers to buy. The true winners of visual search won’t be those who deploy it first, but the ones who deploy it best. In other words, visual search provides not only an opportunity to sell a product, but also to sell an experience that includes multiples products and services that span multiple brands.

So instead of helping you buy a new jacket like one worn by a celebrity on the red carpet, visual search apps could help you assemble an entire outfit that incorporates the jacket plus other garments and accessories from multiple brands. A less advanced, underlying algorithm would not have the artificial intelligence to make such recommendations, where humans potentially could have – resulting in missed sales opportunities. Furthermore, will the visual search app know enough about you to recommend levels of quality and price that match your persona?

Glasses USA implemented visual search with a tool called “Pic and Pair” – allowing customers to upload pictures of eyewear and then the tool presents similar-looking products that are available on their e-commerce website. According to a recent Forbes article, Glasses USA soon discovered that customers who use the new search function spend 6X longer on the site and are 5X more likely to buy, compared to those who did not use the search function.

Looking at the chart below from BI Intelligence, visual search has already gained momentum in apparel. Target and Pinterest were early adopters. Target has said that visual and voice searches could make up 50% of all searches by 2020. Neiman Marcus has also rolled out a visual search tool called Snap.Find.Shop. Nordstrom, Amazon and Google are also using the technology. And L’Oreal now allows customer book live-streaming appointments for digital make-up sessions.

American Eagle has AI-powered fitting rooms that let shoppers scan items to see what other sizes are available and receive related product recommendations. Store associates receive notifications so that they can bring items directly to the customer. All of this customer behavioral data, including what they buy and don’t, is collected so analytics can enhance the experience and make further recommendations.

Other industries are following. CarStory is an AI-driven search service that seeks to help people find and buy cars they like by taking pictures of cars they see, and the app uses machine learning to recommend cars that they may also like, including information about pricing and where to buy them. But the real magic lies in what the app also knows about you.

Let’s consider an example. The new BMW 8-series looks a LOT like a new Ford Mustang (sorry you BMW guys, but it does). However, the profile of the person who buys each car is extremely different. The base price of a new ‘Stang is only $26,670, while the BMW is more than 300% more ($84,900)! Will visual search apps have the intelligence to know which is best for me?

Furthermore, how much brand credibility could be lost when AI-driven visual search fails and completely misidentifies a product, and it becomes the next internet sensation (anyone know where to buy a delicious Chihuahua)?

Inevitably, and quickly, AI-driven visual search will become table stakes for mid-sized and large retailers. But, today’s consumers aren’t just seeking better shopping experiences, they are shopping for experiences. For retailers to stand out against the competition, it requires multi-dimensional thinking (and AI) about how to sell new forms of value through more holistic, connected experiences. The goal is to make shoppers buy things they didn’t even know they wanted – and love the experience. But pulling it off requires that the underlying analytics and algorithms to tap into  a much greater variety and diversity of data sources, including all digital and physical customer interactions with brands.

This calls for more robust analytics platforms and analytics apps. They must be tuned for the data sets that retailers actually need, such as information from store visits, shopping carts, loyalty programs, PoS, transaction history, mobile app activities, e-commerce sites, GIS-enabled apps, streaming IoT and more. The visual search technology is here, but it requires differentiated AI and machine learning algorithms to make better recommendations and self-improving models. Starting with a future-proof data foundation is critical. Otherwise many organizations will need to re-tool before fu    lly realizing the potential of visual search. This requires a foundation that:

  • Handles vast amounts of data with a highly scalable architecture,
  • Ingests and processes data from a wide variety of sources, including customer interaction from physical stores, websites, apps, IoT, and streaming/real-time sources,
  • Includes advanced analytics and capabilities such as AI and machine learning, with predictive and prescriptive insights, and the required management and security to operationalize use cases,
  • Connects easily to third party systems and partners to enable curated experiences

Without a doubt, AI-powered visual search technology is impressive. But the underlying algorithms, data models, and real-time analytics are where the real magic lies. Without them, consumers might be better off pocketing their smartphone and asking for the nearest customer service representative.


Jeff is part of the Digital Software & Solutions group of Tata Consultancy Services, as a lead evangelist for its IoT analytics platform solutions for smart cities, smart retail, smart banking, smart communications, and other areas. Jeff is part of the Digital Software & Solutions group of Tata Consultancy Services, as a lead evangelist for its IoT analytics platform solutions for smart cities, smart retail, smart banking, smart communications, and other areas. Prior to TCS, Jeff was part of EMC’s Global Services division, helping customers understand how to identify, and take advantage of opportunities in Big Data, IoT, and digital transformation. Jeff helped build and promote a cloud-based ecosystem for CA Technologies that combined an online community, cloud development platform, and e-commerce site for cloud services and spent several years within CA’s Thought Leadership group, developing and promoting content and programs around disruptive trends in IT. Prior to this, Jeff spent 3 years product marketing EMC, as well as a tenure at Citrix, and numerous hi-tech marketing firms – one of which he founded with 2 former colleagues in 1999. Jeff lives in Sudbury, MA, with his wife, 2 boys, and dog. Jeff enjoys skiing, backpacking, photography, and classic cars.