We’ve all been there, you’re out shopping and you spot someone wearing what would be the perfect shirt for the family photos you have scheduled next month. Sure, you could be direct and ask said person where they got this mystery shirt, but instead you reach for your phone and attempt to describe it in the best way possible in the hopes that Google will return the exact ‘blue button-down shirt with white buttons’ you’re looking for — not very likely.
After an hour of dead-end queries, you accept defeat and the shirt remains nothing but a distant memory…if only there was a better option.
Enter visual search.
What is visual search?
For years, online shoppers have relied on search engines like Google to assist them in finding products to purchase through short text descriptions.
The problem with this method is, the words that we use are often too vague and the pool of options for a response are just too big— resulting in less than optimal results. Not to mention how time consuming it can be.
With visual search, you can give search engines a more effective way to accurately find related product information by uploading an image of the product itself in place of text.
How does visual search work?
Visual recognition is a huge part of our daily lives and has been for a very long time. Shortly after we are born, we are able to recognize close members of our family, our environment and certain objects we see often. What’s even more incredible, is our ability to do all of this instinctively and with incredible ease and speed.
For instance, let’s look at the image below. We recognize immediately that it’s of a young boy wearing a hat and backpack, holding a camera. We can also see that he is standing in a field while the sun is setting. Easy, right?
When we look at something, our eyes take in visual information through our retina and then send it to our occipital cortex where it is then interpreted and stored in memory. A computer imitates this process by ‘seeing’ an image through pixels that have one or more numbers associated with it.
After the computer receives the input image information, it then performs classification tasks through a convolutional neural network (CNN). Very simply put, a CNN is a type of machine learning that can logically analyze an image through pattern recognition and then make sense of the information it receives.
CNN’s can be very sophisticated and many are responsible for visual search technologies we use every day such as Facebook’s facial recognition, Amazon’s product recommendation and Pinterest’s related pins.
Why is visual search important?
If I told you that Google was responsible for almost 70% of web searches in 2018, would you be surprised? Probably not. But, what if I told you that Google images was responsible for 21% of web searches taking the #2 spot in the US Search Market Share?
Humans are extremely visual. Our brains process visual information 60,000 times faster than text and 90% of the information transmitted to the brain is visual.
With this information, it shouldn’t come as a surprise that images are becoming more and more important in our digital lives.
Visual Search Statistics
1. 62% of Millennials and Gen Z consumers want visual search capabilities
In 2019, millennials — people born between 1982 and 1996 — made 60% of their purchases online . That’s a bit of a difference from the 47% it was back in 2017. It’s safe to assume that we will continue to see this number grow in the coming years as the newest technology-loving generation, known as Gen Z — those born between 1997 and 2012 — is now old enough to shop online.
2. The Global Image Recognition Market is predicted to reach $39.87 billion by 2025
Image recognition was already responsible for $16.85 billion in 2018, but with the advancement of technology, it is predicted to reach $39.87 billion by 2025.
3. 36% of Consumers have used visual search
According to a study conducted by Intent Lab in 2018, not only has 36% of consumers used visual search, 59% think that visual information is more important than textual information.
The areas that led in visual importance amongst those surveyed, were clothes at 86% and furniture at 85%.
4. Early adopter brands that optimize their website to support visual search will increase revenue by 30%
As consumers shift towards the regular use of visual search, early adopter brands that design their websites accordingly will benefit the most with increased conversion and revenue, higher customer satisfaction ratings and brand loyalty.
5. 21% of digital advertisers believe that visual search will have the biggest impact on their business
According to Marin Software’s 2019 annual report “State of Digital Advertising”, digital advertisers believe that one of the biggest trends to impact their business will be visual search making it a bigger focus than it has been in the past.
Brands using visual search
The number of brands that use visual search technology is expected to grow rapidly over the next couple of years. Let’s dive into some of the brands that are currently using it to provide better customer experiences.
1. Pinterest Lens
When Pinterest launched it’s visual search tool — Pinterest Lens — back in 2017, there was no doubt that this powerful platform would take their business to the level above being just a social network.
Pinterest Lens allows users to receive suggested pins based on images they have or find on another pin making finding the perfect home decor, outfit or recipe easier than ever before.
On September 17, 2019 Pinterest Lens received a valuable update with it’s shoppable Product Pins. Users are now able to not only use Lens to find the product they are looking for — from a library of 2.5 billion available options — they are now able to purchase it directly online.
Pinterest also reported through research that 80% of Pinterest users started their online shopping with visual search in comparison to the 58% of non Pinterest users.
2. Google Lens
Like Pinterest Lens, Google Lens was also released in 2017 and does a lot of the same things. One big difference, however, is that Pinterest Lens is only available through the Pinterest app where Google Lens can be accessed through Google Assistant, Google Photos app, Google Camera, and a stand alone app.
Another difference between the two is Pinterest Lens shows results based on what is on Pinterest, while Google Lens pulls from all websites — similar to Google Images.
In May of 2019, Google announced that they would be adding smart ‘overlays’ to Google Lens that would add additional information to search queries. For example, you can point the camera towards a restaurant menu and it will then show you the most popular items ordered. This feature can also be used for translating words in different languages.
3. Amazon StyleSnap
Amazon released its visual search tool, StyleSnap, back in June of 2019. Unlike Pinterest Lens and Google Lens, StyleSnap is focused mainly on fashion. Users can upload an image of an outfit and StyleSnap will recommend similar items that can be purchased on Amazon keeping factors like price, reviews and brand in mind.
StyleSnap not only helps consumers find fashion recommendations, but it also helps fashion influencers enrolled in their Amazon Influencer Program to receive commissions on purchases they inspire.
Virtual search technology has advanced significantly over the past few years and has the potential to really change the way consumers shop online.
As of right now, only 8% of retailers are currently utilizing it, but as awareness builds and the need for a competitive advantage increases, more brands are going to do what they need to ensure they don’t get left behind.