LBB’s Addison Capper on a search function technology that’s evolving at the speed of light
For years the process of finding information and products online has been has been hampered by our limited vocabularies. If you saw a pair of sneakers you fancy, you’d have to figure out how to describe them in such a way that Google (or your search engine of choice) would understand what the hell you were talking about. But all that lexical clumsiness may be a thing of the past thanks to developments in AI that allow search engines to more accurately recognise, categorise and match images. Point. Click. Search. Enter visual search.
Searching ‘visually’ is not a new thing. Early Apple and Microsoft operating systems allowed for visual filtering, numerous ecommerce sites use visual search as part of their navigation and Google’s ‘reverse image search’ has been a feature of the search engine since 2011. But visual search tools have been relatively basic - until now.
Artificial intelligence, combined with improvements in smartphone capabilities, processing power and camera functionality, is revolutionising the technology and opening door after door for consumers and the brands targeting them.
According to a 2017 eMarketer study, around three quarters of US internet users regularly or always search for visual content prior to making a purchase, and only 3% never do. The proliferation of digital devices is naturally playing its part also. We’re speaking to more people more of the time but using less words in doing so; emojis and memes are the prose of the times. And so it seems that this advance in visual search couldn’t come at a more fitting time.
David Towers, Senior Partner, Head of Search at MEC EMEA and his colleague Jessica Chapplow, Ecommerce & Emerging Platforms Manager, say that 2014 marked a breakthrough year for the technology and its commercialisation. “Amazon debuted its Firefly app which had the ability to read barcodes and, using a phone’s camera, identify more than 100million items - the search results were often not great though,” they say. In the same year other retailers, like Target and Macy’s, also began implementing visual recognition technology into their own apps and websites.
Since then things have continued to evolve and show no signs of slowing down. One platform making waves in the space right now is Pinterest, which is unsurprising given its graphic nature. In February it announced three visual discovery tools: Lens beta, a tool that allows users to point their camera at an object and discover similar things, Shop the Look, which means people can find and buy products within fashion and home decor Pins, and Instant Ideas, which speeds up the process of finding similar objects on the platform. “Now we’re using the same visual discovery technology that powers these innovative tools to make Promoted Pins even more effective,” Pinterest’s Creative Strategy Lead for East Coast Raashi Rosenberger tells us. “We are using visual discovery technology to connect consumers and advertisers in the most personalised and relevant way to date.”
Another player investing heavily in visual search is Microsoft. Its search engine Bing has admittedly fallen down the pecking order behind Google, but Microsoft is looking to win users back with Bing Visual Search. The new toolset, which sits within its existing image search tools, lets users search for specific items within a larger image. So, for example, if you see a picture of a large flowerbed but there’s one particular flower that takes your fancy, you can zone in on that section of the image and Bing will identify it. Tests by ‘Search Engine Land’ show mostly positive results. Microsoft has also recently launched ‘Seeing AI’, a free app designed for the low vision community that narrates the world around us (but more about that later).
And then there’s Google. In 2010 the company launched Goggles, an app that involved reverse image search technology. Due to a lack of updates it has fallen by the wayside. But that wasn’t the end of Google’s visual search adventure and in May 2017 Google Lens was revealed.
Google Lens will be able to understand what’s going on in a photo, video or live feed. So, to keep with the botanical theme, if you point your camera at a flower, it will tell what type it is. But, in addition to that, you could aim your camera at a restaurant and information and reviews will pop up. Another neat touch is that if you point your phone at a wireless router’s information sticker, it will automatically connect to the Wi-Fi. Lens is set to launch later this year and will first be integrated into Google Photos and Assistant. That’s when things will really begin to get exciting, believe David and Jessica from MEC. “We are still in the infancy of seeing visual search impact the advertising industry… We think that it when Google Lens is launched globally and adopted by consumers then visual search will come into its own and really start being a key way to reach consumers.”
For Chris Polychronopoulos, Executive Creative Director at KBS, the opportunities for advertisers in the future are myriad. “Advertisers can run media with products and when customers go to their sites to search and they see the products, it’s an instant visual confirmation that they are in the right place. It’s ultimately going to create higher conversions because customers love seeing the products they want to buy throughout the entire shopping path, and always having a visual cue emotionally connects customers to the product even more than just reading the name of it.”
Chris also notes the possibilities that motion content creates too, as gifs or animation could be used as brands’ searchable visuals. These would naturally create more visual intrigue for customers but, having worked with these types of visuals in the past, Chris warns that the key is to keep it simple. “If the visual is competing with the task the customer wants to do then it can be frustrating and cause the opposite effect. With motion design especially, less is more. If movement or animation is too erratic while customers are in a critical scanning and searching mindset, they might get overwhelmed and abandon.”
In other areas, visual search can aid in the show-rooming of products, enable brands to serve smart banner ads that adapt to what a site visitor sees in real-time and, according to David and Jessica, “improve digital marketing campaign ROI with hyper-relevant ads”.
On Pinterest specifically, Raashi highlights the state of mind that the platform’s users tend to be in when browsing. “On Pinterest, we're giving people access to new ideas when they're still in the early stages of consideration. That motivation by users is a huge opportunity for brands.”
There are 175 million people on Pinterest every month, using the service to plan their lives, curating inspiration to help them choose clothes and plan meals and tackle much bigger tasks like planning a wedding or home decoration. “The combination of consumer planning behaviour and their open mindset to new ideas provides markets with a unique intent signal that they can’t find anywhere else,” claims Raashi.
Looking forward, the general consensus is that visual search will just keep evolving. If brands and their agencies can avoid filling the market with annoying gimmicks and use the technology to solve necessary problems, it’s unlikely to become a fad. David and Jessica from MEC predict major players such as Google, Alibaba, Bing and Amazon to implement increasingly sophisticated technology and, as an agency, the visual search advertising opportunities that they are handling at the moment tend to be linked to Google, Pinterest, Bing, Amazon and Snapchat.
Raashi highlights that the tools they launched in February are very much in their infancy and will only improve over time. “We're still in the early stages of our business and our product team believes that there is still work to be done to iterate on the product we ultimately want to build, so we're at the tip of the iceberg of what visual discovery will look like in the future.”
Outside of the ad industry some recent developments also prove exciting. As mentioned earlier, Microsoft has launched Seeing AI. This free app harnesses the power of artificial intelligence to describe surroundings to people with visual impairments. It can read short text as soon as it appears in front of the camera, provide audio guidance for longer documents, scan barcodes to identify products, recognise friends’ faces and emotions, describe scenes and, in the near future, will be able to identify currency bills to help users pay cash.
Elsewhere, visual search is enabling healthcare professionals to detect abnormalities in medical images such as X-ray mammograms and is now used in three out of four mammograms read in US clinics. “It’s easy to see the underlying potential with visual detection and AI in the healthcare field,” say Jessica and David.
Given the rate of progress in artificial intelligence, particularly in object recognition and categorisation, it won’t be long until visual search is a mainstream behaviour. So watch this space, people. What this space through the lens of your always-learning, always-watching smartphone camera.