Posted by Johan Vandersteen on 18 Jan, 2018
Our Lead Strategist, Olivier Legris, shared insight with ITProPortal on visual search tools, and how they’re capable of considerably enhancing future m-commerce transactions for retailers
While retailers always have to keep up with the ever-changing requirements of m-commerce to be successful, the rewards for doing so are plain to see. Retail e-commerce smartphone sales are set to be worth more than £16 billion in 2018 according to eMarketer, a figure that could rise as high as £58 billion by 2021.
UK consumers are increasingly comfortable using their mobile devices to make retail purchases, often omitting a computer entirely, and it means that retailers failing to adapt to this behaviour will be left behind. We think a big visual search will be a big part of this future, but there are some important points for retailers to consider if they’re to get it right.
The term visual search, or computer vision, is the capacity for machines to translate pictures and video into descriptive data that then can be processed by other systems. Retailers are already beginning to invest in this technology, with 45% planning to utilise AI-driven solutions to enhance the customer experience in the next three years (BRP Consulting’s 2017 Customer experience/Unified Commerce Survey)
The problem with traditional search on mobile
The balancing act of giving the customer enough choice while providing them with smarter, quicker and more accurate results is becoming increasingly difficult to achieve in today’s mobile-first world. In the context of a traditional desktop-based search, it’s straightforward enough for a customer who knows what they want to input a search and give the retailer a quick and satisfying sale.
But it’s a different story on mobile, where a simple keyword search like “leggings” might return far too many results and choices for a user to comfortably look through without frustration. This is not a good user experience, and given that mobile shoppers are often on the move and using a smaller screen, an overload of options can become a complete turn-off and a barrier to conversion.
Capitalising on the Pinterest effect
Perhaps the biggest influence on retail and fashion in the past 5 years has come through online inspiration, and platforms like Pinterest and Instagram. These platforms add to users’ visual stimulation, with accounts and pages that showcase real-life products both in-the-hand and in-the-moment; raising the bar for retail expectations.
Yet, the nature of both platforms makes it hard to link to as many products as they can showcase, making it difficult for customers to learn more. Customers are currently left inspired but frustrated by not being able to connect their inspiration to a purchase, and this gap in the journey looks like a fantastic opportunity to us. Current solutions don’t hit the mark, so retailers capable of better aligning themselves with customers at the point of inspiration will be the most successful.
Imagery as an input
Current technology is limited when it comes to extracting detailed information from images, meaning few companies are using imagery-based input. Even those who do are using a simplistic level of categorisation, and the systems on offer generally require non-blurry, high-quality studio images.
But with the right approach and more of a focus on improving inventory data sets, a smarter system is within reach. Retailers that focus on solutions that can comprehend phone-generated images will do much better at connecting that inspiration to a purchase.
Making the move into visual search
A handful of companies are already experimenting with visual search, including Pinterest, ASOS, and a range of retail fashion startups. In the home and goods spaces, Wayfair and eBay have both introduced photo search features, while Made.com is using a discovery engine from Hullabalook. Fashion and furniture are both industries in which visual search makes sense, where products are often purchased based on looks over function.
Poised to play an important role in enhancing the customer experience of the future, it’s time for retailers to be asking themselves whether they can facilitate the technology internally, or if it’s time to look outside the business to a third party. But before entering the visual search space, retailers need to get image-ready.
For the AI and computer vision to understand an item, you must show it hundreds of different images of the product, and assign labels to help determine them. This way, the technology will understand an image when it ‘sees’ it. In order to do this, you need large datasets, you need to input, and you need clarification for the process to work effectively. Talent in this field is rare, and it’s where an upfront investment in a third-party agency can achieve a significant result.
Retailers with extensive catalogues will need to ensure they bring in expertise that can work with them to implement technologies that help turn what seems like the impossible, into the possible. Those that find a path to honing in on personalised choices, capitalising on inspiration, and tantalising with imagery will lead the charge and take the lion’s share of the increased revenues that mobile has to offer.
Future Platforms has over two decades’ worth of experience in the e-and-m-commerce space, providing award-winning solutions for Domino’s Pizza, Wembley Stadium, First Bus and more. If you’re a retailer, vendor, or service provider looking to get ahead with visual search and m-commerce, why not get in touch with us to see how we could help?
You can read the full version of Oliver’s article for ITProPortal right here.
Get in touch with Future PlatformsCONTACT US