Augmented Experiences and the Future of Retail at #NRF17
Now that we’ve had a week to recover from the hustle of NRF, there’s been time to process some of the learnings from this year’s Big Show. It has followed a similar trend as SXSW, where three years ago no one was talking about machine cognition in retail. Today, it appears that infinitely many vendors are leveraging AI in some form or fashion, to power recommendation engines, create a better search experience, or launch semi-intelligent chat bots trained to interact with customers.
I found myself gravitating towards the Innovation hub in the River Pavilion throughout the event, where mini-presentations on emerging technology would take place among the vendor booths. I attended one on Robotics and AI, and saw a visual cart analyzer powered through a tablet. Imagine the power of attaching this to a grocery cart and taking video inventory – content generated by your shoppers pushing the carts around the aisles as they pick up groceries. Another camera pointing inside the cart conducts visual recognition, doing basket analysis in real-time, and then serving up recommendations on related products. Very cool stuff.
There were a lot of vendors there who have explored machine learning to make better product by product comparisons and generate recommendations accordingly. The legacy model that came out of Amazon now boasts 125 different algorithms, and has re-branded as a machine learning company. But there also seems to be a trend of smaller companies going deep in one computer science technique – mathematical models and machine learning, semantics and natural language processing, or computer vision.
The most promising solutions out there right now are the ones that manage to combine all three of those core techniques under the umbrella of AI – machine learning, natural language processing, and computer vision – and make it easy and quick for retailers to use. It’s important because style is an extension of our personalities – our style is expressive, it is visual, and when confronted with so many options, we expect brands to adapt to our needs. Math alone can’t solve personalization by doing better product-to-product comparisons with machine learning algorithms. A true customer-centric view beyond segmentation is going to require retailers to process all sorts of unstructured content from text to images to user reviews to social posts, and be able to contextualize that data. This is what we mean by machine cognition, versus leveraging just one AI technique as a point solution.
Some of the more tech savvy brands are adopting AI strategies for their business, especially given the decline in store earnings and increase in digital revenue throughout the industry. Cost, cost, cost appears to be top of mind for many retailers, who are working hard to find cost takeout opportunities through increased efficiency or process automation.
These will be the most interesting areas to unfold over 2017 as we see machine intelligence spread throughout retail both behind the scenes and in the user experience. Certainly top line revenue and bottom line are not mutually exclusive, and it appears from topics discussed at NRF, that machine intelligence will become essential in helping retailers leapfrog beyond where they’re at now.
Experience is everything though, a user must genuinely enjoy interacting with whatever AI-powered capability a retailer chooses, whether that be a chat bot, digital shopping assistant, or dashboard that enhances call center efficiency. Some companies are starting to experiment with augmented reality, like Microsoft’s HoloLens, to enhance human capabilities and provide futuristic and engaging experiences for shoppers.
At CognitiveScale we have been tackling both sides of the same coin for the past few years. We have one product that improves backend business processes, and another that enriches product data and powers personalized experiences. We provide tools for employees and great experiences for shoppers. One of the recent projects to come out of our innovation lab was a HoloLens shopping assistant that learned your preferences through speech recognition as you shopped. After interacting with a bunch of products like you would in a physical store, HoloLens allowed you step back and see green holograms above clothing that matched your individual profile. While the retail world may not be quite ready for augmented reality headsets, in the meantime we’ll keep focusing on how we can help retailers win with machine cognition.
Related Post_ Top Retail Tech Trends to Look to for at #NRF17