Last week the biggest show in search marketing opened with a keynote speech from Shelby Reed, the Regional Vice President Sales for Bing Ads. The talk focused on marketing intelligence and AI, illuminating the audience with emerging technological concepts and applications.
What if we think about the A in AI in terms of how we can use it to “amplify” what we do as marketers — Microsoft’s Shelby Reed #SMX pic.twitter.com/J1Xtk3j64f
— Ginny Marvin (@GinnyMarvin) March 13, 2018
Microsoft unpacked AI applications currently available to marketers during their SMX West keynote speech last Tuesday morning. Microsoft sees AI as a means of driving customer relationships and explained how, as marketers, we’re moving from the information age to the experience age. Using predictive analytics gives marketers the opportunity to determine actionable outcomes for consumers. This is an effective way forward in reaching consumers, because consumers are driven by meaningful experiences and connections.
On the edges of experience marketing is augmented and virtual reality – a new frontier that allows marketers to build fully immersive experiences for customers. Indeed, new possibilities to drive sense-based marketing experiences are prevalent and today’s marketing toolkit is becoming more colorful, more real, and more intelligent. AI is a huge part of this new marketing frontier, allowing us to tackle new challenges and create convincing experiences.
The 5 Senses of Artificial Intelligence in Marketing
Microsoft is using artificial intelligence to connect cognitive capabilities (our senses) and marketing. They claimed, and demonstrated, the ways that their AI currently sees, hears, speaks, understands, and feels.
Understanding and Speaking
When Shelby said that AI can speak and understand, she was talking about Cortana. The AI built by Microsoft is commonly just thought of as a voice search assistant, but she has much more potential than this. Cortana has learned a lot, and she gets smarter everyday from the 1 in every 5 searches that happen on Bing. Ongoing innovations to Cortana’s AI have given her situational and contextual awareness such that she can actually predict situations and curate content.
Cortana can predict situations to help you leave by a certain time, she can curate content like checking your flight status the morning you’re traveling, and she can help you get somewhere with one click Uber suggestions. Microsoft explained that Cortana is trying to understand someone’s intention using many different signals, without a person having to explicitly state it. Signals come from demographic data, online and offline data as well as individual user patterns and habits.
AI hears more and more queries and new questions everyday, as voice search becomes more prevalent than ever. For example, Answer the Public is a site that produces all the queries it knows related to a certain topic entered into its question-engine. This tool from Microsoft can help inform both paid and organic strategies, showing data about the distribution of question types around any topic. While this tool has interesting data for marketers, it’s also constantly teaching voice search assistants like Cortana how to listen.
Voice search is now so effective that it has the same error rate as a human translator. In just 2 years, research indicates that 50% of search will come from voice, which represents longer search queries. Right now the most common voice searches are quick fact queries. More complex queries are still difficult to do with voice. Microsoft sees chat bots bridging this gap, moving away from question and answer to question and action. As brands and brand marketers, we have to ask ourselves – are we ready to answer these questions?
Seeing and Feeling
AI sees images and videos, and as consumers we interact with marketing images on search, billboards and even in virtual reality. Image recognition is getting better everyday and is another important component of AI. So far, voice search assistants haven’t relied heavily on this technology but it is available to marketers in the suite of senses Microsoft is building.
AI has the power to bring emotion to life and images can play a role in that experience. Captionbot was given as an example. Anyone can upload photos and Captionbot will describe what the caption of the photo could be. Captionbot recognized Anna Kendrick in the photo Microsoft used, demonstrating its image recognition capabilities (it also identified all other images correctly during the presentation).
Bringing AI in Marketing to Life
Microsoft hopes to bring artificial senses to life by creating rich intelligent and meaningful experiences. Chatbots will bridge the voice search experiences for brands. Microsoft now has a cart of cognitive services available through APIs for marketers to bring to life in their own products and services. That means marketers and brands can literally buy Cortana’s vision or hearing ability – which is revolutionary. Bing is also currently running a pilot on search right now as well. A bot will surface on the SERP to help answer your query. They are also doing a pilot partner program with Home Advisor, which has so far been a huge success for all stakeholders.
The future of voice search is now!
Feature Image: Unsplash/Daniil Kuželev