Are digital assistants ready for serious enterprise use yet?

For now, the uses cases still seem to just involve basic productivity.

At Citrix Synergy 2018, Citrix VP of Product for Cloud and IoT Steve Wilson announced that Citrix Endpoint Management (formerly XenMobile) would include management capabilities for Alexa for Business. While we don’t yet know what this entails, it got us wondering: what exactly is the state of digital assistants in 2018, nearly seven years after Siri debuted?

Apple introduced us to Siri in late 2011, and once Siri had time to settle in and develop and we could see the potential, it was off to the races for development of further digital assistants. Alexa and Cortana followed in 2014, Google released Google Assistant in 2016, and Samsung brought up the rear with Bixby in 2017. So, where are we?

Well, there’s good news and bad news.

The good news is that businesses (alongside consumers) are embracing digital assistants in the workspace. Anecdotally, we see people dictating text messages, setting reminders, checking their schedule, playing music, and so on. We can also look at surveys, too. Spiceworks surveyed 500 IT professionals across the U.S. and Europe and found that about 24 percent of large businesses already use Siri, Alexa, or one of the other digital assistants in some capacity. This adoption will grow to over 40 percent by 2019. Small and midsize businesses lag behind with 16 and 15 percent adoption, respectively, but will grow to nearly 30 percent each within the same timeframe.

Adoption rate is nice, but the key thing is how are businesses using digital assistants in 2018? We’re curious if they’re going beyond the basics of messaging and calendaring to real enterprise app use cases.

Unfortunately, businesses aren’t using digital assistants for much more than what they did when first introduced in the office. The overwhelming majority of organizations primarily use digital assistants to improve employee productivity with voice-to-text dictation, something available already before Siri. (The survey results didn’t specify exactly what content is getting dictated, but we’re guessing this means mostly emails and text messages.)

Source: Spiceworks

Looking at the chart, it paints a pretty clear picture that digital assistants remain limited in use. They might be able to speed up combing through unread email or provide common information to customers, but is that really all they can do after seven years? In the same survey, 50 percent of IT professionals at companies without digital assistants say limited use cases is what is preventing adoption.

I dug deeper to find enterprise use cases that go beyond the basic tasks that digital assistants have been able to handle for years now. I found a couple use cases, but overall there wasn’t much.

One example I found is for a retailer called Liberty. The company uses voice-controlled software from Voiteq for warehouse picking, and it integrates with Liberty’s ERP system. Warehouse employees use headsets and microphones to receive orders about which products to pick, while also responding via voice when the item is located and picked.

But really, employee-facing enterprise app case studies are hard to find, and most of the use cases that do come up are consumer or customer focused. This banking example is typical: India’s Tata Capital used machine learning chatbots to help customers filling out online personal loan applications.

Even the Alexa for Business webpage doesn’t promise businesses much. The webpage focuses on how Alexa improves productivity for office works by making it easier to schedule and attend meetings, manage their calendars, and other mundane office tasks. Looking at the customer testimonials, they don’t go beyond how Alexa for Business helped the companies “design smarter meetings,” collaborate better, or have Alexa read out easy-to-access information. Again, this is basic stuff that we’ve been doing for years. That said, there is a little glimmer of realized potential with Capital One’s Alexa use case: they built a skill that allowed teams to easily check system status and updates on high severity events.

Is this the best use for digital assistants today? Is this really it? It seems so, for the moment anyway.

Organizations eager to use digital assistants acknowledge the current limitations. The Museum of Modern Art in New York City used Alexa and the Amazon Echo to query their Collections database, but wish it could do more. Employees have developed several use cases they’d like to one day see become reality. The MoMA wants digital assistants to help museum staff learn about artwork, exhibits, and more; assist museum-goers in common areas with information on events, exhibits, and more; and provide accessibility options.

But none of that is really all that far off from current digital assistant capabilities. Maybe companies aren’t really interested in what digital assistants may do in the future and are just focused on simplifying some of the more mundane aspects of office work, glad that something could do it.

One thing to point out is that use cases involving digital assistants and line of business applications will likely follow the same curve as employee-facing mobile apps. Even several years into the modern mobility craze, people were still asking where all the enterprise line of business mobile apps were, noting that many enterprise mobility use cases still focused around basics like messaging and document editing. But now, a vibrant ecosystem has emerged. Also, just like we got used to the idea that mobile UI is very different than desktop UI, and we have to think of apps differently, we’re still getting our minds around the best way to do a voice-based UI.

So for enterprise use cases, it’s early days, but it’s a good time to become aware of the management, security, and integration challenges. Enterprise digital assistants will have their time to shine; they just need more polishing and for use beyond simple tasks. Employees at enterprise-level organizations might love Alexa finding them an open meeting room, but many of us are dreaming of so much more. Having Siri tell me the weather is nice, but I’d rather a digital assistant integrate with our CMS to speed up editing published content.

Join the conversation

2 comments

Send me notifications when other members comment.

By submitting you agree to receive email from TechTarget and its partners. If you reside outside of the United States, you consent to having your personal data transferred to and processed in the United States. Privacy

Please create a username to comment.

When thinking about digital assistants (in general, not just in the enterprise), it's important to separate the concept into the three core technologies: (1) The voice recognition / speech synthesis, (2) the nature language processing (NLP) to understand what the human is saying, and (3) the connections into all the various business systems which allow the assist to have things to be intelligent about.

Item #1 is pretty much solved today. The only reason I even mention it is because I've heard people say things like, "I don't want to talk to a speaker, I want to use an app," Or, "How will an Alexa handle security?" But the challenge is not the raw speech recognition & synthesis, it's the NLP to know what the human is asking. And that could work as easily via text or speech. So for each person who says, "I don't want to engage with business stuff with a speaker," fine, but I bet that same person uses messenger or Slack to talk to a colleague. "What time are we meeting today?" "Did you rent a car or should I?" "Can you send me the latest deck?" "Who are we meeting with again?" etc.

All that could be handled by a chatbot which uses text rather than a voice interface. And people are already way comfortable with that UI.

So that brings us to (3), the connectors into all the back end business systems, (and I guess 3A, the AI to know how to answer things). That's going to be the hardest problem to solve. But as these EUC companies work on overarching "workplace" solutions which can reach into and connect with everything, and ML-based analytics, etc., I think we'll see some meaningful steps in the coming years.

At that point, pumping that into a voice interface like Siri or Alexa will just be a form factor change. :)
Cancel
Easier said than done. 
Cancel

-ADS BY GOOGLE

SearchVirtualDesktop

SearchEnterpriseDesktop

SearchServerVirtualization

SearchVMware

Close