NetApp Inc.

07/21/2021 | Press release | Distributed by Public on 07/22/2021 01:56

Eight approaches to processing natural-language-enabled AI

In today's world, natural-language-enabled AI is not just a "nice to have"-it's a necessity. According to Gartner, "by 2024, up to 80% of branded digital experiences will be delivered to consumers via virtual people."

Conversational AI goes far beyond chatbots. AI systems that process natural language engage in humanlike dialog, understand context, and offer intelligent responses in milliseconds. Put simply, natural language processing is a computer program that understands human language the way it is spoken. Natural language processing is changing all the time-new developments in technology and ever-changing strategies refine how AI processes language.

Gartner listed eight approaches to processing natural language in the Gartner 2021 Strategic Roadmap for Enterprise AI: Natural Language Architecture, Anthony Mullen, Magnus Revang, Stephen Emmott, Erick Brethenoux, Bern Elliot, Jessica Ekholm, 15 December 2020:

#1 Generalized language models or transformer models
In 2020, many vendors incorporated this new disruptive approach to language processing. The models are used by insight engines, text analytics, natural language generation (NLG), and conversational AI across all areas of the NL technology workflow. These generalized language models are often used with transfer learning to leverage prebuilt deep learning models (with trillions of parameters) to create custom models for industries and organizations. Examples of these models include BERT/Meena and GPT2/3.

#2 Conversational middleware
This middleware lets you use a flexible combination of speech and conversational engines. decoupling the underlying engine from its training data and its dialog design and integration. Typical engines used by these middleware vendors are Amazon Lex, Microsoft Bot Framework, Google Dialogflow, IBM Watson, and Rasa. (See Using Conversational AI Middleware to Build Chatbots and Virtual Assistants).

#3 Expanding search and conversation to computational queries
Most business intelligence (BI) tools have offered some sort of natural language interface to what they do. But the approach of most of them is isn't fully conversational. Computational queries are improving fast, however. The latest promising tool is Google TAPAS, which uses generalized language models to access tabular data. See more on the evolution of BI and natural language in this report: Worlds Collide as Augmented Analytics Draws Analytics, BI and Data Science Together.

#4 Modular deep learning stacks and modules from NL vendors
Large vendors like Alibaba, Oracle, IBM, and Microsoft are developing document, speech, translation, and conversational capabilities from a shared stack of NL models and common components.

#5 Democratizing citizen development of NL experiences
Whether it is for search or conversation, many vendors have improved the WYSIWYG of their citizen developer tools and made their nontechnical interfaces more accessible. Low-code and no-code design tools are available for more than just dialog. Conversational dialog is just one of many things that citizen developers can design without technical knowledge. Today, low-code and no-code platforms often offer not only dialog design, but also RPA and search application development along with more standard Web 2.0 design elements.

#6 Vendors providing multimodal offerings extending language models to include computer vision and translation
Language services are evolving to be more multimodal, which allows for rich and natural communication between people and machines. For example, Openstream provides you the ability to simultaneously talk to and interact with mapping applications. And Baidu Translate takes cues not just from words spoken, but from visual objects in a video scene.

#7 Evolution of data access, metadata management, and graph-powered systems
Today, the state of the art in distributed data management and enrichment is data fabric. Your data fabric can provide you reusable data services, pipelines, semantic tiers, and APIs through a combination of data integration approaches. You can further improve data fabrics by adding dynamic schema recognition or even cost-based optimization (see Demystifying the Data Fabric).

#8 An emerging pipeline of tools and services for NL projects
Data labeling and annotation firms increasingly support text, speech, and document annotation services along with computer vision. Translation firms are managing workloads by using translation hubs of both models and people.

Learn more