Natural Language Generation: Analytics Replacement or Additional Layer?
Whether you realize it or not, you will have consumed content created by a computer running natural language generation algorithms and software. Have you ever wondered how some articles or reports are generated so quickly after a sports, government or financial event?
Natural Language Generation (or 'NLG') is a subfield of artificial intelligence that produces natural language as output on the basis of data input. It really is not a new concept; what is new, however, is the recent increase in adoption of NLG into business intelligence (BI) and analytics.
The software and, more specifically the algorithms, have been getting better and faster in the last 12 to 24 months, as demand has increased. With more data out there from many more sources (e.g. big data and Internet of Things), the need to generate meaning from more than just human minds has also increased.
NLG really has two key components to it:
- Firstly the analysis. If I gave you a pile of data and asked you to find some trends, patterns, outliers, and value from it, you would likely start slicing and dicing it via various combinations and permutations of dimensions and measures, using some form of visual analytics software. This is really where the data discovery software market is now - where analysts visually explore subsets of data for meaning. The big problem with this is the human element, whether a limitation of skills or time, or both. NLG software can now apply advanced algorithms to data sets to automatically identify patterns, trends, and data anomalies
- The second component is the language generation, and how to transform those data-oriented findings into consumable intelligence. NLG software is advanced enough now to create readable text in any language and in a variety of voices and tones, whether chatty or formal, and on-brand. NLG vendors claim that “a narrative can be created that is indistinguishable from a human-written one"
So if NLG software can perform the analysis, and then offer a narrative-oriented deliverable, does this spell the end for human-based analytics and decision-making?
I think the answer is certainly not yet, as humans tend to have a natural distrust (likely disdain) of being automated or replaced by a machine. However, if this technology is applied in a specialist or assistive way, then yes. NLG is still relatively new to enterprises, and it might take two to three years of application, experimentation, and acceptance to displace any analyst roles.
So for now, I would recommend that organizations create a NLG initiative to explore and evaluate the use cases and cost benefits for both internal and external usage.
At Information Builders we leverage NLG as an additional analytics layer in the form of WebFOCUS Narrative Charts. We have partnered with industry-leading Natural Language Generation provider Yseop and have integrated Yseop Savvy directly into WebFOCUS.
Savvy is integrated as a charting option which provides a NLG analytics delivery capability to a wide variety of WebFOCUS stakeholders via several deployment options, including self-service InfoAssist+, In-Document Analytics, and InfoApps. Savvy automatically translates charts into narrative, explaining in everyday English what is happening in the data.
The Narrative Chart technology is highly configurable, enabling companies to set the appropriate level of analysis and detail based upon individual user personas and data requirements. For example, experienced self-service BI users can create their own WebFOCUS charts and determine the corresponding narrative accompaniment, while less technical knowledge workers can have this functionality defined for them and presented via an intuitive InfoApp. We currently see this technology as a complementary option to the wide range of existing analytics capabilities available from the WebFOCUS platform.
We shall see exactly where NLG takes us, but it is an exciting technology and well worth considering for your own internal and external analytical needs.