Corporate Themes

Speaking of Analytics: Reengineering Man-Machine Interactions

 
May 21, 2018

Part 2: The Immersive Analytics (IA) Solution Schematic

When we last left off, we discussed what can constitute as compelling user experience (UX). On that note, have you heard of Google Wave? Probably not.

When the internet search giant announced the launch of its collaboration platform at the 2009 I/O conference, it received much fanfare and was widely touted as a breakthrough product. Just a couple of years down the line, Wave was quietly taken off the shelf, and most people wouldn’t have noticed its disappearance unless the company had formally announced their decision. It’s a pity that Google forgot how poor Immersive Analytics (IA) design has a devastating effect on user experience.

Today, businesses emphasize heavily on usability when it comes to choosing enterprise software. For the end user, it’s all about being able to access the right data, in the right format, within the right context, through the right channel, and at the right time – especially when it comes to Business Intelligence (BI) and analytics applications. Breaking through the usability glass ceiling would require designers to move beyond primitive IA tools such as the keyboard and mouse, and explore modalities which feel inherently natural – from touch to voice and hand gestures.

Making Interactions Organic

For human operators, speaking is the most intuitive form of communication, even when it comes to interacting with machines. This is more than evident from Siri, Alexa, and Cortana’s bourgeoning popularity. So, why can’t we ask our BI suite to pull up last quarter’s revenue report simply by asking?

In terms of building such a next-gen IA interface for analytics tool, the topmost input layer should be able to support multiple Natural Language (NL) modalities, be it text, voice, gestures or in combination with Virtual Reality (VR) and Mixed Reality (MR). Since the end-user is unlikely to use a single IA modality at a time, a multi-modal interaction layer takes care of orchestrating commands interpreted from the user’s inputs.

Once the user’s inputs have been captured, the underlying preprocessing layer parses the captured information and prepares it for the next processing layer. Here they are translated into executable commands and tasks for the analytics application. As soon as you ask the IA-enabled interface for the revenue report, the output layer transforms the reports generated by the BI suite into a suitable format for you to review. The output format is automatically chosen by the interface based on the assumption that your input modality was your most preferred medium of communication – in this case as your Q1 earnings being read out to you by a Siri-like voice.

The Intelligent Interface

For the user’s experience to be seamless, the interface needs to be agile enough to carry the conversation. At the very least, it should be capable of quickly switching between the four major natural language modalities. The IA’s task execution framework is critical in this regard. In case of an NL interface, tasks to be executed can be a dialog-based two way exchange of information, information-based and oriented towards retrieving canned answers and reports, and action-based for generating insights based on very specific user queries.

In case of gesture-based interactions, infrared sensors and cameras capture the user input or query and translates them into typical actions for the analytics applications such as cursor movements, clicks, and so on. For analytics reports to be immersive, the output can be presented in VR/MR, wherein the user can also virtually interact with the visualized data. In line with prevailing user expectations, the next-gen IA interface for analytics is likely to come with embedded platform intelligence. By leveraging machine learning algorithms trained using both supervised and unsupervised techniques, the system can profile users based on queries, search-strings, and through sentiment analysis – continuously improving its capabilities.

The core analytics application’s IA interface however needs be considered a value-add to the existing system rather than a separate tool in itself. Since BI tools can access critical enterprise data, each IA layer should be tightly controlled through well-articulated security and management policies in terms of authentication, entitlements, and audit trail management. Data encryption standards must also be adhered to along with defined storage, disposal, dissemination, and retrieval policies. All of this must be backed by a strong governance framework that provisions for regular audits, and calls for strong usage record keeping policies.

These are just the basics nuts and bolts under the hood of a truly futuristic IA interface for complex business tools. As we delve a little deeper into the architecture of its engine in our next segment, we will discuss why natural language processing matters to IA. In the meantime, you can help us understand a little bit more about what you would consider the most innovative and intuitive modality for interacting with BI tools.

Mahesh has a vast experience of about 24 years across various technologies. He has played a key role in addressing ever-changing demands from the business by architecting state-of-the-art IT solutions. As a technology leader, he has successfully incubated several high impact, high revenue potential IT service lines and technology centers of excellence (CoE). His expertise spans various technology and architecture domains, and while he has been associated with various industry verticals, he has worked most extensively in Banking and Financial Services.