Technology has quietly transformed enterprise operations. In fact, traditional operations technology has evolved to what we call hyper operations, which stands for AI-powered autonomous planning, operations, maintenance, and sustenance at hyper-scale. With hyper operations, traditional IT has extended into new areas of enterprise business. Consider operations for asset inspection in the transmission network of a utility business. Going beyond mere control and monitoring of assets, a hyper operations system also acts as an operations and maintenance platform that can create situational awareness by collecting, managing, analysing, and interpreting heterogeneous data using AI.
At the heart of all this is observability, which forms the foundation for hyper operations. Observability is the ability of a system to sense, perceive, and reason all aspects of a phygital system. Essentially, it reduces efforts in detection and identification of entities or objects of interest, identifying attributes of an entity, understanding its relationship with other entities, and the behaviour of the entity. In addition, it offers insights into constraints on the object of interest - something that’s critical for modelling a system that governs the attributes and interactions.
To understand why observability matters, consider asset inspection. The goal is to detect defects and act on them. Key entities include assets like transmission lines, whose attributes, such as thickness or temperature, must be monitored. Their relationships, for instance, with tower infrastructure, and their behaviour over time (like line sag) reveal deeper system conditions. Constraints such as structural integrity further shape how these assets perform. Together, these elements form a conceptual model for systems that blend structural and dynamic components to enable reasoning, simulation, and action. Observability is what makes this knowledge possible. However, enterprises usually end up with “partial observability.”
The ability to generate relevant business insights using technology fundamentally presupposes access to timely information from various phygital systems. Observability workflows powered by AI and machine learning assume the data and information are continuously available from the sensors, which isn’t true. There are multiple reasons for this:
More often than not, every sensor is an independent system by design. Think of sensors deployed to monitor utilities assets. A drone-captured infrared image of a transmission line creates a map of the thermal profile. Usage patterns captured by smart meters can give insights into demand. However, an efficient function requires multiple sensors and systems to come together to make optimal decisions for operations and maintenance. In most cases, there is usually a need to fuse sensor information to yield enhanced insights and derive benefits. But with each siloed system narrowly focused on generating insights specific to its target environment, the enterprise-wide macro view is lacking.
When every business unit in an enterprise pursues AI and digitisation differently, data becomes scattered, and decisions slow down. This fragmentation creates partial observability - an environment where neither AI systems nor enterprises can act with confidence.
So how do we enable situational awareness and full observability? It is through intelligent multimodal perception. These capabilities include pattern discovery, prediction of future states, and root cause analysis, by combining human expertise and machine perception to understand the operations. The current use case-based approach limits the application of AI in operations technology.
An intelligent, multimodal perception system can help integrate data from multiple sources, making it easier to generate consolidated insights across complex environments. Such a system can address these challenges by:
For enterprises looking to achieve hyper operations and greater operational excellence, embracing an intelligent multimodal perception system will be the smart thing to adopt. By effectively addressing partial observability, it will equip enterprise systems for pattern discovery at scale and help overcome the challenges of existing monitoring techniques with agent-based perception machines that proactively seek, holistically understand, and adeptly respond to dynamic environments.
When infused with multimodal perception, systems can deliver on the promise of situational awareness and give decision-makers an edge with valuable insights. Human-augmented AI ecosystems and autonomous AI systems won’t seem so far off then.