Skip to main content
Skip to footer

Why should a bank do batch when they can do real time?

Bikram Das

When it comes real-time processing versus batch processing, most system architects and transformation leads would agree in consensus to opting for real-time transactions. This is because the relevance of information keeps decreasing with time. So, why do we have outdated batch systems running even now across the banking IT landscape? What can be done to fast-track them into real-time systems?

Typically, batch processes involve updating multiple entities at the database level that need to be performed as a group. For instance, account posting, and settlement operations require an entire group posting that can be followed by a reconciliation process to ensure completeness and accuracy of the postings. Another use case can be classified under reporting and analytical platforms, where a large number of database records are processed for analytical or report generation purposes. However, it is possible to bring in incremental aggregations at individual transactions or triggered events rather than performing a traditional periodic bulk operation that erodes the value of instantly available data. Let us look at some specific examples.

Why real-time processing matters?

Today, most payment platforms continue to operate with legacy technology that lacks flexibility in adapting to changing requirements and response times. A simple acknowledgement of a payment process takes time because an account update is perhaps a day-end operation – it need not be for all scenarios. These systems can be made more responsive using streaming technology and an associated real-time architecture.

Payments and core accounting platforms may explore to eliminate the need of a complex set of operational steps, given that the core activity focuses on debit or credit of the customer and other associated accounts as per accounting practices followed in the bank. However, the complexity lies in the business rules used for computation of the debit or credit amounts, which are typically batch-oriented processes. The other main functionality of such platforms requires interfacing with source and target applications to receive transactional data and expose account information. Evidently, performance bottlenecks stem from the calculation engine and the interfaces. If you, as a retail shopper, buy a product and send a payment instruction from your mobile device, you would want to see an immediate update on your account balances. Perhaps, banking enterprises need to focus on the bottlenecks in the traditional application architecture layers and available real-time technology enablers. This is where real-time interfaces like NoSQL come to play.

NoSQL databases like RavenDB, Couchbase, Redis, MongoDB, and more have lifted the constraints imposed by relational structures requiring referential integrity and frozen data models. Moreover, updates using specific formats such as JSON are as easy as retrievals. Integration between systems is also improved with products like Dell Boomi or Confluent, which provide cloud-based solutions and a multitude of connectors, ensuring scalability and flexibility. Performance of business rule engines can be improved by defining rules at an atomic level and decomposing them into binary units. While simplifying operational steps from core accounting and payment platforms, in-memory visualization tools can solve the challenge and bring data to the fingertips through multi-media interfaces.

Streaming versus real-time processing

When it comes to transforming traditional batch processing functions with real-time systems, it is imperative

to understand the difference between streaming and real-time processing. Quite often, a system that guarantees a reaction within tight deadlines is described as ‘real-time,’ where the time taken can be in minutes, seconds, or even milliseconds. For instance, in a stock market, a quote arriving within 15 minutes of placing an order demonstrates the capability of a real-time system that guarantees a reaction within a brief interval. On the other hand, streaming presents continuous computation as data flows through a system. Streaming involves no time limitations exclusive of the business tolerance to latency and the technology solution used. However, this method has two requisites: output rates must at least coincide with input rates in the long term, and sufficient memory for storing queued inputs needed for the computation.

Considering the basic performance parameters of a system, achieving real-time or streaming may need fundamental redesigning of applications where the underlying design principle of atomicity of the computations and feedback is adopted in every component. Microservices architecture when coupled with digital solutions can addresses this principle adequately.

The change is here – how do banks adapt?

Adapting to this fundamental shift comes with a fair share of challenges for enterprises mired in a complex web of legacy and modern technologies. They need to take the leap, as a slow and easy change is not an option. When implementing a real-time change, it must be executed just as quickly. Therefore, enterprises must invest in building a new platform on real-time principles, but for an interim period, co-operate with the existing one, chipping away core functionalities within a well-communicated plan. Cloud-based platforms, along with concepts of containerization and DevOps are good enablers for embracing such a change.

The change is fundamental, and therefore, it would not be feasible to implement it through modifications to an existing legacy setup. A new platform will be less costly and managed better in terms of impact to end-users. Those who have embraced the architectural changes based on real-time or streaming requirements are better positioned, to earn greater customer satisfaction and remain competitive in the market

About the author

Bikram Das
Bikram Das is Chief Data Architect, Chief Data Officer initiatives with TCS' Banking, Financial Services, and Insurance (BFSI) business unit. With over 25 years of experience in data management and consulting space, he has managed several data initiatives for our global clients across industry verticals. Das specializes in metadata management, data quality, and data governance including the evolving big data space. He has a bachelor's degree in chemical engineering from the Indian Institute of Technology, Kharagpur, India.