Giles Elliott, Head of Business Development, Capital Markets, TCS BaNCS

Asset Servicing has the highest costs per transactional event of any area in the whole securities ecosystem, where sizeable operations, uncapped levels of operating and investment risk, and extensive re-work and task duplication is common. 

There is such a high level of focus and cost applied to the management of the operating risk in middle and back office, that we often overlook the needs of our investor clients and tools that help them make optimal and well-founded investment decisions around corporate events. 

So, is a highly automated model with largely invisible operations where investments risks are the primary focus and where operational and back-office challenges are out of sight and mind, a viable and necessary goal for the industry to actively pursue? And if it is – how should we be adjusting our focus to make this a reality.

In terms of whether it is viable or necessary – the high-level answer on viability is clearly a yes, where technologies and solutions exist to provide such an environment – we simply have not found a way to assemble these in a manner that delivers the desired outcome. And on necessity – the shear operating costs, risk levels and assumed losses from sub-optimal investment decision making would suggest that most in the industry would agree this is a necessity, and the business case is clear for a transformational rather than tactical approach to addressing this.

Mindful that a blog can only address a certain depth, I have tried to lay out some of the key areas that I feel need to underpin a transformational agenda.

The first key area is to systematically address the lack of digitisation across the lifecycle – with a particular emphasis on issuer announcements, investor instructions and tax documentation. The most significant step forward in issuer announcements was probably driven by the European SRD II regulation that mandated digital event announcements across 27 EU countries and Iceland, Liechtenstein, and Norway, and their issuing companies. But we need global solutions that provide access to sophisticated data entry and validation technologies for issuers, issuer and market infrastructure incentives, and regulatory support to drive this forward. 

This clearly needs a different level of industry collaboration and will never be achieved by firms acting individually to drive change. This collaboration is critical in terms of effective market advocacy, but also for a collective agreement to incentivise issuers and market infrastructure to provide high quality “golden” digital source records.  And I feel there is a role for technology firms to make solutions economically accessible to local markets and issuers, to restrain some of the pass-through of these costs to their existing clients. 

In terms of investor instructions, we suffer from an extensive lack of integration between the middle and back-office systems and the front office decision-making platforms, where firms need to supplement data to drive their risk assessment of investment decisions. The provision of manual instruction templates does little to reduce the risk of mis-processed instructions. We need shift towards integrating with investment decision making platforms, and bringing tools that automate the bulk of decisions, and in-turn the submission of perfectly formatted instructions.

And finally, under digitisation, we need to eliminate the use of physical tax documents that by their pure nature, provide opportunities for firms across the ecosystem to hold differing statuses and interpretations, and leading to extensive reversals and re-processing. And to use our collaborative advocacy agenda to reciprocate use of a master digital record held by the company’s local tax authority, accessed through APIs rather than held in a static unreconciled state. 

The second core problem area that would need to be addressed is with the consumption of data, where the total lack of transparency on data lineage including the underlying sources and levels of prior data validation undertaken causes incredible levels of work duplication and delay downstream. Most players today lack systems to capture such depth of insights, and yet this is a key part of removing duplicated streams of effort. 

Most data vendors, intermediaries or even issuers, would be hard pressed to state that they openly accept full liability for their event announcements, in the absence of a regulatory overhang. It is therefore incumbent on us to have sophisticated tools for validating data integrity, leveraging historical and trend analysis. And maybe, if information regarding the level of validation undertaken by upstream professionals was made available to clients, we would feel less compelled to repeat these processes downstream. But an ongoing model of acquiring multiple feeds and comparing these to ascertain which one is correct seems ill-founded if the goal is a digital black box. 

My third problem area points to the diversity of systems and platforms used across the market players, each with differing data models and configuration, where even the heavy focus on standards does not overcome the challenges in processing events between lengthy chains of intermediaries and end issuer/investors. Without a model that brings convergence of systems and configuration, we will continue to see ongoing challenges managing end to end event lifecycles.

We have seen a move away from in-house developed systems over the past 10-15 years, with firms generally accepting that the economics or even business differentiation does not come from top to bottom custom development. But we need to find ways to drive far stronger convergence – either through a collaborative master platform that gradually becomes the primary processing engine, or by a shift towards utilities that accelerate such alignment. Naturally, some will view these arguments as a case for DLT, but most problems and solutions can be driven using either existing database models or a shared distributed ledger model. 

My fourth and final theme that is key to driving this transformation is to transform the model of Connectivity. We need to move away from such a transaction message centric model globally, to more of an open data model where data is available for update and validation instantly between approved counterparts. I can’t see a model where we will collapse the complex chains of intermediaries any time soon, so our focus much be on aggregating and sharing data and insights far more objectively. And while the focus of Unique Transaction Identifiers (UTIs) is clearly focused on the challenges in the trade settlement domain, I am of the view that it will carry even stronger benefits to the asset servicing ecosystem and drive forward how firms share data.

There is a core message that underpins each of these themes, where the necessity of collaboration is key. We are seeing great progress by some of the leading industry groups, including ISSA, but we need to align around a common vision of the future, and apply our collective weight towards this.

As a footnote to this blog - you will note that I have not mentioned using AI and ML extensively as a primary pillar of the transformation priorities. In part, these are tools that will be critical to any transformation, any will be inherent in more sophisticated techniques around data validation, decision analysis and operating risk management. But I do not feel we should approach such a transformation agenda thinking that AI and ML can solve the fundamental problems with the existing models, and hence have viewed these as facilitators and accelerators rather than core foundations of transformation.

Disclaimer: Views or opinions represented in this blog are based on the author’s own research and do not represent TCS BaNCS.