DevOps a major change in the way software is developed and delivered – is making inroads into corporate IT strategies. In an environment where speed is of essence, and quality must be top-notch, DevOps seems to be the only way forward. This is also the key takeaway from TCS latest DevOps research, ‘Winning in the Digital Marketplace: Assuring Software Quality in a Fast-Moving DevOps World’.
DevOps is important for the Banking and Financial Services sector (BFS) as well. Despite the sectors regulatory rigor, compliance heavy processes, legacy infrastructure, and high cost of systems failure viewed as DevOps roadblocks by most software pundits – BFS is gearing up for DevOps adoption. Based on my experience with global BFS IT projects, let me put forth few situations that highlight how these roadblocks can translate into opportunities, and how BFS is in fact, best suited to embrace DevOps.
Lets start with compliance. Software development teams at banks are aware of the high cost of failure associated with banking systems. Because production stability has always been critical, developers of BFS applications are sensitized to the need and responsibility for delivering high quality, robust and reliable production systems, rather than just writing code. To ensure that their products dont end up on the other side of the regulatory line, they collaborate with other teams, including Quality Assurance (QA) and IT operations, to ensure that their envisaged systems are in line with compliance requirements. This practice, in reality, is DevOps at work. With such collaborative work environments in place, BFS businesses can leapfrog onto the DevOps bandwagon, with much greater levels of maturity, than other industries.
Evolving customer and market requirements require rigorous release planning, which in turn impacts release frequency. Despite following agile methodologies and approaches such as Test Driven Development (TDD) and Behavior Driven Development (BDD), frequent release changes may impact the overall software development lifecycle or SDLC. But this challenge, should also be viewed as an opportunity for detecting defects early, and preventing expensive rework in subsequent phases, through efficient and effective release planning with QA checks for assuring performance, security, privacy, stability and ease of use not just during testing, but right from the requirements phase.
With shrinking IT budgets, BFS CIOs need to optimize the total cost of ownership (TCO) of sizeable legacy IT infrastructure, comprising high-maintenance mainframe systems that cannot be discarded overnight. Such legacy infrastructure results in minimal regression test automation quotient (mere 20-30%). Most BFS applications, even today, rely mostly on manual functional testing, with regression test automation only in pockets. Further, manually provisioned test environments are not aligned with production realities. Manual test data generation is time consuming and effort intensive, requiring business analysts support. With strict regulatory guidelines on customer confidentiality and privacy, merely masking production data for testing is not an option for banks. These manual testing methods are a major obstacle to DevOps, and can increase the QA cost by up to 30%. Again, these challenges are opportunities for initiating legacy modernization programs, and moving to modern frameworks such as the Services Oriented components based Architecture (SOA), Responsive Web Design (RWD), Adoptive Web Design (AWD), and Mobility. Besides improving the test automation quotient and release quality, these initiatives also improve return on investment and optimize total cost of ownership, enabling CIOs to better manage their reduced IT budgets.
So the challenges, at first sight, may appear perplexing, but with the right set of QA interventions, they will prove to be blessings in disguise. Most BFS CTO teams are adopting a cautious approach – introducing DevOps pilot projects to determine acceptance and success rates, and then taking the deep dive. By getting involved at the outset, and being part of each step, QA can capture lessons learnt, deploy best practices, and facilitate success of both the pilot and the deep dive.
For QA, DevOps is both a learning and role progression opportunity. Traditional test execution roles are making way for holistic QA roles with domain and business expertise, coupled with development skills.Starting with identifying automation opportunities across the SDLC, QA must automate as much as possible, through a combination of commercial and open source tools. In a DevOps ready world, automated builds, automated code deployment, service virtualization, automation scripts development, automated test environment and data provisioning, cloud adoption, test data reuse, automated test execution, and end to end automation are aspects that fall under the QA realm. DevOps also requires increased level of API-driven test automation, and not just automated UI testing alone. To support frequent releases, sometimes even daily or hourly builds, adequate test harnesses, frameworks and tools have to be built. Quite evidently, the testers role must evolve from test case executor, to test developer.
The DevOps change is inevitable, and well-worth-it. In the words of American singer and actress, Demi Lovato, “Change is never painful. Resistance to change is painful. It will be interesting to see how the BFS industry approaches and embraces this DevOps integration into their IT landscape.