How Can Banks Take Control of Their Payments Transformation in 2024?

30 January 2024

In the first blog of our two-part predictions series, we explored the key technological and regulatory trends shaping the payments landscape in 2024.

In this second blog, we explore the practical strategies and approaches that can enable banks to realise the significant opportunities that these new trends present, getting a handle on costs, risks and development as they move to accelerate their payments transformation.

 

Rationalise the payments value chain – Esther Groen, Head of Payments Centre of Excellence

The challenge for banks in responding to all the new developments in payments is that their traditional payments processing flows are still based on the concept of specific payment types as individual products, like SEPA Instant or BACs for example. This model slows change and inhibits the creation and delivery of value-added services.

A more service-centric model focussing on, for example, speed, cost, or security regardless of the payment processing product is required. In 2024, banks will therefore need to make a strategic choice on how they want to play where in the payments value chain. What areas of the payments processing chain are essential to serve their customers and what are commodity and potentially eligible for new business models or outsourcing. This can be achieved through a strategy known as ‘decoupling’, which involves separating activities in the chain to identify those that hold the most value-add potential.

Execution, clearing and settlement, for example, are not typically areas where a bank can achieve differentiation. With the adoption of the legislative proposal for instant payments, we expect more Tier 2 and Tier 3 banks will outsource these parts of the chain to Tier 1 banks or non-bank payment processors, freeing up resources to invest in areas that are instrumental in adding value to their customers, like order management or pre-processing. In turn, Tier 1 banks need to be ready to pick up on this emerging need, and not only invest in connectivity and operability, but in order management as well, to monetise on increased volumes of payments data.

If implemented strategically as a single layer, order management achieves the required transformation by simplifying payments channels, developing value-added services and making it easier for new payment rails to be added, or old ones decommissioned.

 

Address the developer capacity crunch – Liam Jeffs, Sales Director

So many of the conversations we had this past year with Tier 1 banks in Europe and North America centred around the idea of control, and we fully expect this to continue in 2024.

When it comes to developing new capabilities, most Tier 1 banks have always prioritised build over buy. Research conducted with Celent shows that, having previously been burnt by vendor packages which locked them in and were slow to deliver, just 8% of large banks would consider buying a vendor package today.

The problem is that banks are not software houses. It’s hard to find the right developers. The same research shows banks have missed out on an average of four revenue-generating enhancements over the past two years due to developer resource constraints. That amounts to an opportunity cost of 5% of annual payment revenues. In today’s environment, banks just can’t afford to leave that kind of revenue on the table.

Unlike a software provider, which is constantly releasing updates and new functionality, the changing payments landscape means banks will need to find ways to make sure they can continually invest in updates and enhancements. In 2024 we believe that addressing the software capacity crunch should therefore be a key priority.

 

Consider how low code can help accelerate innovation – Donal Fleming, Chief Technology Officer

For many banks, the preference for building payments processing software in-house also arises from the desire to maintain control over a strategically important part of their corporate product offering. Retaining the ability to deliver competitive differentiation, and the lack of credible off-the-shelf options are other crucial deciding factors.

As banks look towards reconciling their developer capacity limitations with the desire to build instead of buying, I suggest that low code might be the answer.

Low code involves the use of standardized, pre-built blocks that can be employed to rapidly create new code or make changes to existing software. This allows banks to maintain control of their payments processing while providing an effective solution to address resource limitations and accelerate time to market.

Of course, there is no magic wand to fix everything in one fell swoop, but there is a way for banks to dramatically improve their development capacity, stay in control, and accelerate innovation. It comes down to simplifying a hugely complex problem, and that is exactly what low code is all about.

 

Low code still requires you to do your homework – Toine van Beusekom, Strategy Director

As noted above, for large banks that prefer in-house development for payments processing software, low code offers a compelling approach to overcome resource limitations and bridge the gap between product and technology groups.

Yet low code also has important strategic implications that go beyond technology integration. The inherent complexity of payments means there is no single blueprint for leveraging low code to deliver enhanced, revenue-generating services. Each bank will need to think differently and assess how the advent of low code influences their entire approach to both software development and procurement.

This demands a clear strategy and an understanding of the underlying architectural requirements. Banks should also consider the capabilities of partners to ensure they can support their low code aims. By doing so, the full transformative potential of low code can be realised.

Mady Dyson

BACK TO BLOGS