now loading...
Wealth Asia Connect Middle East Treasury & Capital Markets Europe ESG Forum TechTalk
Treasury & Capital Markets / Viewpoint
Breaking down BCBS 239
January’s Basel Committee on Banking Supervision report on banks’ progress towards BCBS 239 compliance threw up a telling contradiction, says Neill Vanlint, managing director, global sales and client operations, GoldenSource
Neill Vanlint 12 Mar 2015
 
   
January’s Basel Committee on Banking Supervision report on banks’ progress towards BCBS 239 compliance threw up a telling contradiction. While global systemically important banks (G-SIBs) “are increasingly aware of the importance” of the BCBS 239 project, their sense of preparedness has decreased. In 2013, 10 of the 31 eligible banks reported they would be unable to comply fully by the 2016 deadline. This year, that number rose to 14. It is understandable that there is more work to be done, but how is it that the G-SIBs are moving backwards?
 
The reason is belied by the simple language of the 28-page directive. Although short in comparison to most financial services directives, its guidelines call for sweeping changes in how banks handle their data governance, lineage and architecture. As banks dive deeper to understand their current practices, they learn more new things about how far away they are from compliance. The real challenge is that BCBS 239 requires an enterprise-wide approach, yet financial institutions often approach regulatory compliance in a tactical way. Think of the legal entity identifier (LEI), for example. Many financial institutions are using LEIs only where mandated. They’re doing all the work, but unlocking only a fraction of the benefit.
 
Compartmentalized compliance isn’t possible with regulation as far-reaching as BCBS 239. It’s vital that banks approach the principles strategically, starting with an all-encompassing assessment of existing data management processes.
 
Setting the foundations
 
BCBS 239 starts with fundamentals. Reviewing the core principles of data management best practices sets the foundation for the principles to be enacted.
 
BCBS 239 addresses how banks should manage the data used in risk analysis and the most logical place to start is to push for clarity across the organisation by setting and agreeing definitions of terms. Questions such as ‘which data elements matter the most’, ‘what do we call them?’ and ‘are these meanings consistent across the enterprise?’ all need to be on the table. Central to this is a common data dictionary, which maps different attribute names to a single underlying definition.
 
Proper control of the data supply chain is paramount, too. Without good governance, banks risk big misunderstandings and big mistakes. And it goes beyond the confines of  IT. BCBS 239 specifically calls out business, operations, auditors and risk managers as playing a central role in risk data aggregation. It starts with establishing general principles for oversight of data sets and also includes identifying how people relate to those data sets.
 
The importance of a strategic approach is typified by the demands the principles place on data architecture. The approach of ‘doing the bare minimum necessary to get by’ might work for issues like mandated LEI adoption, but BCBS 239 changes that. The principles require enterprise-wide data harmonization without cutting corners.
 
The final fundamental is somewhat obvious: risk data aggregation mandates completeness. Risk data spans several broad categories including entities, securities, and transactions and positions. Banks might have already achieved their BCBS 239 target for some of these, but failing to address all of them collectively is akin to doing nothing at all. Effective analysis of risk cannot work with core pieces of information missing.
 
Developing the capabilities
 
With the core fundamentals in place, banks can focus on honing more specific capabilities required for compliance. Effective management of data quality ensures the accuracy of risk data aggregation. Quality measurement is an important first step and needs to happen at multiple points in the data supply chain. However, measurement alone is not enough and banks needs to actively manage quality as well. That requires efficient workflows for error detection, research and resolution and a framework for root cause analysis.
 
After the effective management of data quality, banks must prove the timeliness of their reporting capabilities. Extensive manual ‘data massaging’ is going to hinder compliance with BCBS 239, where time is of the essence. However, banks that have already taken steps to define their terms, implement sound governance, and establish a single, authoritative source for each data set will have an easier time responding to fire alarms.
 
A future crisis could arise from any number of sources. BCBS 239 demands a flexible infrastructure that can aggregate risk data across multiple dimensions. Hard coding simply won’t work and neither will inflexible legacy architectures. Instead, banks need to standardise, link and classify information so they can call upon it quickly and make timely decisions.
 
Finally, there is traceability. Any BCBS 239-ready data management operation must be able to trace critical data forward from source to use, and vice versa. Several elements drive this, including ‘4-eyes controls’ to ensure accuracy when critical data elements are subject to manual intervention. Banks also need audit trails for every transformation as well as bi-temporal history for reconstructions as needed.
 
Reaping the benefits
 
BCBS 239’s challenge is its immense scope. Because of this, there is no quick short-cut to compliance.
 
But the hard work of establishing fundamentals and putting core capabilities in place has its merits. The clarity that effective risk data aggregation provides will help banks streamline their business.
 
The competitive advantage of excellent risk data aggregation can positively affect a bank’s bottom line and allow it to make better judgements through more accurate risk analysis. Moreover, by aggregating information across the business, banks will be able to on-board customers more quickly and cross-sell through existing relationships, all while providing more comprehensive support and services to existing customers.
 
However, perhaps the most important benefit is stability. At the height of the last crisis, some firms took over a month to work out their exposure to distressed counterparties. BCBS 239 compliance puts banks in a far stronger position to cope in the future – that’s because they’ll be able to identify and roll up exposures involving multiple bank subsidiaries and multiple counterparties.
 
Compliance with BCBS 239 is unavoidable and banks are feeling the strain. However, in some ways this is all about common sense in enterprise data management. By adopting a strategic, enterprise-wide approach, banks can create a robust foundation to achieve compliance and, ultimately, a significant competitive edge.
 

Neill Vanlint is the managing director, global sales and client operations, GoldenSource 

Conversation
Andy Suen
Andy Suen
portfolio manager and head of Asia ex-Japan credit research
PineBridge Investments
- JOINED THE EVENT -
17th Asia Bond Markets Summit - China Edition
Rebalancing in the transition journey
View Highlights
Conversation
Mark Witten
Mark Witten
chief investment officer
Portal Asset Management
- JOINED THE EVENT -
Asset Servicing Leadership Series
How digital assets are transforming Asia's investment landscape
View Highlights