Blog article
See all stories »

The Regulatory Web of Complexity

At the centre of financial services risk management and regulatory compliance is the convergence of data from a huge range of information silos, departments, product lines, customers and risk categories. Risk and finance are at the end of an extensive chain, examining the consolidated threads, uncovering issues and spotting patterns.

This is where many organisations run into difficulty, unable to resolve the enormous data integration task. Without good data management, obtaining the right information to provide to regulators is like trawling cloudy waters; there will always be uncertainty as to the resulting catch and its quality.


Harmonies across regulatory data requirements

The banking sector remains in the midst of reform, with investment in risk governance and integration continuing to rise. The volume of regulatory alerts has risen sharply in recent years – in 2008, there were 8,704 regulatory alerts; in 2015, that figure jumped to over 43,000. This equates to a piece of news on new regulation, a standards update, a QIS, a policy document or a consultation every 12 minutes.

The biggest growth in jobs is in risk professionals to deal with successive pieces of regulatory alteration. Within these successive pieces, there are common themes: different product taxonomies and client classifications, unambiguous identification, additional data context, links between related elements and, generally, requirements on audit and lineage.

In addition to regulators examining the quality of risk information chains for BCBS 239, and stress testing programmes encouraging a reconsideration of processes across silos, failure to meet the requirements of FRTB can subject banks to increased capital charges.

 

The sourcing and integrating of data

The arrival of a zero tolerance regulatory approach towards poor data management entails that banks need to source and integrate market data efficiently, derive and track risk factor histories, and manage data quality proactively. The back end, however, is only part of the story. Equally as important is managing the consumption and standardisation of source data, and controlling the process and distribution – with transparent dashboards for quality metrics and ease of reporting.

The importance of a consistent data model to anchor the processes and business rules can also not be underestimated. Sourcing clean market data continues to be a crucial challenge for risk departments and can prove a waste of valuable quant time on data formatting and cleaning. Regulators want to see good data management and provenance in order to trace where a result comes from, and therefore a more structural approach to market data sourcing, quality management and any market data operations is vital.

 

Developing a common data understanding

There may be many complications around regulatory change management, but there is also a range of tools available to assist, from developing a cross-referencing and mapping strategy to developing smart sourcing practices. Banks and their suppliers need to overcome the inherent difficulties, and collaborate strongly, utilising aligned
 data dictionaries.

Internal agreement on terminology 
can often be challenging enough, with a lack of standardisation for acronyms or between desks often the source of data inconsistency. Semantics are critical to creating a single version of the truth, and this is why data governance has become an industry of its own.

Regulators are becoming entrenched in the data detail. Evidence of this, aside from FRTB, are the rule-making in Dodd-Frank or the technical standards in MiFID II. This brings us to the importance of risk and finance in 
the whole process, sitting as the largest stakeholders
for data governance and data quality. Positioned at the information convergence point, they ensure each strand is accurate in itself and allocated correctly. As such, they must lead the way when it comes to good data practices.

A bank’s ability to live up to the expectations of the risk data aggregation and reporting principles within BCBS 239 plays a pivotal role, through the provision of transparency and integration capabilities. The specific reports, metrics and regulatory destinations may differ, but much of the input is the same. Good infrastructure establishes common ground for regulatory mandates (whether BCBS 239, FRTB or MiFID II), and as a result, it becomes clear what external data is required as a net sum of regulation.

 

The need for strong data infrastructure

As organisations scramble to adjust their practices to suit the latest regulatory requirements – whether third party vendors or financial services firms – an overriding theme is emerging: data infrastructure. Taking everything into consideration from a bank’s perspective, creating robust data infrastructure is the best place to start when preparing for regulation. A solid foundation for compliance with the new market risk framework of FRTB, inclusive of prices, traded prices, quotes, risk factors and quality assessments, is a precondition of effective change management. This is relevant for adjacent regulations such as PRIIPs and MiFID II, but also overall for the business.

First and foremost, banks should look at implementing best practices in the collection, cross-referencing and integration of data, before moving on to look at data quality workflows, such as controlling and tracking proxies. Joined-up data may be the most significant shared aim of regulatory regimes – but it is yet to be adequately addressed.

Clarifying the need for data cohesion includes the requirement for a common data model for product terms and conditions, prices and risk factors, as well as business rules for the lookup and classification work required. At the same time, banks need the ability to configure distributions to risk and valuation systems, and to
use dashboards to both monitor volumes of data and suspects and track the use of proxies or other actions taken on data. This is to complete the information supply chain before the data hits consuming systems.

 

Conclusion

Ultimately, the most effective way to manage regulatory change is through the accurate collection, controlled sourcing, cross-referencing, and integration of data as a foundation. This addresses common regulatory demands around taxonomies, classifications, unambiguous identification, additional data context, links between related elements
and general requirements on audit and lineage.

Compliance with modern financial services regulation cannot be treated as a box ticking exercise. To avoid regulatory change management taking over other projects, firms need to get their data management capabilities in order first – and taking the risk out of their risk data is a critical step in that process.

4558

Comments: (0)

Martijn Groot

Martijn Groot

VP Product Management

Asset Control

Member since

04 Aug 2016

Location

Amsterdam

Blog posts

2

This post is from a series of posts in the group:

Financial Services Regulation

This network is for financial professionals interested in staying up to date on financial services regulation happening anywhere in the world. CFOs, bankers, fund managers, treasurers welcome.


See all

Now hiring