BCBS 239 2.0 resurgence: Strengthening risk management and decision making

The Basel Committee on Banking Supervision (BCBS) issued its standard number 239 (BCBS 239) nearly a dozen years ago in 2013, with the aim of strengthening banks’ risk management through improved risk data aggregation and internal risk reporting. Its binding compliance deadline for global systemically important banks (G-SIBs) was nearly nine years ago, in January 2016. For domestic systemically important banks (D-SIBs), compliance was expected within three years following their designation as such.

However, full compliance remains elusive for many institutions; meanwhile, regulators are renewing their attention and applying an increasingly forceful approach. There’s a broadening of scope in terms of which institutions are receiving regulatory attention—including Tier 2 and Tier 3 institutions. The assessments are also deepening in their application and level of detail across areas of policy, capability, and reporting. In Europe, they take the form of on-site inspections (OSIs), targeted reviews of priority areas, and assessments of data quality related to supervisory reporting. These actions often lead to significant penalties, including findings communicated in the form of European Central Bank (ECB) letters, Pillar 2 requirement (P2R) add-ons, restrictions on business activity, and fines. In the United States, assessments involve examinations of the data management practices of banks, along with evaluations of related areas such as regulatory reporting, resolution and recovery planning, and specific report examinations (for example, the Complex Institution Liquidity Monitoring Report, or FR 2052a). These assessments can result in matters requiring immediate attention (MRIAs) and matters requiring attention (MRAs); in the most severe situations, they may lead to consent orders. Across both Europe and the United States, beyond the direct penalties, there are cascading indirect financial consequences, such as conservatism add-ons in risk modeling, for example, margins of conservatism (MOC) for internal ratings-based (IRB) models.

A renewed call to action

According to the Bank for International Settlements, only two in 31 banks (G-SIBs) have fully complied with the standard; moreover, several formerly compliant banks have been downgraded. A series of progress reports—seven between 2013 and 2023—issued additional regulatory guidance. The sixth report, which in April 2020 called for the transition of enforcement to local regulators, was followed by a pause of approximately three years. This pause, however, concealed the growing pressure on banks to meet the expectations of local regulators. In Europe, this includes the issuance of ECB letters with findings, P2R add-ons, and fines. In the United States, banks face scrutiny from the Office of the Comptroller of the Currency and the Federal Reserve Board, including MRIAs, MRAs, and, in severe cases, consent orders.

This pressure was ratcheted up considerably by the latest report in November 2023, which highlighted a lack of meaningful progress and issued significant expectations for banks and their supervisors. The report noted that BCBS 239 programs have been underfunded and lacking in attention from senior leadership, with insufficient recognition of the standard’s importance in relation to capability improvement. It also pointed out a failure to embed the standard in relevant urgent programs, such as Basel IV/3.1. Contributing to the lack of progress, the report suggested, is a “boil the ocean” approach taken by some banks, with insufficient prioritization of requirements and misfires with regard to the scope of implementation. Technical factors, including fragmented IT ecosystems hampered by legacy systems, add to the struggle.

In addition to the BCBS 239 progress reports, regulatory bodies have called attention to related problems. The ECB’s banking supervision identified risk data aggregation and risk reporting (RDARR) deficiencies in its December 2023 report on supervisory priorities for 2024–26. Likewise, its May 2024 Guide on effective risk data aggregation and risk reporting (Guide) conveyed a range of guidance, including highlighting the importance of basic data governance hygiene to ensure confidence in the numbers and reports issued by financial institutions, clearly defining what constitutes critical risk and finance information across various dimensions, prioritizing end-to-end automated lineage, and actively involving top management. The Guide also adds, for the first time, real practical guidance on essential requirements across seven areas—leaving no room for neglect.

Guiding principles for success

We are aware of the obstacles encountered when endeavoring to manage risk-related data effectively. In line with the latest BCBS 239 progress report, we’ve identified a number of key challenges that need to be addressed. These include getting organizations with differing priorities and perspectives to work together, conducting thorough root-cause analysis to identify data issues in a context where data are pervasive throughout the bank, and aligning existing incentive structures to promote a strong data management culture. We have five core beliefs, along with ten key lessons (see sidebar, “A blueprint for success”), about how banking organizations should orient their mindset when it comes to BCBS 239. By finding the right disposition toward the standard, financial institutions can position themselves well to undertake meaningful action. Consider these five guiding principles the foundation for an effective strategy blueprint—and as part of that blueprint, aim to create visibility for board and senior management with frequent progress reports.

1. Make it a business impact story from the start

It’s crucial—and truly beneficial—to approach the BCBS 239 journey as a business impact story right from the beginning. This means the CFO, chief information officer (CIO), and chief risk officer (CRO) should be proactive in bringing the business leaders on board and linking the effort to specific business objectives that go beyond regulatory compliance. Leaders should highlight the opportunities that arise from more timely data and streamlined calculation processes in prioritized areas. Improved master and transactional data can unlock new commercialization opportunities. Additionally, improved model explainability can mitigate the impact of regulatory reviews. Leaders should develop a perspective focused on how initiatives can be linked and integrated with existing business-related efforts and programs.

Our experience suggests that practical implementation of such an approach entails interviewing business leaders at the outset to identify major data-related pain points and prioritize the respective remediation. This could include initiatives such as shedding excessive hedges and capital buffers currently in place due to insufficient timeliness of risk metrics or removing margins of conservatism while remaining within the boundaries established by risk modeling.

2. Take a risk reduction approach from the outset

Leaders should identify and prioritize critical information, addressing these areas first to immediately mitigate the most significant risks. With the scope prioritized at the beginning, it can then be expanded in terms of both breadth and depth. For example, banks might begin with an initial prioritized scope in the form of select key regulatory reports and management metrics, focusing on data quality controls and the reduction of manual interventions in high-risk areas of the aggregation processes. Then, in time, the prioritized scope can expand to include a broader set of reports and metrics, with data quality controls across more points in the aggregation processes. In essence, this approach entails breaking the scope into manageable sizes while enabling the measurement of risk reduction in critical outputs.

Our experience suggests that the above can be achieved by ensuring risk and finance collaboration from the beginning of the program and tasking the respective areas with identifying the information most critical to them. This can then be conveyed in terms of common dimensions such as metrics, critical data elements (CDEs), and reports based on central guidance regarding what constitutes criticality. There should also be a focus on sharing/reusing CDEs across the metrics so that the population does not keep growing unnecessarily.

3. Look for opportunities to accelerate execution

Leaders should look for opportunities to accelerate the execution of the approach described in principle 2. The use of generative AI (gen AI) tools can significantly accelerate data compliance and development efforts. In fact, leading organizations are deploying gen AI at scale to fix data quality issues and go beyond rule-based vendor products, enabling significant value through higher productivity.

We have observed that, as a starting point, banks can benefit from tools that help automate data lineage and transparency efforts to ensure base levels of compliance. This approach will also provide banks with a clear view of the gaps and issues in their data. With this in place, banks can take directed actions to remediate data issues. Next, banks should think through the entire data development life cycle to understand what types of tools and interventions are needed. Gen AI tools, for example, can help integrate data privacy and protection solutions during the data governance stage. Banks should consider experimenting with a suite of tools to build deployable data quality workflows—focusing not only on which ones can best support their development needs but also on those that can do so at scale.

4. Remediate at the source with a target architecture and operating model to guide the process

Banks should aim to remediate data as far upstream in the data life cycle as possible, ideally at the point of origination. Ideally, they should move toward a target data architecture that relies on a limited number of authorized provisioning points (APPs) or authoritative data sources (ADSs). Implementing a robust set of data controls, preferably automated and preventative, early in the data process is crucial to ensure quality for downstream consumers. It is important to rigorously enforce the use of APPs and ADSs to ensure that high-quality data are sourced from a minimal set of systems. If existing data sources fail to meet consumer requirements, they should be upgraded, rather than creating new, redundant sources, which would require additional controls and governance to maintain data quality.

Experience tells us that most data quality issues originate from upstream systems in the data sources and at the consumption point. To address this, banks can, as part of their data operating model, map the data lineage from its point of origin to its consumption point. This enables evaluation of the existing data controls to determine their effectiveness and gather feedback on pain points from data consumers throughout the lineage—thus identifying where additional data controls and/or upgrades are necessary. Banks should consider implementing a comprehensive framework that outlines minimum preventative, detective, and corrective data quality control requirements for critical data along the end-to-end data lineage.

5. Be transparent and comprehensive in regulatory dialogue

Banks should maintain a strong degree of proactiveness and transparency with regulators, ensuring they perceive the bank and the BCBS 239 program as models of openness and proactivity. To convey a strong sense of control and oversight, it is important to communicate in a highly structured manner, providing regular progress reports that offer comprehensive information on both the current status and upcoming initiatives. Insofar as possible, banks should implement initiatives of their own accord versus waiting for a regulatory push. This approach will enable the bank to set its own pace.

In our experience, this entails communicating early on the scope of the program—as well as the vision, ambition level, and execution approach (for example, deciding to first do a horizontal fix of all foundational aspects versus engaging a sprint-based method). Thereafter, this involves building and leveraging structured templates to communicate the current state (for example, gaps in critical metrics) and upcoming initiatives; likewise, it includes regular reporting on progress and bottlenecks. Where possible, banks should inform the regulator in an integrated way, such as by communicating BCBS 239-related initiatives and commitments as part of Basel IV/3.1 programs.

Banks across Europe and the United States are at varied stages of maturity

European and US banks vary widely in terms of where they stand on their BCBS 239 journeys. Some are just beginning, while others are refreshing their efforts or accelerating their progress. Those furthest along have been dedicated to compliance for several years. They have been closely monitoring key risk metrics and reports, with business and IT functions closely involved. Nevertheless, they face regulatory scrutiny, because BCBS 239 demands perpetual enhancements, such as the removal of manual processes and the widening of scope across dimensions of reports, models, risk indicators, and critical data elements, with the ultimate aim of covering all critical data of the bank.

Banks in the middle of their BCBS 239 compliance journey typically have well-documented frameworks, such as data governance structures, clearly defined scopes, and have begun exploring new tools. However, they often struggle to make swift, measurable progress and engage the business. Some of those just starting out have previous failed attempts behind them. The problem typically lies with execution: despite ambitious plans, practical implementation has proved elusive, and tooling sometimes emerges as an excuse.


The rewards are worth the effort. Banks are at an important moment in their regulatory journeys. With BCBS 239 getting renewed attention and the expectations rising rapidly, the pressure is on to make meaningful progress toward full compliance. By establishing a business impact mindset across the organization, these requirements can also become an opportunity for competitive advantage with a host of indirect financial benefits, including enhanced digitization initiatives, improved risk management, and bolstered relationships with regulators based on trust.