In the January 2015 BIS BCBS239 Adoption progress report it was stated that “compliance with Principle 2 (data architecture/IT infrastructure) was rated lowest.” BCBS 239 is the first time that the Enterprise IT Architecture profession has been subject to regulatory scrutiny like its construction and transportation industry forbears.
This has become necessary because Distributed Enterprise IT Architectures have now hit the same engineering maturity inflection point with both many partial subsystem issues and in some cases public catastrophic supply chain failures.
Enterprise Architects must shift focus from solely functional analysis models to address the sustained volume, velocity and variety of data that they now aggregate and process to produce meaningful measures of Financial Sector business risks.
Let’s remind ourselves what the banks are being asked to do when it comes to data architecture/IT infrastructure:
A bank should design, build and maintain data architecture and IT infrastructure
which fully supports its risk data aggregation capabilities and risk reporting
practices not only in normal times but also during times of stress or crisis, while
still meeting the other Principles.”
Perhaps the reason for the lack of compliance is that
I didn’t find much help from a quick visit to Wikipedia,
which talks about how the risk data architecture should comprise of 3 “Pillars:”
So I drew on my three decades of CTO experience and as well as the thoughts of regulatory experts to offer some clarity by having you
focus on the following questions:
BIS offer a few clues as follows:-
Given the scale of most Financial Institution’s IT Estates that comprise hundreds of applications deployed over thousands of physical servers this can only be addressed by automation i.e.
At the moment Banks only have very crude Application Portfolio systems — which use an in-house defined, arbitrary classification of the business functions applications perform with occasionally some manual audit/regulatory firedrill processes to provide some form of information connectivity catalogue.
Ironically this current lack of data analysis rigor leads to banks being repeatedly charged for unauthorized use of fee-liable data during audit cycles, which often runs to many millions of £/$.
Timeliness, Repeatability and Frequency of Risk Calculations is also a key factor in the BIS Principles – let’s now apply the same macro Data Architecture pillars to this section of their requirements.
Most major Financial Institutions have good implementations of time synchronization infrastructure across their estates – for BCBS 239 compliance this does not need to be at the same degree of precision required by Low Latency/Algorithmic platforms.
Conversely the same institutions have largely failed to maintain single- or consistently-federated scheduling toolsets across their Application portfolios – this is due to a combination of under investment and weak technical leadership coupled with disinterest from the Enterprise Architecture functions who have failed to document the core dimension of time in their taxonomies.
The well-publicized failure of RBS core mainframe batch processing platform and its knock on effects across the UK banking system for several weeks should have been a wakeup call to the industry to invest in comprehension and strategic investment/optimisation of this key enabling asset.
Sir Christopher Wren’s tomb in St Paul’s Cathedral carries the inscription “SI MONUMENTUM REQUIRIS CIRCUMSPICE” – i.e. If you seek his monument look around you – i.e. All architectures should have a purpose that should be self-evident. BIS hints at this too with the statement “Risk data should be reconciled with bank’s sources, including accounting data where appropriate, to ensure that the risk data is accurate.”
Again we can apply the 3 Pillars to clarify these requirements as follows
Currently the reporting of KPI’s is often massaged by a set of human aggregation processes into monthly “Status Decks” cultural change needs to occur to ensure that the real world dashboard data is automatically embedded into the reports to avoid ambiguity / political emphasis.
As with the other pieces of this jigsaw BIS are giving few tangible clues i.e. “The owners (business and IT functions), in partnership with risk managers, should ensure there are adequate controls throughout the lifecycle of the data and for all aspects of the technology infrastructure”
Let’s apply our 3 tiers approach again to try and decode this sentence and determine the “Whos.”
You have to systematically blend the document-based approach of operating committees and scorecards with the live operational dashboards of process/technical monitoring technologies – which admittedly is very difficult to do currently. This gives rise to the commonly used “interpreted” RAG scorecard that is often manually assembled and “massaged” to manage “bad news” and/or accentuate positive results. With the advent of document based databases and semantics this area is ripe for automation and simplification.
The notions of Geography and “City Planning” are much more comfortable spaces for architects to describe and operate in – and indeed some of the concepts are very mature in large corporations so applying the 3 pillar approach would appear to be straightforward.
The application portfolio of a Financial Institution should be a first-class entity within its Enterprise Reference Data platform – the CTO/Enterprise Architecture function has responsibility for its maintenance, which must be largely automated with validation/comformance checking processes against the physical infrastructure.
NOTE: An application will often span multiple logical production/dev/test environments as well as now being able to be instantiated dynamically on premise or external to an institution so the data model and maintenance functions must be able handle these short lived instances.
Finally we get to the most detailed set of architectural artefacts specified by BIS: “A bank should establish integrated data taxonomies and architecture across the banking group, which includes information on the characteristics of the data (metadata), as well as use of single identifiers and/or unified naming conventions for data including legal entities, counterparties, customers and accounts”.
It is interesting to note that BIS focuses on the notion of standardized identifiers and naming conventions which are quite basic hygiene concepts, in reality there are some much more important components of the architecture that need to be defined first.
As noted above, in many large Financial Institutions Integration, Storage, Reporting, Delivery, Command+Control systems have become both fragmented and growing, so to achieve effective Risk Data Aggregation and Reporting compliance a single, integrated toolset needs to be applied along the supply chain.
Being compliant with the principles stated by BIS in BCBS239 requires real IT work with tangible “Design”+“Build” assets and sustainable “Maintenance” processes linking multiple transaction and reference data systems with real artefacts owned by both Line and Enterprise Architecture Teams.
The Enterprise Application Portfolio and supporting functional taxonomy must become concrete reference data entities.
Linkage of dataflows to each of the Application/Environment instances must be achieved through significant mechanisation and automated physical discovery toolsets – manual firedrill collections of user opinions is not an option.
NB Excel, PowerPoint and Word artefacts are only ancillary to the core solution.
And finally… The data produced by this exercise should be used for the strategic optimisation of the organisation – not just for appeasing a set of faceless regulatory bureaucrats. “You can only manage what you measure” is a very old business maxim.
Like what you just read, here are a few more articles for you to check out or you can visit our blog overview page to see more.
A data platform lets you collect, process, analyze, and share data across systems of record, systems of engagement, and systems of insight.
We’re all drowning in data. Keeping up with our data – and our understanding of it – requires using tools in new ways to unify data, metadata, and meaning.
A knowledge graph – a metadata structure sitting on a machine somewhere – has very interesting potential, but can’t do very much by itself. How do we put it to work?
Don’t waste time stitching together components. MarkLogic combines the power of a multi-model database, search, and semantic AI technology in a single platform with mastering, metadata management, government-grade security and more.Request a Demo