PREPARE TO MEET THY DATA
Silvano Stagni, group head of marketing & research at IT consultancy, Hatstand provides an insight into BCBS 239 and argues that we should all be planning to de-clutter our database and check our risk monitoring processes.
Companies in the financial business sector tend to grow by merger and acquisition. Merging two businesses implies merging corporate cultures, business processes, client base, etc. The whole process is not cheap or without its pitfalls. It is usually a stressful and exhausting time, where legal and financial professionals cross the crossable and dot the dottable, but afterwards there is often very little energy, drive or indeed money left to look at technology.
The newly merged entity starts with a data architecture and system architecture joined by ‘spit and sellotape’, conversion tables and (possibly) external spreadsheets. Things are left to “when we have time and a budget”, but all too commonly that “time and budget” never comes.
While this is understandable, the risk is disjointed data architecture, potential disruption of straight through processing and any remedial action remains the umpteenth entry in the long to do list for “when we have time”. And that time never comes because there are ever more pressing and exciting things to do – things for which getting a budget is easier.
But now, here comes BCBS 239, a set of principles published by the Basel Committee for Banking Supervision and sponsored by the Financial Stability Board. BCBS 239 aims to make sure that the aggregation of your data enables you to monitor risk according to the criteria set out in the corporate risk appetite statement. Although its focus is mainly on risk, eight of its fourteen principles concern data. Principles 3-9 look at the following characteristics of data:
- Principle 3 – Accuracy and Integrity
- Principle 4 – Completeness
- Principle 5 – Timeliness
- Principle 6 – Adaptability
- Principle 7 – Accuracy
- Principle 8 – Comprehensiveness
- Principle 9 – Clarity and usefulness
Technically, BCBS 239 only affects large global banks that were classified as having Global Systemic Importance (G-SIFI)1 by the Financial Stability Board (FSB) in 2013. The FSB updates that list every November and it has also invited regulators to prepare a list of Domestic Systematically Important Financial Institutions (D-SIFI). The ones that were in the 2013 list have until January 2016 to comply.
However, there is perhaps a more powerful argument to look at BCBS 239. For no matter what the final purpose is, everybody should seriously aim for a data architecture that meets these eight principles, because it is simplyà good practice.
Whatever their classification, banks should still be looking at BCBS 239 because it provides good criteria to look at your data. They may also want to do it sooner rather than later because a data architecture that has grown unchecked can over time become too complex to manage change quickly, efficiently and at a reasonable cost.
In a recent survey (Clean Databases – Reality or Fantasy?), a manageable level of complexity in a data architecture is defined by the following criteria:
- There is a golden source of data, or ‘master copy’ that is constantly kept accurate and up to date. Other databases are updated from that master copy in a timely fashion.
- There is a single point of responsibility (and accountability) for the integrity of the golden source and the processes to update the various databases.
- There is a corporate commitment to regular reviews, especially after a merger, acquisition or any other event that could have a disruptive consequence to the existing structure of data.
- There is a constant review of actual usage of real time feeds.
- There is a regular review of conversion tables. They serve a purpose, but they are too often used as a way to paper the cracks caused by two systems that are eventually intended to talk to each other but time, budgetary and/or resource constraints may extend their life indefinitely.
- There is a constant review of the usage of flat file databases and anything that is used by more than one person is ‘brought into the fold’ of the overall data architecture.
Under a pragmatic point of view, meeting the criteria for a manageable level of complexity is as close as most organisations can get to a ‘clean’ database. MiFID II/MiFIR will bring a very high level of change to data structures. Changes to data will be driven by new processes, new trading venues and, most of all, the obligation to trade on exchange, non-equity financial instruments that meet certain liquidity criteria. The cleaner the database, and the more detailed the knowledge of your data architecture, the easier it will be to implement those changes.
Looking at BCBS 239 can help in this process since any effort towards meeting the eight data-related principles will represent a move in the right direction towards achieving the manageable complexity mentioned earlier, and make your life considerably easier in the next couple of years.
BCBS 239 is not prescriptive, so you are free to decide how to meet those principles; even more so if you choose to implement it when you do not have to.
On the other hand, if you have to meet BCBS 239 principles you’d better keep a log to demonstrate that you have gone through a risk policy review and a revision of data, and that you have implemented any remedial actions that the gap analysis following those revisions outlined as necessary. Over time, you will have to prove that you have periodically reviewed data and risk monitoring policies, and that should also be included in the log.
References: 1. The Financial Stability Board published that list in November. The 2013 list can be found in www.financialstabilityboard.org/wp-content/uploads/r_131111.pdf ; the list published in November 2014 can be found in https://www.financialstabilityboard.org/wp-content/uploads/r_141106b.pdf2. Clean Databases – Reality or Fantasy? can be downloaded from www.hatstand.com/insights
[divider_line]©BestExecution 2015
[divider_to_top]