By Peter Waters, GlobalTrading
On the 11th March a wide range of industry participants gathered in London to discuss the latest matters of data management and visualisation, and how compliance and regulation need to embrace the latest technologies to stay ahead of technology and provide positive benefit to the firm as a whole.
An initial area for discussion was the challenge of finding the balance between having expertise in house and being able to make the decision on what solutions might or might not be fit for purpose.
One way to work towards this is through collaboration across a firm. Everyone has to address this issue. The ongoing regulatory push is to bring wider transparency across asset classes and markets – OTC derivatives; swaps etc. Rules are the same, as are the problems and challenges, but it can be hard for people to change workflows. Collaboration across market participants and vendors is also increasingly needed. Investment banks need to agree on priorities, namely where they are drawing data from, and how to manage risk.
Next the panel moved on to discuss wider trends in risk and compliance. To date many compliance investment decisions are being made on a back foot. It was suggested, though not all agreed, that firms need to get rid of excessive staffing in compliance and invest in infrastructure/systems.
Discussion centred around the fact that the quality and quantity of data is much better than it has ever been. That said, firms need to better identify and clarify what particular aspects of the data they need to figure out. Data needs to be cleverer to offer greater value, but it is a lack of discipline that means individuals can’t articulate what they want out of it. There is seemingly a lack of skills in financial services, resulting in a trend in managed analytical services. External managed services may be a solution.
The room soon moved on to discuss what compliance officers can do themselves, and how other members of the broader firm can assist. It was determined that firms should also be looking at infrastructure and capturing information properly and efficiently. They have to adjust systems and processes much more rapidly, embracing a model of more fluid adaptation.
Because of limited resources (both human and technological) and because there is so much data from so many sources, firms are trying to figure out how to best exploit it. Is there a roadmap that can be agreed and followed?
One difficulty highlighted is that analytics means different things to different people and technology isn’t necessarily the problem. People need to clearly define what they want to get from the data, and then share across the firm. The problems emerging in Europe when conducting cross border trades has been evident in Asia for some time.
One positive developing trend is that a lot of banks are standardising underlying systems. Alignment is happening, meaning that data is coming from one centralised source within a firm. But this development is not going to happen overnight. The changing regulatory burden, and how it continues to shift from the regulator to market participants is equally evident.
There is however a disconnect between the financial services industry and other industries in terms of how the user experience develops. There is a need to get dashboard designers involved. It’s very easy to give a terrible interface because there is a lack of incentive to invest and firms are choosing to not spend unless it is deemed an absolute necessity. Industries such as healthcare and some engineering sciences, offer users of data the opportunity to not only access large amounts of data, but also more intuitive interpretation. In financial services, for example – highlighting a position that that breaches an accepted standard deviation may be more valuable in terms of identifying any immediate risk to the firm, than concentrating on the overall trading picture.
It was generally agreed that the scope of products – alerts, flags and other user interface points, have to get smarter. The ongoing battle is one of accuracy vs. precision, as when data quality drops there isn’t the capacity to adapt to the drop in quality of data.
The debate moved onto the role of cloud computing, and how cloud services could help collaboration between regulators and market participants, and how data feeds could be shifted to a public cloud. This is a viable gateway to have data routed to your own private cloud. But a key concern hinges around privacy for internal data. Regulation tends to be nationalistic in approach which is particularly challenging in both Asian and European markets.
The elephant in the room, as decided by the panellists, is that the regulator has outpaced those involved in
compliance and technological advancement. End users are not making the most of the products that they have access to. People don’t have the skill-set necessary to build their own technology.
A balance in a firm’s team is needed so they understand what they are talking about and individual decision makers are properly informed to enable them to do so. There is currently a disconnect between IT and everyone else. IT builds what they think a firm needs, but often it hasn’t been relayed what the technology is needed for. Technologists often fail to deliver because they are not practitioners. A balance is obviously required.
The broad conclusion was that people’s skills need to be updated so that the collaboration between developers and practitioners is much easier. The skills should flow in both directions, and that would help reduce unnecessary expenditure. More effective data visualisation and processing would support this, by helping compliance and risk officers exploit the data that they have to the full.