With reduced settlement times on the horizon, regulatory hurdles and increasing amounts of unstructured data, there is now an urgent need to boost the use of technology and automation in capital markets. That’s according to a new report from Coalition Greenwich, titled Data Automation: The Workflow Efficiency Game-Changer. Josh Monroe, chief revenue officer at Xceptor, spoke exclusively to BEST EXECUTION on how contemporary data differs in both volume and kind, and what firms are looking for in a third-party data processing provider.
As the 2024 T+1 implementation deadline approaches, concerns are rising about the readiness of market participants, especially smaller institutions, the report outlines, with fears the increasing complexity of data, legacy technology, and multiple systems could potentially lead to a scramble to meet the deadline.
Speaking exclusively to BEST EXECUTION, Monroe said: “Previously, data volumes were lower, and the data itself was often simple, structured data. Today, volumes are growing exponentially, particularly unstructured data, which can be more difficult to ingest, normalize, cleanse, and validate. New and changing regulations also impact firms’ data processing needs.”
“Many financial institutions have previously built internal systems and processes to manage their data and workflows and prefer to continue to use or adapt these systems. However, legacy technology is unlikely to be able to meet the challenge of more data and more complexity, making it less and less likely to be able to meet regulator, client, or other business needs,” Monroe said.
“Financial institutions are more likely to use a third-party solution when they are confident the solution can meet their needs, not just in terms of data processing, but also in terms of maintaining the same level of data protection, data privacy, and data governance in a transparent and auditable manner. Ease of use and configurability are also very important,” Monroe added.
“Offline reconciliations are usually less efficient and more likely to be subject to errors. Proprietary systems, built by firms using desktop tools, do not have the specific automation and data management processes that are required for accurate, timely reconciliations at scale,” Monroe told BEST EXECUTION.
“Similarly, manual data cleansing is less efficient, prone to errors and inaccuracies, and is often resource intensive. This is particularly challenging when the data is required for complex workflows or to meet regulatory or client requirements within tight timelines,” Monroe said.
The Coalition Greenwich report, which leverages feedback from more than 60 C-suite and senior leaders from capital markets firms in North America, the UK and Europe, underscores the challenges faced by financial institutions in finding effective solutions. While 60% of the market uses at least one third-party system, a mixed approach of third-party and proprietary systems leads to significant inefficiencies, requiring additional manpower to reconcile.
The report found that the percentage of offline reconciliation is highest among users of proprietary solutions. Users of third-party solutions, or a combination of third-party and proprietary see lower levels of offline reconciliation. Proprietary solutions also see high rates of manual data cleansing.
Additionally, third-party solutions are viewed as highly capable of standardising data, with users benefiting from workflow efficiencies that arise. Almost 80% of firms using a third-party data cleansing solution rely on these services for all of their data transformation.
However, more than half of study respondents said they do not outsource data cleansing, choosing to use proprietary systems and internal staff which can be labour intensive, particularly for data linked to nascent products such as digital assets.
About 36% of respondents are using manual processes to cleanse 10% or less of their data while nearly a third still use manual processes for data cleansing more than 50% of the time.
Audrey Blater, senior analyst for Coalition Greenwich market structure and technology, and author of the report, said improving efficiency and accuracy across the trade lifecycle is essential for firms to reduce costs, lower the need for extensive resources and minimise risks.
“Many financial institutions are attempting to solve this with a combination of proprietary systems, internal staff, third party providers and manual processes, with mixed results. To accommodate market changes, evolving regulation and the ever-expanding data needs, robust, scalable technology for data automation and processing is absolutely crucial,” Blater added.
© Markets Media Europe 2023