Reference Data Utility: Pipeline of the Future or Pipedream of the Past?

John Mason, SmartStream - 13 April 2010

A centralised processing utility providing clean, consistent reference data has once again come to the fore as the industry wrestles with its current challenges.

It is a question that financial company executives continue to ask of their data management staff: why does reference data still cause so many issues?

It is not a new problem, of course. The industry has looked at many ways to resolve the issues at hand including enterprise data management (EDM) solutions, business process outsourcing (BPO) and managed services, all of which have never quite delivered on the initial promise. So the question remains, despite all of these best efforts - is it a problem that can be solved or is it one of those issues that just gets put into the ‘too hard’ category and what we have is deemed just a necessary cost in doing business?

Surely, however, there is an argument that subscribing to the latter theory has resulted in the current issues, where a patchwork of identifiers, descriptions and codes is constantly managed, re-worked and added to as the problem just seems too big to address at a grass roots level.

Increasingly the industry is beginning to evaluate whether there is a place for an industry utility where all reference data can be sourced once, for all the market, ensuring that all participants are using the same clean and consistent data.

For many this remains a pipedream, an unachievable goal, but the recent turmoil in the financial markets has led many to question whether it is now not just desirable but a necessity to have such a utility. Recent initiatives by organisations such as the European Central Bank (ECB) or the National Institute of Finance in the US have all advocated the move to a single utility approach for securities and legal entity data in order to provide the risk mitigation and transparency that the regulators are now demanding.

So is the time right for just such a utility? Are the market forces from within organisations to lower the cost of trade processing, as well as the external forces from regulators, conspiring to create the market conditions that would allow for just such a utility to be created?

Change of Approach

The industry needs to address the issue of reference data in the market and its impact on the transaction lifecycle as a whole, not just in a single financial institution - there is a subtle but distinct difference. In the past the solutions that have been put into the market placed have looked to lower the data management costs of the organisation that they are dealing with and that alone. Be it EDM, BPO or managed services, all have offered a reduction in data management costs but have not crucially addressed the fundamental issue, which is that there are several parties involved in any transaction and all parties need to be using the same data in order for the trade not to break. Solving the problem unilaterally doesn’t address this major issue.

It needs to be recognised that the provision of clean, consistent reference data is a means to an end rather than the end in itself. For many, this should be achieved by the standardisation of data across the industry and the implementation of a single code. While laudable in its sentiments, it is arguable as to how practical this really is given how embedded certain codes are today.

The utility offers a migratory approach to this; however, it should not be the case that the solution to the industry’s needs is either standards or a utility. There seems to be a feeling that is it one or the other, that somehow they are mutually exclusive. This is not the case.

Today, a utility can offer the kind of cross-referencing service that will allow organisations to navigate across symbols. Although this approach does nothing to reduce the number of symbols in play to begin with, it does offer a meta-layer that allows greater transparency across those symbols while still preserving the provenance of the data so underlying processing systems can still function. Over time, as systems are replaced or refreshed, the codes used by those systems and data-models can be addressed in the move towards a standard. As a consequence, the utility’s purpose alters from maintaining the cross-referencing across codes to maintaining the standards themselves.

There is no doubt that the financial crisis focused many people’s attention to all aspects of reference data - from symbology to legal entity management. Many saw Lehman as ‘too big to fail’, so its ultimate demise sent shockwaves throughout the industry as organisations struggled to understand their exposure to certain companies or instruments. Combined with the need, if not the demand, for greater transparency from the regulators, the management of reference data cannot continue as if nothing had happened. To many, a totally new approach must be taken.

Against this backdrop, the approach of an industry-wide utility has real appeal. By providing greater transparency in the short term without the need for massive infrastructure change and cost, while offering the nirvana of standards in the long term and allowing a phased-in approach over time, the utility would seem to many to be the only practicable way forward.

Back to top