Despite significant technology advances throughout the settlement cycle, a major pain point continues to challenge the timely resolution of transactions: collateral management and transfer.
While billed as a reliable, secure and transparent solution when introduced in 2017, tokenisation has not materialised on a commercial scale. However, firms throughout the transaction supply chain continue to invest in and experiment with the technology. And it continues to earn headlines with small wins. Perhaps with some scale, the technology will take off.
Delivery of mail by post office versus email offers an oversimplified window into the appeal of relying on tokens in the settlement process.
Currently, the settlement process of transferring assets and collateral often takes multiple days to finish, due to the number of “mail carriers” needed to handle the funds.
The tokenisation of assets refers to the process of issuing a digital token that represents ownership in a real asset. Through digitalisation, it becomes eminently more mobile. Utilising distributed ledger technology, the token is issued on a distributed ledger, and the transfer of ownership is recorded on that ledger.
Once an asset becomes tokenised, “atomic” settlement becomes possible (i.e. simultaneous and instantaneous), akin to clicking send on an email, as opposed to the current multi-day process. This would mean more liquidity due to having less collateral tied up for days. It also likely would reduce or potentially eliminate counterparty risk.
Additionally, one can make tokens “smart,” by programming the logical steps of a process into the code to automate the transfer. This automation option adds to the efficiency with which it could operate.
Anyone who has collaborated (or tried to) on a document at the office knows the pains of version control. Perhaps a colleague downloaded a copy to their desktop to work offline. Maybe someone attached a document instead of linking to it. Reconciling those versions often leads to missed edits or outdated information.
Just as cloud services synch document edits in real time at the office, DLT offers a decentralised environment for recording the transaction of assets. Often referred to as blockchain technology, the transparent process allows all parties of the transaction to concurrently contribute to and define the parameters of the deal. It also allows all parties involved to see who uses and/or modifies the ledger and ensures each party has access to the current version.
Throughout the process, it creates an immutable database – the all-important permanent record.
Each of these characteristics provides a high level of trust among the parties involved and nearly eliminates opportunities for fraudulent activities to occur in the ledger.
“In allowing for the simultaneous exchange of two assets in real-time and enabling the exchange of information and value to happen in a single step, this can help eliminate settlement risk, duplicative reconciliation, and increase the efficiency of transaction processing,” said Leong Sing Chiong, deputy managing director of markets and development at the Monetary Authority of Singapore, at the Layer One Summit on 4 November 2024.
“With a programmable platform that allows for pre-determined conditions to be encoded with the tokenised assets, this can also facilitate greater straight-through processing in capital market transactions, and greater efficiency in asset servicing.”
Leong cited a use case that combined tokenising money market funds and foreign exchange. “A solution developed by Citi and Fidelity International combined the properties of two distinct asset classes - tokenised money market funds and FX swaps. This solution seamlessly combined yield generation of tokenised MMF tokens with real-time digital currency risk hedging.”
However, challenges exist related to interoperability. With a lack of standardised protocols, compatibility issues may arise between different DLT platforms.
Simon Millington, head of business development at CloudMargin, a technology vendor that provides collateral management solutions, highlighted this in a discussion at the 2024 FIA forum in Frankfurt.
“We’re looking for a single source of truth and transparency, not just across the industry, but within firms as well,” Millington said. “We all know in financial services that you can have multiple versions of the same thing in multiple different systems. Where it’s difficult is getting a single record, a single truth. Then, fundamentally, you don’t have reconciliation breaks in your own platforms as well.”
As with any new technology project, sandboxes offer a secure, isolated environment to test and analyse code and applications without affecting the rest of a system.
The largest post-trade organisation in the world, the Depository Trust & Clearing Corporation, announced its sandbox, DTCC Digital Launchpad, in October 2024.
“We’re launching an industry sandbox to try to get the entire industry to start to collaborate and cooperate, where we can build a common infrastructure so we can accelerate the velocity of adoption in this space,” Nadine Chakar, global head of DTCC Digital Assets, said in November during a panel discussion at FIA Expo. “We’re trying to collapse a lot of the infrastructure that propagated over time. We believe that with common infrastructure, common rules and a common mission and purpose, we could accelerate the adoption of tokenised assets within the US market.”
JP Morgan has developed its own privately operated blockchain, Onyx. It includes a JP Morgan coin, digital finance and its tokenised collateral network. In comments made at the FIA forum in Frankfurt, Katie Emerson, EMEA head of agency lending and collateral management sales, said the bank has experienced several benefits from the platform.
She explained how the buy side has seen the benefits when the bank went live with tokenised money market funds. There was additional utility for the assets that were otherwise trapped and not really used as collateral.
“The UK LDI (liability-driven investment) crisis and mini budget gave us real focus and emphasis on where clients were struggling to redeem out of money market funds, move the cash, the settlement timeframes and the cycle around that. So, we felt that money market funds were a good use case as the first asset class to tokenise,” Emerson said.
The next steps for JP Morgan? More buy-in. “As we’ve tokenised and successfully mobilised money market funds as collateral for non-cleared derivatives, we’re looking to grow the network effect. We need to grow the participation on our tokenised collateral network. It only really works if you have more people joining it, on both the collateral provider and collateral receiver side.”
Where they have developed interest on the buy-side, Emerson believes JP Morgan’s efforts to tokenise other asset classes, particularly US Treasurys, will entice the sell-side.
In thinking about it from a client perspective, Christoph Hock, head of tokenisation and digital assets at Union Investment, put it simply during the panel discussion at the FIA forum in Frankfurt, “Looking at tokenisation of traditional assets, it’s about higher speed, it’s about lower cost, it’s about lower risk, and it’s creating more efficiencies.”
“Tokenisation of funds is a key theme in our industry, with tokenised money market funds that might settle in the future. Also, a T+0 basis can be used as collateral in crisis situations,” he added.
Similarly, Chakar laid out three opportunities she sees for tokenisation in the cleared derivatives markets. Notably, she mentioned the velocity of collateral and margin and the ability of expanding that inventory. Related to tokens and smart contracts, fully automating the life cycle of the trade to encapsulate data compliance in it will make the environment more transparent and less risky, too. As she put it at Expo, “By animating the entire life cycle, you demystify, if you will, the opacity that’s behind derivatives.”
Efthimia Kefalea, senior vice president for clearing design at Eurex Clearing AG, remarked about the efficiencies of tokenisation at the FIA forum in Frankfurt, from operational to cost efficiencies and reduced settlement times. She referenced the significant resources invested in past efforts to reach T+0 overnight and how tokenisation achieves that faster and much cheaper.
As with any new technology, questions remain about the industry’s ability to innovate without new regulations. And, if regulators must act, how can the industry help steer that movement toward uniformity and cohesion?
“Tokenisation is just a wrapper. The collateral is the collateral. We should be able to use tokenisation tools – it’s just technology – there should be no need for rule changes [in the US],” Tom Sullivan, managing director and head of business development for digital assets at Societe Generale, said during the tokenisation discussion at FIA Expo. “Market participants should be able to use existing policies and practices, etc., to identify and assess the risks and manage them with respect to DLT, as they would with any other technology they use within their firms.”
He added, “We expect regulators to be tech neutral, that they should not be regulating technology. They should be regulating the space that they’re in.”
Mike Reed, senior vice president and head of digital asset partnership development at Franklin Templeton Investments, suggests working together toward a shared outcome. Speaking alongside Sullivan and Chakar, he described how his firm has worked with regulators in the US to issue a tokenised money market fund.
“It’s much better to work with the regulators in a collaborative way than it is to battle with them,” he said. “So, we’ve done things like hosted webinars and provided educational material.”
Another advantage of using DLT is that the transactions are visible to regulators in real time – a key theme in Franklin Templeton’s discussions with regulators. “Having collaborative discussion back and forth with [regulators], and then also finding a hook where it’s beneficial for them as well, helps,” Reed added.
Eurex looks to work with regulators on tokenisation. “We need regulation to adapt and allow CCPs also to be an active player in digital ecosystems. We expect that there will be explicit guidance, because today, the CCP regulation does not really mention anything about new technologies. And of course, this has to change,” Kefalea added.
At FIA’s 2024 Asia conference, Tuang Lee Lim, assistant managing director for capital markets at MAS, acknowledged the need for clarity.
“Specifically for tokenisation, we recognise that there needs to be more legal and regulatory certainty on how certain activities are treated in the blockchain and whether they have the force of law. MAS has commenced a review of existing rules and regulations pertaining to the tokenisation of capital markets products, with a view to removing regulatory impediments and providing further guidance to the industry where necessary.”
Optimism reigns when speaking with tokenisation advocates. At the same time, they agree more work lies ahead.
“There’s a few challenges that come up when I’m speaking to clients,” Emerson from JP Morgan says. “We need to persist. We’re going to get there, but it’s taking longer than we thought. We have certain clients waiting until they see more critical mass. Legal clarity and legal complexity also comes up.”
To reach the scale needed, Leong laid out the pieces of the puzzle as MAS views it. “We think there are four jigsaw puzzle pieces that need to come together to support industry-wide deployment of tokenised assets: 1) liquidity, 2) foundational infrastructure, 3) standardised frameworks and protocols, and 4) common settlement assets.”
From a CCP perspective, Kefalea spoke about the challenge of some clients wanting tokenisation and blockchain while others staying on the current path. “As a financial market infrastructure, we have to somehow integrate both worlds. We will have a long period of time where we have to integrate the legacy systems together with the new technology systems, and to do that integration in a seamless way for our clients. This will definitely increase costs, at least for that period of time. And it may take decades.”
Perhaps the clearest outlook comes from Hock, who likened adoption to a marathon. “We are probably at kilometre 10 of the marathon. There’s still a way to go, but I think tokenisation, in the long term, will become the standard.”