immimagery – stock.adobe.com
Central banks need not fear the widespread tokenization of money and securities, according to recent findings from the Federal Reserve Bank of New York and the Bank for International Settlements.
On Wednesday, the two organizations
The report concluded that, if properly outfitted, central banks could continue to transmit monetary policy within their jurisdictions — and could even benefit from the
Tokenization is the process of creating a digital representation of something else for use on a programmable platform. In this case, the underlying assets being considered were central bank reserves, commercial bank deposits, securities and other unsecuritized assets.
Carried out by the New York Fed’s New York Innovation Center and the BIS Innovation Hub’s Swiss Centre, Project Pine was carried out in response to the
The groups created a generic toolkit for central banks to deal with these digitized assets using self-executing codes — known as smart contracts — for functions such as paying interest on reserves, engaging in swaps and repurchase agreements, and executing asset purchases or sales. They then put these tools through a series of tests to see how they behave under various conditions and monetary policy scenarios.
Researchers found that the prototype “successfully” and “instantaneously” carried out its intended functions under all the hypothetical scenarios, each of which drew from historical data on past market activities, as well as consultations with numerous central banks and their advisors.
If central banks adopt the proper policies and technologies — keeping an eye toward safeguarding privacy and limiting the ramifications of coding errors — the report concludes that a tokenized banking system could enhance their ability to effectuate monetary policy by making it easier to implement new facilities or modify existing ones.
“This could allow future central banks to be nimbler in uncertain conditions and potentially reduce frictions between the time of announcements and offerings,” the report states. “There might also be operational efficiencies from automating collateral management.”
Still, a shift tokenization is not without tradeoffs and potential complications for central banks. Along with the cost of onboarding new technologies to operate in this new environment, central banks could find themselves needing “privileged access to data” to engage in and facilitate transactions. This would entail establishing “higher standards of privacy and security” than most other blockchain participants.
The underlying technologies themselves also come with potential pitfalls for central banks. These include higher costs that could make tokenized transactions economically nonviable for smaller participants.
The report adds that the term “smart contract” is a misnomer, as the codes themselves are neither sufficiently dynamic — as they operate only as commanded — nor legally binding. There is also the “oracle problem” with which central banks must contend.
“When smart contracts execute on information from outside their programmable platform, they require an external data source — or ‘oracle’ — to provide it,” the report notes. “The oracle problem is that smart contracts, not being smart, will execute on whatever information is provided, regardless of its accuracy.”
The report also draws distinctions between a tokenized ecosystem intermediated by a central bank and a true decentralized finance, or DeFi, environment, which can be prone to greater risks and offers less opportunities for addressing issues related to smart contracts and other elements.
Central banks also must consider how to make their systems interoperable with one another and, potentially, how to deal with a wider variety of currencies and underlying assets.
Still, the report concludes that various issues and risks related to tokenization can be solved for. It does not present Project Pine as a plug-and-play solution for central banks looking to engage in this space, but rather as a starting point from which they can build.
“Outlining requirements for a jurisdiction is a task that each central bank will perform alone,” the report states. “However, Project Pine’s results offer central banks a starting point for better understanding the opportunities, risks, and requirements of adopting tokenization in their respective jurisdictions.”