Blue Economy – Sagarmala Financial Engineering: Yet Another Dig. Skeletal Sketch of an Upcoming Talk in Somnath, Gujarat.

untitled

Authorized Share Capital in the case of Sagarmala happens to be INR 1000 crore, and is the number of stock units Sagarmala Development Company Limited (SDCL) has issued in its articles of incorporation. This ASC is open, in that share capital isn’t fully used, and there is ample room for future issuance of additional stock in order to raise capital quickly as and when a demand arises. SDCL can increase the authorized capital at anytime with shareholders’ approval and paying an additional fee to the RoC, Registrar of Companies. 

Capital Budgeting: Business determines and evaluates potential large expenditures/investments. Capital Budgeting is generally a long-term venture, and is a process that SDCL would use (and uses) to identify hat capital projects would create the biggest returns compared with the funds invested in the project. The system of ranking helps establish a potential return in the future, such that the SDCL management can choose where to invest first and most. Let us simply call it the first and most principle of budgeting. Blue Economy that instantiates itself via Sagarmala in India has options to choose from as regards its Capital Budgeting, viz. 

  1. Throughput analysis – This defines the main motives behind a project, where all the costs are operating costs, and the main emphasis is on maximizing profits in passing through a bottleneck. The best example for Sagarmala speculatively thought out is the marking of Western Shipping Corridor for container traffic and posing a livelihood threat to traditional fishermen. Throughput is an alternative to the traditional cost accounting, but is neither accounting, not costing, since it is focused on cash flows. It does not allocate fixed costs to products and services sold or provided and treats direct labour as a fixed expense. Decisions made are based on three critical monetary variables: throughput, investment or inventory and operating expenses. Mathematically, this is defined as revenue minus totally variable expenses, the cost of raw materials or services incurred to produce the products sold or services delivred. T = R – TVE. 
  2. Net Present Value (NPV) – this s the value of all future cash flows, either positive or negative over the entire life of an investment discounted to the present. NPV forms a part of an intrinsic valuation, and is employed for valuing business, investment security, capital project, new venture, cost reduction and almost anything involving cash flows. 

NPV = z1/(1 + r) + z2/(1 + r)2 – X

      , where z1 is the cash flow in time 1, z2 is the cash flow in time 2, r is the discount       range, and X is the purchase price, or initial investment. NPV takes into account the timing of each cash flow that can result in a large impact on the present value of an investment. It is always better to have cash inflows sooner and cash outflows later. this is one spect where SDCL might encounter a bottleneck and thereby take recourse to throughput analysis. Importantly, NPV deliberates on revolving funds.  

  1. Internal Rate of Return (IRR) – this is an interest rate at which NPV from all cash flows become zero. IRR qualifies attractiveness of an investment, whereby if IRR of a new project exceeds company’s required rate of return, then investment in that project is desirable, else project stands in need of a rejection. IRR escapes derivation analytically, and must be noted via mathematical trial and error. Interestingly, business spreadsheets are automated to perform these calculations. Mathematically, IRR is:

0 = P0 + P1/(1 + IRR) + P2/(1 + IRR)2 + …. + Pn/(1 + IRR)n

, where P0, P1,…, Pn are cash flows in periods of time 1, 2, …, n. 

 With a likelihood of venture capital and private equity expected in Sagarmala accompanied with multiple cash investments over the life-cycle of the project, IRR could come in handy for an IPO. 

     4. Discounted Cash Flow – this calculates the present value of an investment’s future            cash flows in order to arrive at  current fair value estimate for an investment. Mathematically, 

DCF =  CF1/(1 + r) + CF2/(1 + r)2 + CF3/(1 + r)3 + … + CFn/(1 + r)n

, where CFn are cash flows in respective n periods, and r is discount rate of return. 

DCF accounts for the fact that money received today can be invested today, while money we have to wait for cannot. DCF accounts for the time value of money and provides an estimate of what e should spend today to have an investment worth a certain amount of money at a specific point in the future. 

       5. Payback period – mathematically, this is defined as: 

Payback Period = Investment required/Annual Project Cash flow

This occurs the year plus a number of months before the cash flow turns positive. Though seemingly important, payback period does not consider the time value of investment/money, and is quite inept at handling projects with uneven cash flows. 

As a recap (and here, here, here)

Sagarmala is a 3-tier SPV structure

untitled

Private Players/PPPs OR EPCs/Turnkey – the latter are used for projects with high social impact or low IRR. 

Expenses incurred for project development will be treated as part of equity contribution by SDCL, or, in case SDCL does not have any equity, or expenses incurred are more than the stake of SDCL, SPV will defray SDCL. Divestment possibilities cannot be ruled out in order to recoup capital for future projects. 

Statistical Arbitrage. Thought of the Day 123.0

eg_arb_usd_hedge

In the perfect market paradigm, assets can be bought and sold instantaneously with no transaction costs. For many financial markets, such as listed stocks and futures contracts, the reality of the market comes close to this ideal – at least most of the time. The commission for most stock transactions by an institutional trader is just a few cents a share, and the bid/offer spread is between one and five cents. Also implicit in the perfect market paradigm is a level of liquidity where the act of buying or selling does not affect the price. The market is composed of participants who are so small relative to the market that they can execute their trades, extracting liquidity from the market as they demand, without moving the price.

That’s where the perfect market vision starts to break down. Not only does the demand for liquidity move prices, but it also is the primary driver of the day-by-day movement in prices – and the primary driver of crashes and price bubbles as well. The relationship between liquidity and the prices of related stocks also became the primary driver of one of the most powerful trading models in the past 20 years – statistical arbitrage.

If you spend any time at all on a trading floor, it becomes obvious that something more than information moves prices. Throughout the day, the 10-year bond trader gets orders from the derivatives desk to hedge a swap position, from the mortgage desk to hedge mortgage exposure, from insurance clients who need to sell bonds to meet liabilities, and from bond mutual funds that need to invest the proceeds of new accounts. None of these orders has anything to do with information; each one has everything to do with a need for liquidity. The resulting price changes give the market no signal concerning information; the price changes are only the result of the need for liquidity. And the party on the other side of the trade who provides this liquidity will on average make money for doing so. For the liquidity demander, time is more important than price; he is willing to make a price concession to get his need fulfilled.

Liquidity needs will be manifest in the bond traders’ own activities. If their inventory grows too large and they feel overexposed, they will aggressively hedge or liquidate a portion of the position. And they will do so in a way that respects the liquidity constraints of the market. A trader who needs to sell 2,000 bond futures to reduce exposure does not say, “The market is efficient and competitive, and my actions are not based on any information about prices, so I will just put those contracts in the market and everybody will pay the fair price for them.” If the trader dumps 2,000 contracts into the market, that offer obviously will affect the price even though the trader does not have any new information. Indeed, the trade would affect the market price even if the market knew the selling was not based on an informational edge.

So the principal reason for intraday price movement is the demand for liquidity. This view of the market – a liquidity view rather than an informational view – replaces the conventional academic perspective of the role of the market, in which the market is efficient and exists solely for conveying information. Why the change in roles? For one thing, it’s harder to get an information advantage, what with the globalization of markets and the widespread dissemination of real-time information. At the same time, the growth in the number of market participants means there are more incidents of liquidity demand. They want it, and they want it now.

Investors or traders who are uncomfortable with their level of exposure will be willing to pay up to get someone to take the position. The more uncomfortable the traders are, the more they will pay. And well they should, because someone else is getting saddled with the risk of the position, someone who most likely did not want to take on that position at the existing market price. Thus the demand for liquidity not only is the source of most price movement; it is at the root of most trading strategies. It is this liquidity-oriented, tectonic market shift that has made statistical arbitrage so powerful.

Statistical arbitrage originated in the 1980s from the hedging demand of Morgan Stanley’s equity block-trading desk, which at the time was the center of risk taking on the equity trading floor. Like other broker-dealers, Morgan Stanley continually faced the problem of how to execute large block trades efficiently without suffering a price penalty. Often, major institutions discover they can clear a large block trade only at a large discount to the posted price. The reason is simple: Other traders will not know if there is more stock to follow, and the large size will leave them uncertain about the reason for the trade. It could be that someone knows something they don’t and they will end up on the wrong side of the trade once the news hits the street. The institution can break the block into a number of smaller trades and put them into the market one at a time. Though that’s a step in the right direction, after a while it will become clear that there is persistent demand on one side of the market, and other traders, uncertain who it is and how long it will continue, will hesitate.

The solution to this problem is to execute the trade through a broker-dealer’s block-trading desk. The block-trading desk gives the institution a price for the entire trade, and then acts as an intermediary in executing the trade on the exchange floor. Because the block traders know the client, they have a pretty good idea if the trade is a stand-alone trade or the first trickle of a larger flow. For example, if the institution is a pension fund, it is likely it does not have any special information, but it simply needs to sell the stock to meet some liability or to buy stock to invest a new inflow of funds. The desk adjusts the spread it demands to execute the block accordingly. The block desk has many transactions from many clients, so it is in a good position to mask the trade within its normal business flow. And it also might have clients who would be interested in taking the other side of the transaction.

The block desk could end up having to sit on the stock because there is simply no demand and because throwing the entire position onto the floor will cause prices to run against it. Or some news could suddenly break, causing the market to move against the position held by the desk. Or, in yet a third scenario, another big position could hit the exchange floor that moves prices away from the desk’s position and completely fills existing demand. A strategy evolved at some block desks to reduce this risk by hedging the block with a position in another stock. For example, if the desk received an order to buy 100,000 shares of General Motors, it might immediately go out and buy 10,000 or 20,000 shares of Ford Motor Company against that position. If news moved the stock price prior to the GM block being acquired, Ford would also likely be similarly affected. So if GM rose, making it more expensive to fill the customer’s order, a position in Ford would also likely rise, partially offsetting this increase in cost.

This was the case at Morgan Stanley, where there were maintained a list of pairs of stocks – stocks that were closely related, especially in the short term, with other stocks – in order to have at the ready a solution for partially hedging positions. By reducing risk, the pairs trade also gave the desk more time to work out of the trade. This helped to lessen the liquidity-related movement of a stock price during a big block trade. As a result, this strategy increased the profit for the desk.

The pairs increased profits. Somehow that lightbulb didn’t go on in the world of equity trading, which was largely devoid of principal transactions and systematic risk taking. Instead, the block traders epitomized the image of cigar-chewing gamblers, playing market poker with millions of dollars of capital at a clip while working the phones from one deal to the next, riding in a cloud of trading mayhem. They were too busy to exploit the fact, or it never occurred to them, that the pairs hedging they routinely used held the secret to a revolutionary trading strategy that would dwarf their desk’s operations and make a fortune for a generation of less flamboyant, more analytical traders. Used on a different scale and applied for profit making rather than hedging, their pairwise hedges became the genesis of statistical arbitrage trading. The pairwise stock trades that form the elements of statistical arbitrage trading in the equity market are just one more flavor of spread trades. On an individual basis, they’re not very good spread trades. It is the diversification that comes from holding many pairs that makes this strategy a success. But even then, although its name suggests otherwise, statistical arbitrage is a spread trade, not a true arbitrage trade.

ε-calculus and Hilbert’s Contentual Number Theory: Proselytizing Intuitionism. Thought of the Day 67.0

Untitled

Hilbert came to reject Russell’s logicist solution to the consistency problem for arithmetic, mainly for the reason that the axiom of reducibility cannot be accepted as a purely logical axiom. He concluded that the aim of reducing set theory, and with it the usual methods of analysis, to logic, has not been achieved today and maybe cannot be achieved at all. At the same time, Brouwer’s intuitionist mathematics gained currency. In particular, Hilbert’s former student Hermann Weyl converted to intuitionism.

According to Hilbert, there is a privileged part of mathematics, contentual elementary number theory, which relies only on a “purely intuitive basis of concrete signs.” Whereas the operating with abstract concepts was considered “inadequate and uncertain,” there is a realm of extra-logical discrete objects, which exist intuitively as immediate experience before all thought. If logical inference is to be certain, then these objects must be capable of being completely surveyed in all their parts, and their presentation, their difference, their succession (like the objects themselves) must exist for us immediately, intuitively, as something which cannot be reduced to something else.

The objects in questions are signs, both numerals and the signs that make up formulas a formal proofs. The domain of contentual number theory consists in the finitary numerals, i.e., sequences of strokes. These have no meaning, i.e., they do not stand for abstract objects, but they can be operated on (e.g., concatenated) and compared. Knowledge of their properties and relations is intuitive and unmediated by logical inference. Contentual number theory developed this way is secure, according to Hilbert: no contradictions can arise simply because there is no logical structure in the propositions of contentual number theory. The intuitive-contentual operations with signs form the basis of Hilbert’s meta-mathematics. Just as contentual number theory operates with sequences of strokes, so meta-mathematics operates with sequences of symbols (formulas, proofs). Formulas and proofs can be syntactically manipulated, and the properties and relationships of formulas and proofs are similarly based in a logic-free intuitive capacity which guarantees certainty of knowledge about formulas and proofs arrived at by such syntactic operations. Mathematics itself, however, operates with abstract concepts, e.g., quantifiers, sets, functions, and uses logical inference based on principles such as mathematical induction or the principle of the excluded middle. These “concept-formations” and modes of reasoning had been criticized by Brouwer and others on grounds that they presuppose infinite totalities as given, or that they involve impredicative definitions. Hilbert’s aim was to justify their use. To this end, he pointed out that they can be formalized in axiomatic systems (such as that of Principia or those developed by Hilbert himself), and mathematical propositions and proofs thus turn into formulas and derivations from axioms according to strictly circumscribed rules of derivation. Mathematics, to Hilbert, “becomes an inventory of provable formulas.” In this way the proofs of mathematics are subject to metamathematical, contentual investigation. The goal of Hilbert is then to give a contentual, meta-mathematical proof that there can be no derivation of a contradiction, i.e., no formal derivation of a formula A and of its negation ¬A.

Hilbert and Bernays developed the ε-calculus as their definitive formalism for axiom systems for arithmetic and analysis, and the so-called ε-substitution method as the preferred approach to giving consistency proofs. Briefly, the ε-calculus is a formalism that includes ε as a term-forming operator. If A(x) is a formula, then εxA(x) is a term, which intuitively stands for a witness for A(x). In a logical formalism containing the ε-operator, the quantifiers can be defined by: ∃x A(x) ≡ A(εxA(x)) and ∀x A(x) ≡ A(εx¬A(x)). The only additional axiom necessary is the so-called “transfinite axiom,” A(t) → A(εxA(x)). Based on this idea, Hilbert and his collaborators developed axiomatizations of number theory and analysis. Consistency proofs for these systems were then given using the ε-substitution method. The idea of this method is, roughly, that the ε-terms εxA(x) occurring in a formal proof are replaced by actual numerals, resulting in a quantifier-free proof. Suppose we had a (suitably normalized) derivation of 0 = 1 that contains only one ε-term εxA(x). Replace all occurrences of εxA(x) by 0. The instances of the transfinite axiom then are all of the form A(t) → A(0). Since no other ε-terms occur in the proof, A(t) and A(0) are basic numerical formulas without quantifiers and, we may assume, also without free variables. So they can be evaluated by finitary calculation. If all such instances turn out to be true numerical formulas, we are done. If not, this must be because A(t) is true for some t, and A(0) is false. Then replace εxA(x) instead by n, where n is the numerical value of the term t. The resulting proof is then seen to be a derivation of 0 = 1 from true, purely numerical formulas using only modus ponens, and this is impossible. Indeed, the procedure works with only slight modifications even in the presence of the induction axiom, which in the ε-calculus takes the form of a least number principle: A(t) → εxA(x) ≤ t, which intuitively requires εxA(x) to be the least witness for A(x).

Data Governance, FinTech, #Blockchain and Audits (Upcoming Bangalore Talk)

This is skeletal and I am febrile, and absolutely nowhere near being punctilious. The idea is to note if this economic/financial revolution, (could it even be called that?) could politically be an overtone window? So, let this be otiose and information disseminating, for a paper is on its way forcing down greater attention to detail and vastly different from here. 

Data Governance and Audit Trail

Data Governance specifies the framework for decision rights and accountabilities encouraging desirable behavior in data usage

Main aim of Data Governance is to ensure that data asset are overseen in a cohesive and consistent enterprise-wide manner

Why is there a need for Data governance? 

Evolving regulatory mechanisms and requirements

Could integrity of data be trusted?

Centralized versus decentralized documentation as regards use, hermeneutics and meaning of data

Multiplicity of data silos with exponentially rising data

Architecture

Information Owner: approving power towards internal + external data transfers + business plans prioritizing data integrity and data governance

Data steward: create/maintain/define data access, data mapping and data aggregation rules

Application steward: maintain application inventory, validating testing of outbound data and assist master data management

Analytics steward: maintain a solutions inventory, reduce redundant solutions, define rules for use of standard definitions and report documentation guidelines, and define data release processes and guidelines

What could an audit be?

It starts as a comprehensive and effective program encompassing people, processes, policies, controls, and technology. Additionally, it involves educating key stakeholders about the benefits and risks associated with poor data quality, integrity and security.

What should be audit invested with?

Apart from IT knowledge and operational aspects of the organization, PR skills, dealing with data-related risks and managing a push-back or a cultural drift handling skills are sine qua non. As we continue to operate in one of the toughest and most uneven economic climates in modern times, the relevance of the role of auditors in the financial markets is more important than ever before. While the profession has long recognized the impact of data analysis on enhancing the quality and relevance of the audit, mainstream use of this technique has been hampered due to a lack of efficient technology solutions, problems with data capture and concerns about privacy. However, recent technology advancements in big data and analytics are providing an opportunity to rethink the way in which an audit is executed. The transformed audit will expand beyond sample-based testing to include analysis of entire populations of audit-relevant data (transaction activity and master data from key business processes), using intelligent analytics to deliver a higher quality of audit evidence and more relevant business insights. Big data and analytics are enabling auditors to better identify financial reporting, fraud and operational business risks and tailor their approach to deliver a more relevant audit. While we are making significant progress and are beginning to see the benefits of big data and analytics in the audit, this is only part of a journey. What we really want is to have intelligent audit appliances that reside within companies’ data centers and stream the results of our proprietary analytics to audit teams. But the technology to accomplish this vision is still in its infancy and, in the interim, what is transpiring is delivering audit analytics by processing large client data sets within a set and systemic environment, integrating analytics into audit approach and getting companies comfortable with the future of audit. The transition to this future won’t happen overnight. It’s a massive leap to go from traditional audit approaches to one that fully integrates big data and analytics in a seamless manner.

Three key areas the audit committee and finance leadership should be thinking about now when it comes to big data and analytics:

External audit: develop a better understanding of how analytics is being used in the audit today. Since data capture is a key barrier, determine the scope of data currently being captured, and the steps being taken by the company’s IT function and its auditor to streamline data capture.

Compliance and risk management: understand how internal audit and compliance functions are using big data and analytics today, and management’s future plans. These techniques can have a significant impact on identifying key risks and automating the monitoring processes.

Competency development: the success of any investments in big data and analytics will be determined by the human element. Focus should not be limited to developing technical competencies, but should extend to creating the analytical mindset within the finance, risk and compliance functions to consume the analytics produced effectively.

What is the India Stack?

A paperless and cashless delivery system; a paradigm that is intended to handle massive data inflows enabling entrepreneurs, citizens and government to interact with each other transparently; an open system to verify businesses, people and services.

This is an open API policy that was conceived in 2012 to build upon Aadhaar. The word open in the policy signifies that other application could access data. It is here that the affair starts getting a bit murky, as India Stack gives the data to the concerned individual and lets him/her decide who the data can be shared with.

financialsecurity_007

So, is this a Fintech? Fintech is usually applies to the segment of technology startup scene that is disrupting sectors such as mobile payments, money transfers, loans, fundraising and even asset management. And what is the guarantee that Fintech would help prevent fraud that traditional banking couldn’t? No technology can completely eradicate fraud and human deceit, but I believe technology can make operations more transparent and systems more accountable. To illustrate this point, let’s look back at the mortgage crisis of 2008.

Traditional banks make loans the old fashioned way: they take money from people at certain rates (savings deposits) and lend it out the community at a higher rate. The margin constitutes the bank’s profit. As the bank’s assets grow, so do their loans, enabling them to grow organically.

Large investment banks bundle assets into securities that they can sell on open markets all over the world. Investors trust these securities because they are rated by third party agencies such as Moody’s and Standard & Poor’s. Buyers include pension funds, hedge funds, and many other retail investment instruments.

The ratings agencies are paid by investment banks to rate them. Unfortunately, they determine these ratings not so much by the merits of the securities themselves, but according to the stipulations of the banks. If a rating fails to meet the investment banks’ expectations, they can take their business to another rating agency. If a security does not perform as per the rating, the agency has no liability! How insane is that?

Most surprisingly, investment banks can hedge against the performance of these securities (perhaps because they know that the rating is total BS?) through a complex process that I will not get into here.

Investment banks and giant insurance firms such as AIG were the major dominoes that nearly caused the whole financial system to topple in 2008. Today we face an entirely different lending industry, thanks to FinTech. What is FinTech? FinTech refers to a financial services company (not a technology company) that uses superior technology to bring newer and better financial products to consumers. Many of today’s FinTech companies call themselves technology companies or big data companies, but I respectfully disagree. To an outsider, a company is defined by its balance sheet and a FinTech company’s balance sheet will tell you that it makes money from the fees, interest, and service charges on their assets—not by selling or licensing technology. FinTech is good news not only for the investors, borrowers and banks collectively, but also for the financial services industry as a whole because it ensures greater transparency and accountability while removing risk from the entire system. In the past four to five years a number of FinTech companies have gained notoriety for their impact on the industry. I firmly believe that this trend has just begun. FinTech companies are ushering in new digital business models such as auto-decisioning. These models are sweeping through thousands of usual and not-so-usual data sources for KYC and Credit Scoring.

But already a new market of innovative financial products has entered into mainstream finance. As their market share grows these FinTech companies will gradually “de-risk” the system by mitigating the impact of large, traditional, single points of failure. And how will the future look? A small business might take its next business loan from Lending Club, OnDeck, Kabbage, or DealStruck, instead of a traditional bank. Rather than raising funds from a venture capital firm or other traditional investor, small businesses can now look to Kickstarter or CircleUp. Sales transactions can be processed with fewer headaches by Square or Stripe. You can invest your money at Betterment or Wealthfront and not have to pay advisors who have questionable track records outperforming the market. You can even replace money with bitcoin using Coinbase, Circle, or another digital-currency option. These are the by-products of the FinTech revolution. We are surrounded by a growing ecosystem of highly efficient FinTech companies that deliver next-generation financial products in a simple, hassle-free manner. Admittedly, today’s emerging FinTech companies have not had to work through a credit cycle or contend with rising interest rates. But those FinTech companies that have technology in their DNA will learn to ‘pivot’ when the time comes and figure it all out. We have just seen the tip of this iceberg. Technically speaking, the FinTech companies aren’t bringing anything revolutionary to the table. Mostly it feels like ‘an efficiency gain’ play and a case of capitalizing on the regulatory arbitrage that non-banks enjoy. Some call themselves big data companies—but any major bank can look into its data center and make the same claim. Some say that they use 1,000 data points. Banks are doing that too, albeit manually and behind closed walls, just as they have done for centuries. FinTechs simplify financial processes, reduce administrative drag, and deliver better customer service. They bring new technology to an old and complacent industry. Is there anything on the horizon that can truly revolutionize how this industry works? Answering this question brings us back to 2008 as we try to understand what really happened. What if there was a system that did not rely on Moody’s and S&P to rate the bonds, corporations, and securities. What if technology could provide this information in an accurate and transparent manner. What if Bitcoin principles were adopted widely in this industry? What if the underlying database protocol, Blockchain, could be used to track all financial transactions all over the globe to tell you the ‘real’ rating of a security.

20151031_FBC911

Blockchain can be defined as a peer-to-peer operated public digital ledger that records all transactions executed for a particular asset (…) “The Blockchain maintains this record across a network of computers, and anyone on the network can access the ledger. Blockchain is ‘decentralised’ meaning people on the network maintain the ledger, requiring no central or third party intermediary involvement.” “Users known as ‘miners’ use specialised software to look for these time stamped ‘blocks’, verify their accuracy using a special algorithm, and add the block to the chain. The chain maintains chronological order for all blocks added because of these time-stamps.” The digitalisation of financial service opens room for new opportunity such as to propose new kind of consumer’s experience as well as the use of new technologies and improve business data analysis. The ACPR, the French banking and insurance regulatory authority, has  classified the opportunities and risks linked to the Fintech such as the new services for uses, better resilience versus the difficulty to establish effective supervision, the risks of regulation dumping and regarding clients interest protection such as data misuse and security. The French Central Bank is currently studying blockchain in cooperation with two start-ups, the “Labo Blockchain” and “Blockchain France”. In that context, blockchain is a true financial service disruption, according to Piper Alderman “Blockchain can perform the intermediating function in a cheaper and more secure way, and disrupt the role of Banks.”

Hence, leading bank wants to seize that financial service opportunity. They are currently working on blockchain project with financial innovation firm, R3 CEV. The objective is that the project delivers a “more efficient and cost-effective international settlement network and possibly eliminate the need to rely on central bank”. R3 CEV has announced that 40 peer banks, including HSBC, Citigroup, and BNP Paribas, started an initiative to test new kind of transaction through blockchain. This consortium is the most important ever organized to test this new technology.

And what of security? According to the experts “the design of the blockchain means there is the possibility of malware being injected and permanently hosted with no methods currently available to wipe this data. This could affect ‘cyber hygiene’ as well as the sharing of child sexual abuse images where the blockchain could become a safe haven for hosting such data.” Further, according to the research, “it could also enable crime scenarios in the future such as the deployment of modular malware, a reshaping of the distribution of zero-day attacks, as well as the creation of illegal underground marketplaces dealing in private keys which would allow access to this data.” The issue of cyber-security for financial institutions is very strategic. Firstly, as these institutions rely on customer confidence they are particularly vulnerable to data loss and fraud. Secondly, banks represent a key sector for national security. Thirdly they are exposed to credit crisis given their role to finance economy. Lastly, data protection is a key challenge given financial security legal requirements.

As regard cyber security risks, on of the core legal challenge will be the accountability issue. As Blockchain is grounded on anonymity the question is who would be accountable for the actions pursued? Should it be the users, the Blockchain owner, or software engineer? Regulation will address the issue of blockchain governance. According to Hubert de Vauplane, “the more the Blockchain is open and public, less the Blockchain is governed”, “while in a private Blockchain, the governance is managed by the institution” as regard “access conditions, working, security and legal approval of transactions”. Where as in the public Blockchain, there is no other rules that Blockchain, or in other words “Code is Law” to quote US legal expert Lawrence Lessing. First issue: who is the block chain user? Two situations must be addressed depending if the Blockchain is private or public. Unlike public blockchain, the private blockchain – even though grounded in a public source code – is protected by intellectual property rights in favour of the organism that manages it, but still exposed to cyber security risks. Moreover, a new contractual documentation provided by financial institutions and disclosure duty could be necessary when consumers may simply not understand the information on how their data may be used through this new technology.

‘Disruption’ has turned into a Silicon Valley cliché, something not only welcomed, but often listed as a primary goal. But disruption in the private sector can have remarkably different effects than in the political system. While capital forces may allow for relatively rapid adaptation in the market, complex political institutions can be slower to react. Moreover, while disruption in an economic market can involve the loss of some jobs and the creation of others, disruption in politics can result in political instability, armed conflict, increased refugee flows and humanitarian crises. It nevertheless is the path undertaken….