a measure of how frequently a token is used and reused in the system (its velocity, V). The value of a single token is therefore M/T, where T is the total number of tokens. If a given utility protocol does not have a built-in mechanism, such as Ethereum’s GASPRICE, to ensure that the cost of using the network does not arbitrarily and sustainably diverge from the underlying cost of the computational resources it consumes, one of three things happens: (a) the token’s price trades to a level such that there is no premium cost to using the network (i.e., there is no economic rent); (b) the chain forks into a functionally identical but less rent-seeking chains until any premium usage cost and economic rent on the network declines to a level at which it is no longer worthwhile to arbitrage; (c) the protocol’s adoption is temporarily limited to the highest-value use cases until (a) or (b) occurs. In all cases, the equilibrium result must be at or near marginal revenue = marginal cost for the mining industry maintaining the blockchain in question, so that the token’s value cannot materially decouple from the underlying computing resource cost. PQ, the cost of computing resources required to maintain a blockchain, is not only low relative to the current network values being attributed to cryptoassets; it is also inflated by the prevalence of proof-of-work consensus mechanisms, which mean that the vast majority of computing resource consumed is make-work. To the extent that new scaling technologies such as proof-of-stake, sharding, Segregated Witness, Lightning, Raiden and Plasma become prevalent, the amount of computing resource consumed may become quite small. Note also that in the context of cryptoassets, V could go very high at equilibrium. Even if a significant portion of a given cryptoasset has a low velocity because it is being hodl’d by speculators or because it is staked by miners under a proof-of-stake consensus mechanism, the circulating portion of the tokens can circulate at the speed of computer processing and bandwidth—i.e., fast and accelerating. The implication is that average velocities can and are likely to be high, regardless of how many tokens are actually actively circulating for utility purposes to allocate network resources.3 The combined effect of low and falling PQ and potentially very high V is that the utility value of utility cryptoassets at equilibrium should in fact be relatively low. Clearly, scaling solutions such as proof-of-stake, etc. are bullish for adoption/users but bearish for token value/investors. Even without those technology shifts, the cost of using decentralised protocols is deflationary, since the cost of processing power, storage and bandwidth are deflationary. This is also bullish for adoption and users and bearish for token value and investors.4 Whatever scaling solutions are developed, the inherent redundancy of the consensus mechanism means that there may be fewer use cases than many decentralised revolutionaries think in which a decentralised solution displaces a centralised solution. Use cases will be limited to dematerialised networks where the value of decentralisation, censorship-resistance and trustlessness is high enough to justify the inherent inefficiency and redundancy of the consensus mechanism. Is it worth the cost for payments? Yes for some, but not for all. Consider Twitter -- what is the added value to the user of a massively redundant, trustless, 3 Chris Burniske’s recent blog post “Cryptoasset Valuations” (https://medium.com/@cburniske/cryptoasset- valuations-ac83479ffca7) estimates an average V of 7, after adjusting for hodlers, stakers, etc. This assumption may be optimistic (meaning, it is probably a low value of V to assume at equilibrium and therefore an optimistic number to be using to estimate the potential equilibrium value of a given cryptoasset), but his framework is useful for thinking about the different drivers of V for a given cryptoasset. 4 I have yet to come across any examples of a protocol where I have been persuaded that when all is said and done the underlying scarce resource being provisioned is something other than computing resources, or at least where that is what it will boil down to at competitive equilibrium after competition in mining, forks, etc. Please alert me to any counter examples you have seen or can think of. 3
Investor’s Take on Cryptoassets by John Pfeffer Page 2 Page 4