The Future’s Present Past
Gazing into the crystal ball of crypto by looking at the conditions that lead to it.
When we talk about what’s next in the industry, a lot of people project extensions of the present, which presents a shortsighted landscape: more things with NFTs, advanced and/or easier to use DeFi, faster transaction times, some kind of privacy element. These things are often correct in the six month projection, but it, much like project planning, becomes wildly inaccurate in the multi-year scale. So let’s instead think more broadly, looking at what elements lead to the major pieces of today, and prognosticate what is to be, from completely unrelated events in the broader sector.
The Past’s Present Future
To simplify the way we can look at things, we’ll start with the bedrock on which crypto became a part of our reality: Bitcoin. By doing this, we can chart a relatively simple timeline:
1979 - Merkle Trees ┊ Merkle Trees introduced an efficient way to prove membership and location in ┊ a set, as an unforgeable commitment scheme. 1985 - ECC ┊ Elliptic Curve Cryptography is an asymmetric cryptosystem which enabled ┊ shorter key sizes with equivalent security to RSA, leveraging the properties ┊ of elliptic curves over finite fields. 1992 - ECDSA ┊ The Elliptic Curve Digital Signature Algorithm provided a signature scheme ┊ utilizing ECC. 1997 - Hashcash ┊ Hashcash was the first major introduction of an early proof of work model ┊ which was intended to serve as a measured expenditure of computing hashes ┊ to meet a difficulty bar which would serve as "postage" for emails to ┊ disincentivize spam. 1998 - b-Money/bitgold ┊ b-Money and bitgold were some of the first digital cash proposals, the former ┊ of which was referenced by Satoshi Nakamoto. Both of these proposals included ┊ a notion of computational expenditure, b-Money explicitly highlighted ┊ Hashcash. 1999 - Proof of Work ┊ Proof of Work served as a broader generalization of the problem space ┊ Hashcash was solving for. 2004 - Reusable Proof of Work ┊ Reusable Proof of Work, proposed by Hal Finney, was explicitly for the purpose ┊ of tokenized money, utilizing remote attestation to make the token ┊ re-spendable. 2008 - Bitcoin Whitepaper ┊ The Bitcoin whitepaper was the first proposal for a fully decentralized currency ┊ utilizing a reusable proof of work model that did not require remote attestation, ┊ instead drawing consensus from the miners performing their proof of work ┊ on top of a sequenced block of transactions. 2009 - Bitcoin Public Release
This elides over related events and innovations, but paints a nice linear sequence. Of course, when we actually think about the incremental improvements and innovations, this isn’t a linear sequence at all, but rather a tree of events, reaching backwards in time. This is especially important, when looking to the state of crypto today.
The Present’s Future’s Past
Since Bitcoin, we have had many entrants in the space, adding functionality and incorporating improvements that have granted transaction speed gains, smart contracts and decentralized applications, and more. It no longer becomes possible to use the simplified linear view, as some networks eschew many of the constructs upon which Bitcoin or other major cryptocurrencies utilize.
It becomes far more rational to adopt the tree view, highlighting some of the more prominent elements:
And this doesn’t even cover the externalities of what lead to these concepts and networks – Proof of Space-Time was based on timelock puzzles, quickly moving towards verifiable delay functions, settling on the Wesolowski VDF. zkEVM builds on top of the massive amount of research pouring in around zkSNARKs and zkSTARKs, and not to mention that Ethereum’s future approach of sharding is building on top of a number of ideas, including KZG commitments, verkle trees, and more.
So when we talk about the future, it feels incredibly silly to look at what we are on the precipice of, and simply say, “faster or private transactions”. There’s a lot more on the horizon.
The Future’s Present Past
Allow me to take an oracular position and draw from some cutting edge research, and chart a future timeline. This time, I will only converge on one thing: what I believe to be the future.
Efficient Subgroup Hiding via Planted Cliques
The planted clique problem has seen very little research in the space of cryptography – a rarely cited paper by Péter Hudoba is one of the few I could find. But what it can offer is the ability to hide relationships within large connections of nodes. This can be relevant in decentralized storage models, such that it becomes computationally difficult to derive the specific nodes that correspond to the shards of a given dataset.
Verifiable Delay Functions via IQC
Chia Network has already taken advantage of it, however it has seen minimal use outside of this context, and is only directly used as a consensus model – it is not vertically integrated with other constructs to provide additional utility. The Wesolowski VDF has equally great utility as a timestamping mechanism as well as a randomness beacon.
Anonymous Routing via Shuffled Lattices
In parallel to data location being indeterminable to outside parties with planted cliques, the routing of data between nodes must also provide indistinguishability from randomness else the cliques would eventually fall prey to statistical analysis. Enter RPM - Robust Anonymity at Scale, which leverages shuffled lattices to provide anonymous routing.
Oblivious Hypergraphs via Correlated Oblivious Transfer
Oblivious Transfer has been around for many decades, improved in many ways and more broadly generalized, many constructions around which leverage Correlated Oblivious Transfers conducted in batch extensions from a random seed OT. Concurrently, oblivious data structures, such as ORAM, provide data access/mutation in varying ways, some of which rely on cryptographic or statistical constructions, however beyond the relationship of the term “oblivious”, does not necessarily imply a relationship to Oblivious Transfer. Oblivious Hypergraphs, on the other hand, are an oblivious data structure which does leverage correlated oblivious transfers, building on top of the efficiency and security gains of KOS15, FERRET and others. I am writing the paper on this structure, and will publish more soon.
Outside of cryptographic research, there are other pivotal papers which have landed that prove interesting in building a projection of the future:
Mapping Relational Operations onto Hypergraph Model
This paper introduces a way in which RDBMS conventions may be efficiently translated into the hypergraph model. There also exist further papers delineating additional applicability, such as representations of RDF.
DBOS – A DBMS-oriented Operating System
Rebuilding the notion of an operating system by representing the common elements in terms of a distributed relational database, this paper demonstrates significant benefits and reductions in complexity compared to the typical operating system model.
A Calculus of Mobile Processes
This paper introduces the pi-calculus, a model for communicating systems which has been the foundational element of communicating sequential processes, which in turn has lead to concurrent programming languages such as Erlang. The number of papers written around this subject is too long to list, however it merits remarking that it has applicability with respect to both oblivious transfer programs and zero knowledge proof construction.
Projection
I foresee a future of these combined domains, in which a new operating system may emerge, allowing the basic utilities of web3: tokenized fungible and non-fungible resources, a global filesystem, a practical language to compose the relationships and rules between these things, but with the practicalities of conventional operating systems: identity, scheduling, inter-process communication, bridging the gaps between classical computing, cloud, and crypto. Instead of these items being discrete protocols and networks as they stand today, I believe that a singular network will emerge that offers all and more, with a unified protocol that vertically integrates the various innovations into a complete system not subject to external forces and dynamics of cross-protocol game theory.
But I did lie to a degree with the premise – it is not simply a projection – it is what I am building: Quilibrium.