Executive summary:
Polygon zkEVM is a revolutionary child project that aims to solve Ethereum's scalability issues by building a Zero Knowledge rollup with EVM equivalence. This project has been in development since mid-2022 and is set to go live on mainnet on March 27th.
The zkEVM project has been merged with Mir and Hermez protocols to create a powerful Layer 2 technology that promises to bring mass adoption of blockchain technology. The project has benefited significantly from the $1 billion fund allocated to its development by Polygon labs, which has allowed for the hiring of ZK experts and builders, as well as incentivizing projects and protocols to deploy on the new Layer 2.
By achieving a high degree of EVM equivalence, Polygon zkEVM provides developers and users with an experience similar to using Ethereum mainnet, with added benefits such as privacy, speed, and low transaction costs. This makes zkEVM stand out from its competitors as a highly efficient and user-friendly solution to Ethereum's scalability issues.
In this article, we will explore the technology behind Polygon zkEVM, its unique features, and how it sets itself apart from other Layer 2 solutions. Whether you're a blockchain enthusiast or simply curious about the latest developments in the industry, this article is sure to provide valuable insights into the past, present and future of the ZK technology.
The part 1 will focus on history of Scaling, mad to understand the zk technology and context?
Skip to directly go to Part 2 for Polygon zkEVM presentation.
I. Introduction to scaling, rollups and zkEVM
1. History of Scaling and Polygon Initiatives
Before we dive into the specifics of Polygon’s zkEVM, let’s have a look at how we got to where we are today and how Polygon’s scaling initiatives have evolved over time as new technologies have emerged.
Scaling the throughput of blockchain networks has been a main focus of research and development in the blockchain space for years. It is indisputable that to reach true mass adoption, blockchains need to be able to scale. But what exactly does that mean? Generally, scalability is the ability of a network to process a large amount of transactions quickly and at low cost. This consequently means that as more use cases arise and network adoption accelerates, the performance of the blockchain doesn’t suffer. Based on this definition, Ethereum for example lacks scalability.
With increasing network usage, gas prices on Ethereum have in the past skyrocketed to unsustainably high levels, pricing out many smaller users from interacting with decentralized applications entirely. This gave alternative, more “scalable” L1 blockchains a chance to eat into Ethereum’s market share, but also spurred innovation around increasing the throughput of the Ethereum network.
But can’t Ethereum just use more powerful hardware? Can’t we simply increase hardware requirements validator nodes that validate in a smaller set, thereby improving the network’s ability to verify the chain and hold its state? Well, we could. And it’s actually the approach that many alternative layer 1 chains have chosen to take (e.g. Solana). But the question is, at what cost does this scalability increase come? To understand that, it’s important to be familiar with the blockchain trilemma (visualized in figure 1 below). The concept refers to the idea that a blockchain cannot reach all three core qualities that any blockchain network should strive to have (scalability, security & decentralization) all at once.
What that means becomes clear if we think about the before-mentioned increase of hardware requirements. In order to scale throughput, an alt L1 chain for example that has decided to go with a more centralized network structure where users have to trust a smaller number of validators with high-spec machines, sacrifices decentralization & security for scalability. Additionally, with the need for more powerful hardware, running a node also becomes more expensive (hardware itself but also bandwidth & storage). This drastically impairs decentralization of the network as the barriers of entry increase and fewer people are able to participate in the network and validate transactions in the first place as they are priced out.
Since decentralization and inclusion are two core values of the Ethereum community and Ethereum is built on a culture of users verifying the chain, it is not very surprising that running the chain on a small set of high-spec nodes is not a suitable path for scaling Ethereum. Even Vitalik Buterin argues that it is “crucial for blockchain decentralization for regular users to be able to run a node”. Hence, other scaling approaches gained traction. The following subsections will explore these technologies in detail.
a. Side-Chains, Plasma & Polygon PoS
The idea behind sidechains is to operate an additional blockchain in conjunction with a primary blockchain (Ethereum). This means the two blockchains can communicate with each other, facilitating the movement of assets between the two chains. A side-chain operates as a distinct blockchain that functions independently from Ethereum and links to Ethereum mainnet through a two-way bridge. Side-chains generally have their own block parameters and consensus algorithms, which are frequently tailored for streamlined transaction processing and increased throughput. However, utilizing a side-chain also means making a trade-off as it does not inherit Ethereum's security features.
Polygon itself has built a network that might be considered a side-chain to Ethereum. The Polygon Proof of Stake network is an Ethereum Virtual Machine (EVM) compatible blockchain network that runs alongside Ethereum and features its own validator set as well as proprietary consensus algorithm optimized for a higher throughput than Ethereum’s layer 1. However, when it comes to the Polygon PoS network, it is worth differentiating it from a “pure” side-chain as it has a lot of extra features that rely on the security of the main Ethereum layer.
Most importantly, staking MATIC tokens happens on the Ethereum main chain, where the collective set of validators managed as well. If a validator begins to engage in malicious behavior such as double signing or is experiencing extensive downtime, their stake is subject to slashing. Because staking is carried out on the Ethereum smart contract, the need to trust validators is reduced as some of Ethereum security features are inherited in this pivotal process.
In the unlikely event of a collusion scenario where the majority of validators engage in malicious activities, the community can still collaborate to redeploy the contracts on Ethereum and implement a fork that eliminates the malicious validators, enabling the chain to continue operating as intended.
The network also uses a checkpointing technique to increase network security in which a single Merkle root is periodically published to the Ethereum layer 1. This published state is referred to as a checkpoint. Checkpoints are important as they provide finality on the Ethereum chain. The Polygon PoS Chain contract deployed on the Ethereum layer 1 is considered to be the ultimate source of truth, and therefore all validation is done via querying the Ethereum main chain contract. Taking this into consideration, Polygon PoS should probably be referred to as a commit chain rather than a classical side-chain.
Similarly, plasma chains also utilize a proprietary consensus mechanism to generate blocks. However, unlike side-chains, the "root" of each plasma chain block is broadcasted to Ethereum This is very similar to checkpointing on Polygon PoS but demands more communication with L1. The "root" in this context is essentially a small piece of information that enables users to demonstrate certain aspects of the L2 block's contents.
But while Polygon PoS has gained significant traction as outpriced users fled Ethereum in the search of lower transaction fees, the overall adoption of side-chains and plasma chains as a scaling technology has remained limited.
b. State Channels
The same goes for a scaling approach referred to as state channels, that enables off-chain transactions between two or more parties. State channels allow participants to engage in a series of interactions, such as payments or game moves, without requiring each transaction to be recorded on the L1 blockchain.
The process begins with the creation of a smart contract on the Ethereum main chain. This contract includes the rules that govern the interactions and specifies the parties involved. Once the contract is established, the participants can open a state channel, which is an off-chain communication channel for executing transactions.
During the state channel's lifespan, the parties can engage in multiple interactions, updating the state of the contract through signed messages. The state changes are not immediately recorded on the blockchain, but they are verified by the smart contract, which can be enforced at a later time if necessary.
When the participants are finished with the interactions, they can close the state channel and publish the final state of the contract to the Ethereum main chain, which executes the contract and finalizes the transaction. This approach reduces the number of on-chain transactions needed to complete a series of interactions, allowing for faster and more efficient processing of transactions.
While state channels initially seemed like a promising solution for scaling blockchain networks like Ethereum, state channels require a certain level of trust between the participants and there is a considerable risk of disputes arising if one party fails to follow the rules outlined in the smart contract. Therefore, state channels are primarily well suited for interactions between parties who trust each other, which limits the number of use cases the technology can feasibly support. Consequently, adoption has remained low as other scaling technologies took the spotlight.
c. Homogenous Execution Sharding
A scaling approach that many alternative L1 blockchains have chosen to take and that for quite some time seemed like the most promising solution to Ethereum’s scalability issues as well is what is referred to as homogenous execution sharding.
Homogeneous execution sharding is a blockchain scaling approach that seeks to increase the throughput and capacity of a blockchain network by splitting its transaction processing workload among multiple, smaller units (validator sub-sets) called shards. Each shard operates independently and concurrently, processing its own set of transactions and maintaining a separate state. The goal is to enable parallel execution of transactions, thus increasing the overall network capacity and speed. Harmony and Ethereum 2.0 (old roadmap only!) are two examples of scaling initiatives that have adopted or at least considered homogeneous execution sharding as part of their scaling strategy.
Harmony is an alternative L1 blockchain platform that aims to provide a scalable, secure, and energy-efficient infrastructure for decentralized applications (dApps). It uses a sharding-based approach in which the network is divided into multiple shards, each with its own set of validators who are responsible for processing transactions and maintaining a local state. Validators are randomly assigned to shards, ensuring a fair and balanced distribution of resources. The network uses a consensus algorithm called Fast Byzantine Fault Tolerance (FBFT), a variant of the Practical Byzantine Fault Tolerance (PBFT) consensus mechanism, to achieve fast and secure transaction validation across shards.
Cross-shard communication is facilitated through a mechanism called "receipts," which allows shards to send information about the state changes resulting from a transaction to other shards. This enables seamless interactions between dApps and smart contracts residing on different shards, without compromising the security and integrity of the network.
Ethereum 2.0, is a planned upgrade to the Ethereum network aiming to address the scalability, security, and sustainability issues faced by the original Proof-of-Work (PoW) based Ethereum version. The old Ethereum 2.0 roadmap proposed a multi-phase rollout, transitioning the network to a Proof-of-Stake (PoS) consensus mechanism (which we finally saw happening last fall) and introducing execution sharding to improve scalability (in the old roadmap!). Under this original plan, Ethereum 2.0 would have consisted of a Beacon Chain and 64 shard chains. The Beacon Chain was designed to manage the PoS protocol, validator registration, and cross-shard communication.
The shard chains, on the other hand, were individual chains responsible for processing transactions and maintaining separate states in parallel. Validators would have been assigned to a shard and would rotate periodically to maintain the security and decentralization of the network. The Beacon Chain would have kept track of validator assignments and managed the process of finalizing shard chain data. Cross-shard communication was planned to be facilitated through a mechanism called "crosslinks," which would periodically bundle shard chain data into the Beacon Chain, allowing state changes to be propagated across the network.
However, the Ethereum 2.0 roadmap has since evolved, and execution sharding has been replaced by an approach referred to as data sharding that aims to provide the scalable basis for a more complex scaling technology known as rollups (more on this soon!).
d. Heterogenous Execution Sharding
Heterogeneous execution sharding is a blockchain scaling approach that connects multiple, independent blockchains with different consensus mechanisms, state models, and functionality into a single, interoperable network. This approach allows each connected blockchain to maintain its unique characteristics while benefiting from the security and scalability of the entire ecosystem. Two prominent examples of projects that employ heterogeneous execution sharding are Polkadot and Cosmos.
Polkadot is a decentralized platform designed to enable cross-chain communication and interoperability among multiple blockchains. Its architecture consists of a central Relay Chain, multiple Parachains, and Bridges.
Relay Chain: The main chain in the Polkadot ecosystem, responsible for providing security, consensus, and cross-chain communication. Validators on the Relay Chain are in charge of validating transactions and producing new blocks.
Parachains: Independent blockchains that connect to the Relay Chain to benefit from its security and consensus mechanisms, as well as enable interoperability with other chains in the network. Each parachain can have its own state model, consensus mechanism, and specialized functionality tailored to specific use cases.
Bridges: Components that link Polkadot to external blockchains (like Ethereum) and enable communication and asset transfers between these networks and the Polkadot ecosystem.
Polkadot uses a hybrid consensus mechanism called Nominated Proof-of-Stake (NPoS) to secure its network. Validators on the Relay Chain are nominated by the community, and they, in turn, validate transactions and produce blocks. Parachains can use different consensus mechanisms, depending on their requirements. What is an important feature of Polkadot’s network architecture is that by design, all parachains share security with the relay chain, hence inheriting the relay chain security guarantees.
Cosmos is another decentralized platform that aims to create an "Internet of Blockchains," facilitating seamless communication and interoperability between different blockchain networks. Its architecture is kind of similar to Polkadot’s and composed of a central Hub, multiple Zones, and Bridges.
Hub: The central blockchain in the Cosmos ecosystem enabling cross-chain communication and soon inter-chain security (shared security similar to Polkadot). Cosmos Hub uses a Proof-of-Stake (PoS) consensus mechanism called Tendermint, which offers fast finality and high throughput. Theoretically, there can be multiple hubs. But especially with ATOM 2.0 and inter-chain security coming up, the Cosmos Hub will likely remain the center of the Cosmos-enabled internet of blockchains.
Zones: Independent blockchains connected to the Hub, each with its own consensus mechanism, state model, functionality and generally also validator set. Zones can communicate with each other through the Hub using a standardized protocol called Inter-Blockchain Communication (IBC).
Bridges: Components that link the Cosmos ecosystem to external blockchains, allowing asset transfers and communication between Cosmos Zones and other networks.
Both Polkadot and Cosmos are examples of heterogeneous execution sharding, as they connect multiple, independent blockchains with diverse functionality, consensus mechanisms, and state models into a single, interoperable ecosystem. This approach allows each connected chain to maintain its unique characteristics while enabling scalability by separating application-specific execution layers from each other that still benefit from the cross-chain communication and security capabilities of the entire network.
e. Scaling Ethereum on Rollups
Rollups take sharding within a shared security paradigm to the next level. It’s a scaling solution in which transactions are processed off-chain in the execution environment of the rollup and, as the name suggests, rolled up into batches. Sequencers collect transaction data on the L2 and submit the data to a smart contract on Ethereum L1 that enforces correct transaction execution on L2 and stores the transaction data on L1, thereby enabling rollups to inherit the security of the battle-tested Ethereum base layer.
So now what were essentially shards in the old Ethereum 2.0 roadmap are completely decoupled from the network and devs have a wide open space to develop their L2 “chain” however they want (similar to Polkadot’s parachains or Cosmos’ zones), while still being able to rely on Ethereum L1’s security by communicating through arbitrary smart contracts developed in a way that is optimized for the specific rollup. Another key advantage, for example compared to side-chains, is that a rollup does not need a validator set and consensus mechanism of its own.
A rollup system only needs to have a set of sequencers (performing the tasks outlined above), with only one sequencer needing to be live at any given time. With weak assumptions like this, rollups can actually run on a small set of high-spec server-grade machines, allowing for great scalability. However, most rollups try to design their systems as decentralized as possible (more on that later). Also, instead of consensus mechanisms, rollups can for example have coordination mechanisms with rotation schedules to rotate sequencers accordingly, thereby increasing security & reducing the time the (high-spec) sequencer nodes need to be online.
Generally, there are two types of rollup systems:
What is referred to as optimistic rollups are characterized by having a sequencer node that collects transaction data on L2, subsequently submitting this data to the Ethereum base layer alongside the new L2 state root. In order to ensure that the new state root submitted to Ethereum L1 is correct, verifier nodes will compare their new state root to the one submitted by the sequencer. If there is a difference, they will begin what’s called a fraud proof process. If the fraud proof’s state root is different from the one submitted by the sequencer, the sequencer’s initial deposit (a.k.a. bond) will be slashed. The state roots from that transaction onward will be erased and the sequencer will have to recompute the lost state roots.
Zero Knowledge rollups or zk-rollups on the other hand rely on validity proofs in the form of Zero Knowledge proofs (e.g. SNARKs or STARKs) instead of fraud proving mechanisms. So basically, similar to the optimistic rollup systems, a sequencer collects transaction data on L2 and is responsible for submitting (and sometimes also generating) the Zero Knowledge proof to L1. The sequencer’s stake can be slashed if they act maliciously, which incentivizes them to post valid blocks (or proofs of batches). The prover (or sequencer if combined in one) generates unforgeable proofs of the execution of transactions, proving that these new states and executions are correct.
The sequencer subsequently submits these proofs to the verifier contract on the Ethereum mainnet. Technically, the responsibilities of sequencers and provers can be combined into one role. However, because proof generation and transaction ordering each requires highly specialized skills to perform well, splitting these responsibilities prevents unnecessary centralization in a rollup’s design. The Zero Knowledge proof the sequencer submits to L1 reports only the changes in L2 state and provides this data to the verifier smart contract on Ethereum mainnet in the form of a verifiable hash.
Determining which approach is superior is a challenging task. However, let's briefly explore some key differences. Firstly, because validity proofs can be mathematically proven, the Ethereum network can trustlessly verify the legitimacy of batched transactions. This differs from optimistic rollups, where Ethereum relies on verifier nodes to validate transactions and execute fraud proofs if necessary. Hence, some may argue that zk-rollups are more secure. Furthermore, validity proofs (the zero-knowledge proofs) enable instant confirmation of rollup transactions on the main chain.
Consequently, users can transfer funds seamlessly between the rollup and the base blockchain (as well as other zk-rollups) without experiencing friction or delays. In contrast, optimistic rollups (such as Optimism and Arbitrum) impose a waiting period before users can withdraw funds to L1 (7 days in the case of Optimism & Arbitrum), as the verifiers need to be able to verify the transactions and initiate the fraud proving mechanism if necessary. This limits the efficiency of rollups and reduces the value for users. While there are ways to enable fast withdrawals, it is generally not a native feature.
Rollups play an especially important role in Ethereum 2.0’s rollup-centric roadmap. Along with Proof of Stake consensus, the central feature in ETH 2.0 design is what is referred to as data sharding. As per Ethereum’s rollup-centric roadmap, instead of actually processing transactions, the shards would simply store data & attest to the availability of ~250 kB sized blobs (little amounts) of data (see figure above). However, data sharding will still take some time until it’s fully implemented and deployed.
That’s where EIP-4844 comes into play. EIP-4844 (a.k.a. Proto-danksharding) provides an interim solution by implementing the transaction format that will be used in data sharding already, but not actually shards transactions yet. Instead, data from this transaction format (data blobs) is part of the beacon chain (L1) & is fully downloaded by all consensus nodes. However, to prevent state bloat, the blobs are not permanently stored and can be deleted after a short delay.
The new transaction type which is referred to as a blob-carrying transaction, is generally just like a regular transaction, except it also carries these extra pieces of data referred to as blobs. Blobs are rather large (~125 kB) & are much cheaper than similar amounts of call data that rollups would normally have to pay for when posting state roots or validity proofs to L1. Because validators & clients still have to download full blob contents, data bandwidth in proto-danksharding is targeted to 1 MB per slot instead of the full 16 MB that are targeted when data sharding is fully implemented. However, there are still large scalability gains because this data is not competing with the gas usage of other Ethereum L1 transactions.
Hence, L1 execution can be congested & expensive, while blobs remain very cheap (at least in the medium term). Finally, another benefit of the EIP4844 rollout is that while rollups have to adapt to switch to EIP-4844, they won't have to worry about adapting anymore when full data sharding is finally implemented. It’s the same transaction format, but will safely allow for additional data.
Through data sharding in the long term and EIP-4844 in the short term, Ethereum aims to address the data availability problem, which refers to the question of how peers in a blockchain network can be sure that all the data of a newly proposed block is actually available. If data is not available, a block might contain malicious transactions which are being hidden by the block producer. Even if the block contains non-malicious transactions, hiding transactions might compromise the security of the system.
This data availability problem is especially prominent in the context of rollup systems. It is very important that sequencers can make transaction data available, as the rollup needs to know about its state and users’ account balances. Data availability also introduces certain limitations to rollups on Ethereum. Even if the sequencer were an actual supercomputer, the number of transactions per second it can actually compute will be limited by the data throughput of the underlying data availability solution/layer it uses. If the data availability solution/layer used by a rollup is unable to keep up with the amount of data the rollup’s sequencer wants to dump on it, the sequencer (and the rollup) can’t process more transactions even if it wanted to.
By transforming the Ethereum base layer into a major data availability / settlement layer for an almost infinite number of highly scalable, rollup-based execution layers, the overall Ethereum network and its rollup ecosystems will enable enormous scale.
f. Polygon zk-Rollup Scaling Initiatives
Polygon has in many ways been a pioneer and a driving force behind the development of Zero Knowledge rollup technology. Since the launch of its PoS network, Polygon has pivoted to becoming an application platform providing all the tools and infrastructure needed to build various kinds of EVM-compatible layer 2 solutions. The vision includes providing developers an SDK to build various layer 2 scaling solutions, including rollup implantations such as zk-rollups, but also side or commit chain solutions like Polygon PoS and state channels. Polygon will also supply PoS validators from its existing network to support L2 infrastructure and build bridges between the various layer 2s, to become an aggregator for scalable Ethereum L2 scaling solutions.
Probably most importantly, Polygon is strongly committed to scaling Ethereum on zk-rollups and plans to continue investing in the development of zero-knowledge technology. Over the years, Polygon has launched a variety of initiatives working towards scalable and secure zk-rollup implementations that allow for generalized computation and EVM equivalence. These initiatives include Polygon Zero, Hermez, Miden, Nightfall and what is now known as the Polygon zkEVM. Let’s have a brief look at what these initiatives aim to achieve and how they some of these initiatives are the basis of the Polygon zkEVM that stands at the center of this report.
Polygon Zero is a zk-rollup solution designed to reduce the computational cost of generating validity proofs by utilizing the recursive Plonky2 proof system, a proving mechanism originally developed by the team behind the acquired Mir Protocol (for $400m). Plonky2 is said to be the cheapest and most performant proof system for proof generation / verification on Ethereum. Polygon Zero was aimed to be compatible with the Ethereum Virtual Machine (EVM) and can theoretically batch up to 3,000 transactions per block.
Polygon Hermez is a decentralized zk-rollup project built on the Ethereum network, with decentralization as its primary focus. It was originally acquired by Polygon for $250m and uses a novel L2 consensus algorithm called Proof of Efficiency (PoE) consisting of Sequencers and Aggregators, who work together to ensure rollup functionality. Polygon Hermez achieves significant throughput by batching up to 2,000 transactions and using SNARK proofs to validate transactions.
Polygon Miden is a general-purpose, STARK-based zk-rollup with EVM compatibility that relies on the Miden Virtual Machine (VM) to execute arbitrary logic and run smart contracts. Developers can compile Solidity or Vyper code into Miden Assembly, with the rollup being capable of processing up to 5,000 transactions per block and achieving over 1,000 TPS at launch.
Polygon Nightfall, an enterprise rollup solution, combines optimistic rollups with zero-knowledge cryptography to offer scalable and private blockchain transactions for large-scale companies. Developed in collaboration with Ernst & Young (EY), the rollup architecture consists of Nightfall Contracts, Block Proposers, Challengers, and Liquidity Providers. Polygon Nightfall supports the secure and private exchange of various token standards and aims to deliver a 100 TPS rate for enterprise clients on Ethereum.
Hermez 2.0 or what is more commonly referred to as Polygon zkEVM, is an upgrade to Polygon Hermez, aims to introduce EVM equivalence (more on this in “Levels of EVM Equivalence”) to the existing zk-rollup tech, thereby enabling the porting of Ethereum-based dApps or launching EVM dApps directly on the rollup. Polygon zkEVM will also leverage Polygon Zero’s cutting-edge Zero Knowledge proof system plonky2 and Polygon Miden’s STARK proofs (more on this in “ZK Proof System (plonky2)”), further proving how this great technological innovation, that the zkEVM undoubtedly is, is a result of research & engineering work across various teams and areas across all the before-mentioned Polygon initiatives.
2. Deep-Dive into zk-Rollups
Now that we have established a basic understanding of zk-rollup technology, let's dive a bit deeper, explore the benefits that are inherent to zk-rollups and investigate some of the limitations and issues that exist for this nascent technology.
a. What makes zk-Rollups so exciting?
i. Scalability
The most obvious reason to be excited about zk-rollups is the enormous scale they provide. By providing an execution environment that is separated from the congested L1 execution layer and not bound by its limitations in terms of block size and transaction throughput, zk-rollups theoretically enable thousands of transactions per second.
But how does it look on the cost side of things? As we know, many users are priced out of the Ethereum L1 economy in times of high network activity as gas prices spike. Let’s have a closer look at rollup fees. As mentioned earlier, rollups need to post data on L1, hence, operating a rollup incurs a cost on Ethereum mainnet as well. Right now the cheapest data option for rollups to post data on the Ethereum base layer is call data (16 gas per byte). However, this can still become rather expensive since there is no separation of the fee market for calldata from the fee market for general execution. Consequently, when gas prices are driven up by L1 activity, posting calldata gets more expensive.
But back to rollup fees. Generally, fees on a rollup have three components: fees by the rollup for processing the transaction, batch/verification fees, and fees for posting the transaction data to L1 (as calldata). It is not unreasonable to assume that currently, 60-70% of rollup fees are attributable to fees incurred on L1.
However, based on data from L2Fees.info, rollups frequently provide transaction fees that are 5-10x lower than fees on L1 already today (see table below). In times of congestion and high gas prices on L1 the difference can be even larger. Moreover, application-specific zk-rollups, which have better data compression have even achieved ~50-100x lower fees than the Ethereum L1. With the rollout of EIP-4844 (see “Scaling Ethereum on Rollups”), rollup fees are expected to be further reduced by up to 100x, as instead of posting call data to L1, rollups post transaction data into temporary data blobs that essentially have a fee market separated from L1 execution exclusively for rollups.
However, the long-term solution to this issue remains data sharding, which will add ~16 MB per block of dedicated data space to the chain that rollups could use to post data. Rollups themselves can reduce cost by improving data compression and reducing the amount of data posted to L1. Some projects also pivot to using off-chain data availability solutions (e.g. the optimistic rollup Metis but also Validiums like Immutable).
Specifically in the context of zk-rollups, it gets even more interesting when we look at what is referred to as layer 3 rollups. WTF is a layer 3 you’re asking yourself? Well, L3 relates to L2 just as L2 relates to L1. L3 can be realized using validity proofs as long as the Zero Knowledge proof system used supports recursiveness (proofs can prove validity of other proofs, see “ZK Proof System (plonky2)”) and the underlying L2 is capable of supporting a verifier smart contract to verify said validity proofs (hence is a generalized computation layer like Polygon zkEVM).
When the L2 also uses validity proofs which are submitted to L1, this becomes an extremely scalable recursive Zero Knowledge proof structure where the compression benefit of L2 proofs is multiplied by the compression benefit of L3 proofs, enabling hyper-scalability. Moreover, On/off-ramping flows between L1 and L2 are notoriously expensive. This is different between L2 and L3, allowing for much cheaper and simpler L2-L3 interoperability.
In addition, independent L3 systems will interoperate with each other via the cheap L2, not the expensive L1, also allowing for very cheap L3-L3 interoperability. Finally, L3 structures provide developers with improved control over the technology stack, meaning that potential L3 solutions could be highly optimized application-specific systems that are customized for better application performance.
ii. Security
Aside from scalability, which is also offered by some alternative layer 1 blockchains like Solana, rollups primarily pride themselves on their security. The main distinction to alt L1 chains is that rollups do not have a proprietary validator set. Instead, thanks to the mathematical verifiability of the proofs the rollups post on L1, they effectively share security with the Ethereum base layer. This in turn means that rollup systems are essentially secured by Ethereum’s highly decentralized validator set consisting of over 500’000 validators and with a total of over USD 20bn in ETH staked.
However, despite their production use, Zero Knowledge proof systems are still relatively new, complex and they rely on the proper implementation of the polynomial constraints used to check validity of the Execution Trace. Bugs in this process can expose users to risks (see “Risks”).
Additionally, many zk-rollup implementations currently rely on a centralized sequencer. The sequencer is the only entity that can propose blocks and a live & trustworthy sequencer is vital to the health of any rollup system. A lack of decentralization can hence pose a security risk as the sequencer could go offline or censor transactions. However, zk-rollups generally have mechanisms in place to protect users and keeping up security guarantees. Since block production is open to anyone, if users experience censorship from the sequencer, they can normally propose their own blocks which would include their transactions (as is the case with Polygon zkEVM).
Additionally, if users experience censorship from the sequencer with a regular exit to L1, they can submit their withdrawal requests directly on L1. The L2 system is then obliged to service this request. Once the force operation is submitted if the request is serviced, the operation follows the flow of a regular exit (a feature also supported by Polygon zkEVM).
In summary, it is reasonable to assume that a zk-rollup implementation like Polygon zkEVM provides superior security guarantees than most alternative L1 chains that rely on security provided by smaller validator sets with less value at stake.
iii. Privacy
Finally, the reliance on zero knowledge proofs also reduces the volume of shared transaction data, thus almost making zk-rollups privacy-preserving by default. The core concept behind zk-rollups is the utilization of zero knowledge proofs, which are cryptographic techniques that allow one party (the prover) to prove to another party (the verifier) the validity of a statement without revealing any specific information about it. In the context of zk-rollups, these proofs are used to validate the correctness of a batch of transactions without disclosing the details of individual transactions.
The privacy benefits of zk-rollups stem from their ability to conceal the inputs of transactions while still proving their legitimacy. When users submit transactions to a zk-rollup, the rollup aggregates multiple transactions into a single batch and then generates a Zero Knowledge proof attesting to the validity of the entire batch. This proof is then submitted to the underlying blockchain (e.g., Ethereum) for final verification and storage.
The key advantage of this approach is that transaction data is never exposed directly on the base blockchain. Instead, the zero-knowledge proof provides a succinct and cryptographically secure representation of the batched transactions. Consequently, the details of individual transactions – such as sender and recipient addresses, amounts, and other metadata – remain private, as they are not explicitly included in the proof.
b. What are the current issues & limitations?
But, probably unsurprisingly considering how nascent Zero Knowledge rollup technology is, there are still issues and limitations that need to be addressed. This section will introduce some of the key pain points of zk-rollup-based scaling.
i. Data Availability
Data availability is the primary scaling bottleneck for Ethereum-based rollup systems and a hot topic at the frontier of blockchain scaling. But what is the data availability problem and how is it addressed? Let’s quickly recap.
The data availability problem pertains to the challenge of ensuring that all data within a newly proposed block is accessible to peers within a blockchain network. In the event that some data is unavailable, the block could potentially contain malicious transactions that are deliberately concealed by the block producer. Even if the transactions are non-malicious, concealing them could jeopardize the system's security. This issue is particularly pronounced in the context of rollup systems, where it is crucial for sequencers to have access to transaction data, as they need to be aware of the network's state and account balances. Data availability limitations also impose constraints on rollups in general.
Regardless of the sequencer's computational capabilities, the number of transactions it can process per second is ultimately restricted by the data throughput capacity of the underlying data availability solution. If the solution employed by a rollup is unable to keep pace with the volume of data that the rollup's sequencer intends to process, the sequencer (and the rollup) will be unable to handle more transactions, irrespective of its inherent technological capacity to do so.
While the before-mentioned EIP-4844 and data sharding (see ”Scaling Ethereum on Rollups”) aim to address the data availability problem on Ethereum, some projects have pivoted to using off-chain data availability solutions.
A “pure” validium for example uses zero knowledge proofs for transaction validity (like a zk-rollup) but stores transaction data off-chain with a centralized data provider as opposed to on-chain on L1. Validiums offer very low cost per transaction (because they pay less than rollups that post more data on L1), however, security guarantees are comparably weak as accessing the latest state in a validium requires off-chain data to be available. Hence, a risk that the data provider misbehaves or goes offline does exist. To address these security concerns most current validium designs utilize a Data Availability Committee (DAC) rather than a single data provider (as in the pure validium). Basically, DAC-based solutions can be thought of as validiums with multiple nodes, where nodes or members of the committee are trusted parties that keep copies of data off-chain and make data available. An example of a DAC-based validium is Immutable.
Celestiums on the other hand are a novel type of L2 network that is based on Celestia’s specialized data availability layer but uses Ethereum L1 for settlement and dispute resolution. Basically, Celestiums are a form of a permissionless DAC scaling solution with additional economic guarantees on data availability because the decentralized committee can be slashed in case of malicious behavior.
Finally, as explained earlier, “true” rollups directly use the underlying Ethereum base layer as data availability layer, which comes with strong security guarantees, but also high cost and certain limitations. While Polygon zkEVM does rely on Ethereum for data availability, it addresses the data availability problem by using a highly advanced proofing system with cutting-edge data compression capabilities. Additionally, Polygon was also working on a specialized data availability layer under the name Polygon Avail, which will enable Celestium-style network implementations. However, the project recently decoupled from Polygon and is building independently under the name Avail Project now. Interestingly, Avail is building their chain on Polkadot’s Substrate framework.
ii. Sequencer Decentralization
Many rollups use a single node called Sequencer to generate blocks on L2. This bears speed advantages as blocks can be generated in seconds as the new blocks do not need to be handed over to other nodes for verification. However, it also introduces certain centralization concerns. Especially when there is no way for permissionless participation and the team basically becomes a centralized operator of the network.
Luckily many rollups also take steps to address these centralization issues. A common approach is to use a Proof of Stake based rotation mechanism in which sequencers are chosen from a permissionless pool of sequencers. However, many argue that randomly assigning block production to single validators does not ensure a sufficient degree of decentralization. To tackle these issues, a novel consensus model called Proof-of-Efficiency (PoE) has been proposed by the Polygon Hermez team (that will also be used in the Polygon zkEVM implementation). This mechanism will be explained in detail in section “Network Architecture”.
But in many ways, the PoE mechanism is similar to the Proposer-Builder Separation (PBS) that is part of the data sharding rollout in the Ethereum 2.0 roadmap and also aims to promote decentralization in the block production / validation process. The data sharding will implement a combination of PBS and what is referred to as crList. In the form of builders, PBS creates a new role which aggregates all Ethereum L1 transactions as well as raw data from rollups into lists or block candidates and submit them to proposers. While there can be many proposers, some censorship risks still exist.
What if all proposers choose to censor certain transactions? That’s where crList comes into play. With crList, block builders can force proposers to include transactions. As a result, PBS allows third-party builders to compete for providing the best block of transactions to the next proposer and allows stakers to capture as much of the economic value of their block space as possible, trustlessly and in protocol. The crList component on the other hand is a censorship resistance mechanism to prevent proposers from abusing their magical powers and force non-censorship of user transactions.
How does PBS work exactly you ask? On a high level, builders collect transactions from the mempool and create an immutable “crList”, which is essentially a list that contains the transaction information to be included in the block. The builder conveys the list of transactions to a proposer who reorders the transactions in the crList to maximize MEV (Maximal Extractable Value). In this way, although block builders have no say when it comes to how transactions are ordered, they can still make sure all transactions coming from mempool enter the block in a censorship-resistant manner, by forcing proposers to include them. In conclusion, the proposer-builder separation essentially builds up a firewall and a market between proposers and builders.
iii. Prover Decentralization
As we know from the section “Scaling Ethereum on Rollups”, zk-rollups require an off-chain prover to generate a succinct proof for a batch of transactions. However, generating these proofs for complex smart contract transactions can be costly.
To summarize, the typical transaction flow on a zk-rollup is as follows:
Users send transactions to a centralized sequencer (coordinator) on layer 2.
The sequencer executes transactions, packs multiple transactions into a rollup block, and pre-confirms the client once the transaction is included in a block.
A (often centralized) prover generates a succinct proof for the rollup block, which is uploaded to Layer 1 with the minimum required data for verification.
The layer 1 smart contract verifies the proof and updates the state (i.e., root hash).
The fact that many zk-rollup's take a rather centralized design approach with both the prover and sequencer being centralized, leads to several issues that limit its functionality:
Limited computational power: Proof generation time is crucial for zk-Rollup, particularly for large-sized circuits (e.g., zkEVM). While customized hardware (ASIC) can significantly reduce proof generation time, the high stake in ASIC design and manufacturing makes it economically impractical for a centralized entity.
Limited community participation: Community members or users cannot join the ecosystem other than by purchasing tokens and waiting for their release from the zk-Rollup company. This makes it difficult to distribute shares fairly in a centralized setting.
Potential attack from MEV and Denial of Transactions: The sequencer orders transactions centrally, enabling frontrunning for profit (e.g., inserting bad transactions). This issue is known as "Miner Extractable Value" in blockchain. The sequencer can even refuse to include some transactions in the Rollup block.
To address these issues, Scroll (zkEVM rollup, see chapter “The zkEVM Landscape”) has proposed a layer 2 proof outsourcing mechanism to decentralize the prover. This involves engaging "miners" to generate proofs and rewarding them according to their completed proving work, which typically correlates positively with circuit size (an important parameter with regards to zk proof systems). This encourages the miner community to contribute computational power to the platform. It is essential however to note that this is not Proof of Work (PoW) but rather "volunteer computation" or "verifiable outsourced computation." To differentiate it from PoW, these actors are referred to as rollers.
In Scroll's proposed implementation, one must stake SCR tokens (or however Scroll’s native token will be called) in a smart contract to become a legitimate roller and generate proofs. An initial reputation ratio proportional to the deposit is granted, with a higher deposit yielding a higher reputation ratio. The (in this proposed design) centralized sequencer selects multiple rollers for each block based on their normalized reputation ratio and sends the block to the selected rollers to generate proofs within time limit T.
The sequencer then verifies the proofs received from the rollers. In this scheme, the reputation ratio balances a roller's stake and computational power. The stake determines the upper bound of a roller's probability of being chosen, while the reputation ratio reflects the roller's actual computational power. This mechanism ensures fairness, as not only the fastest roller but also everyone has a chance to receive rewards. To maximize a roller's profit, they will be more willing to generate proofs for different blocks in parallel. All parameters will be dynamically adjusted to the community's computational power at the time.
It should be noted that this proposed design still uses a centralized sequencer, with only proof generation being outsourced to rollers. This maintains the efficiency and speed of pre-confirmation for users, as there is no need for "consensus". This once again shows that there almost always are trade-offs between decentralization and efficiency that need to be considered (see blockchain trilemma in “History of Scaling & Polygon Scaling Initiatives”).
The Polygon zkEVM implementation will initially rely on a centralized prover (the ZKProver) with decentralization plans for the future. This will be covered in more detail in section “Network Architecture”.
c. What the f**k is a zkEVM?
Alright, we know what zk-rollups are, how they work, why they are so cool, and what limitations they are still facing. Now it is time to dive into what the Ethereum Virtual Machine is, why it is important, and how the EVM is finally coming to zk-rollups in the form of zkEVM implementations.
i. Introduction to the EVM
Firstly, let’s have a look at what the Ethereum Virtual Machine (EVM) is. So basically, the EVM is a piece of software that executes smart contracts & computes the state of the Ethereum network after a new block is added to the chain. The EVM sits on top of Ethereum’s hardware & node layer & its main purpose is computing the network's state & compiling various types of smart contract code (written in human-readable programming languages) into a machine-readable format called bytecode to run the smart contracts.
The EVM hence powers smart contract execution and is one of the core features of Ethereum. Instead of a classic distributed ledger of transactions, the EVM transforms Ethereum into a distributed state machine. Ethereum's state is a large data structure which holds not only all accounts & balances, but also an overall network or machine state which can change from block to block according to a pre-defined set of rules. At any given block in the chain, Ethereum has one & only one canonical state with the rules for valid state transition, being defined by the EVM.
To gain a clearer understanding, let's examine a simplified version of a smart contract transaction at a high level:
The contract bytecode is retrieved from the Ethereum Virtual Machine's (EVM) storage and executed by peer-to-peer nodes within the EVM. Since all nodes utilize the same transaction inputs, it ensures that each node reaches the same outcome.
EVM opcodes, which are embedded in the bytecode, subsequently interact with various components of the EVM's state (memory, storage, and stack). These opcodes carry out read-write operations, reading values from state storage and writing or sending new values to the EVM's storage.
Lastly, EVM opcodes perform computations on the values acquired from state storage before returning the updated values. This update leads to the EVM transitioning to a new state.
ii. Why EVM on L2 matters
But before we have a closer look at the zkEVM concept and protocols that build it out, let's have a look at why it is important for the overall Ethereum ecosystem to have EVM-enabled layer 2 systems (and zk-rollups in specific).
As outlined in the section “History of Scaling & Polygon Scaling Initiatives”, Ethereum’s current design lacks true scalability. The rollup-centric ETH 2.0 roadmap introduced in chapter “Scaling Ethereum on Rollups” aims to address this by transforming the Ethereum L1 in a major settlement and data availability layer for a multitude of powerful rollup-based execution layers that take computation off-chain, thereby enabling fast and cheap transactions at scale.
But as with any emerging ecosystem, it is important to find adoption among developers and enable a smooth migration for projects built on Ethereum’s L1 as well as simple deployment for new projects. Since Solidity (the EVM language) and the EVM as a whole (with all the development tools) have basically become the de-facto programming standard for dApp development across the crypto space, it is not surprising that many alternative L1s (e.g. Fantom or Harmony) and side-chains (e.g. Polygon) have decided to rely on EVM implementations to attract developers.
While opening up the design space and the number of supported programming languages is an important development focus across the industry, for now EVM support can almost be considered vital to enable a thriving ecosystem of applications in any ecosystem (with a few exceptions). Hence, for the adoption of zk-rollup ecosystems, enabling EVM support is a major step forward for sure. However, innovation will not stop here as in the future we will surely observe the emergence of other VM implementations (also on (zk-)rollups) and the building of applications via WASM or RISC-V using programming languages such as C++, Solidity, Rust, or others rather than just Solidity.
On Ethereum-based (Zero Knowledge) rollups however, the importance of having an EVM environment also comes from an ideological perspective. The ethos of most zk-rollup projects is strongly aligned with the core values & vision of Ethereum as many explicitly define themselves as a piece in the overall Ethereum scaling puzzle. Consequently, many rollup projects strive for maximized Ethereum equivalence, which also means building an EVM implementation that is as close as possible to the original EVM.
But why is specifically the zkEVM so cool? Since it is easier to build EVM implementations on top of optimistic rollup technology rather than zk-tech (due to the complex cryptography involved in computing & verifying proofs), EVM-compatible execution environments on optimistic rollups have already been around for a while, with Arbitrum, Optimism or Metis being examples. The emergence of zkEVMs will now finally enable general purpose zk-rollups as well. So far, we have only seen application-specific zk-rollups & validiums. The importance of the scalable, secure and private execution layers that zkEVM-enabled rollups can provide for entire ecosystems of composable DeFi protocols & even (application-specific) recursive rollup structures on top of zkEVMs (see “Scalability”), can’t be overstated. This is a major innovation for the entire Ethereum ecosystem and an important step in finally bringing true scale to Ethereum.
iii. Meet the zkEVM
Back in July, a trio of announcements from Scroll, zkSync & Polygon kicked off the race to mainnet, a.k.a. the zkEVM wars, as each company implied that it would be the “first” to bring the “best” zkEVM to market. Now with only 3 days left at the time of publication before the launch of Polygon zkEVM’s mainnet launch, let’s dive into how the zkEVM works and what different forms of zkEVM implementations are out there among Polygon and its competitors.
A zkEVM is an EVM-compatible virtual machine that supports zero-knowledge proof computation. Integrating these two components is a complex process, as zk-proofs necessitate a specific format (algebraic circuit) for all computational statements, which can then be compiled into STARKs or SNARKs (more on this in section “ZK Proof System (plonky2)”). Developing zkEVMs allows general-purpose rollups to utilize EVM as their smart contract engine, ensuring compatibility with the prevalent interfaces in the Ethereum ecosystem and facilitating the migration of existing contracts and tooling applications onto the rollup.
zkEVMs can be categorized into three components: an execution environment, a proving circuit, and a verifier contract. Each component plays a role in the zkEVM's program execution, proof generation, and proof verification. The execution environment is where smart contracts run within the zkEVM, functioning similarly to the EVM. The proving circuit is responsible for generating zero-knowledge proofs to validate transactions in the execution environment. Zk-rollups submit validity proofs to a smart contract deployed on the L1 chain (Ethereum) for verification through the verifier contract. The verifier then performs computations on the provided proof and confirms the accuracy of the submitted outputs based on the inputs.
Constructing zkEVMs presents several challenges, such as the inclusion of specialized opcodes, a stack-based architecture, high storage overhead, and considerable proving costs. However, upon successful implementation, zkEVMs are expected to provide quicker finality, enhanced capital efficiency, and more secure scaling. As various zkEVM projects employ different methods for combining EVM execution with zero-knowledge proof computation, we will examine some of the ongoing zkEVM projects in a later section of this report.
Prior to that, we will explore classification approaches for zkEVMs and discuss the distinctions between zkEVM compatibility and equivalence. Equivalence is often referred to as the holy grail of EVM compatibility. Reaching EVM equivalence basically means reaching full bytecode-level compatibility and it is important to understand that EVM compatibility is not the same as EVM equivalence. Settling for mere compatibility means that devs are forced to modify, or even completely reimplement, lower-level code that Ethereum’s supporting infrastructure also relies on (more on this in the next section).
3. Overview of the (zk-)EVM Rollup Space
a. Levels of EVM Equivalence
There are multiple types of EVM compatibility. High-level they can be summarized in three main buckets:
Bytecode level
Language level
Consensus level
The breakdown proposed by Vitalik Buterin is more granular and divides the types of EVM compatibility and equivalence as follows:
Type 1 - Fully Ethereum-equivalent:
The first category of zkEVMs, as defined by Vitalik Buterin, is Fully Ethereum-equivalent zkEVMs. These strive to maintain complete compatibility with the Ethereum system, without altering any aspect such as hashes, state trees, transaction trees, or in-consensus logic. This perfect compatibility offers several benefits, including the potential for maximum L1 scalability and providing rollups with the necessary infrastructure. However, the complexity of a fully Ethereum-equivalent zkEVM results in extremely high prover times.
Type 2 - Fully EVM-equivalent:
Type 2 zkEVMs aim to be EVM-equivalent but exhibit certain differences in data structures like block structure and state tree. These zkEVMs enable most Ethereum-native applications to function on the zkEVM rollup, with only a few exceptions. Type 2 zkEVMs offer faster performance than Type 1 zkEVMs but may present incompatibilities for applications verifying Merkle proofs and relying on complex, zk-unfriendly cryptography. A Type 2.5 zkEVM achieves full EVM-equivalence except for gas costs, potentially involving precompiles, the keccak opcode, and specific patterns of contract calls, memory or storage access, or reverts.
Type 3 - Almost EVM-equivalent:
Type 3 zkEVMs sacrifice equivalence to enhance prover times and simplify EVM development. Consequently, these zkEVMs may eliminate a few features that are challenging to implement in a zkEVM. While this approach generally improves prover times, it introduces new complexity as some applications may need to be rewritten.
Type 4 - High-level-language equivalent:
The final category of zkEVMs facilitates smart contract code development in a high-level programming language, allowing for significantly faster prover times. However, this approach is accompanied by substantial incompatibilities.
But while prioritizing EVM compatibility has its advantages (see “Why EVM on L2 matters” and “Meet the zkEVM”), highly EVM-compatible zkVMs are often slower and more resource-intensive than zkVMs that go for a lower degree of EVM compatibility, in order to optimize for efficiency and scalability. This tradeoff between performance and EVM equivalence is why zkSync and more prominently StarkNet have chosen a less compatible approach to building their Virtual Machines (see “The zkEVM Landscape”).
b. The zkEVM Landscape
But let’s have a look at where the leading zkEVM projects fit into these classification systems. In terms of type of zkEVM, zkSync 2.0 for example falls into the language-level bucket. Consequently, devs can write smart contracts in Solidity, but zkSync transpiles that code into a language optimized for zk-proofs called Yul behind the scenes. While Matter Labs (company behind zkSync), claims that this system was engineered to provide the rollup scalability advantages, primarily in the proofing process, by most definitions, zkSync’s EVM implementation would likely be described as EVM-compatible rather than EVM-equivalent. Not being equivalent means that zkSync potentially isn’t 1:1 compatible with every single Ethereum tool out there, which might result in additional overhead for devs. However, Matter Labs insists that this shouldn’t be an issue in the long-term. This is similar to the approach StarkWare takes with StarkNet & the programming language Cairo.
Starknet is based on the Cairo programming language, which is optimized for zk-proofs. To enable smart contracts & composability, StarkNet takes a language-level compatibility approach & transpiles EVM-friendly languages (e.g. Solidity) down to a STARK-friendly VM (in Cairo).
Scroll & Polygon zkEVM on the other hand, are both taking the more ambitious bytecode-level approach to their zkEVMs. These approaches rip out the transpiler step completely, meaning they don’t convert Solidity code into a separate language before it gets compiled & interpreted. This generally results in better compatibility with the EVM. But even here, there are distinctions that probably make Scroll more of a “true” zkEVM than Polygon’s implementation. When Polygon announced bringing the first EVM-equivalent zkEVM to market back in July, it was quickly pointed out by many that the zkEVM implementation likely better described as EVM-compatible rather than EVM-equivalent based on its specifications.
As outlined in an article by Messari, part of the “true EVM” debate follows whether the EVM bytecode is being executed directly or interpreted first & then executed. In other words, if a solution does not mirror official EVM specs, it cannot be considered a "true zkEVM". Based on this definition, Scroll might be considered a "true zkEVM" vs. the others introduced here as it aims to execute EVM bytecode directly. According to Messari, Polygon on the other hand uses a new set of assembly codes to express each opcode, the human-readable translation of bytecode, as an intermediary step which could theoretically allow the behavior of the code to be different on the EVM. Hence, overall Polygon might be a little bit further from EVM equivalence than its main bytecode competitor Scroll but still comes close (see figure below).
Similarly, Taiko aims to achieve type 1 status (see “Levels of EVM Equivalence”) as well, striving for complete Ethereum equivalence without introducing modifications to facilitate zero-knowledge proof generation. Taiko's intention is to maintain compatibility with Ethereum at the opcode level, preserving hash functions, precompiled contracts, transaction and state trees, and other in-consensus logic elements. Despite temporarily disabling certain Ethereum Improvement Proposals (EIPs) as mentioned in its whitepaper, these limitations are expected to evolve over time.
Additionally, Taiko's compatibility extends further through its use of the Go-Ethereum client, a well-established Ethereum client that offers familiarity and ease of use for participants. This compatibility allows end users to interact with Uniswap on Taiko in the same manner as on the Ethereum mainnet, thus enhancing consistency, accessibility, and user satisfaction.
However, pursuing perfect compatibility presents challenges, particularly for aspiring type-1 zkEVMs like Taiko. The primary obstacle is the slow generation of zero-knowledge proofs, as Ethereum was not originally designed with zero-knowledge proof integration in mind. Consequently, the protocol contains numerous components that necessitate extensive computation for zero-knowledge proof generation.
c. Optimistic EVM Rollups
Outside the “zkEVM wars”, zkEVM rollups will also face tough competition from Optimistic rollups that already had time to establish themselves in the market. There is a number of EVM-enabled optimistic rollups implementations, including Arbitrum, Optimism, Metis or Boba. This section will briefly introduce these protocols and explore to what extent they might compete with zk-rollups like Polygon zkEVM.
Optimism: Is an optimistic rollup protocol. Transactions are sent to the layer 2 and received by sequencers who have a responsibility to accurately execute the transactions they receive. For executing transactions properly, sequencers are rewarded, while also being punished if they act maliciously. If someone suspects that a sequencer has acted fraudulently, they may alert the adjudicator contract on the Ethereum mainnet.
This can verify the validity of the results produced by the sequencer using the Optimistic Virtual Machine (OVM), an Ethereum Virtual Machine (EVM) compatible execution environment built for L2-systems. If it happens that a sequencer’s results are invalid, the optimistic rollup executes a fraud proof and the sequencer’s funds are slashed. Part of the slashed funds are awarded to the whistleblower that has challenged the results of the sequencer during a period known as the “challenge period”. This period typically lasts around 1 week which results in a 1-week delay in moving assets from Optimism back to the Ethereum layer 1.
Arbitrum: Is a suite of Ethereum scaling solutions that enables high-throughput, low-cost smart contracts on L2. While Arbitrum also offers alternative scaling solutions in the form of AnyTrust, this section focuses on the Arbitrum rollup, which is currently live as Arbitrum One on mainnet. Similar to OVM, the Arbitrum Virtual Machine (AVM) supports EVM but is optimized for allowing fast progress in the optimistic case while maintaining the ability to efficiently resolve disputes.
Thanks to EVM support, porting contracts from Ethereum to Arbitrum is fast & easy. Additionally, all smart contract languages that work with Ethereum (e.g. Solidity or Vyper) work with Arbitrum. Similarly, all standard Ethereum developer tools (e.g. Truffle, MetaMask, The Graph, ethers.js) are also natively integrated with Arbitrum.
Metis DAO: Metis is a hyper-scalable optimistic rollup L2 built on top of Ethereum. Metis offers a mostly EVM-equivalent virtual machine environment to run Solidity smart contracts. By introducing a peer network and rotating the sequencer role among a network of nodes, Metis innovates in improving optimistic rollup security/decentralization. Its multiple layers of checks and balances also enable Metis to significantly lower withdrawal times. Moreover, Metis leverages MemoLabs as a storage/data availability layer, which significantly reduces transaction costs. Last but not least, Metis offers an innovative framework for Decentralized Autonomous Companies (DAC), essentially taking DAOs to the next level.
Boba Network: Is an optimistic rollup-based scaling solution that is based on the work done by Optimism but continued development focus on things like swap-based onramps, fast exit to L1 or cross-chain bridging.
There are also other optimistic rollup projects like Specular or Base (Coinbase L2 built on Optimism stack), that strive to achieve the highest possible degree of EVM equivalence. A focus that is also very prevalent in the optimistic rollup space, especially among the leading Optimism and Arbitrum. While the technological approach that optimistic rollups take differs from the one of zk-rollups, the rationale of building EVM-equivalent execution layers that share security with the Ethereum base layer is the same and hence zkEVM rollups will certainly compete for users, TVL and devs with their optimistic counterparts. While optimistic rollups have a first mover advantage, zk-Rollups introduce certain technological features that optimistic rollups cannot provide (see “What makes zk-rollups so exciting?”).
II. Presentation of Polygon zkEVM
4. Technology Deep-Dive into Polygon zkEVM
a. Network Architecture
As mentioned earlier, the Polygon zkEVM relies on the novel Proof-of-Efficiency (PoE) consensus mechanism which is an innovative approach designed to address the challenges of sequencer decentralization. This mechanism employs a two-step model that involves two types of participants: Sequencers and Aggregators.
Sequencers are responsible for collecting L2 transactions from users and forming new L2 batches. Operating in a permissionless manner, Sequencers create batch proposals and submit them as Layer 1 (L1) transactions. In order to propose a new batch, Sequencers are required to pay the gas fees for L1 transactions, as well as an additional fee in $MATIC tokens. This additional fee ensures that Sequencers have an incentive to propose valid batches containing legitimate transactions. The batch fee can vary depending on the network load and is determined by a parameter automated from the protocol smart contract.
Permissionless aggregators are the second key actor in the PoE mechanism. Their primary responsibility is to generate (via the ZKProver) and submit validity proofs for the updated L2 states based on the proposed batches from Sequencers. As batches are proposed by Sequencers and recorded as Layer 1 (L1) transactions, Aggregators monitor these proposed batches and strategically decide when to initiate proof generation. Their objective is to be the first to produce a valid proof that successfully updates the L2 state by incorporating one or more proposed batches. The PoE smart contract accepts the first submitted validity proof that updates the L2 state, making the process highly competitive among Aggregators. Those who fail to submit the winning proof may incur the cost of proof generation, but most of the gas fees can be recovered.
By allowing multiple sequencers to propose batches and multiple Aggregators to participate in the proof generation process in a permissionless way, the PoE mechanism also promotes decentralization and prevents the network from being controlled by a single entity. Furthermore, it ensures that the network remains resilient against malicious attacks, as any Aggregator can step in to create and submit validity proofs when needed.
In the Polygon zkEVM, the transaction validation and verification process is managed by a centralized zero-knowledge proof component referred to as the ZKProver. This component enforces and implements the necessary conditions for a transaction to be deemed valid. The ZKProver executes intricate mathematical calculations involving polynomials and assembly language, which are subsequently verified by a smart contract. These conditions can be regarded as constraints that a transaction must adhere to in order to modify the state tree or exit tree.
Given its complexity, the ZKProver represents the most intricate module within the zkEVM. To implement the required elements, two new programming languages were developed: the Zero-Knowledge Assembly language and the Polynomial Identity Language (will not be covered in-depth in this report though).
At a high level, the ZKProver is comprised of four primary components:
The Executor or Main State Machine Executor
The STARK Recursion Component
The CIRCOM Library
The zk-SNARK Prover
In essence, the ZKProver employs these four components to generate verifiable proofs. Consequently, the polynomial constraints or polynomial identities serve as the requisite conditions that each proposed batch must fulfill. All valid batches are required to satisfy specific polynomial constraints. A highly interesting feature of this design is the combined use of SNARKs and STARKs (more on this in section “ZK Proof System (plonky2)”).
Owing to their speed and lack of trusted setup requirements, zk-STARK proofs are used in the above-mentioned verification process. However, their size is considerably larger than that of zk-SNARK proofs. Due to this size issue and the succinctness property (refer to “ZK Proof System (plonky2)”) of zk-SNARKs, the ZKProver uses zk-SNARKs to vouch for the accuracy of the zk-STARK proofs. As a result, zk-SNARKs are published as validity proofs for state changes.
With regards to EVM compatibility, Polygon zkEVM adopts a rather ambitious bytecode-level approach in its quest for EVM equivalence. By eliminating the transpiler step, this method retains compatibility with the Ethereum Virtual Machine (EVM) without converting Solidity code into a separate language before compilation and interpretation.
Polygon's zkEVM however, while highly EVM-compatible, may not be considered fully EVM-equivalent due to its utilization of a new set of assembly codes to express each EVM opcode. This deviation could theoretically result in differences in the behavior of the code when executed on the EVM. Nevertheless, it is one of the most EVM-equivalent zkEVMs out there (closest rival being Scroll, see “The zkEVM Lanscape”).
What is also important to mention with regards to the Polygon zkEVMs network architecture is that the role that layer 3 systems will play. Recursive L3 rollup structures are a crucial part of how Polygon aims to add throughput to the zkEVM. Polygon plans on providing extensive infrastructure and there are multiple projects already working on L3 implementations.
b. ZK Proof System (plonky2)
This section will introduce the cutting-edge plonky2 proof system that Polygon zkEVM uses as a basis for its validity proofs. However, let’s first have a closer look at what SNARKs and STARKs are and how they differ from each other.
First of all, what are Zero Knowledge proofs (ZKPs) in general? ZKPs are a cryptographic technique that allow individuals or entities to prove to another that a statement is true, without revealing any information beyond the validity of the statement.
In crypto we especially care about what is referred to as non-interactive Zero Knowledge proofs (NIZKPs), a variant of ZKPs, which don't require interaction between the certifier and verifier. They rely on a common reference string (CRS), which is generated in a trusted setup ceremony that is publicly verifiable. The CRS serves as the basis for the NIZKPs and allows for the verification of statements without any interaction. However, the reliance on a CRS for NIZKPs raises concerns about the security of the system, as the CRS must be generated in a trusted setup ceremony to ensure the integrity of the system.
The trusted setup ceremony involves generating a CRS that consists of cryptographic parameters used to generate the NIZKPs used in the system. One challenge in the design of NIZKPs is finding the balance between security, efficiency, and the size of the CRS. Ideally, the CRS should be small and secure. Techniques like universal hash functions and the Fiat-Shamir transformation have been proposed to achieve this balance.
zkSNARKs are the most well-known form of zero-knowledge proofs and were first introduced by the ZCash protocol in 2012. But what are SNARKs exactly? SNARKs (Succinct Non-interactive ARguments of Knowledge) are a specific form of zero-knowledge proofs and a method of verifying the validity of a statement without revealing any information about said statement. The zkSNARK proofs in ZCash for example allow the network to verify transactions without compromising user privacy, as even the nodes enforcing consensus rules do not need to know the underlying data of each transaction, improving both user anonymity & transaction confidentiality in the network. Zk-rollups have similar privacy benefits that are outlined in the section “Privacy”.
But ZCash was not only a pioneer, ZCash’s most recent Halo proof system was also the first Zero Knowledge proof system ever discovered that supported two key features in one system:
No trusted setup
Recursion
As mentioned above, one key feature of Halo is its lack of a trusted setup. This means that the security of the system is not reliant on the integrity of a setup ceremony and thus is not vulnerable to exploitation in the event of a compromise of that ceremony. The absence of a trusted setup also enables greater protocol agility and allows for the development of novel zero-knowledge protocols removing a major complexity / risk factor that was inherent to non-interactive Zero Knowledge proof systems in the past.
The second key property that makes Halo stand out is recursion. This means that the protocol is scalable and capable of proving arbitrarily complex facts, making it a "general purpose" protocol that can be used for a wide range of zero-knowledge applications (zkApps). An important benefit that comes with that, is that proofs can verify other proofs, allowing for recursive rollup structures (e.g. application-specific L3s) on top of the generalized L2 (see chapter “Scalability”). As outlined in the previous section “Network Architecture”, L3s also play an important role in realizing Polygon zkEVM’s full scaling potential.
More recently, a proof technology that gained a lot of attention. STARKs or Scalable Transparent ARguments of Knowledge). But what makes STARK proofs interesting you ask? Well, STARKs offer a number of benefits, the main one being the elimination of the need for a trusted setup (like more advanced SNARK proof systems such as Halo). However, both SNARKs & STARKs come with some limitations & trade-offs. While the former requires a trusted setup in many cases, it is primarily the computational overhead required to generate & verify proofs that can limit the use of STARKs.
While Polygon zkEVM does not use the Halo proof system or a pure STARK proof system, it does combine both elements in a cutting-edge zk proof system. Just like Halo, the plonky2 proof system that Polygon zkEVM uses to compute SNARK proofs, is based on PLONK, a proof system introduced in 2019. Since its release, various ameliorations and extensions have been proposed, including fflonk, turbo PLONK, ultra PLONK, plonkup or more recently, said plonky2.
PLONK stands for "Permutations over Lagrange-bases for Oecumenical Noninteractive arguments of Knowledge" and is a form of general-purpose zero-knowledge proof scheme. One of the improvements that PLONK introduced is that while it still requires a trusted setup procedure similar to that needed for the SNARKs in ZCash's old (pre-Halo) proof system, it is a "universal and updateable" trusted setup. However, more advanced extensions of PLONK like Polygon’s plonky2 have even managed to remove the need for a trusted setup entirely.
Plonky2, a recursive proof system, offers notable advantages in terms of speed and compatibility with Ethereum. By integrating PLONK and FRI (polynomial commitment scheme used in STARKs), it combines the strengths of STARKs, such as rapid proofs and the absence of a trusted setup, with the benefits of SNARKs, including support for recursion and low verification costs on Ethereum. The FRI polynomial commitment scheme, used in STARKs, has the potential to significantly enhance the performance of recursive SNARKs.
With plonky2, proving can be done on CPUs. This is better than proving ion GPUs because the main metric that Polygon aims to optimize for is cost. Since CPUs are much cheaper than GPUs, enabling CPU-proving is hence an important benefit. With the prover being so fast, Polygon can use CPUs, with similar latency to GPUs at a lower cost. Impressive is that generation just takes 170 milliseconds on a Macbook Pro.
The speed mainly stems from the fact that the use of FRI allows Polygon to use smaller fields (important for some complex computations). Additionally, Polygon uses recursive SNARK proofs to vouch for the validity of the STARK proofs.
By wrapping a final plonky2 proof in a pairing-based proof, which costs 180k to verify on-chain, the cost of verifying proofs on Ethereum is very low. This is especially important as the vast majority of L2 fees are calldata costs (see “Scalability”). The proof cost is amortized across all transactions in a proven batch. Plonky2 is said to be the fastest and cheapest system for proof verification on Ethereum.
6. Metrics
When it comes to evaluating the performance and effectiveness of a blockchain network, one of the key areas to focus on is its metrics. These metrics can provide valuable insights into how well the network is operating, how active its user base is, and what level of security it provides.
a. Polygon POS
The number of wallets created on a blockchain network is often considered a key metric for measuring its user base and adoption. While some of these wallets may be created for malicious purposes, such as sybil attacks or artificially inflating the numbers, they still provide valuable insight into the overall health and growth of the network.
With that in mind, it's worth noting that Polygon has seen an impressive 224 million wallets created on its network to date. This number is particularly significant when compared to Ethereum itself, which currently boasts just slightly above 225 million wallets created.
In addition to the number of wallets created, another important metric for assessing the health and adoption of a blockchain network is the total value bridged on the chain. This metric reflects the amount of money invested in the network through various tokens, providing insight into the level of investor confidence and overall demand for the network's services.
For Polygon, the total value bridged on the chain currently stands at $8.3 billion, with the majority of this value held in MATIC tokens, which make up around 60% of the total.
This metric is different from the Total Value Locked (TVL) figure, commonly used in DeFi analytics as it reflects the total amount of money deposited in DeFi applications on the chain, compared to the total value bridged on the chain including all tokens held in wallets, not just those currently being used in DeFi applications.
This is an important distinction to make, as it highlights the fact that the total value bridged on the chain represents a much broader picture of the network's overall adoption and use.
The TVL on the Polygon network is only based on money deposited in DeFi applications, so its TVL is lower than the total value bridged on the chain. Polygon is ranked fifth in TVL by DefiLlama but third in total value bridged, indicating that it attracts investment and usage beyond DeFi.
Transaction volume is another key metric and has remained relatively stable over the past year, in contrast to other scaling solutions such as Optimism and Arbitrum, which have experienced significant growth during the same period.
While stable transaction volume may initially seem like a positive indicator of network health and stability, it also shows a lack of transaction growth, being a sign of network maturity and stability.
As the DeFi ecosystem continues to evolve and new applications and protocols are developed, we may see changes in transaction volume and other key metrics that will provide further insight into the network's overall health and potential for growth.
Revenue generation from fees and transactions is a critical factor in evaluating the performance and sustainability of a blockchain network. In this regard, Polygon has emerged as a clear leader, generating the highest amount of fees among all blockchain networks. Over the past 30 days, Polygon has generated an average of $5 million in fees, and over the past year, the network has generated a total of $26 million, securing its position as the top revenue-generating chain in the market.
However, revenue generation alone does not necessarily equate to profitability. In fact, when we examine Polygon's financial statements, we can see that the network is investing heavily in various incentives and rewards programs aimed at attracting and retaining users, as well as supporting its validators and stakers.
While these investments may be crucial for the long-term growth and success of the network, they also represent a significant expense that must be carefully managed in order to ensure profitability. As it stands, Polygon is not yet profitable as a blockchain, but its strong revenue generation and ongoing investment in the network suggest that the potential for future profitability is certainly there.
Ultimately, as with any blockchain network, the key to long-term success and profitability lies in finding the right balance between revenue generation and expense management, while also ensuring that the chain is able to attract and retain users and developers in a highly competitive market.
While Polygon may not yet be profitable as a blockchain, the network does have a significant advantage in the form of its substantial treasury, which is currently valued at $2.4 billion. This treasury provides the network with a significant runway and the resources to continue investing in its infrastructure, incentivization programs, and other key initiatives aimed at attracting and retaining users. In the Funds section, we'll take a closer look at how Polygon's treasury is structured and managed
b. zkEVM
As we turn our attention to the zkEVM chain metrics, it's important to note that since the chain is not yet live, our analysis will be based on the results and metrics made available by the testnet.
According to the latest reports from mid-February, the zkEVM testnet has seen significant testing activity from both the community and development teams, with more than 300,000 transactions processed in total with an average cost per TX of 0.00875$. This level of testing activity is a positive sign for the future of the zkEVM chain, as it suggests that there is strong interest and engagement from both developers and users in the network.
It remains to be seen how the network will perform in a live environment, and there may be additional challenges and complexities to address as the network scales and grows. Nonetheless, the initial results from the testnet are certainly encouraging and suggest that the zkEVM chain has the potential to be a significant player in the blockchain space in the years to come.
In addition to the strong testing activity on the testnet, we can also see that the total number of contracts deployed on the network is steadily increasing. According to available data, the zkEVM chain has already validated more than 74,000 zk proofs, a significant milestone for a network that is still in its early stages.
This growth in the number of contracts deployed and proofs validated is a positive sign for the future of the zkEVM chain, as it suggests that developers and users are actively engaging with the network and exploring its capabilities. As more developers build on the network and more users begin to leverage its unique features and benefits, we can expect to see continued growth and expansion of the zkEVM ecosystem.
The positive results from the zkEVM testnet have given the development team the confidence to move forward with the full deployment of the network on March 27th, which will allow users to transact with real money on dApps running on the network.
This upcoming launch is an important milestone for the zkEVM chain and represents a significant step forward in its development and evolution. With the full deployment of the network, developers and users will have access to a powerful new platform for building and deploying decentralized applications that leverage the benefits of zero-knowledge proofs and other advanced cryptographic techniques.
There may be challenges and issues to address as the network scales and grows, but the team behind the zkEVM chain has demonstrated a strong commitment to innovation and excellence in their work. This leads to a significant excitement and anticipation among developers and users for the full launch of the network, and we can expect to see continued growth and adoption in the months and years to come.
6. Tokenomics
The MATIC token is an essential component of the network's functionality and serves multiple purposes. In this context, let's dive into the most significant aspects of the token.
a. Use cases
Firstly, MATIC token holders possess governance rights and can actively participate in on-chain governance proposals. The network's governance system follows a community-driven approach where token holders can vote on proposals that relate to network upgrades, fee adjustments, and other important matters.
Secondly, the MATIC token is crucial for gas fee transactions on the POS chain, whereas Ether will be utilized for gas fees on the zkEVM. This distinction is crucial to consider while transacting on the network.
Lastly, Validators and stakers play an important role in securing the blockchain, and they receive MATIC yield for their services. Validators and stakers use the MATIC token to secure the network, which incentivizes network participants to take an active role in maintaining its integrity.
b. Supply
The total supply of MATIC tokens stands at 10 billion, out of which approximately 9 billion tokens are in circulation without any remaining vested tokens. The remaining tokens have been reserved for the staking rewards.
19% of the tokens were reserved for the Binance Initial Exchange Offering (IEO) that took place in late April 2019. The IEO aimed to raise $5M from investors and users at a market capitalization of $26M at the time. The IEO turned out to be one of the most profitable ones, with an all-time high (ATH) return of 1100x for users who participated in it.
Private investors took part in 2 different rounds, with 3.80% of the total supply shared between the Seed Round and the Early Supporters round. The Seed Round was held at a rate of 1 MATIC = $0.00079, and it successfully raised a total of $165,000. This sale accounted for 2.09% of the total token supply.
The Early Supporters round followed shortly after the Seed Round and was conducted at a rate of 1 MATIC = $0.00263. This round was more successful, raising a total of $450,000 and accounting for 1.71% of the total token supply.
As said earlier, MATIC holders have the opportunity to participate in staking and earn rewards for helping to secure the network. As of March 2023, the annual staking rewards for MATIC ranged up to 6%, depending on the platforms and validators used, and dropped to 3.2% when adjusted for inflation.
The staking model locks a significant portion of the token supply, with up to 3.9 billion tokens currently staked, and the unbonding process taking 21 days. This approach enables better control over the token supply and reduces potential sales.
The ecosystem and foundation funds are utilized to back various projects, grants, and initiatives that support the growth and development of the Polygon ecosystem. These initiatives aim to promote innovation, adoption, and community participation, all of which are critical to the long-term success of the network.
c. Current state
The MATIC token's decline in price from its ATH of $2.92 in December 2021 by 60% to $1.12 as of March 2023, has not prevented it from becoming one of the most capitalized cryptocurrencies. Despite the decline in value, MATIC currently holds the 8th spot in the industry by market capitalization.
Finally, the fully diluted valuation (FDV) of the MATIC token, which takes into account its total supply of 10 billion tokens, stands at $11,2B.
7. Ecosystem
a. The POS ecosystem
The Polygon ecosystem is currently thriving with a vast array of dApps and protocols that have been deployed on the network. According to the latest reports, there are more than 7000 dApps running on Polygon, that attracted more than $1B of TVL, each offering unique features and capabilities to users.
The variety of dApps on Polygon is impressive, ranging from decentralized exchanges (DEX) and lending platforms to gaming and NFT marketplaces. This diversity of use cases shows the variety of the Polygon network and its ability to support a wide range of dApps and protocols.
While the Polygon network has a thriving ecosystem of dApps and protocols, it is true that the majority of DeFi applications on the network are DEX and lending/borrowing platforms. However, this is not necessarily a drawback, as these are some of the most popular and in-demand DeFi use cases.
That being said, other Layer 2 scaling solutions such as Arbitrum have a more diverse range of DeFi applications and protocols, which could make them more attractive to certain users and developers.
The Polygon team has been actively working to attract new projects and developers to the network, through initiatives such as the Polygon Ecosystem Support Program and the Polygon Grants program. These efforts have already showed results, with many new projects and protocols choosing to build on Polygon and contribute to the growth of its DeFi ecosystem.
But these are the projects actually developed on the POS chain which will need to deploy on the zkEVM chain. As the chain is EVM-compatible, projects can easily deploy their smart contracts on the zkEVM chain.
Polygon has been making headlines in the blockchain space by partnering and onboarding with the most web2 companies in the past year. While some may argue that these partnerships were primarily focused on the NFT side of the chain, it is undeniable that they represent a major step towards developing the chain and bringing web2 projects into the web3 industry.
Of all its partnerships, the one with Reddit has been the most significant. Through this partnership, Polygon enabled Reddit users to mint over 5 million NFTs, breaking adoption records and showcasing the potential of the blockchain platform. This achievement not only solidified Polygon's position as a leader in the NFT space but also demonstrated its potential to drive the adoption of web3 technology across a wide range of industries.
b. zkEVM ecosystem
The ecosystem surrounding the zkEVM is experiencing rapid growth as a multitude of projects are actively preparing to deploy their applications on the chain. This thriving development is indicative of the excitement and potential that the zkEVM holds for various industries and use cases.
As more projects join the ecosystem, the DeFi network will strengthen, leading to even greater innovation, value bridged and adoption of the chain, granting a bright future!
Here are some projects that will be deploying on the chain once live:
On the DEX part:
Kyberswap (@KyberNetwork)
Quickswap (@QuickswapDEX)
Hashflow (@hashflow)
Uniswap(proposal underway) (@Uniswap)
MantisSwap(@MantisSwap)
Curve(@CurveFinance)
Satin (@SatinExchange)
Balancer (@Balancer)
TimeSwap (@TimeswapLabs)
Regarding the lending and borrowing projects we have:
Qi (@QiDaoProtocol)
0vix (@0vixProtocol)
Aave (@AaveAave)
Compound(Proposal underway) (@compoundfinance)
The Derivatives sector will also host multiple protocols
Primex(@primex_official),
Synfutures (@SynFuturesDefi)
Gains (@GainsNetwork_io)
d8x (d8x_exchange)
Satori (@SatoriFinance)
Quickperps :eyes:
Voltz (@voltz_xyz)
The chain has a diverse range of projects that have been developed and deployed to create a thriving DeFi ecosystem. These projects provide users with numerous opportunities to trade and make full use of the technology. Overall, the new chain offers a dynamic and user-friendly environment that allows for seamless and hassle-free participation in the DeFi ecosystem.
8. Community and marketing
In order to succeed in the Layer 2 market, it's crucial for projects to have effective communication strategies and establish their narrative as early as possible. The crypto market can be irrational at times, which is shown by the disparities among the top 10 projects based on their fundamentals and innovations. Some tokens that don't seem to have strong fundamentals have still managed to make it to the top rankings, highlighting the importance of community and communication.
Although the fundamentals of Polygon zkEVM are robust and contribute significant value to the ecosystem, it's equally important to assess their communication strategy and community-building efforts.
Polygon has one of the most significant social media followings in the blockchain industry, with its Twitter account alone amassing 1.8 million followers. However, when it comes to its zkEVM chain, it currently has 30k followers, which puts it behind some of its competitors. For instance, zkSync has 640k followers, Arbitrum has 588k, Optimism has 438k followers and Scroll 261k followers.
Although the number of followers is a vital metric to gauge community engagement and interest, it is essential to note that a community's quality is more important than its quantity.
A recent survey conducted by Ethereum Daily focused on the user attitudes towards various layer 2 solutions. The survey highlighted Arbitrum as the most well-known chain among investors due to its recent surge in popularity. Polygon zkEVM did not appear in the ranking at the study time showing a potential lack of awareness.
The survey results highlighted an important issue for the adoption of the zkEVM chain, as it may be challenging to incentivize users to adopt and transact on the chain. The newer users in the crypto space are primarily attracted by financial opportunities, such as airdrops or early investment stages, while the more experienced users are drawn towards the technological advancements of the layer 2 solutions. This disparity between the two groups shows that a complex and multi-faceted marketing strategy is required to appeal to both groups.
In the case of Arbitrum, it succeeded in becoming a major hub for DeFi because users were able to make money on the chain and benefit from its early protocols. However, with protocols deploying on the Layer 2 zkEVM chain, the potential financial opportunities for users to make money on the new chain may be increased.
To onboard as many users and developers as possible, Polygon is working a lot on events, partnerships and building communities.
They’re mainly focusing on India and onboarding developers. They notably implemented the “made in India” tour where they do an India tour to meet and educate developers. They also organized projects like ETHIndia and Polygon Connect India that gathered thousands of developers and community members.
In 2022, the Polygon ecosystem accomplished significant growth and development with the support of its community. The launch of the Guild program saw hundreds of members join more than 70 guilds worldwide to learn about the future of the web, which provided support and education to help communities flourish.
Polygon Advocates played an instrumental role in helping builders worldwide by organizing workshops, training programs, and education through various channels, more than 7,000 developers joined the programs organized by Polygon Advocates in 2022.
It's crucial for Polygon to attract and develop innovative protocols and provide attractive marketing, communication, and financial incentives to attract DeFi users to its zkEVM chain.
It's also important to remember that markets are irrational, and having the best technology does not guarantee a protocol's success in the market. The crypto industry is still an immature market, and users are often more driven by marketing, hype, and financial opportunities than technology.
Therefore, to achieve mass adoption and TVL, the Polygon team will need to focus on both the technology and the marketing aspects of their zkEVM solution.
9. Funds
Having sufficient funds is crucial for any blockchain project as it requires a significant investment of resources to build, operate and maintain a blockchain network.
Blockchains are usually developed for the long term, and it can take years for a project to gain significant adoption and generate substantial revenue. Being in the scene since 2019, we saw that Polygon is still not profitable, which means that having a long runway of funds is critical to ensure the project's sustainability and continued development.
Having sufficient funds also allows blockchain projects to invest in research and development, hire top talent, and expand their network's infrastructure. Additionally, funds can be used to support partnerships and collaborations with other projects to expand the network's user base and increase its value proposition.
a. Fund raising
To sustain and expand its operations, Polygon has been actively raising funds through a series of fundraising rounds since 2019. Throughout this period, Polygon has conducted eight different rounds, raising a disclosed amount of $456.5M, an impressive amount that speaks to the platform's popularity and appeal in the industry.
The notorious raises were:
A venture round with undisclosed series for $450M led by Sequoia capital India
A series A round for $1M in December 2020 made by John Lilic
A early contributor round for $450K in April 2019 led by Coinbase Ventures
A seed round for 165k made by unknown investors
Polygon has also benefited from the Initial Exchange Offering of their tokens through the Binance IEO in April 2019 for $5M. Other notorious and undisclosed raises were made by Mark Cuban and Disney accelerator.
This impressive fundraising has cemented Polygon's position as the highest-funded chain in the ecosystem, underscoring the platform's immense potential and the trust it has garnered from investors. Furthermore, the $456.5M raised by Polygon stands out as the largest funding round accomplished by any blockchain network in 2022, which is a remarkable achievement in a year that has seen significant interest and investment in the blockchain industry.
The funds raised by Polygon will enable the platform to continue expanding its capabilities and features, leading to the development of more innovative and decentralized applications.
b. The treasury
At the beginning of 2021, Polygon had an impressive amount of $4.6 billion MATIC in its treasury. Later in the year, the project announced its strategic focus on ZK-based scaling solutions and allocated $1 billion in treasury funding towards this effort. This move allowed Polygon to acquire ZK-projects and teams, build ZK-based solutions, hire talented individuals, and provide research funding.
The platform acquired the ZK-rollup Hermez Network for a whopping sum of $250 million, highlighting the platform's strong commitment to expanding its capabilities and strengthening its position in the market. This acquisition was soon followed by another on December 9, 2021, when Polygon acquired Mir Protocol (now Polygon Zero) for $400 million.
As a result of these acquisitions, the treasury wallet of Polygon has received a considerable drop, currently sitting at $2.4 billion. This includes $250 million in USD and 1.9 billion MATIC tokens, that still shows the platform's financial strength.
One of the significant benefits of having a treasury is that it allows the blockchain network to have a comfortable runway, without the need to constantly raise more funds between different market phases (bear and bulls). This stability is essential in the current fast-paced market, where market conditions can change rapidly and unpredictably.
By having a treasury, blockchain networks can also focus on developing new products and services, expanding their offerings, and exploring new partnerships and collaborations. This is especially crucial in this highly competitive industry such where innovation is the key to staying ahead of the curve and attracting new users.
c. Polygon ventures
Polygon has its own fund and invests/advises more than 60 projects at the moment. The venture capital investment fund is solely focused on fostering the development of new dApps on the Ethereum and Polygon blockchains.
One of the significant advantages of the Polygon Ventures fund is that it provides much-needed financial support to developers who may not have access to traditional funding sources and attract them to the blockchain.
By providing financial support, Polygon Ventures is helping to create a more vibrant and diverse ecosystem of dApps on its blockchain which is critical to the long-term success, as it fosters innovation, attracts new users, and drives the growth of the blockchain industry as a whole.
10. Team
One of the key advantages that sets the Polygon team apart on this project is the wealth of experience they have gained over the years by creating and monitoring a chain that has grown to become one of the largest in the industry.
a. The Polygon Labs team
This experience is a real competitive advantage that enables them to navigate the complexities of the blockchain landscape and deliver functioning products to users.
Founded in 2017, Polygon was the brainchild of four co-founders and has now grown to over 500 talented individuals working on various projects and chains.
Here is a presentation of the main actors of the project:
Jaynti Kanani, CEO and co-founder, is a Full-stack developer and blockchain engineer. He contributed to multiple blockchain projects and was previously a data scientist at Housing.com @jdkanani
Sandeep Nailwal, COO and co-founder, has served as a CTO for Welspun Group. He also ran his own firm specializing in Blockchain services and products, where he worked developed and deployed Decentralized app architectures @sandeepnailwal
Anurag Arjun, CPO and co-founder, has over nine years of experience in product and project management, and has worked with companies such as IRIS Business Services, Dexter Consultancy, Cognizant Tech, and SNL Financial. @anuragarjun
Mihailo Bjelic, is a cofounder with information systems engineering background @MihailoBjelic
Young Ko, CFO, is an experienced finance executive with 16+ years of global accounting and FP&A experience.
Jennifer Kattula, SVP marketing, has more than 10 years of executive Marketing in web2 and web3 companies @jkattula
Ryan Wyatt, President, previously CEO of the gaming division, he has a 14-year career in the gaming industry and was the person behind all the web2 partnerships @Fwiz
Unfortunately, Arnurag Arjun has recently decided to leave the Polygon executive team and brought with him the Polygon Avail project (modular blockchain) that will now become independent. This event took place after the layoff period in February 2023, when Polygon announced a 20% headcount reduction (100 persons) as part of organizational restructuring. Today the team still represents more than 400 persons working on all the various chains and projects Polygon may have.
b. zkEVM team
The zkEVM team is slightly different form the global Polygon team and is led by a group of seasoned experts including:
David Schwartz, co-founder of Polygon Hermez, and now project lead, David has experience as a CTO and web3 developer @davidsrz
Jordi Baylina, co-founder of Polygon Hermez, is a developer and blockchain OG with multiple years of experience in the field. @jbaylina
Brendan Farmer, co-founder of Mir protocol (now Polygon Zero) is working on developing the zkEVM project @_bfarmer
Daniel Lubarov, co-founder at Mir protocol, is behind the cryptography part on zkEVM and has more than 10 years of experience as a software engineer @dlubarov
With their expertise in zero-knowledge proofs and virtual machines, they are well-positioned to take on the challenges of creating a scalable and efficient blockchain ecosystem.
Overall, the Polygon team brings a wealth of experience, talent, and dedication to this project. Their commitment to innovation and excellence is sure to drive the project forward and deliver exceptional results for all stakeholders involved.
11. Pumpamentals
To gain a competitive edge in the market and achieve widespread adoption, zkEVM will need to focus on developing what could be referred to as "pumpamentals". This term refers to the narratives and strategies that a project should employ in order to effectively market and promote itself.
Cooperation with “real-world” companies:
Polygon is a blockchain that is leading the charge in bringing traditional web2 companies into the Web3 industry. By providing a scalable and cost-effective solution for businesses to transition to decentralized technology, Polygon is making it easier than ever for companies to get the benefits of blockchain adoption.
With more and more web2 companies onboarded onto the Polygon network, they are experiencing increased adoption and recognition in the international market. This onboarding will benefit to the zkEVM chain as the user will be much more tempted to use zkEVM rather than other chains.
Many L2 scaling solutions are building advanced solutions to improve Transaction Per Second (TPS) reaching extremely high levels (20 000 +) which is great for a future adoption. However, most of the blocks are empty and does not fill a part of the potential that some chains are building. In the figure below chains like Arbitrum and zkSync benefited from the recent hype with the Airdrop, pumping their Max recorded TPS but are on average processing between 1 and 5 TPS.
In the end, we can see that current technologies providing 2000+ TPS are able to face almost any current demand, meaning that the focus should be on user adoption and onboarding contrary to improving potential TPS.
Polygon is currently the chain working the most towards this end by onboarding most of the major web2 brands and companies on their ecosystem, giving an edge on their user adoption and future development.
Cutting-edge zkEVM technology:
A byte-code level EVM-equivalent development environment that will allow for an easy transition for devs and projects as Solidity contracts can be deployed with almost no modification. Moreover, the plonky2 Zero Knowledge proof system Polygon zkEVM uses is amongst the most advanced proof systems and likely to be the fastest and cheapest system for proof generation and verification on Ethereum. It also supports recursion and hence allows for recursive layer 3 rollup implementations. All while the Polygon zkEVM rollup inherits Ethereum’s security guarantees and uses a PoE consensus mechanism on L2 to decentralize tx batching and proof generation similar to how PBS in danksharding aims to decentralize block production on L1. The system allows multiple sequencers to propose batches and multiple aggregators to take part in the proof generation process, allowing for permissionless participation, while splitting traditionally centralized tasks among two distinct roles, further improving network decentralization.
Major funding:
In 2022, Polygon successfully raised a whopping $450 million in funding, enabling them to make significant investments in cutting-edge zk technology to improve their network. This has cemented their position as one of the most well-capitalized projects in the blockchain industry, with a treasury of $2.4 billion.
This gives them a tremendous financial flexibility to face any potential bear markets or to invest in expanding their operations and hiring more talent. Their vast financial resources put them in a strong position to continue innovating and growing, ensuring their longevity and success in the competitive blockchain landscape.
Merging and improving instead of building:
Rather than building from the ground up, Polygon has chosen to acquire existing projects to enhance its ecosystem. The acquisition of Mir and Hermez has been instrumental in Polygon's success as they will be the first to launch a type 2 zkEVM. As a result, they have gained a significant first-mover advantage in the market putting them ahead of their competitors in the race to develop zkEVM scaling solutions.
Team proven track record:
Polygon has a highly skilled and experienced team when it comes to blockchain development and management, setting them apart from their competitors. Their team's vast knowledge and expertise in this field have led to their success, evident in their impressive market capitalization and one of the largest TVL among the blockchains. As a result, Polygon's team has become a trusted authority in the industry, helping them to establish a strong presence and earn the respect of their peers.
There are plenty of narratives and growth opportunities for Polygon to explore, as the industry is still in its early stages of development. Surfing on trends and shifts in the market is a crucial skill for any team in this field, and Polygon has demonstrated an ability to do so effectively.
12. Risks
In this report, we have taken a holistic approach to analyze the project, which means examining it from all angles and considering both its strengths and weaknesses. This allows us to provide a comprehensive assessment of the project's potential and highlight areas where further development or refinement may be necessary. Here are some of the risks we identified:
Technology
When interacting with a Zero Knowledge rollup, it is important to understand the risks that are specific to these systems. One aspect that needs to be considered with a zk-based system (whether its a L1 or a L2) is the issue of upgradability. The primary issue in the protocol design for private L1 protocols is the tight coupling with the chosen proving system, which results in the selection of various components, such as the field, program representation, hash function, and state vector commitment, based on their efficiency for proving. For instance, if Marlin defined over BLS-377 is chosen as the ZKP, the hash function will be a SNARK-friendly hash defined over the scalar field for the curve to ensure efficient proving. Consequently, the programs will be encoded as verification keys for Marlin/BLS-377.
While this poses no problems as long as the proving system remains the same, the rapid evolution of zk-technology may render the zk-L1 at a significant disadvantage when a newer and more advanced proving system emerges, such as plonky2. The challenge arises as plonky2 is defined over a new field, rendering the existing app verification keys incompatible. Additionally, plonky2 proofs cannot efficiently access the blockchain state due to the Marlin/BLS-377-specific hash function being slow in plonky2. Therefore, it becomes crucial to develop a method for migrating application logic from Marlin to plonky2.
While upgradability can be a challenge for zk-rollups as well, it is considerably easier if the protocol is a layer 2. In order to accommodate future updates to the Polygon zkEVM implementation for example, whether it is for adding new features, fixing bugs, or optimization purposes, the following contracts are deployed using a Transparent Upgradeable Proxy (TUP) pattern:
PolygonzkEVM.sol (Consensus Contract)
PolygonzkEVMGlobalExitRoot.sol
PolygonzkEVMBridge.sol.
To ensure security and streamline the audit process, the Polygon zkEVM team has opted to utilize OpenZeppelin's openzeppelin-upgrades library for implementing this functionality. As shown in the figure below, Open Zeppelin's TUP pattern separates the protocol implementation of storage variables using delegated calls and a fallback function, allowing the implementation code to be updated without changing the storage state or the contract's public address.
Upgradability is important since it can be key to address issues/bugs in the case of a network security incident. However, upgrade processes can also introduce some risk itself as funds can be stolen if a contract receives a malicious code upgrade. In the case of Polygon zkEVM, at least the users have some time to react, as there is a 7 days delay on code upgrades.
Additionally, Zero Knowledge tech is still a nascent technology and considering its complexity, there certainly is some risk that funds could be stolen if the cryptography is broken or implemented incorrectly. However we do have a lot of trust in the leading zkEVM teams such as the one at Polygon to implement sound proof systems (plonky2) in a secure way.
Many zk-rollup systems also have rather centralized network designs in which centralized sequencers have a lot of control over the network, introducing a certain degree of censorship risk (if the operator refuses to include a user’s transactions). Moreover, in a centralized setup, MEV can be extracted if the sequencer exploits the centralized position and frontruns user transactions. However, in the case of Polygon zkEVM, these risks are addressed. Firstly, Polygon zkEVM uses a Proof of Efficiency (PoE) consensus mechanism on L2, splitting the sequencer responsibilities in two distinct roles within a permissionless system that allows for user participation. Additionally, in the case of sequencer failure or a censorship attempt, the Plygon zkEVM user is able to submit a L1 withdrawal request and force the sequencer to include it on L2. After that the user exits the system with their funds.
Regulatory for zk privacy
The use of privacy-focused ZK technologies poses a regulatory risk, as evidenced by the delisting of Monero from multiple exchanges and the imprisonment of Alexey Pertsev, the developer of the privacy-centric Tornado Cash protocol. To ensure the continued viability of ZK technologies, collaboration and education with regulatory institutions will be necessary.
Competition
The competition in the L2 market has intensified significantly, with several other players building Zero Knowledge and optimistic rollups and raising substantial funds to gain a foothold in the space. In order to maintain its lead in this competitive landscape, zkEVM must leverage its first-mover advantage and commit all available resources to remain at the forefront of innovation.
Currently, the biggest contender to zkEVM's market share is Arbitrum, which has recently launched its own airdrop and token. Prior to zkEVM's entrance into the market, Arbitrum was the dominant player, leading the market in terms of transactions, users and hype. While Arbitrum wasn’t afraid of some competition in the space, the launch of zkEVM has presented a significant technological challenge, as zkEVM's technology is superior and has attracted higher levels of funding.
In a strategic move to maintain its market share, Arbitrum released its airdrop just four days before zkEVM's mainnet launch. This timing suggests that Arbitrum is aware of the threat that zkEVM poses and is taking proactive measures to defend its position in the market. As the L2 market continues to evolve and competition intensifies, it will be important for zkEVM to face such threats and keep its first mover advantage.
Attract users/devs/projects
As previously mentioned, the competition in the blockchain space is fierce, with numerous projects fighting for user adoption and liquidity. To maintain its advantage, Polygon zkEVM will need to focus on attracting a diverse range of users, developers, and projects to its platform. One disadvantage that zkEVM faces compared to other chains is the lack of airdrop potential. Airdrops have been a popular strategy for other chains to attract users, as they help to create hype around the project and incentivize early adopters. As a result, Polygon will need to double down on its marketing efforts to onboard these stakeholders.
For instance, when Arbitrum announced its airdrop, it resulted in an $8 million bridge inflow to zkSync in terms of TVL. This demonstrates the impact that marketing and hype can have on having an airdrop potential. In order to compete with other chains, Polygon zkEVM will need to develop and execute an effective marketing strategy that highlights the unique features and benefits of the chain.
Overall, the success of zkEVM will depend on its ability to attract and retain users and developers. By investing in effective marketing strategies and creating a strong value proposition, Polygon can differentiate itself from competitors and establish itself as a leading player in the blockchain space.
This represents some of the most significant risks that zkEVM may encounter in its operation, and it is inevitable that new risks will emerge over time, while others may be resolved and transformed into strengths.
As with any new technology or chains, there are inherent risks that must be identified and addressed in order to ensure its long-term success. While some risks may be anticipated and managed, others may arise unexpectedly, requiring quick and decisive action to mitigate their impact.
However, it is important to note that not all risks are negative. Some risks may present opportunities for innovation and growth, while others may be transformed into strengths with the right approach and strategy. Therefore, it is essential for zkEVM to have a comprehensive risk management plan in place that is regularly reviewed and updated to address emerging threats and capitalize on new opportunities.
By being proactive and adaptable, zkEVM can navigate the challenges of the rapidly evolving blockchain industry and emerge as a strong and sustainable L2 that delivers value to its users and stakeholders.
13. Conclusion
Polygon has made a name for itself in the blockchain industry as a company that produces innovative projects consistently. They have a reputation for being pioneers, pushing the boundaries of what is possible with blockchain technology. Their latest creation, zkEVM, is a testament to this reputation.
zkEVM is a technological breakthrough in terms of innovation, providing unmatched levels of development advancement that no other Blockchain in the market has achieved. As the use of zk-Rollups continues to gain momentum, being a first mover in this space is becoming increasingly crucial. With several well-funded projects fighting for market share, it is also essential to have a solid team with a proven track record to gain an advantage.
Fortunately, the Polygon team is well-positioned to lead the project. They have a large and talented team that has demonstrated their ability to elevate Polygon to one of the top blockchains in the industry. They have a proven track record of delivering successful projects and are well-equipped to replicate this success with zkEVM.
The ecosystem surrounding zkEVM is rapidly expanding, with many projects expressing their interest to move onto the chain. As the ecosystem grows, the potential for DeFi adoption and an increased user base will become more evident which will drive innovation and provide opportunities for developers to create new and exciting applications that can leverage the benefits of zk-Rollups.
The launch of zkEVM promises to be an exciting event to follow, the race for the top spot in the zk L2 market is sure to be a thrilling competition, and it will be fascinating to see which projects emerge as the leaders in this space.
With the potential for widespread adoption of zk-Rollups, the future looks bright for the blockchain industry.
14. Disclaimer - Limits
We possess a small amount of MATIC tokens and tokens from projects based on the Polygon blockchain. However, it's important to note that this article is written objectively and independently, without any affiliation or association with the team.
Even if the article hasn’t been written in cooperation with Polygon, we would like to give a heads-up to Asif Khan (@0xkhan_), Amanda Tyler and David Silverman for taking the time to answer our questions.
The information presented in this article is intended for educational and informational purposes only, and should not be considered as investment advice. As the project is unlaunched and some information are retained by the team, this article presents limits regarding the potential TPS of the chain, exact team members, current burn rate, roadmap, and more similar important information.
There may be some inaccuracies or errors in the information presented, but we welcome any discussion and are open to being corrected.
Don’t trust, verify.
This article has been written by Louround and Expctchaos, we would be delighted to get feedback and support on this initiative.
We also released a thread highlighting the most important parts of this article, feel free to share it and show feedback on it :
You can follow us on Twitter and share this article to create awareness of what is set to be the biggest narrative of the next bull run.
15. More links/Sources:
Twitter: https://twitter.com/0xPolygonZK
Discord: https://discord.com/invite/0xPolygon
Gitbook: https://wiki.Polygon.technology/docs/zkEVM/introduction
Panther explaining zk-Rollups: https://blog.pantherprotocol.io/zk-rollup-projects-inner-workings-importance-analysis/#benefits-drawbacks-of-zk-rollup-projects
Ye Zhang on decentralized rollups: https://hackmd.io/@yezhang/SkmyXzWMY
Galaxy on zkEVMs and The Future of Ethereum Scalability: https://www.galaxy.com/research/whitepapers/zkEVMs-the-future-of-ethereum-scalability/
Ethereum on scaling possibilities: https://ethereum.org/en/developers/docs/scaling/
Channels vs Plasma vs Rollups: https://medium.com/coinmonks/easy-to-understand-ethereum-layer-2-scaling-solutions-channels-vs-plasma-vs-rollups-1dc1d4e9cb52
Is Polygon a L2 or side chain: https://ethereum.stackexchange.com/questions/125024/is-Polygon-MATIC-a-layer-2-or-a-sidechain
Is Polygon a commit chain and not a sidechain: https://fineMATICs.com/Polygon-commit-chain-explained/
The Blockchain Trillema: https://www.gemini.com/cryptopedia/blockchain-trilemma-decentralization-scalability-definition#section-what-is-scalability
Binance research report on zkEVMs: https://research.binance.com/static/pdf/zkEVM_and_the_Future_of_Ethereum_Scaling_Stefan_Piech.pdf
Deep dive on zkEVMs by Ian Greer: https://www.iangreer.io/zkEVM/
101 Blockchains presentation of Polygon’s products: https://101blockchains.com/Polygon-zk-rollups/
Polygon Hermez presentation: https://docs.hermez.io/zkEVM/Overview/Overview/
Messari on how to influence zkEVM adoption: https://messari.io/report/l2-brief-factors-that-will-influence-zkEVM-adoption?referrer=all-research
Messari on ZK technology: https://messari.io/report/the-zk-everything-report?referrer=all-research
Messari on zkEVM progress: https://messari.io/report/an-update-on-zkEVM-progress-and-development?referrer=all-research
Cointelegraph presents Polygon zkEVM: https://cointelegraph.com/news/Polygon-tests-zero-knowledge-rollups-mainnet-integration-inbound
Alchemy presents Polygon zkEVM: https://www.alchemy.com/overviews/Polygon-zk-rollups
Deep dive on Polygon zkEVM: https://crypticera.com/Polygon-zkEVM/
Deep dive on Polygon zkEVM by Foresight Ventures: https://foresightventures.medium.com/foresight-ventures-all-about-Polygon-zkEVM-and-zkEVM-rollup-cfdfd3bd8160
Polygon Hermez zkEVM - PPT Overview: https://docs.google.com/presentation/d/1wtonJp-OHoRiBEB9ZSGmPV8Z4FTUpwRy/edit#slide=id.p26
L2fees paid per blockchain: https://l2fees.info/
Proof Of Efficiency concept: https://ethresear.ch/t/proof-of-efficiency-a-new-consensus-mechanism-for-zk-rollups/11988
Polygon introducing Plonky2 https://Polygon.technology/blog/introducing-plonky2
zkNode presentation: https://wiki.Polygon.technology/docs/zkEVM/zknode/zknode-overview/
Polygon zkEVM State Management: https://wiki.Polygon.technology/docs/zkEVM/protocol/state-management/
Polygon on sidechains and Plasma: ttps://wiki.Polygon.technology/docs/home/blockchain-basics/sidechain/#:~:text=Sidechain%20is%20an%20alternate%20blockchain,some%20other%20%E2%80%9Cmain%E2%80%9D%20blockchain
Polygon on its zkProver: https://wiki.Polygon.technology/docs/zkEVM/ZKProver/overview
Taiko presentation: https://taiko.xyz/docs
Taiko zkEVM details: https://mirror.xyz/umede.eth/gDuDNng_xHcGFOLZb9-JogcU3d2Em4l9dp32Om3YHjc
Starkware scalability: https://medium.com/starkware/redefining-scalability-5aa11ffc5880
Decentralized rollups and Espresso Sequencer: https://medium.com/@espressosys/decentralizing-rollups-announcing-the-espresso-sequencer-791adeb747e
Call data cost reduction importance: https://polynya.medium.com/why-calldata-gas-cost-reduction-is-crucial-for-rollups-30e663577d3a
Vitalik on Rollups: https://vitalik.ca/general/2021/01/05/rollup.html
Delphi guide on rollups: https://members.delphidigital.io/reports/the-complete-guide-to-rollups/
MATIC token distribution: https://docs.google.com/spreadsheets/d/1ywYBnlxa5kHO-5dM3OoZh3Ae5AcWw_e_yx55vy33Tys/edit#gid=0
Polygon previous raises: https://cryptorank.io/ico/MATIC-network
Polygon lay-off 20% of its workforce: https://u.today/Polygon-network-MATIC-lays-off-20-of-workforce-consolidates-with-Polygon-labs
Polygon invest $1B in ZK: https://Polygon.technology/blog/the-Polygon-thesis-strategic-focus-on-zk-technology-as-the-next-major-chapter-for-Polygon-1b-treasury-allocation
Anurag Arjun leaves Polygon with Avail: https://www.theblock.co/post/220454/Polygon-avail-anurag-arjun
Polygon team presentation: https://Polygon.technology/about
Polygon community development: https://Polygon.technology/blog/india-takes-center-stage-in-Polygons-web3-made-in-india-tour
Polygon on its protocol upgradability: https://wiki.Polygon.technology/docs/zkEVM/protocol/upgradability/
Polygon community guilds: https://Polygon.technology/guilds
Polygon community 2022 recap: https://Polygon.technology/blog/a-look-back-at-Polygon-guilds-advocates-in-2022
This a very detailed article. Loved it.
Can we have indexing in docs?
Great job !!! Thank you so much !