Arithmetization-Oriented Ciphers

A range of fancy cryptographic protocols, such as zero-knowledge proofs, multi-party computation, and fully homomorphic encryption, rely on ​arithmetization​. This technique represents a general purpose computation as a sequence of finite field operations. The complexity of a protocol using arithmetization is determined by the number and type of arithmetic operations, so it makes sense to look at the arithmetic complexity of the computations.

Many concrete use cases of these protocols require the evaluation of a symmetric or unkeyed primitive such as a block cipher or hash function. Unfortunately, traditional block ciphers and hash functions are designed to optimize performance in software or hardware, not the performance of the advanced protocol that evaluates them. The need for primitives that are efficient with respect to this metric has prompted several proposals in recent years, most notably MiMC, Poseidon, and Rescue.


Zero-Knowledge Contingent Payments. A seller wants to sell information so he encrypts it before transmitting it to the buyer. With this ciphertext, he includes the hash of the used key, and a zero-knowledge proof that the unknown key is the preimage to the hash as well as the decryption key for the ciphertext. The buyer makes a cryptocurrency transaction that is spendable by whoever produces a preimage for the given hash. In order to collect his funds, the seller must release this preimage. In order for this protocol to be efficient, both the hash function and encryption scheme must have a low arithmetic complexity.

Merkle Tree Membership Proofs. A zero knowledge prover wants to prove something about a leaf in a Merkle tree, but without revealing said leaf or its position within the tree. In this case it pays to build the Merkle tree using an arithmetization-oriented hash function.

Compressing Blockchain History. A network verifier wants to verify the blockchain history without doing all the work required to verify the blockchain history. A SNARK comes to the rescue: by verifying the SNARK instead, the network verifier is assured of the same fact: that the claimed history is indeed valid. If the consensus rules depend on hash functions or block ciphers, it makes sense to use arithmetization-oriented primitives there, in order to facilitate producing the SNARK.

Recursive Transparent SNARKs. What do you get when your SNARK does not testify to the integrity of an entire computation, but only to the validity of the last computational step and the validity of a similar SNARK but covering all the steps up until that point? By induction, the entire computation was executed correctly. If the SNARK verifier needs a hash function (for instance, to enable the Fiat-Shamir transform) then choose an arithmetization-oriented hash function — otherwise the complexity of recursion will explode.

Encryption of Sensitive Outputs. When in multi-party computations (MPC), the output is confidential to a select number of (possibly unknown) parties, it pays to encrypt this output. To reduce the overhead of encrypting over MPC, it pays to use an arithmetization oriented cipher.

Threshold Derandomized Signatures. Derandomization in signature schemes enables the verifier to not have to trust its possibly flawed source of randomness. A threshold signature scheme is where the secret key is spread across many participants, and a signature can only be generated if a threshold number of them collaborate. However, the two properties do not mix well and when naïvely combined they lead to malicious attacks where adversaries extract their peers’ key shares. One solution to this problem is for the co-signers to commit to their randomness in the first round; then in the second round they produce their signature contribution in addition to a zero-knowledge proof that the same randomness was used. Another solution is to run an MPC protocol to run the exact same single-user derandomized signature generation, except by a threshold of participants and over MPC.

Post-Quantum Signature Schemes. One of the strategies to generate post-quantum signature schemes is to start from a symmetric or unkeyed primitive and then prove in zero-knowledge that certain facts about it are known. Using an arithmetization-oriented primitive here results in a smaller, faster to generate, and faster to verify signature, compared to using traditional primitives. Another strategy is to use the hypertree approach, where every leaf is a public key that signs a new Merkle root. When building these Merkle trees with an arithmetization-oriented hash function, it becomes easier to prove properties about signatures, and thereby achieve fancy properties such as blind signatures, signature aggregation, or a threshold scheme.