Benjamin Wilson

@benjaminwilson

Joined on Mar 31, 2023

  • Neural nets perform exceptionally well at a variety of classification tasks - tasks like determining if an image contains an airplace, or recognising handwritten digits. These models use millions or even billions of floating point parameters to compute their classifications using multiple layers of matrix multiplications and non-linearities. While these calculations can be carried out very efficiently, it remains challenging to efficiently prove that the calculations were carried out correctly. Overcoming this challenge would allow slow computers (e.g. blockchains, or edge devices such as smartphones) to delegate neural network inference tasks to untrusted parties, enabling applications such as trustless biometric identification and smart contracts that are truly very smart. The problem is that the primitives of ZK and ML are difficult to reconcile. ZK operates at a fundamental level with modular arithmetic (i.e. with discrete values over a finite field) whereas neural nets and most machine learning models perform "smooth" operations on floating point numbers, called "weights". Existing approaches have attempted to bridge this divide by quantizing the weights of a neural net, so that they can be represented as elements of the finite field. Care must be taken to avoid a "wrap-around" occurring in the (now, modular!) arithmetic of the quantized network, and weight quantization can only decrease model accuracy. But more than anything, it does feel a little like trying to force a square peg into a round hole. We propose a different approach: let's go back to a time before the NN paradigm was settled, to a time when a greater variety of neural nets roamed the earth, and let's find a machine learning model more amenable to ZKP. One such model is the "Weightless Neural Network". It's claimed to be the first ever neural net to be commercialized! Wow. But wow again, it is a very dusty dinosaur. Over the decades, it has received very little attention compared to familiar NNs. We set out to develop a system for proving the inferences of this weightless wonder ... and we call it ... Zero Gravity (The Weight is Over). Weightless means no weights, no floating point arithmetic, and no expensive linear algebra, let alone non-linearities - so none of the challenges mentioned above. Will there be different challenges, and will they be worse? This is what we set out to discover at ZKHack hackathon (Lisbon, 2023). What's a Weightless Neural Network? A Weightless Neural Network (WNN) is entirely combinatorial. Its input is a bitstring (e.g. encoding an image), and their output is one of several predefined classes, e.g. corresponding to the ten digits. It learns from a dataset of (input, output) pairs by remembering observed bitstring patterns in a bunch of devices called RAM cells, grouped into "discriminators" that correspond to each output class. RAM cells are so called since they are really just big lookup tables, indexed by bitstring patterns, and storing a 1 when that pattern has been observed in an input string that is labeled with the class of this discriminator.
     Like 2 Bookmark