A neural network written completely in jq. Sample program processes the MNIST image dataset with a success rate of 94%.
This will first compile make_json.c, which parses MNIST data files and produces JSON. Then it will run make_json and pipe it into jq, first processing the training set and then the test set. When finished, jq will emit the error rate of the test set.
This usually takes days to run to completion.
A typical feedforward neural network would be implemented by allocating several arrays and continually updating their values. Because data in jq is immutable, this continual state update is implemented as a reduction over all input, where the state of the network is the value being accumulated in the reduction. A benefit of this is that the state of the network could be easily saved / loaded to and from json files in between program executions.
Configuration is supplied via a JSON argument to jq (see run.sh for an example). Sample config is config-trivial.json.
The input is a stream of records like:
It occasionally logs its progress to stderr in the form of jq debug statements:
Its final output is the number of records processed and error rate:
The example program, configuration, and data live in example/example.jq. Neural network library is in neural_net.jq, and sample configuration file for a tiny network is in config-trivial.json
.png)


