MothNet’s computer code, according to the boffins, contains layers of artificial neurons to simulate the bug’s antenna lobe and mushroom body, which are common parts of insect brains.

Crucially, instead of recognizing smells, the duo taught MothNet to identify handwritten digits in the MNIST dataset. This database is often used to train and test pattern recognition in computer vision applications.

The academics used supervised learning to train MothNet, feeding it about 15 to 20 images of each digit from zero to nine, and rewarding it when it recognized the numbers correctly.

Receptor neurons in the artificial brain processed the incoming images, and passed the information down to the antenna lobe, which learned the features of each number. This lobe was connected, by a set of projection neurons, to the sparse mushroom body. This section was wired up to extrinsic neurons, each ultimately representing an individual integer between zero and nine.
MothNet achieved 75 per cent to 85 per cent accuracy, the paper stated, despite relatively few training examples, seemingly outperforming more traditional neural networks when given the same amount of training data.
It shows that the simplest biological neural network of an insect brain can be taught simple image recognition tasks, and potentially exceed other models when training examples and processing resources are scarce. The researchers believe that these biological neural networks (BNNs) can be “combined and stacked into larger, deeper neural nets.”

Source: Roses are red, are you single, we wonder? ‘Cos this moth-brain AI can read your phone number • The Register