Some time ago, I wanted to learn more about neural-network principles by reading others' code. Most ai programming is done in python, C or C++, etc., none of which I 'speak' fluently. But I enjoy shell-scripting, so immediately I wondered if anyone had demonstrated a real learning AI in shell.
It took a long time to find, but I did find just such a demo, of a small neural network, with 2 inputs, 3 hidden neurons, and 1 output. The original training script is only 85 lines long. It uses bc and awk for all the math, calling bc over 30 times, and awk 4 times for each iteration of the learning. It was written for zsh/bash, so it needed some very slight alteration to run under 'lesser' shells. It does still need bc and awk. You can find the original file predict.sh here:
https://github.com/justinstaines/neuralnetwork
I'll attach my altered version predict.ash below, which has no changes to the math itself, just a bit of posix-friendly usage, and streamlining. Also attached is a default dataset for demonstration. Remove '.txt' suffixes for use, as usual.
If anyone here has bc and awk, you might try this out. I do know it runs under busybox ash(from Bionic64-8.0) but you need awk and bc -dc will not do. Can anyone confirm whether this runs under any Puppy variety?
Edit: Corrected shebang in predict.ash