Page 1 of 1

AI - Puppy Style

Posted: Mon Jan 17, 2022 1:14 pm
by amigo

Some time ago, I wanted to learn more about neural-network principles by reading others' code. Most ai programming is done in python, C or C++, etc., none of which I 'speak' fluently. But I enjoy shell-scripting, so immediately I wondered if anyone had demonstrated a real learning AI in shell.

It took a long time to find, but I did find just such a demo, of a small neural network, with 2 inputs, 3 hidden neurons, and 1 output. The original training script is only 85 lines long. It uses bc and awk for all the math, calling bc over 30 times, and awk 4 times for each iteration of the learning. It was written for zsh/bash, so it needed some very slight alteration to run under 'lesser' shells. It does still need bc and awk. You can find the original file predict.sh here:
https://github.com/justinstaines/neuralnetwork

I'll attach my altered version predict.ash below, which has no changes to the math itself, just a bit of posix-friendly usage, and streamlining. Also attached is a default dataset for demonstration. Remove '.txt' suffixes for use, as usual.

If anyone here has bc and awk, you might try this out. I do know it runs under busybox ash(from Bionic64-8.0) but you need awk and bc -dc will not do. Can anyone confirm whether this runs under any Puppy variety?

Edit: Corrected shebang in predict.ash


Re: AI - Puppy Style

Posted: Mon Jan 17, 2022 2:40 pm
by Trapster

I have no idea what's going on but here's the output from Slacko-7, 32 bit

Code: Select all

# ./predict.ash
begin:0.8 0.4 0.3 0.2 0.9 0.5 0.3 0.5 0.9    0.00000000
      1.2904400000 .6942640000 .4634800000 .6904400000 1.1942640000 .6634800000 .4471320000 .6471320000 1.0471320000     0.668188
      1.5328029430 .8618727220 .5676577650 .9328029430 1.3618727220 .7676577650 .5268136665 .7271469157 1.1300146986     0.760004
      1.7007566110 .9839020690 .6468031190 1.1007566110 1.4839020690 .8468031190 .5908294860 .7918501924 1.1973909907     0.786209
      1.8265950600 1.0782062430 .7097699820 1.2265950600 1.5782062430 .9097699820 .6440737015 .8458720384 1.2538373650     0.806024
      1.9257681420 1.1541454590 .7615553660 1.3257681420 1.6541454590 .9615553660 .6894933512 .8920715744 1.3022274464     0.821593
      2.0067578350 1.2171637310 .8052565850 1.4067578350 1.7171637310 1.0052565850 .7289935062 .9323226323 1.3444635879     0.834192
      2.0746781670 1.2706733320 .8428582610 1.4746781670 1.7706733320 1.0428582610 .7638734673 .9679139708 1.3818631549     0.844631
      2.1328178880 1.3169391060 .8757268600 1.5328178880 1.8169391060 1.0757268600 .7950568008 .9997671369 1.4153729647     0.853446
      2.1834007710 1.3575264240 .9048279970 1.5834007710 1.8575264240 1.1048279970 .8232183508 1.0285582992 1.4456901900     0.861007
      2.2279996280 1.3935635860 .9308715830 1.6279996280 1.8935635860 1.1308715830 .8488695364 1.0548015908 1.4733465192     0.867578
after:2.2279996280 1.3935635860 .9308715830 1.6279996280 1.8935635860 1.1308715830 .8488695364 1.0548015908 1.4733465192     0.867578
# 

Re: AI - Puppy Style

Posted: Mon Jan 17, 2022 5:02 pm
by amigo

Thanks for asking. The first line shows the beginning values of the weights 0.8 0.4 .... and at the end the beginning certainty of the guess -of course zero at the start.
Then the calculations begin. Each line shows then shows the updated weights and at the end the certainty of the guess. Note that the certainty grows with each iteration and after 10 passes, is already at 86% certainty. If you run it for 100 iterations, accuracy goes up ~98%.

The writing of the new weights is disabled so that one can easily play with it without changing the original data. In real use, the data would be written and you'd then use the 'predictinfer.sh' script to guess the results using the new data. I have not altered the original 'predictinfer.sh' for use with ash&Co. I leave that for the user -get the original file from the justinstaines github repo linked above.

As mentioned, it's a neural-network with a 2-input, 3-hidden-node, 1-output architecture. Not enough hidden-neurons to do much, but it does illustrate the learning mechanism. It took me a few months to figure out what was going on there. If you compare the original 2 scripts, you'll see there are way fewer lines in the predictinfer.sh script. The difference in predict.sh is the back-propogation which adjust the weights after each iteration.

I won't spoil all the fun by dissecting the whole script/process -and didn't add any comments to explain it either. The math being done by 'bc' is easyx enough to understand. I'll save you some time though, by saying that those lines using 'awk' are implementing the sigmoid-logistic function and it's first derivative -which is slope. The logistic function is a standard, go-to building block of many neural-networks. It's purpose is to flatten all incoming values from (-infinity) to (infinity ) to a value between 0 and 1. The 4th line calling 'awk' flattens the sum of the hidden-neurons guesses, and that is then used to calculate the errors for each neurons' output. The error value is then used to create delta-values which are applied to the weights for each neuron.

If you know nothing about NN's, don't feel bad -it's taken 3 years for me to figure it out enough to write that last paragraph. Was this helpful?


Re: AI - Puppy Style

Posted: Mon Jan 17, 2022 6:22 pm
by amigo

Now, let's make it interesting for all Puppy users, even without bc or awk. In fact, you can do all that fancy math from within your initrd -anywhere you have a shell. I've attached a very small archive with all the needed files. It contains: neuralnetweights.ssv and predict.ash, as above. And it contains a script named predict-able.sh, which is a translated version that completely replaces bc and awk by using the 'iq' calculator, which is also in the archive.

The whole kit, uncompressed is 51K, so you shouldn't have any space problems, hehe. Everything in the kit already has the /bin/ash shebang, so it should run on any Puppy out-of-box, I guess. Have fun!

In case you wonder about Justin Staines and his original scripts, I'm sure shell is like his 42nd programming language. He holds Patents on things like the Wacom Tablet and Samsung-stylus technologie, so let's be nice and not complain about his script-fu. That little 85-liner of his really inspired me!

Edit: add note about justinstaines


Re: AI - Puppy Style

Posted: Sat Aug 13, 2022 1:38 pm
by sc0ttman

That's great.

I have nothing to add, but would love to know if you can list potential applications for this.

Would it be any use (in conjuction with others things of course), for example, in doing any of the following?

- Sentiment analysis of text? (...given a particular string, and a body of text, workout if the text is "for" or "against")

- Image analysis of any kind? (...with ImageMagick et al)?

- Automated (in some way) regression or unit testing?

- Sorting stuff (...getting vague here lol)?

...Sorry for the dumb questions, just trying to get my head around it.