Challenging the assumptions

You can challenge the assumptions underlying current artificial neural networks a bit and broaden out the possibilities:
https://ai462qqq.blogspot.com/2019/11/artificial-neural-networks.html

Is there anything special you want to talk about with this link, or did you just want to share it with the community?

Well, it would be a good idea to be aware of the statistics of the weighted sum. Which is really based on the variance equation for linear combinations of random variables. You can see that in certain circumstances over-paramerization leads to a particular type of error correction, not necessarily over-fitting.
Also an electrical switch is n volts in, n volts out when on. Zero out when off.
Usually in your house there is a fixed voltage on one side. The linear aspect of a switch is then slightly counter-intuitive. It is seen as entirely binary, where it is not exactly. You might fail to recognize the ReLU activation function as a switch.

The ideas look interesting. But I can’t really follow them, as it appears to be very condensed.

I guess that is your blog there, as it links out to your github in the end.

If you can elaborate more thoroughly what you want to point out there, it would be nice. I think most of the topics you are touching on there, would benefit from a full blog post explaining in more detail why they matter and, their real applications are and how they differ from the status quo.

Yeh, it’s a hint sheet. I don’t know why the material is not covered in introductions to artificial neural networks. It is only high school math.
Unless some complete meltdown in scientific methodology happened and no one ever bothered to collect together the basic math of the weighted sum to see how it fitted together and applied to neural networks.