>>16691139Linear algebra is the most basic math for building neural networks. It's not about "linearity" but multiplying matrices. Abstract algebra is also important, like set theory. Any sort of data can be represented by a neural network and they can be configured in various ways. They can generate media, solve problems.
You can even build a neural network that builds itself and "learns" how to do things by trial and error like a human being. This all requires linear algebra. Mathematical logic can be modeled with computers. This means that we can use math to build programs that think like human beings and maybe even one day build a self-aware intelligence out of pure logic.
By multiplying n-dimensional arrays of numbers in complex ways, you can build a generative intelligence, give it a goal, simulate thousands of random attempts, and "evolve" an optimal solution. In machine learning, an evolutionary algorithm (EA) can make random "mutations" and attempt the same goal repeatedly inside a digital simulation until it "evolves" to the optimal solution. It is amazing, it is like creating an organic lifeform out of pure logic and math.