Search Results
7/6/2025, 10:43:10 AM
>>105815372
Can you calculate the local minima of mean squared error loss using a sigmoid function and multidimensional gradient descent? That is the difference between building the AI and being replaced by the AI.
Can you calculate the local minima of mean squared error loss using a sigmoid function and multidimensional gradient descent? That is the difference between building the AI and being replaced by the AI.
6/23/2025, 7:58:10 AM
>>16705424
>We are talking thousands of dimensions and calculating the derivstions of a "slice" of each dimension one at a time to make tiny adjustments.
Past three dimensions my mind can't conceptualize it. I can visualize the derivation of a 3D graph making steps to find local minima through linear regression. It is like a ball falling to a curve in a surface.
4D? What does that look like? 100+ dimensions? Just to draw a tiny image. I can only visualize the nodes. At this point the computer is thinking on a level the human brain just isn't capable of.
>We are talking thousands of dimensions and calculating the derivstions of a "slice" of each dimension one at a time to make tiny adjustments.
Past three dimensions my mind can't conceptualize it. I can visualize the derivation of a 3D graph making steps to find local minima through linear regression. It is like a ball falling to a curve in a surface.
4D? What does that look like? 100+ dimensions? Just to draw a tiny image. I can only visualize the nodes. At this point the computer is thinking on a level the human brain just isn't capable of.
6/18/2025, 10:32:51 AM
>>16700578
Things that writing neural networks for machine learning has taught me, coming from a background of calculus:
>Backpropagation
>Sigmoid function
>Partial derivatives
>Mean squared error
>Stochastic gradient descent
>Multidimensional linear regression
I will sit down and work through every part of the problem until I can wrap my head around it completely.
I especially like how partial derivatives are used to fine tune neural networks to local minima in functions that include thousands of dimensions.
Things that writing neural networks for machine learning has taught me, coming from a background of calculus:
>Backpropagation
>Sigmoid function
>Partial derivatives
>Mean squared error
>Stochastic gradient descent
>Multidimensional linear regression
I will sit down and work through every part of the problem until I can wrap my head around it completely.
I especially like how partial derivatives are used to fine tune neural networks to local minima in functions that include thousands of dimensions.
Page 1