The Mastery of Mathematics in Neural Networks: Unlocking the Secrets of AI -Part 1

Aditya Mangal
5 min readApr 3, 2023

Artificial intelligence (AI) has become an integral part of our lives. From voice assistants to self-driving cars, artificial intelligence has changed the way we interact with technology. One of the key components of artificial intelligence is neural networks, which are machine learning algorithms that mimic the structure and function of the human brain. But what is at the heart of a neural network? The answer is mathematical.
In this blog, we will dive deep into the mathematics of neural networks and uncover the secrets of artificial intelligence. In part-1, I will write the complete mathematics of a neural network by taking an example, and in part-2, I will write the code using NumPy to train the neural network. So that we can understand the mathematics of Neural Networks and can build our own different architectures of Neural Networks.

Linear Algebra:
Linear algebra is the foundation of neural networks. It is used to represent and manipulate data through vectors and matrices. In neural networks, data is represented as feature matrices, and linear algebra is used to perform operations on these matrices, such as addition, multiplication, and transposition. For example, an image can be represented as a matrix of pixels, and linear algebra is used to perform convolution and pooling operations.

gfycat

Calculation:
The calculation is used to optimize the weights and biases of neural networks during training. The optimization process is called backpropagation. Backpropagation uses calculus to calculate the gradient of the loss function with respect to weights and biases. The loss function measures the difference between the predicted output and the actual output. The goal of training a neural network is to minimize the loss function, and computation is used to update weights and biases in the direction of negative gradients.

Source

Activation Functions:
Activation functions are used to introduce nonlinearities into neural networks. With nonlinearities, neural networks would be expanded to linear operations, which would be sufficient for complex tasks such as image recognition and natural language processing. Activation functions are often nonlinear, allowing neural networks to model complex relationships between inputs and outputs.

Image Source — https://machinelearningknowledge.ai

Let’s take an example to understand the mathematics of Neural Network

Tenor

Suppose we have two inputs x1 and x2. Weights and Bias are initiated with random values.

We can choose a neural network with one hidden layer of two neurons and one output layer of two categories y1 and y2.

Now, we have to calculate the value of H1,H2,y1 and y2 in Forward pass

Calculate H1

As we have used sigmoid as an Activation Function

Similarly, we can calculate H2 and H2 final.

Calculate y1 and y2

Apply the sigmoid activation function on y1

Now, we have to find the total error, which is simply the difference between the output from the target outputs

We have to minimize this error by the backpropagation method and update the weights and biases.

Let’s first update w5

Similarly, we can calculate w6,w7, and w8

Now, we have to update w1,w2,w3 and w4.Let’s first update w1

Finally, we have updated w1. Similarly, we can update w2,w3, and w4.

The weights have been updated in the neural network, and upon feeding inputs of 0.05 and 0.1, an error of 0.298371109 was observed. The first round of backpropagation was carried out, resulting in a decrease in the total error to 0.291027924. This process was repeated 10,000 times, leading to a significant reduction in the total error to 0.0000351085. At this point, the output neurons generated values of 0.159121960 and 0.984065734, which were close to the target values when inputs of 0.05 and 0.1 were fed forward. In part-2, we will check it with coding using NumPy.

Conclusion

Finally, mastering the mathematics of neural networks is essential for unlocking the secrets of artificial intelligence. Linear algebra, differential calculus, and activation functions are key mathematical concepts that form the basis of successful neural networks. As AI advances, the importance of mathematics in neural networks will only grow.

--

--

Aditya Mangal

My Personal Quote to overcome problems and remove dependencies - "It's not the car, it's the driver who win the race".