DNN from scratch

It has the code and a little explanation on how to implement a neural network from scratch using numpy and test it on a simple dataset that is created.

What to expect from this?

We will look into the general implementation of the neural network as many hidden layers as we want with sigmoid activation function for a binary Classification

You can extend this to any activation, and a softmax layer at the end. Code is pretty much same. For further customizations, take a look at the blog that I have mentioned below. It has some features as adding dropouts as well.

The main focus of this blog is to understand how to vectorize the forward and backward propagation in an MLP. It is just a simplified version of the incredible Medium blog with missing pieces like explanation of how you got the gradients in vectorized form. In order to get the complete picture, I would recommend to read the medium blog as well as this article.

Medium Blog

Why you should read this ?

The above blog has everything that you need to know about implementing MLP with different activations along with the dropout. If you feel above contents are a lot to take in, I would suggest you to read this first and then go to that blog for further reading.

I have simplified the procedure so that you can understand why we are doing the way we are doing, and At the end, you are the one who is going to write all the code. Think of this is like an exercise to test your understanding of the concept and your coding skills.

Last updated

Was this helpful?