# DNN from scratch

It has the code and a little explanation on how to implement a neural network from scratch using numpy and test it on a simple dataset that is created.

### What to expect from this?

We will look into the general implementation of the neural network as many hidden layers as we want with sigmoid activation function for a binary Classification

{% hint style="info" %}
You can extend this to any activation, and a softmax layer at the end. Code is pretty much same. For further customizations, take a look at the blog that I have mentioned below. It has some features as adding dropouts as well.
{% endhint %}

The main focus of this blog is to understand how to vectorize the forward and backward propagation in an MLP. It is just a simplified version of the incredible [Medium blog](https://medium.com/@udaybhaskarpaila/multilayered-neural-network-from-scratch-using-python-c0719a646855) with missing pieces like explanation of how you got the gradients in vectorized form. In order to get the complete picture, I would recommend to read the medium blog as well as this article.

{% embed url="<https://medium.com/@udaybhaskarpaila/multilayered-neural-network-from-scratch-using-python-c0719a646855>" %}
Medium Blog
{% endembed %}

### Why you should read this ?

The above blog has everything that you need to know about implementing MLP with different activations along with the dropout. If you feel above contents are a lot to take in, I would suggest you to read this first and then go to that blog for further reading.

I have simplified the procedure so that you can understand why we are doing the way we are doing, and At the end, you are the one who is going to write all the code. Think of this is like an exercise to test your understanding of the concept and your coding skills.

{% hint style="warning" %}
The complete code for the exercises are not given as so many people would be tempted to look at it instead of trying them on their own.&#x20;
{% endhint %}


---

# Agent Instructions: Querying This Documentation

If you need additional information that is not directly available in this page, you can query the documentation dynamically by asking a question.

Perform an HTTP GET request on the current page URL with the `ask` query parameter:

```
GET https://ramsane.gitbook.io/deep-learning/multi-layered-perceptron-1/dnn-from-scratch.md?ask=<question>
```

The question should be specific, self-contained, and written in natural language.
The response will contain a direct answer to the question and relevant excerpts and sources from the documentation.

Use this mechanism when the answer is not explicitly present in the current page, you need clarification or additional context, or you want to retrieve related documentation sections.
