
Deep learning Error Analysis (Bias variance, train/traindev/dev/test)

Why use softmax only in the output layer and not in hidden layers further explaination

Tradeoff batch size vs. number of iterations to train a neural network

Write your own custom activation function from scratch Write your own custom activation function from tensorflow primitives

What do you mean by 1D, 2D and 3D Convolutions in CNN

Common causes of nans during deep training

Introducing both L2 regularization and dropout into the network. Does it Makes sense?

How do CNNs Deal with Position Differences?

How many images do you need to train a neural network?

what is GEMM in deep learning ?

Know more about Gradient descent

NLP best practices

Activation function should be differentiable..Always?

Know internals of Neural network & train efficiently  1

Efficient Backprop by Yann Le cun

tanh activation function vs sigmoid activation function

how to take derivative of sigmoid

The Right Way to Oversample in Predictive Modeling

Implement from scratch  Neural networks

Implement from scratch  Logistic regression

Understanding Convolutions

Vanishing gradient problem

Walkthrough of backpropagation

Derivation of backpropagation

Why do we use activation function ?
Comments
Leave a comment