Here it goes

  1. Deep learning Error Analysis (Bias variance, train/train-dev/dev/test)

  2. Why use softmax only in the output layer and not in hidden layers further explaination

  3. Tradeoff batch size vs. number of iterations to train a neural network

  4. Write your own custom activation function from scratch Write your own custom activation function from tensorflow primitives

  5. What do you mean by 1D, 2D and 3D Convolutions in CNN

  6. Common causes of nans during deep training

  7. Introducing both L2 regularization and dropout into the network. Does it Makes sense?

  8. How do CNNs Deal with Position Differences?

  9. How many images do you need to train a neural network?

  10. what is GEMM in deep learning ?

  11. Know more about Gradient descent

  12. NLP best practices

  13. Activation function should be differentiable..Always?

  14. Know internals of Neural network & train efficiently - 1

  15. Efficient Backprop by Yann Le cun

  16. tanh activation function vs sigmoid activation function

  17. how to take derivative of sigmoid

  18. The Right Way to Oversample in Predictive Modeling

  19. Implement from scratch - Neural networks

  20. Implement from scratch - Logistic regression

  21. Understanding Convolutions

  22. Vanishing gradient problem

  23. Walkthrough of back-propagation

  24. Derivation of back-propagation

  25. Why do we use activation function ?