Topics to Look For

  • Mini-batch gradient descent
  • Momentum
  • RMSprop
  • Adam optimization algorithm
  • Learning rate decay

Resources

Supplemental

  • Chapter 8 from the Deep Learning book covers these topics as well, and as usual for this resource, it is fairly math-heavy and quite detailed. I recommend using it as a reference in which to look up specific topics for this week to gain a different perspective and possibly a few more details.