Learning Deep learning and latest AI technologies

Deep learning has become one of the most innovative technologies and our society is facing to the change coming from its big impact. In the future, no matter what kind of job / what kind of you are involved in, you will be influenced by the impact of advanced AI technologies. Deep Learning JP is offering a series of educational programs including this course named “Deep learning basics” as a preparation for such future.

Deep Learning Basics

This course is designed to cover wide range of Deep learning technologies which are commonly used in latest researches / developments. It starts from basic topics on machine learning such as Logistic regression, MLP and basics of neural networks. After mastering the basic topics, it moves on to more advanced topics such as autoencoders, CNNs, generative models, RNNs and language models. One of the most important characteristics of this course is that it’s a fully implementation-oriented course with practical exercises. In each chapter, you need to tackle on a number of problems and home works that we provide. To allow you to focus on learning the principles of deep learning technologies rather than managing computational environment, we provide iLect.net, an online GPU programming environment on fully-virtualized servers.

  • Every Wed. 16:50 – 18:35
  • Engineering building #2 and #3, the University of Tokyo
  • Application Form (Deadline:17th Oct.)
  • It is an independent course and thus a credit from the university won’t be given.


  1. Introduction (2016/10/19)
    What’s Deep Learning? The impact, influence for our society. Explanation of the course.
  2. Machine Learning 1 (2016/10/26)
    Machine Learning, k-NN, Logistic Regression/ Training, Testing
  3. Machine Learning 2 (2016/11/ 2)
    Numpy,Scipy, Scikit-learn, Numpy Idioms, Advanced slicing, etc.
  4. Perception+Feed Forward Network, Gradient Descent (2016/11/ 9)
    Neural networks and the basics, gradient descent: the principle and implementation on Numpy
  5. Gradient Descent, Stochastic Gradient Descent, Optimizers (2016/11/16)
    Theano, optimizers
  6. Autoencoders (2016/11/30)
    pretraining, encoder-decoder models, denoising Autoencoder, SdA, modern autoencoders
  7. Convolutional Neural Networks(CNN) (2016/12/ 7)
    CNN Basics, detailed explanation about image processing, data augmentation, etc.
  8. Convolutional Neural Networks(CNN) 2 (2016/12/14)
    Advanced CNNs.
  9. Generative models (2016/12/21)
    Generative models, RBM, VAE, GAN
  10. Recurrent Neural Networks(RNN) (2017/ 1/11)
    Series data and RNN
  11. Deep Learning and Language Models (2017/ 1/18)
    Word Embedding, LSTM, Language Models
  12. The final presentation (2017/ 1/21)
    Projects presentations