Understanding Variational Lower Bound

Variantional Bayesian (VB) methods are a popular family of techniques (along with Generative Advesarial Networks (GANs) [1] and Fully visible belief networks (FVBNs) [2]) in statistical machine learning, in particular, for training generative models. VB methods allow us to cast the statistical inference problems into optimization problems which can be solved by latest optimization algorithms. The optimization objective known as Variational Lower Bound - also called Evidence Lower Bound (ELBO) - was proposed in Varionational Autoencoders (VAE) [3], which is one of the simplest form of VB methods. This goal of this post is to learn and take notes about this Variational Lower Bound.

Read More

Learning basic TensorFlow, Keras by examples of CNN implementation! - Part I

In this tutorial, we learn TensorFlow, Keras by going step by step from simple thing to recent state-of-the-art neural network in computer vision. At the beginning of the tutorial, we learn how to implement Convolutional Neural Networks (CNN) by TensorFlow and more efficient tool Keras. Towards the end of this tutorial, you can go advance to implement from the scratch state-of-the-art CNN, such as: VGG, Inception V4, DenseNet, etc. If you are not familiar with CNN, I recommend to take a look this tutorial first: http://cs231n.github.io/convolutional-networks/

Read More