Similarity-based Contrastive Divergence Methods for Energy-based Deep Learning Models

Sankar, Adepu Ravi and Balasubramanian, Vineeth N (2019) Similarity-based Contrastive Divergence Methods for Energy-based Deep Learning Models. In: Proceedings of The 7th Asian Conference on Machine Learning, 20-22 November 2015, Hong Kong.

JMLR_2015 ACML.pdf

Download (453kB) | Preview


Energy-based deep learning models like Restricted Boltzmann Machines are increasingly used for real-world applications. However, all these models inherently depend on the Contrastive Divergence (CD) method for training and maximization of log likelihood of generating the given data distribution. CD, which internally uses Gibbs sampling, often does not perform well due to issues such as biased samples, poor mixing of Markov chains and highmass probability modes. Variants of CD such as PCD, Fast PCD and Tempered MCMC have been proposed to address this issue. In this work, we propose a new approach to CDbased methods, called Diss-CD, which uses dissimilar data to allow the Markov chain to explore new modes in the probability space. This method can be used with all variants of CD (or PCD), and across all energy-based deep learning models. Our experiments on using this approach on standard datasets including MNIST, Caltech-101 Silhouette and Synthetic Transformations, demonstrate the promise of this approach, showing fast convergence of error in learning and also a better approximation of log likelihood of the data.

[error in script]
IITH Creators:
IITH CreatorsORCiD
Balasubramanian, Vineeth NUNSPECIFIED
Item Type: Conference or Workshop Item (Paper)
Subjects: Computer science
Divisions: Department of Computer Science & Engineering
Depositing User: Team Library
Date Deposited: 19 Jun 2019 10:35
Last Modified: 19 Jun 2019 10:35
Publisher URL:
Related URLs:

Actions (login required)

View Item View Item
Statistics for RAIITH ePrint 5513 Statistics for this ePrint Item