Shuihui’s Home

Convolutional Deep Belief Networks

This program is an implementation of Convolutional Deep Belief Networks with ‘MATLAB’,‘MEX’,‘CUDA’ versions. In this code, the binary and Gaussian visable types are both supported. In addition, CUDA acceleration is also included. We provide some demo programs to show the usage of the code. Currently, only 2D data are supported, however, 3D version is under developing.

The source code can be download at https://github.com/lonl/CDBN

Requirement

Build

  1. Change the path to ‘CDBN/toolbox/CDBNLIB/mex’
  2. edit ‘Makefile’, modify ‘MATLAB_DIR’ and ‘CUDA_DIR’ to your correct path.
  3. make

Run the program

Experiments

We have conducted classification experiments with ‘Convolutional Deep Belief Networks’, ‘Deep Belief Networks’, and ‘Directed Softmax’ in mnist data (2000 train data & 2000 test data). The detail parameters of these three ways can be found in code.

The comparison results (accuracy) are as follows:

No noise added in test data: CDBN: 95.1% DBN: 91.5% Softmax: 87.7%
10% noise added in test data: CDBN: 92.8% DBN: 86.7% Softmax: 83.2%
20% noise added in test data: CDBN: 84.4% DBN: 60.1% Softmax: 74.7%

Note

Different computation methods can be selected. Currently, Matlab matrix computation, MEX, CDUA are supported. You can change the computation method globally in CDBN/toolbox/CDBNLIB/default_layer2D.m by select one of the methods.

layer.matlab_use = 0; layer.mex_use = 1; layer.cuda_use = 0;

or you can change the computation method in the layer definition, for example, you can add above lines to ‘DemoCDBN_Binary_2D.m’ at layer 1’s definition as:

layer{1}.matlab_use = 0; layer{1}.mex_use = 0; layer{1}.cuda_use = 1;

The acceleration effect of CUDA version is not obvious in first layer. But it may be better in the later layer for big size pictures.

Connection

If you have any problem, or you have some suggestions for this code, please contact Pengcheng Han by hanpc839874404@163.com, thank you very much!