-
- Description:
Features:
It is an object oriented toolbox with the most important abilities needed for the implementation of DBNs.
According to object oriented programming, DeeBNet is designed to be very modular, extensible, reusable and can easily be modified and extended
It can be run using both MATLAB and Octave and is platform independent (Windows and Linux)
Different sampling methods including Gibbs, CD, PCD and our new FEPCD method are implemented in out toolbox
Different sparsity methods, including quadratic, rate distortion and our normal sparsity method are included in DeeBNet
DeeBNet supports different RBM types (including generative and discriminative)
Efficiency in using GPU power properly (high GPU load)
Possibility of using DeeBNet in many different tasks such as classification, feature extraction, data reconstructing, noise reduction, generating new data, etc.
Data management in DataStore class and optimized codes for engagement with big data in some functions
Description:
The DeeBNet (Deep Belief Network) is an object oriented MATLAB and Octave toolbox to provide tools for conducting research using Deep Belief Networks. The toolbox has two packages with some classes and functions for managing data and sampling methods and also has some classes to define different RBMs and DBN.
- DataClasses package has one class to manage training, testing and validation of data. The DataStore class has some useful functions such as normalize and shuffle functions for normalizing and shuffling of data.
- SamplingClasses package includes the implementation of some different sampling methods. These sampling methods are Gibbs, CD, PCD and FEPCD, where FEPCD is our new sampling method.
The toolbox has also six types of RBM classes.
RBM class, is an abstract class that defines all necessary functions (such as training method) and features (like sampler object) in all types of RBMs and therefore , we can't create an object from it. Other RBM classes are inherited from this abstract class.
GenerativeRBM
DiscriminativeRBM
SparseRBM, The SparseRBM can use three different types of sparsity methods where one of them is our new method namely “normal sparse RBM method”.
SparseGenerativeRBM
SparseDiscriminativeRBM
In addition, some useful codes are implemented. These codes are the functions to read different datasets and different scripts showing how the toolbox can be used in different applications:
- MNIST for image recognition
- ISOLET for speech recognition
- 20 Newsgroups for text categorization
and can be applied to different problems:
- classification
- feature extraction
- data reconstructing
- noise reduction
- new data generation
- ...
More explanations and documentations are in the site.
- Changes to previous version:
New in toolbox
- Using GPU in Backpropagation
- Revision of some demo scripts
- Function approximation with multiple outputs
- Feature extraction with GRBM in first layer
- BibTeX Entry: Download
- Supported Operating Systems: Platform Independent
- Data Formats: Matlab, Octave
- Tags: Classification, Deep Belief Networks, Feature Extraction, Artificial Neural Network, Matlab Toolbox, Restricted Boltzmann Machine
- Archive: download here
Other available revisons
-
Version Changelog Date 3.2 New in toolbox
- Using GPU in Backpropagation
- Revision of some demo scripts
- Function approximation with multiple outputs
- Feature extraction with GRBM in first layer
June 26, 2016, 16:19:55 3.1 New in toolbox
- Bug fix in changing learning rate.
- Expanded generateData function in using after backpropagation.
- Expanded reconstructData function in using after backpropagation.
January 19, 2016, 08:13:41 3.0 New in toolbox
- Editing toolbox for using in Octave.
January 9, 2016, 10:44:13 2.2 New in toolbox
- Bug was fixed for computeBatchSize function in Linux.
- Revision of some demo scripts.
September 5, 2015, 14:14:16 2.1 New features
- GPU support (about 5 times faster than CPU - test in GPU: NVIDEA GeForce GTX 780 CPU: AMD FX 8150 Eight-Core 3.6 GHz)
- Cast DBN parameters to single and double data types
- Sparsity in RBM with three different methods
- Plotting bases function
- Classification and feature extraction on 20 Newsgroups datasets
- Code correction in using back propagation.
- Runtime and memory code optimization in Normalization and Shuffling
July 23, 2015, 07:12:22 2.0 Initial Announcement on mloss.org.
July 11, 2015, 14:09:24
Comments
No one has posted any comments yet. Perhaps you'd like to be the first?
Leave a comment
You must be logged in to post comments.