Projects running under windows.
Showing Items 41-60 of 212 on page 3 of 11: Previous 1 2 3 4 5 6 7 8 Next Last

Logo Chalearn gesture challenge code by jun wan 2.0

by joewan - September 29, 2015, 08:50:22 CET [ BibTeX BibTeX for corresponding Paper Download ] 8877 views, 1971 downloads, 2 subscriptions

About: This code is provided by Jun Wan. It is used in the Chalearn one-shot learning gesture challenge (round 2). This code includes: bag of features, 3D MoSIFT-based features (i.e. 3D MoSIFT, 3D EMoSIFT and 3D SMoSIFT), and the MFSK feature.

Changes:

Initial Announcement on mloss.org.


Logo JMLR Darwin 1.9

by sgould - September 8, 2015, 06:50:37 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 70399 views, 14365 downloads, 4 subscriptions

About: A platform-independent C++ framework for machine learning, graphical models, and computer vision research and development.

Changes:

Version 1.9:

  • Replaced drwnInPaint class with drwnImageInPainter class and added inPaint application
  • Added function to read CIFAR-10 and CIFAR-100 style datasets (see http://www.cs.utoronto.ca/~kriz/cifar.html)
  • Added drwnMaskedPatchMatch, drwnBasicPatchMatch, drwnSelfPatchMatch and basicPatchMatch application
  • drwnPatchMatchGraph now allows multiple matches to the same image
  • Upgraded wxWidgets to 3.0.2 (problems on Mac OS X)
  • Switched Mac OS X compilation to libc++ instead of libstdc++
  • Added Python scripts for running experiments and regression tests
  • Refactored drwnGrabCutInstance class to support both GMM and colour histogram model
  • Added cacheSortIndex to drwnDecisionTree for trading-off speed versus memory usage
  • Added mexLoadPatchMatchGraph for loading drwnPatchMatchGraph objects into Matlab
  • Improved documentation, other bug fixes and performance improvements

Logo jLDADMM 1.0

by dqnguyen - August 19, 2015, 12:52:36 CET [ Project Homepage BibTeX Download ] 2540 views, 604 downloads, 2 subscriptions

About: The Java package jLDADMM is released to provide alternative choices for topic modeling on normal or short texts. It provides implementations of the Latent Dirichlet Allocation topic model and the one-topic-per-document Dirichlet Multinomial Mixture model (i.e. mixture of unigrams), using collapsed Gibbs sampling. In addition, jLDADMM supplies a document clustering evaluation to compare topic models.

Changes:

Initial Announcement on mloss.org.


Logo JMLR libDAI 0.3.2

by jorism - July 17, 2015, 15:59:55 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 61832 views, 11806 downloads, 4 subscriptions

Rating Whole StarWhole StarWhole StarWhole StarWhole Star
(based on 1 vote)

About: libDAI provides free & open source implementations of various (approximate) inference methods for graphical models with discrete variables, including Bayesian networks and Markov Random Fields.

Changes:

Release 0.3.2 fixes various bugs and adds GLC (Generalized Loop Corrections) written by Siamak Ravanbakhsh.


Logo ABACOC Adaptive Ball Cover for Classification 2.0

by kikot - May 29, 2015, 11:57:28 CET [ BibTeX BibTeX for corresponding Paper Download ] 7086 views, 1759 downloads, 3 subscriptions

About: Incremental (Online) Nonparametric Classifier. You can classify both points (standard) or matrices (multivariate time series). Java and Matlab code already available.

Changes:

version 2: parameterless system, constant model size, prediction confidence (for active learning).

NEW!! C++ version at: https://github.com/ilaria-gori/ABACOC


About: Jie Gui et al., "How to estimate the regularization parameter for spectral regression discriminant analysis and its kernel version?", IEEE Transactions on Circuits and Systems for Video Technology, vol. 24, no. 2, pp. 211-223, 2014

Changes:

Initial Announcement on mloss.org.


About: Jie Gui, Zhenan Sun, Guangqi Hou, Tieniu Tan, "An optimal set of code words and correntropy for rotated least squares regression", International Joint Conference on Biometrics, 2014, pp. 1-6

Changes:

Initial Announcement on mloss.org.


Logo XGBoost v0.4.0

by crowwork - May 12, 2015, 08:57:16 CET [ Project Homepage BibTeX Download ] 19774 views, 3454 downloads, 3 subscriptions

About: xgboost: eXtreme Gradient Boosting It is an efficient and scalable implementation of gradient boosting framework. The package includes efficient linear model solver and tree learning algorithm. The package can automatically do parallel computation with OpenMP, and it can be more than 10 times faster than existing gradient boosting packages such as gbm or sklearn.GBM . It supports various objective functions, including regression, classification and ranking. The package is made to be extensible, so that user are also allowed to define there own objectives easily. The newest version of xgboost now supports distributed learning on various platforms such as hadoop, mpi and scales to even larger problems

Changes:
  • Distributed version of xgboost that runs on YARN, scales to billions of examples

  • Direct save/load data and model from/to S3 and HDFS

  • Feature importance visualization in R module, by Michael Benesty

  • Predict leaf index

  • Poisson regression for counts data

  • Early stopping option in training

  • Native save load support in R and python

  • xgboost models now can be saved using save/load in R

  • xgboost python model is now pickable

  • sklearn wrapper is supported in python module

  • Experimental External memory version


Logo lomo feature extraction and xqda metric learning for person reidentification 1.0

by openpr_nlpr - May 6, 2015, 11:38:32 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 3830 views, 567 downloads, 3 subscriptions

Rating Empty StarEmpty StarEmpty StarEmpty StarEmpty Star
(based on 1 vote)

About: This MATLAB package provides the LOMO feature extraction and the XQDA metric learning algorithms proposed in our CVPR 2015 paper. It is fast, and effective for person re-identification. For more details, please visit http://www.cbsr.ia.ac.cn/users/scliao/projects/lomo_xqda/.

Changes:

Initial Announcement on mloss.org.


Logo BLOG 0.9.1

by jxwuyi - April 27, 2015, 06:52:05 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 3844 views, 830 downloads, 3 subscriptions

About: Bayesian Logic (BLOG) is a probabilistic modeling language. It is designed for representing relations and uncertainties among real world objects.

Changes:

Initial Announcement on mloss.org.


Logo FsAlg 0.5.4

by gbaydin - April 25, 2015, 02:11:03 CET [ Project Homepage BibTeX Download ] 2569 views, 692 downloads, 1 subscription

About: FsAlg is a linear algebra library that supports generic types.

Changes:

Initial Announcement on mloss.org.


Logo java machine learning platform 1.0

by openpr_nlpr - April 2, 2015, 09:02:14 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 3479 views, 633 downloads, 2 subscriptions

About: Jmlp is a java platform for both of the machine learning experiments and application. I have tested it on the window platform. But it should be applicable in the linux platform due to the cross-platform of Java language. It contains the classical classification algorithm (Discrete AdaBoost.MH, Real AdaBoost.MH, SVM, KNN, MCE,MLP,NB) and feature reduction(KPCA,PCA,Whiten) etc.

Changes:

Initial Announcement on mloss.org.


Logo libcmaes 0.9.5

by beniz - March 9, 2015, 09:05:22 CET [ Project Homepage BibTeX Download ] 14506 views, 2748 downloads, 3 subscriptions

About: Libcmaes is a multithreaded C++11 library (with Python bindings) for high performance blackbox stochastic optimization of difficult, possibly non-linear and non-convex functions, using the CMA-ES algorithm for Covariance Matrix Adaptation Evolution Strategy. Libcmaes is useful to minimize / maximize any function, without information regarding gradient or derivability.

Changes:

This is a major release, with several novelties, improvements and fixes, among which:

  • step-size two-point adaptaion scheme for improved performances in some settings, ref #88

  • important bug fixes to the ACM surrogate scheme, ref #57, #106

  • simple high-level workflow under Python, ref #116

  • improved performances in high dimensions, ref #97

  • improved profile likelihood and contour computations, including under geno/pheno transforms, ref #30, #31, #48

  • elitist mechanism for forcing best solutions during evolution, ref 103

  • new legacy plotting function, ref #110

  • optional initial function value, ref #100

  • improved C++ API, ref #89

  • Python bindings support with Anaconda, ref #111

  • configure script now tries to detect numpy when building Python bindings, ref #113

  • Python bindings now have embedded documentation, ref #114

  • support for Travis continuous integration, ref #122

  • lower resolution random seed initialization


Logo CN24 Convolutional Neural Networks for Semantic Segmentation 1.0

by erik - February 23, 2015, 09:02:06 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 3851 views, 789 downloads, 1 subscription

About: CN24 is a complete semantic segmentation framework using fully convolutional networks.

Changes:

Initial Announcement on mloss.org.


Logo JMLR DLLearner 1.0

by Jens - February 13, 2015, 11:39:46 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 27717 views, 5755 downloads, 6 subscriptions

Rating Whole StarWhole StarWhole StarWhole Star1/2 Star
(based on 3 votes)

About: The DL-Learner framework contains several algorithms for supervised concept learning in Description Logics (DLs) and OWL.

Changes:

See http://dl-learner.org/development/changelog/.


Logo Auto encoder Based Data Clustering Toolkit 1.0

by openpr_nlpr - February 10, 2015, 08:30:55 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 3746 views, 658 downloads, 2 subscriptions

About: The auto-encoder based data clustering toolkit provides a quick start of clustering based on deep auto-encoder nets. This toolkit can cluster data in feature space with a deep nonlinear nets.

Changes:

Initial Announcement on mloss.org.


Logo fertilized forests 1.0beta

by Chrisl_S - January 23, 2015, 16:04:31 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 3563 views, 840 downloads, 1 subscription

About: The fertilized forests project has the aim to provide an easy to use, easy to extend, yet fast library for decision forests. It summarizes the research in this field and provides a solid platform to extend it. Offering consistent interfaces to C++, Python and Matlab and being available for all major compilers gives the user high flexibility for using the library.

Changes:

Initial Announcement on mloss.org.


Logo Hub Miner 1.1

by nenadtomasev - January 22, 2015, 16:33:51 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 6058 views, 1045 downloads, 2 subscriptions

About: Hubness-aware Machine Learning for High-dimensional Data

Changes:
  • BibTex support for all algorithm implementations, making all of them easy to reference (via algref package).

  • Two more hubness-aware approaches (meta-metric-learning and feature construction)

  • An implementation of Hit-Miss networks for analysis.

  • Several minor bug fixes.

  • The following instance selection methods were added: HMScore, Carving, Iterative Case Filtering, ENRBF.

  • The following clustering quality indexes were added: Folkes-Mallows, Calinski-Harabasz, PBM, G+, Tau, Point-Biserial, Hubert's statistic, McClain-Rao, C-root-k.

  • Some more experimental scripts have been included.

  • Extensions in the estimation of hubness risk.

  • Alias and weighted reservoir methods for weight-proportional random selection.


Logo JMLR RL library 3.00.00

by frezza - January 13, 2015, 04:15:16 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 5671 views, 1665 downloads, 2 subscriptions

About: A template based C++ reinforcement learning library

Changes:

Initial Announcement on mloss.org.


Logo JEMLA 1.0

by bathaeian - January 4, 2015, 08:34:49 CET [ Project Homepage BibTeX Download ] 2580 views, 802 downloads, 3 subscriptions

About: Java package for calculating Entropy for Machine Learning Applications. It has implemented several methods of handling missing values. So it can be used as a lab for examining missing values.

Changes:

Discretizing numerical values is added to calculate mode of values and fractional replacement of missing ones. class diagram is on the web http://profs.basu.ac.ir/bathaeian/free_space/jemla.rar


Showing Items 41-60 of 212 on page 3 of 11: Previous 1 2 3 4 5 6 7 8 Next Last