-
- Description:
The Cognitive Foundry is a modular Java software library for the research and development of cognitive systems. It is primarily designed for research and development to be easy to plug into applications to provide adaptive behaviors.
The main part of the Foundry is the Machine Learning package, which contains reusable components and algorithms for machine learning and statistics. It contains many algorithms for supervised and unsupervised learning as well as statistical modeling. It is interface-centric and uses generics to make it easy to customize to the needs of individual applications.
The Cognitive Foundry's development is led by Sandia National Laboratories and is released under the open source BSD License. It requires Java 1.6.
- Changes to previous version:
- Upgraded to mtj-9.9.14 and added the netlib-java-0.9.3 library, which MTJ now depends on.
- Common Core:
- ParallelUtil: Added executeInParallel(tasks, algorithm)
- CollectionUtil: Added removeElement.
- Added MutableDouble class, which is like Double but with a mutable value. It also implements Ring and Vectorizable.
- UnivariateStatisticsUtil: Improved stability of computeMean with large values.
- VectorReader: Changed to use a Collection of tokens instead of requiring an ArrayList.
- AbstractSingularValueDecomposition: Fixed a bug for certain types of rectangular matrices.
- DiagonalMatrixMTJ: Made inverse faster.
- ArgumentChecker: Added assertIsInRangeExclusive and fixed a formatting issue with some exception messages.
- Added Identified interface.
- Learning Core:
- Added BatchAndIncrementalLearner interface, which is for algorithms that can be used in batch and incremental modes. Note that it defines that the BatchLearner interface uses a Collection of data for consistency with existing batch (non-incremental) algorithms. However, it does also include another learn method that takes an Iterable. AbstractBatchAndIncrementalLearner now implements it.
- Added SupervisedIncrementalLearner interface, supervised version, SupervisedBatchAndIncrementalLearner, and abstract class AbstractSupervisedBatchAndIncrementalLearner. Many incremental learning algorithms now implement these interfaces and extend this abstract class.
- VectorNaiveBayesCategorizer: Now has generic for the distribution type representing each feature, including OnlingBaggingCategorizerLearner, AbstractOnlineLinearBinaryCategorizerLearner, OnlinePassiveAggressivePerceptron, OnlinePerceptron, OnlineVotedPerceptron, and Winnow.
- BaggingCategorizerLearner: Added a protected fillBag method, which can be overridden to implement a different sampling approach.
- Added BatchMultiPerceptron, which is an implementation of a multi-class Perceptron that keeps one Perceptron per class.
- Added MultiCategoryAdaBoost, which is an implementation of the AdaBoost.M1 algorithm.
- DecisionTree: Added findTerminalNode methods.
- CrossFoldCreator: Added constructor that takes just a number of folds.
- KernelBinaryCategorizer is now an interface with a default implementation in DefaultKernelBinaryCategorizer. KernelPerceptron, KernelAdatron, SequentialMinimalOptimization, and SuccessiveOverrelaxation now all use the new interface or default class.
- Added LinearMultiCategorizer class, which keeps a LinearBinaryCategorizer for each class and
- Removed GeneralizedScalarRadialBasisKernel class.
- GaussianContextRecognizer: Now requires MixtureOfGaussians.PDF.
- DefaultConfusionMatrix: Added copy constructor.
- DiscreteDistribution: Added getDomainSize and implemented in subclasses.
- DiscreteSamplingUtil: Added sampleWithReplacementInto method.
- ProbabilityMassFunctionUtil: Added sampleMultiple and sampleSingle.
- ScalarProbabilityDensityFunction: Added logEvaluate method and implemented in subclasses.
- BayesianLinearRegression: Added incremental learner.
- BayesianRobustLinearRegression: Added incremental learner.
- Refactored LinearMixtureModel, MixtureOfGaussians, and ScalarMixtureDensityModel to be more consistent with other Foundry statistics classes.
- MultivariateGaussian: Added incremental estimator.
- Added MultivariateMixtureDensityModel class.
- UnivaraiteGaussian: Added incremental estimator.
- Added FriedmanConfidence, NemenyiConfidence, and TukeyRangeConfidence, which implement methods for computing confidence intervals.
- Text Core:
- LatentSemanticAnalysis: Added better handling of low-rank matrices.
- BibTeX Entry: Download
- Corresponding Paper BibTeX Entry: Download
- Supported Operating Systems: Agnostic, Platform Independent
- Data Formats: Matlab, Csv, Xml, Xstream
- Tags: Classification, Clustering, Adaboost, Decision Tree Learning, Algorithms, Gaussian Mixture Models, Bagging, Ensemble Methods, Gaussian Processes, Affinity Propagation, Bfgs, Generics, Genetic Algorith
- Archive: download here
Comments
No one has posted any comments yet. Perhaps you'd like to be the first?
Leave a comment
You must be logged in to post comments.