Projects that are tagged with inference.


Logo Aika 0.8

by molzberger - September 19, 2017, 18:10:43 CET [ Project Homepage BibTeX Download ] 1398 views, 436 downloads, 3 subscriptions

About: Aika is an open source text mining engine. It can automatically extract and annotate semantic information in text. In case this information is ambiguous, Aika will generate several hypothetical interpretations about the meaning of this text and retrieve the most likely one.

Changes:

Aika Version 0.8 (2017-09-17) - Optimization of the interpretation search using an upper bound on the interpretation weights. - Support for very large models with millions of neurons by suspending rarely used neurons to disk.

Aika Version 0.7 (2017-08-06) - Refactoring of the range model. Now the range begin and the range end can be treated independently of each other. Synapses now have three properties: range match, range output and range mapping. - The Iteration class has been merged into the document class. - Performance optimizations for the interpretation search in the SearchNode class. - Test case fixes - Class renaming: Option -> InterprNode, ExpandNode -> SearchNode - Lots of javadoc

Aika Version 0.6 (2017-07-01) - Mainly optimizations


Logo OpenGM 2 2.0.2 beta

by opengm - June 1, 2012, 14:33:53 CET [ Project Homepage BibTeX Download ] 4886 views, 1135 downloads, 1 subscription

About: A C++ Library for Discrete Graphical Models

Changes:

Initial Announcement on mloss.org.


Logo Gibbs RTSS 1.0

by marc - April 4, 2011, 19:58:43 CET [ BibTeX BibTeX for corresponding Paper Download ] 4626 views, 1259 downloads, 1 subscription

About: The software provides an implementation of a filter/smoother based on Gibbs sampling, which can be used for inference in dynamical systems.

Changes:

Initial Announcement on mloss.org.


About: OpenGM is a free C++ template library, a command line tool and a set of MATLAB functions for optimization in higher order graphical models. Graphical models of any order and structure can be built either in C++ or in MATLAB, using simple and intuitive commands. These models can be stored in HDF5 files and subjected to state-of-the-art optimization algorithms via the OpenGM command line optimizer. All library functions can also be called directly from C++ code. OpenGM realizes the Inference Algorithm Interface (IAI), a concept that makes it easy for programmers to use their own algorithms and factor classes with OpenGM.

Changes:

Initial Announcement on mloss.org.