4 projects found that use cython as the programming language.


Logo RLScore 0.7

by aatapa - September 20, 2016, 09:51:25 CET [ Project Homepage BibTeX Download ] 6821 views, 2070 downloads, 0 subscriptions

About: RLScore - regularized least-squares machine learning algorithms package

Changes:

Initial Announcement on mloss.org.


Logo RLPy 1.3a

by bobklein2 - August 28, 2014, 14:34:35 CET [ Project Homepage BibTeX Download ] 15675 views, 3605 downloads, 0 subscriptions

About: RLPy is a framework for performing reinforcement learning (RL) experiments in Python. RLPy provides a large library of agent and domain components, and a suite of tools to aid in experiments (parallelization, hyperparameter optimization, code profiling, and plotting).

Changes:
  • Fixed bug where results using same random seed were different with visualization turned on/off
  • Created RLPy package on pypi (Available at https://pypi.python.org/pypi/rlpy)
  • Switched from custom logger class to python default
  • Added unit tests
  • Code readability improvements (formatting, variable names/ordering)
  • Restructured TD Learning heirarchy
  • Updated tutorials

Logo OptWok 0.3.1

by ong - May 2, 2013, 10:46:11 CET [ Project Homepage BibTeX Download ] 27876 views, 5750 downloads, 0 subscriptions

About: A collection of python code to perform research in optimization. The aim is to provide reusable components that can be quickly applied to machine learning problems. Used in: - Ellipsoidal multiple instance learning - difference of convex functions algorithms for sparse classfication - Contextual bandits upper confidence bound algorithm (using GP) - learning output kernels, that is kernels between the labels of a classifier.

Changes:
  • minor bugfix

Logo HDDM 0.5

by Wiecki - April 24, 2013, 02:53:07 CET [ Project Homepage BibTeX Download ] 18741 views, 4457 downloads, 0 subscriptions

About: HDDM is a python toolbox for hierarchical Bayesian parameter estimation of the Drift Diffusion Model (via PyMC). Drift Diffusion Models are used widely in psychology and cognitive neuroscience to study decision making.

Changes:
  • New and improved HDDM model with the following changes:
    • Priors: by default model will use informative priors (see http://ski.clps.brown.edu/hddm_docs/methods.html#hierarchical-drift-diffusion-models-used-in-hddm) If you want uninformative priors, set informative=False.
    • Sampling: This model uses slice sampling which leads to faster convergence while being slower to generate an individual sample. In our experiments, burnin of 20 is often good enough.
    • Inter-trial variablity parameters are only estimated at the group level, not for individual subjects.
    • The old model has been renamed to HDDMTransformed.
    • HDDMRegression and HDDMStimCoding are also using this model.
  • HDDMRegression takes patsy model specification strings. See http://ski.clps.brown.edu/hddm_docs/howto.html#estimate-a-regression-model and http://ski.clps.brown.edu/hddm_docs/tutorial_regression_stimcoding.html#chap-tutorial-hddm-regression
  • Improved online documentation at http://ski.clps.brown.edu/hddm_docs
  • A new HDDM demo at http://ski.clps.brown.edu/hddm_docs/demo.html
  • Ratcliff's quantile optimization method for single subjects and groups using the .optimize() method
  • Maximum likelihood optimization.
  • Many bugfixes and better test coverage.
  • hddm_fit.py command line utility is depracated.