Project details for DiffSharp

Logo DiffSharp 0.7.5

by gbaydin - January 4, 2016, 00:54:33 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ]

view ( today), download ( today ), 0 subscriptions

Description:

DiffSharp: Differentiable Functional Programming

DiffSharp is a functional automatic differentiation (AD) library.

AD allows exact and efficient calculation of derivatives, by systematically invoking the chain rule of calculus at the elementary operator level during program execution. AD is different from numerical differentiation, which is prone to truncation and round-off errors, and symbolic differentiation, which is affected by expression swell and cannot fully handle algorithmic control flow.

Using the DiffSharp library, differentiation (gradients, Hessians, Jacobians, directional derivatives, and matrix-free Hessian- and Jacobian-vector products) is applied using higher-order functions, that is, functions which take other functions as arguments. Your functions can use the full expressive capability of the language including control flow. DiffSharp allows composition of differentiation using nested forward and reverse AD up to any level, meaning that you can compute exact higher-order derivatives or differentiate functions that are internally making use of differentiation. Please see the API Overview page for a list of available operations.

The library is developed by Atılım Güneş Baydin and Barak A. Pearlmutter mainly for research applications in machine learning, as part of their work at the Brain and Computation Lab, Hamilton Institute, National University of Ireland Maynooth.

DiffSharp is implemented in the F# language and can be used from C# and the other languages running on Mono, .NET Core, or the .Net Framework, targeting the 64 bit platform. It is tested on Linux and Windows. We are working on interfaces/ports to other languages.

Changes to previous version:

Improved: Performance improvement thanks to faster Array2D.copy operations (thank you Don Syme @dsyme)

Improved: Significantly faster matrix transposition using extended BLAS operations cblas_?omatcopy provided by OpenBLAS

Improved: Performance improvement by disabling parts of the OpenBLAS backend using System.Threading.Tasks, which was interfering with OpenBLAS multithreading. Pending further tests.

Update: Updated the Win64 binaries of OpenBLAS to version 0.2.15 (27-10-2015), which has bug fixes and optimizations.

Fixed: Bug fixes in reverse AD operations Sub_D_DV and Sub_D_DM (fixes #8, thank you @mrakgr)

Fixed: Fixed bug in the benchmarking module causing incorrect reporting of the overhead factor of the AD grad operation

Improved: Documentation updates

BibTeX Entry: Download
Corresponding Paper BibTeX Entry: Download
Supported Operating Systems: Linux, Windows
Data Formats: Agnostic
Tags: Optimization, Automatic Differentiation, Symbolic Differentiation, Backpropagation, Numerical Differentiation
Archive: download here

Comments

No one has posted any comments yet. Perhaps you'd like to be the first?

Leave a comment

You must be logged in to post comments.