-
- Description:
DiffSharp is an automatic differentiation (AD) library implemented in the F# language.
AD allows exact and efficient calculation of derivatives, by systematically applying the chain rule of calculus at the elementary operator level. AD is different from numerical differentiation, which is prone to truncation and round-off errors, and symbolic differentiation, which is exact but not efficient for run-time calculations and can only handle closed-form mathematical expressions.
Using the DiffSharp library, derivative calculations (gradients, Hessians, Jacobians, directional derivatives, and matrix-free Hessian- and Jacobian-vector products) can be incorporated with minimal change into existing algorithms. Operations can be nested to any level, meaning that you can compute exact higher-order derivatives and differentiate functions that are internally making use of differentiation. Please see the API Overview page for a list of available operations.
The library is under active development by Atılım Güneş Baydin and Barak A. Pearlmutter mainly for research applications in machine learning, as part of their work at the Brain and Computation Lab, Hamilton Institute, National University of Ireland Maynooth.
- Changes to previous version:
Initial Announcement on mloss.org.
- BibTeX Entry: Download
- Corresponding Paper BibTeX Entry: Download
- Supported Operating Systems: Linux, Macosx, Windows
- Data Formats: Agnostic
- Tags: Optimization, Automatic Differentiation, Symbolic Differentiation, Backpropagation, Numerical Differentiation
- Archive: download here
Comments
No one has posted any comments yet. Perhaps you'd like to be the first?
Leave a comment
You must be logged in to post comments.