-
Notifications
You must be signed in to change notification settings - Fork 13
Automatic Differentiation in R through Julia
Automatic differentiation (AD) is a set of techniques to calculate derivatives automatically.
It generally outperforms non-AD methods like symbolic differentiation and numerical approximation
in speed or/and accuracy.
It has important applications in many fields, like optimization, machine learning,
Bayesian statistics and differential equations.
Julia is a high-level, high-performance dynamic
programming language for numerical computing.
While there is a lack of automatic differentiation package in R,
Julia has mature automatic differentiation packages,
like ForwardDiff.jl
for forward mode AD and
ReverseDiff.jl
for reverse mode AD.
The aim of this project is to develop an R wrapper for the Julia AD packages
ForwardDiff.jl and ReverseDiff.jl by the use of R packages which embedds Julia
in R like JuliaCall.
- R package
Derivfor symbolic differentiation. - R package
numDerivfor numeric differentiation. It provides numerical gradients, Jacobians, and Hessians, computed by finite differences, Richardson extrapolation, and other numerical approximation methods. - R package
radxfor automatic differentiation, but it imposes restrictions on the functions that can be dealt with, and the functionality is not complete. - R package
nlsrwhich is a nonlinear least squares problems with some symbolic derivative features. - From Perry de Valpine of Berkeley, we have learned that there is development in the Nimble project (https://r-nimble.org/) to incorporate AD via C++. Similar capability was in the AD Model Builder software, as described by Ben Bolker in https://cran.r-project.org/web/packages/R2admb/vignettes/R2admb.pdf.
TODO
Since one of the most important goal of this project is to ensure the correctness of
automatic differentiation, so a comprehensive test set covering lots of possible functions
for automatic differentiation will be written for the project,
including native R functions, Julia functions
through JuliaCall, Rcpp functions.
In the tests, the automatic differentiation results for a (mostly) same function
in Julia, R, and Rcpp
will be checked against each other,
and will also be checked against the numerical
and symbolic differentiation results by packages like numDeriv and Deriv for correctness.
There should also be some benchmark tests, which checks the performance of the code
before/after a certain commit or pull request,
and compares the performance and with the numerical
and symbolic differentiation results by packages like numDeriv and Deriv.
Every important commit or pull request should pass the tests. So CI systems will be configured, both on travis CI for mac os and linux, and appveyor for windows.
The project is expected to lead to an R wrapper for Julia's AD packages
ForwardDiff.jl and ReverseDiff.jl. It should be able to do both forward mode and
backward mode AD for
- native R functions with little or no modification,
-
Juliafunctions in R through theJuliaCallpackage, - "typical"
Rcppfunctions with little modification, - mixing of the three kinds of functions.
Erin Hodgess
John Nash is retired Professor of Management in the Telfer School of Management, University of Ottawa. He has worked for approximately half a century in numerical methods for linear algebra and function minimization, in particular with R. See 'optim()' and packages such as optimx, Rvmmin, Rtnmin, Rcgmin, lbfgsb3 and nlsr. However, he also has been active in trying to simplify and unify access to the tools R makes available, that is, the navigation of the package space.
Students, please do one or more of the following tests before contacting the mentors above.
- Use
JuliapackagesForwardDiffto get gradient and hessian functions for a simpleJuliafunctionf(x::Vector) = sum(sin, x) + sum(sqrt, x);, and evaluate the gradient and hessian functions on the vector[1.0, 1.0, 1.0, 1.0, 1.0]. - Use R package
JuliaCallto get the gradient and hessian functions for an R functionf <- function(x) {sum(sin(x)) + sum(sqrt(x))}throughForwardDiff, and evaluate the gradient and hessian functions on the vectorrep(1, 5). This should be quite straightforward givenJuliaCallalready overloads R functions likesum,sinandsqrt. - This project will also involve programming in
Rcpp. You should be able to deal withJuliaObjectfromJuliaCall, which is an R6 object inRcpp. For this test, write anRcppfunction to return the type string of aJuliaObject.
Students, please post a link to your test results here.
Name: Changcheng Li
Email: [email protected], [email protected]
Solution: https://github.com/Non-Contradiction/AD-R-GSOC/blob/master/Test.md