A plan to try out the Julia language.
Three months ago, Version 1.0 of Julia was released, marking the more-or-less maturing of this language, especially for the field of scientific computation. For Internet applications, some people say everything that can be done using Javascript will eventually be done using Javascript. I wonder if the same thing is true about Julia: In scientific computation, will everything that can be done using Julia eventually be done using Julia?
I am not wondering without a solid basis. Julia appears to be fast, maybe close to C, according to the official benchmark. In the meanwhile, Julia is a dynamic language that is probably much easier to use than static languages like C. As the developer indicated, Julia is developed in the hope to “eliminate the performance trade-off and provide a single environment productive enough for prototyping and efficient enough for deploying performance-intensive applications”.
I am particularly interested in a few features of Julia claimed by its developers: (1) Designed for parallelism and distributed computation; (2) Powerful type system where user-defined types are as fast and compact as built-ins; (3) Call C functions directly, without wrappers or special APIs. Item 1 indicates a certain level of automation in parallelizing the code. Item 2 indicates the efficient implementation of some computational techniques, such as automatic differentiation (AD). If that is true, a lot of troubles and pains in AD’ing codes in C/C++/Fortran can be avoid. Item 3 could be a huge plus for a Cython user like me - I have been glueing C/C++ libraries using Python all the time. Nevertheless, if Julia is fast enough, maybe there is even no need for the acceleration using C/C++.
To try out and assess Julia, my short-term plan is to rewrite and expand the Python library Mushin that I developed earlier. The rewriting will be divided into the following steps/modules,
- Basics I: The computation of the element internal force vectors and stiffness matrices; the assembly of the element vectors and matrices; the solution of the nonlinear system of static mechanics.
- Basics II: The inclusion of mass terms and the capability for solving dynamic problems.
- Parallelization using MPI.
- Optimization: Development of the adjoint of the problem using AD techniques.
- Uncertainty quantification (UQ) of the time response.
Modules 2, 3 and 4 depend on Module 1 and are parallel to each other. Module 5 depends on Modules 1 and 2. There are a few questions to consider in the implementation of these modules:
- How to implement sparse and distributed vectors and matrices? Would SparseArrays work?
- More fundamentally, how to do large-scale linear algebra operations in Julia? Is there something that is AD-able and is like the PETSc library for C?
- How to solve a large scale nonlinear system? Would NLopt work?
- How to AD the code? How would JuliaDiff work?
- Is it better to implement time integration manually, or directly use DifferentialEquations? What impact would be on the UQ applications?
The ultimate goal of the short-term plan is to check if Julia is suitable for a computational library: fast, easy parallelization, easy AD and UQ. If everything works out, the long-term plan would be to develop a pure Julia library for general finite element analysis that is parallelized with AD and UQ capabilities enabled.
Finally a few notes on the basic aspects in AD. Start with wiki. Mathematically, forward and backward modes of AD are conceptually based on the tangent and dual vectors, respectively. There are basically two strategies for implementation: source code transformation (SCT) and operator overloading (OO). The AD results are usually compared with the complex step method. There are a lot of resources on GitHub, for both SCT and OO.
Some references:
- Website, a collection of libraries for AD.
- Introduction to AD.
- Evaluating Derivatives, Griewank2008.
- Numerical Optimization, Nocedal2006, a chapter is devoted to derivative evaluation, also including coloring scheme.
- Andersson2012, the library CasADi is presented and eight “flavors” of AD are discussed.
- Martins2013.