Need help with NiLang.jl?
Click the “chat” button below for chat support from the developer who created it, or find similar developers for support.

About the developer

GiggleLiu
151 Stars 9 Forks Apache License 2.0 271 Commits 7 Opened issues

Description

A differential eDSL that can run faster than light and go back to the past.

Services available

!
?

Need anything else?

Contributors list

# 213,210
Jupyter...
julia-l...
The Jul...
quantum...
214 commits
# 71,171
Shell
syntax-...
webfram...
julia-l...
5 commits
# 550,857
Haskell
Python
reddit
imgur
1 commit
# 686,380
gaussia...
probabi...
1 commit
# 280,781
julia-l...
markov-...
webfram...
juliala...
1 commit

NiLang.jl (逆lang), is a reversible domain-specific language (DSL) that allow a program to go back to the past.

  • Requires Julia version >= 1.3,
  • Now a function dataview is specified by
    x |> bijection
    , e.g. the previous
    grad(x)
    now should be written as
    x |> grad
    in the reversible context.
  • Our paper uses version v0.6, which might be different from the master branch.

NiLang features:

  • any program written in NiLang is differentiable,
  • a reversible language with abstraction and arrays,
  • complex values
  • reversible logarithmic number system

Documentation CI

Start from our Pluto notebook here.

The strangeness of reversible computing is mainly due to our lack of experience with it.—Henry Baker, 1992

To Start

pkg> add NiLang

An example: Compute the norm of a vector

julia> using NiLang

julia> @i function f(res, y, x) for i=1:length(x) y += x[i] ^ 2 end res += sqrt(y) end

julia> res_out, y_out, x_out = f(0.0, 0.0, [1, 2, 3.0]) (3.7416573867739413, 14.0, [1.0, 2.0, 3.0])

julia> (~f)(res_out, y_out, x_out) # automatically generated inverse program. (0.0, 0.0, [1.0, 2.0, 3.0])

julia> ∂res, ∂y, ∂x = NiLang.AD.gradient(Val(1), f, (0.0, 0.0, [1, 2, 3.0])) # automatic differentiation, Val(1) means the first argument of f is the loss. (1.0, 0.1336306209562122, [0.2672612419124244, 0.5345224838248488, 0.8017837257372732])

The performance of reversible programming automatic differentiation is much better than most traditional frameworks. Here is why, and how it works,

how it works

Check our paper

@misc{Liu2020,
    title={Differentiate Everything with a Reversible Programming Language},
    author={Jin-Guo Liu and Taine Zhao},
    year={2020},
    eprint={2003.04617},
    archivePrefix={arXiv},
    primaryClass={cs.PL}
}

We use cookies. If you continue to browse the site, you agree to the use of cookies. For more information on our use of cookies please see our Privacy Policy.