You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: README.md
+46-62Lines changed: 46 additions & 62 deletions
Original file line number
Diff line number
Diff line change
@@ -19,8 +19,7 @@ A dynamic expression is a snippet of code that can change throughout runtime - c
19
19
3. It then generates specialized [evaluation kernels](https://github.com/SymbolicML/DynamicExpressions.jl/blob/fe8e6dfa160d12485fb77c226d22776dd6ed697a/src/EvaluateEquation.jl#L29-L66) for the space of potential operators.
20
20
4. It also generates kernels for the [first-order derivatives](https://github.com/SymbolicML/DynamicExpressions.jl/blob/fe8e6dfa160d12485fb77c226d22776dd6ed697a/src/EvaluateEquationDerivative.jl#L139-L175), using [Zygote.jl](https://github.com/FluxML/Zygote.jl).
21
21
5. DynamicExpressions.jl can also operate on arbitrary other types (vectors, tensors, symbols, strings, or even unions) - see last part below.
22
-
23
-
It also has import and export functionality with [SymbolicUtils.jl](https://github.com/JuliaSymbolics/SymbolicUtils.jl), so you can move your runtime expression into a CAS!
22
+
6. It also has import and export functionality with [SymbolicUtils.jl](https://github.com/JuliaSymbolics/SymbolicUtils.jl).
24
23
25
24
26
25
## Example
@@ -29,18 +28,17 @@ It also has import and export functionality with [SymbolicUtils.jl](https://gith
(We can construct this expression with normal operators, since calling `OperatorEnum()` will `@eval` new functions on `Node` that use the specified enum.)
43
-
44
42
## Speed
45
43
46
44
First, what happens if we naively use Julia symbols to define and then evaluate this expression?
@@ -53,27 +51,25 @@ First, what happens if we naively use Julia symbols to define and then evaluate
53
51
This is quite slow, meaning it will be hard to quickly search over the space of expressions. Let's see how DynamicExpressions.jl compares:
54
52
55
53
```julia
56
-
@btimeexpression(X, operators)
57
-
#693 ns
54
+
@btimeexpression(X)
55
+
#607 ns
58
56
```
59
57
60
-
Much faster! And we didn't even need to compile it. (Internally, this is calling `eval_tree_array(expression, X, operators)`).
58
+
Much faster! And we didn't even need to compile it. (Internally, this is calling `eval_tree_array(expression, X)`).
61
59
62
60
If we change `expression` dynamically with a random number generator, it will have the same performance:
63
61
64
62
```julia
65
-
@btimebegin
66
-
expression.op =rand(1:3) # random operator in [+, -, *]
67
-
expression(X, operators)
68
-
end
69
-
# 842 ns
63
+
@btimeex(X) setup=(ex =copy(expression); ex.tree.op =rand(1:3) #= random operator in [+, -, *] =#)
64
+
# 640 ns
70
65
```
66
+
71
67
Now, let's see the performance if we had hard-coded these expressions:
72
68
73
69
```julia
74
70
f(X) = X[1, :] .*cos.(X[2, :] .-3.2)
75
71
@btimef(X)
76
-
#708 ns
72
+
#629 ns
77
73
```
78
74
79
75
So, our dynamic expression evaluation is about the same (or even a bit faster) as evaluating a basic hard-coded expression! Let's see if we can optimize the speed of the hard-coded version:
@@ -102,49 +98,37 @@ We can also compute gradients with the same speed:
Internally, this is calling the `eval_grad_tree_array` function, which performs forward-mode automatic differentiation on the expression tree with Zygote-compiled kernels. We can also compute the derivative with respect to constants:
138
129
139
130
```julia
140
-
result, grad, did_finish =eval_grad_tree_array(expression, X, operators; variable=false)
141
-
```
142
-
143
-
or with respect to variables, and only in a single direction:
144
-
145
-
```julia
146
-
feature =2
147
-
result, grad, did_finish =eval_diff_tree_array(expression, X, operators, feature)
131
+
result, grad, did_finish =eval_grad_tree_array(expression, X; variable=false)
I'm so glad you asked. `DynamicExpressions.jl` actually will work for **arbitrary types**! However, to work on operators other than real scalars, you need to use the `GenericOperatorEnum <: AbstractOperatorEnum` instead of the normal `OperatorEnum`. Let's try it with strings!
155
139
156
140
```julia
157
-
x1=Node(String; feature=1)
141
+
_x1=Node{String}(; feature=1)
158
142
```
159
143
160
144
This node, will be used to index input data (whatever it may be) with either `data[feature]` (1D abstract arrays) or `selectdim(data, 1, feature)` (ND abstract arrays). Let's now define some operators to use:
161
145
162
146
```julia
163
-
my_string_func(x::String) ="ello $x"
147
+
using DynamicExpressions:@declare_expression_operator
0 commit comments