1
1
# Symmetric Meta Programming
2
2
3
- Symmetric meta programming is a new framework for staging and certain
3
+ Symmetric meta programming is a new framework for staging and for some
4
4
forms of macros. It is is expressed as strongly and statically typed
5
5
code using two fundamental operations: quotations and splicing. A
6
6
novel aspect of the approach is that these two operations are
@@ -116,27 +116,30 @@ PCP. This is explained further in a later section.
116
116
117
117
## Details
118
118
119
- ### ` Expr ` as an Applicative
119
+ ### From ` Expr ` s to Functions and Back
120
120
121
- We postulate an implicit "Apply " decorator that turns a tree
121
+ The ` Expr ` companion object contains an "AsFunction " decorator that turns a tree
122
122
describing a function into a function mapping trees to trees.
123
123
124
- implicit class AsApplicative[T, U](f: Expr[T => U]) extends AnyVal {
125
- def apply(x: Expr[T]): Expr[U] = ???
124
+ object Expr {
125
+ ...
126
+ implicit class AsFunction[T, U](f: Expr[T => U]) extends AnyVal {
127
+ def apply(x: Expr[T]): Expr[U] = ???
128
+ }
126
129
}
127
130
128
- This decorator turns ` Expr ` into an applicative functor, where ` Expr ` s
131
+ This decorator gives ` Expr ` the ` apply ` operation of an applicative functor, where ` Expr ` s
129
132
over function types can be applied to ` Expr ` arguments. The definition
130
- of ` AsApplicative (f).apply(x)` is assumed to be functionally the same as
133
+ of ` AsFunction (f).apply(x)` is assumed to be functionally the same as
131
134
` ’((~f)(~x)) ` , however it should optimize this call by returning the
132
135
result of beta-reducing ` f(x) ` if ` f ` is a known lambda expression
133
136
134
- The ` AsApplicative ` decorator distributes applications of ` Expr ` over function
137
+ The ` AsFunction ` decorator distributes applications of ` Expr ` over function
135
138
arrows:
136
139
137
- AsApplicative (_).apply: Expr[S => T] => (Expr[S] => Expr[T])
140
+ AsFunction (_).apply: Expr[S => T] => (Expr[S] => Expr[T])
138
141
139
- The dual of expansion , let’s call it ` reflect ` , can be defined as follows:
142
+ Its dual, let’s call it ` reflect ` , can be defined as follows:
140
143
141
144
def reflect[T, U](f: Expr[T] => Expr[U]): Expr[T => U] = ’{
142
145
(x: T) => ~f(’(x))
@@ -253,7 +256,7 @@ Here’s an application of `map` and how it rewrites to optimized code:
253
256
254
257
### Relationship with Inline and Macros
255
258
256
- Seen by itself, principled meta-programming looks more like a
259
+ Seen by itself, symmetric meta-programming looks more like a
257
260
framework for staging than one for compile-time meta programming with
258
261
macros. But combined with Dotty’s ` inline ` it can be turned into a
259
262
compile-time system. The idea is that macro elaboration can be
@@ -307,7 +310,7 @@ If `program` is treated as a quoted expression, the call to
307
310
` Macro.assertImpl ` becomes phase correct even if macro library and
308
311
program are conceptualized as local definitions.
309
312
310
- But what about the call from ` assert ` to `assertImpl? Here, we need a
313
+ But what about the call from ` assert ` to ` assertImpl ` ? Here, we need a
311
314
tweak of the typing rules. An inline function such as ` assert ` that
312
315
contains a splice operation outside an enclosing quote is called a
313
316
_ macro_ . Macros are supposed to be expanded in a subsequent phase,
@@ -318,7 +321,7 @@ the call from `assert` to `assertImpl` phase-correct, even if we
318
321
assume that both definitions are local.
319
322
320
323
The second role of ` inline ` in Dotty is to mark a ` val ` that is
321
- constant or a parameter that will be constant when instantiated. This
324
+ either a constant or is a parameter that will be a constant when instantiated. This
322
325
aspect is also important for macro expansion. To illustrate this,
323
326
consider an implementation of the ` power ` function that makes use of a
324
327
statically known exponent:
@@ -331,7 +334,7 @@ statically known exponent:
331
334
else if (n % 2 == 0) ’{ { val y = ~x * ~x; ~powerCode(n / 2, ’(y)) } }
332
335
else ’{ ~x * ~powerCode(n - 1, x) }
333
336
334
- The usage of ` n ` as an argument in ` ~powerCode(n, ’(x)) ` is not
337
+ The reference to ` n ` as an argument in ` ~powerCode(n, ’(x)) ` is not
335
338
phase-consistent, since ` n ` appears in a splice without an enclosing
336
339
quote. Normally that would be a problem because it means that we need
337
340
the _ value_ of ` n ` at compile time, which is not available for general
@@ -435,10 +438,10 @@ Running `compile(letExp, Map())` would yield the following Scala code:
435
438
436
439
’{ val y = 3; (2 + y) + 4 }
437
440
438
- The body the first clause ( ` case Num(n) => n ` ) looks suspicious. ` n `
441
+ The body of the first clause, ` case Num(n) => n ` , looks suspicious. ` n `
439
442
is declared as an ` Int ` , yet the result of ` compile ` is declared to be
440
- ` Expr[Int] ` . Shouldn’t `n be quoted? The answer is that this would not
441
- work since replacing ` n by ` ’n` in the clause would not be phase
443
+ ` Expr[Int] ` . Shouldn’t ` n ` be quoted? In fact this would not
444
+ work since replacing ` n ` by ` ’n ` in the clause would not be phase
442
445
correct.
443
446
444
447
What happens instead "under the hood" is an implicit conversion: ` n `
@@ -480,7 +483,7 @@ tree machinery:
480
483
}
481
484
}
482
485
483
- Since ` Liftable ` is a type class, instances can be conditional. For instance
486
+ Since ` Liftable ` is a type class, its instances can be conditional. For example,
484
487
a ` List ` is liftable if its element type is:
485
488
486
489
implicit def ListIsLiftable[T: Liftable]: Liftable[List[T]] = new {
@@ -505,7 +508,7 @@ need a syntax change that introduces prefix operators as types.
505
508
SimpleType ::= ...
506
509
[‘-’ | ‘+’ | ‘~’ | ‘!’] StableId
507
510
508
- Analogously to the situation with expressions a prefix type operator
511
+ Analogously to the situation with expressions, a prefix type operator
509
512
such as ` ~ e ` is treated as a shorthand for the type ` e.unary_~ ` .
510
513
511
514
Quotes are supported by introducing new tokens ` ’( ` , ` ’{ ` , and ` ’[ `
@@ -521,7 +524,7 @@ Syntax changes are given relative to the [Dotty reference
521
524
grammar] ( ../internal/syntax.md ) .
522
525
523
526
An alternative syntax would treat ` ’ ` as a separate operator. This
524
- would be attractive since it enables quoting single identifies as
527
+ would be attractive since it enables quoting single identifiers as
525
528
e.g. ` ’x ` instead of ` ’(x) ` . But it would clash with symbol
526
529
literals. So it could be done only if symbol literals were abolished.
527
530
@@ -538,7 +541,7 @@ that method. With the restrictions on splices that are currently in
538
541
place that’s all that’s needed. We might allow more interpretation in
539
542
splices in the future, which would allow us to loosen the
540
543
restriction. Quotes in spliced, interpreted code are kept as they
541
- are, after splices nested in the quotes are expanded recursively .
544
+ are, after splices nested in the quotes are expanded.
542
545
543
546
If the outermost scope is a quote, we need to generate code that
544
547
constructs the quoted tree at run-time. We implement this by
@@ -575,7 +578,7 @@ The syntax of terms, values, and types is given as follows:
575
578
576
579
Typing rules are formulated using a stack of environments
577
580
` Es ` . Individual environments ` E ` consist as usual of variable
578
- bindings ` x: T ` . Environments can be combined using one of two
581
+ bindings ` x: T ` . Environments can be combined using the two
579
582
combinators ` ’ ` and ` ~ ` .
580
583
581
584
Environment E ::= () empty
@@ -608,7 +611,7 @@ We define a small step reduction relation `-->` with the following rules:
608
611
The first rule is standard call-by-value beta-reduction. The second
609
612
rule says that splice and quotes cancel each other out. The third rule
610
613
is a context rule; it says that reduction is allowed in the hole ` [ ] `
611
- position of an evaluation contexts . Evaluation contexts ` e ` and
614
+ position of an evaluation context . Evaluation contexts ` e ` and
612
615
splice evaluation context ` e_s ` are defined syntactically as follows:
613
616
614
617
Eval context e ::= [ ] | e t | v e | ’e_s[~e]
@@ -662,7 +665,7 @@ environments and terms.
662
665
663
666
## Going Further
664
667
665
- The presented meta-programming framework is so far quite restrictive
668
+ The meta-programming framework as presented and currently implemented is quite restrictive
666
669
in that it does not allow for the inspection of quoted expressions and
667
670
types. It’s possible to work around this by providing all necessary
668
671
information as normal, unquoted inline parameters. But we would gain
@@ -687,11 +690,11 @@ implementation of power otherwise.
687
690
This assumes a ` Constant ` extractor that maps tree nodes representing
688
691
constants to their values.
689
692
690
- Once we allow for inspection of code, the "AsApplicative " operation
693
+ With the right extractors the "AsFunction " operation
691
694
that maps expressions over functions to functions over expressions can
692
695
be implemented in user code:
693
696
694
- implicit class AsApplicative [T, U](f: Expr[T => U]) extends AnyVal {
697
+ implicit class AsFunction [T, U](f: Expr[T => U]) extends AnyVal {
695
698
def apply(x: Expr[T]): Expr[U] =
696
699
f match {
697
700
case Lambda(g) => g(x)
@@ -705,7 +708,7 @@ This assumes an extractor
705
708
}
706
709
707
710
Once we allow inspection of code via extractors, it’s tempting to also
708
- add constructors that construct typed trees directly without going
711
+ add constructors that create typed trees directly without going
709
712
through quotes. Most likely, those constructors would work over ` Expr `
710
713
types which lack a known type argument. For instance, an ` Apply `
711
714
constructor could be typed as follows:
@@ -722,13 +725,14 @@ implemented as a primitive; it would check that the computed type
722
725
structure of ` Expr ` is a subtype of the type structure representing
723
726
` T ` .
724
727
725
- Before going down that route, we should carefully evaluate its
726
- tradeoffs . Constructing trees that are only verified _ a posteriori_
728
+ Before going down that route, we should evaluate in detail the tradeoffs it
729
+ presents . Constructing trees that are only verified _ a posteriori_
727
730
to be type correct loses a lot of guidance for constructing the right
728
731
trees. So we should wait with this addition until we have more
729
732
use-cases that help us decide whether the loss in type-safety is worth
730
- the gain in flexibility.
731
-
733
+ the gain in flexibility. In this context, it seems that deconstructing types is
734
+ less error-prone than deconstructing tersm, so one might also
735
+ envisage a solution that allows the former but not the latter.
732
736
733
737
## Conclusion
734
738
0 commit comments