diff --git a/_overviews/collections/seqs.md b/_overviews/collections/seqs.md index 76e33c3065..a2479e5c1a 100644 --- a/_overviews/collections/seqs.md +++ b/_overviews/collections/seqs.md @@ -74,7 +74,7 @@ If a sequence is mutable, it offers in addition a side-effecting `update` method | `xs union ys` |Multiset union; same as `xs ++ ys`.| | `xs.distinct` |A subsequence of `xs` that contains no duplicated element.| -Trait [Seq](http://www.scala-lang.org/api/current/scala/collection/Seq.html) has two subtraits [LinearSeq](http://www.scala-lang.org/api/current/scala/collection/IndexedSeq.html), and [IndexedSeq](http://www.scala-lang.org/api/current/scala/collection/IndexedSeq.html). These do not add any new operations, but each offers different performance characteristics: A linear sequence has efficient `head` and `tail` operations, whereas an indexed sequence has efficient `apply`, `length`, and (if mutable) `update` operations. Frequently used linear sequences are `scala.collection.immutable.List` and `scala.collection.immutable.Stream`. Frequently used indexed sequences are `scala.Array` and `scala.collection.mutable.ArrayBuffer`. The `Vector` class provides an interesting compromise between indexed and linear access. It has both effectively constant time indexing overhead and constant time linear access overhead. Because of this, vectors are a good foundation for mixed access patterns where both indexed and linear accesses are used. You'll learn more on vectors [later](http://docs.scala-lang.org/overviews/collections/concrete-immutable-collection-classes.html#vectors). +Trait [Seq](http://www.scala-lang.org/api/current/scala/collection/Seq.html) has two subtraits [LinearSeq](http://www.scala-lang.org/api/current/scala/collection/IndexedSeq.html), and [IndexedSeq](http://www.scala-lang.org/api/current/scala/collection/IndexedSeq.html). These do not add any new operations, but each offers different performance characteristics: A linear sequence has efficient `head` and `tail` operations, whereas an indexed sequence has efficient `apply`, `length`, and (if mutable) `update` operations. Frequently used linear sequences are `scala.collection.immutable.List` and `scala.collection.immutable.Stream`. Frequently used indexed sequences are `scala.Array` and `scala.collection.mutable.ArrayBuffer`. The `Vector` class provides an interesting compromise between indexed and linear access. It has both effectively constant time indexing overhead and constant time linear access overhead. Because of this, vectors are a good foundation for mixed access patterns where both indexed and linear accesses are used. You'll learn more on vectors [later](concrete-immutable-collection-classes.html#vectors). ### Buffers ### diff --git a/_overviews/collections/sets.md b/_overviews/collections/sets.md index 8e773555a1..027d3b4af3 100644 --- a/_overviews/collections/sets.md +++ b/_overviews/collections/sets.md @@ -107,7 +107,7 @@ Comparing the two interactions shows an important principle. You often can repla Mutable sets also provide add and remove as variants of `+=` and `-=`. The difference is that `add` and `remove` return a Boolean result indicating whether the operation had an effect on the set. -The current default implementation of a mutable set uses a hashtable to store the set's elements. The default implementation of an immutable set uses a representation that adapts to the number of elements of the set. An empty set is represented by just a singleton object. Sets of sizes up to four are represented by a single object that stores all elements as fields. Beyond that size, immutable sets are implemented as [hash tries](concrete-immutable-collection-classes.html#hash_tries). +The current default implementation of a mutable set uses a hashtable to store the set's elements. The default implementation of an immutable set uses a representation that adapts to the number of elements of the set. An empty set is represented by just a singleton object. Sets of sizes up to four are represented by a single object that stores all elements as fields. Beyond that size, immutable sets are implemented as [hash tries](concrete-immutable-collection-classes.html#hash-tries). A consequence of these representation choices is that, for sets of small sizes (say up to 4), immutable sets are usually more compact and also more efficient than mutable sets. So, if you expect the size of a set to be small, try making it immutable. diff --git a/_overviews/core/actors-migration-guide.md b/_overviews/core/actors-migration-guide.md index 11aab78b7d..4d2604f9bf 100644 --- a/_overviews/core/actors-migration-guide.md +++ b/_overviews/core/actors-migration-guide.md @@ -14,7 +14,7 @@ permalink: /overviews/core/:title.html ## Introduction Starting with Scala 2.11.0, the Scala -[Actors](http://docs.scala-lang.org/overviews/core/actors.html) +[Actors](actors.html) library is deprecated. Already in Scala 2.10.0 the default actor library is [Akka](http://akka.io). diff --git a/_overviews/core/futures.md b/_overviews/core/futures.md index 6c3704b3ca..b31d7008e2 100644 --- a/_overviews/core/futures.md +++ b/_overviews/core/futures.md @@ -132,7 +132,7 @@ Fortunately the concurrent package provides a convenient way for doing so: } } -Note that `blocking` is a general construct that will be discussed more in depth [below](#in_a_future). +Note that `blocking` is a general construct that will be discussed more in depth [below](#blocking-inside-a-future). Last but not least, you must remember that the `ForkJoinPool` is not designed for long lasting blocking operations. Even when notified with `blocking` the pool might not spawn new workers as you would expect, diff --git a/_overviews/macros/blackbox-whitebox.md b/_overviews/macros/blackbox-whitebox.md index 180099943f..45c7b3cd89 100644 --- a/_overviews/macros/blackbox-whitebox.md +++ b/_overviews/macros/blackbox-whitebox.md @@ -35,7 +35,7 @@ compehensibility. However sometimes def macros transcend the notion of "just a regular method". For example, it is possible for a macro expansion to yield an expression of a type that is more specific than the return type of a macro. In Scala 2.10, such expansion will retain its precise type as highlighted in the ["Static return type of Scala macros"](http://stackoverflow.com/questions/13669974/static-return-type-of-scala-macros) article at Stack Overflow. -This curious feature provides additional flexibility, enabling [fake type providers](http://meta.plasm.us/posts/2013/07/11/fake-type-providers-part-2/), [extended vanilla materialization](/sips/pending/source-locations.html), [fundep materialization](/overviews/macros/implicits.html#fundep_materialization) and [extractor macros](https://github.com/scala/scala/commit/84a335916556cb0fe939d1c51f27d80d9cf980dc), but it also sacrifices clarity - both for humans and for machines. +This curious feature provides additional flexibility, enabling [fake type providers](http://meta.plasm.us/posts/2013/07/11/fake-type-providers-part-2/), [extended vanilla materialization](/sips/pending/source-locations.html), [fundep materialization](/overviews/macros/implicits.html#fundep-materialization) and [extractor macros](https://github.com/scala/scala/commit/84a335916556cb0fe939d1c51f27d80d9cf980dc), but it also sacrifices clarity - both for humans and for machines. To concretize the crucial distinction between macros that behave just like normal methods and macros that refine their return types, we introduce the notions of blackbox macros and whitebox macros. Macros that faithfully follow their type signatures are called **blackbox macros** as their implementations are irrelevant to understanding their behaviour (could be treated as black boxes). Macros that can't have precise signatures in Scala's type system are called **whitebox macros** (whitebox def macros do have signatures, but these signatures are only approximations). @@ -50,7 +50,7 @@ We express the distinction by replacing `scala.reflect.macros.Context` with `sca Blackbox def macros are treated differently from def macros of Scala 2.10. The following restrictions are applied to them by the Scala typechecker: 1. When an application of a blackbox macro expands into tree `x`, the expansion is wrapped into a type ascription `(x: T)`, where `T` is the declared return type of the blackbox macro with type arguments and path dependencies applied in consistency with the particular macro application being expanded. This invalidates blackbox macros as an implementation vehicle of [type providers](http://meta.plasm.us/posts/2013/07/11/fake-type-providers-part-2/). -1. When an application of a blackbox macro still has undetermined type parameters after Scala's type inference algorithm has finished working, these type parameters are inferred forcedly, in exactly the same manner as type inference happens for normal methods. This makes it impossible for blackbox macros to influence type inference, prohibiting [fundep materialization](/overviews/macros/implicits.html#fundep_materialization). +1. When an application of a blackbox macro still has undetermined type parameters after Scala's type inference algorithm has finished working, these type parameters are inferred forcedly, in exactly the same manner as type inference happens for normal methods. This makes it impossible for blackbox macros to influence type inference, prohibiting [fundep materialization](/overviews/macros/implicits.html#fundep-materialization). 1. When an application of a blackbox macro is used as an implicit candidate, no expansion is performed until the macro is selected as the result of the implicit search. This makes it impossible to [dynamically calculate availability of implicit macros](/sips/rejected/source-locations.html). 1. When an application of a blackbox macro is used as an extractor in a pattern match, it triggers an unconditional compiler error, preventing [customizations of pattern matching](https://github.com/paulp/scala/commit/84a335916556cb0fe939d1c51f27d80d9cf980dc) implemented with macros. diff --git a/_overviews/macros/bundles.md b/_overviews/macros/bundles.md index a204cd338c..f698a304a7 100644 --- a/_overviews/macros/bundles.md +++ b/_overviews/macros/bundles.md @@ -27,7 +27,7 @@ following reasons: 1. Being limited to functions makes modularizing complex macros awkward. It's quite typical to see macro logic concentrate in helper traits outside macro implementations, turning implementations into trivial wrappers, which just instantiate and call helpers. -2. Moreover, since macro parameters are path-dependent on the macro context, [special incantations](/overviews/macros/overview.html#writing_bigger_macros) are required to wire implementations and helpers together. +2. Moreover, since macro parameters are path-dependent on the macro context, [special incantations](/overviews/macros/overview.html#writing-bigger-macros) are required to wire implementations and helpers together. Macro bundles provide a solution to these problems by allowing macro implementations to be declared in classes that take `c: scala.reflect.macros.blackbox.Context` or `c: scala.reflect.macros.whitebox.Context` as their constructor parameters, relieving macro implementations from having diff --git a/_overviews/macros/changelog211.md b/_overviews/macros/changelog211.md index 28851fcb26..bed801af57 100644 --- a/_overviews/macros/changelog211.md +++ b/_overviews/macros/changelog211.md @@ -25,21 +25,21 @@ Quasiquotes is the single most impressive upgrade for reflection and macros in S ### New macro powers -1) **[Fundep materialization](http://docs.scala-lang.org/overviews/macros/implicits.html#fundep_materialization)**. Since Scala 2.10.2, implicit whitebox macros can be used to materialize instances of type classes, however such materialized instances can't guide type inference. In Scala 2.11.0, materializers can also affect type inference, helping scalac to infer type arguments for enclosing method applications, something that's used with great success in Shapeless. Even more, with the fix of [SI-3346](https://issues.scala-lang.org/browse/SI-3346), this inference guiding capability can affect both normal methods and implicit conversions alike. Please note, however, that fundep materialization doesn't let one change how Scala's type inference works, but merely provides a way to throw more type constraints into the mix, so it's, for example, impossible to make type inference flow from right to left using fundep materializers. +1) **[Fundep materialization](implicits.html#fundep-materialization)**. Since Scala 2.10.2, implicit whitebox macros can be used to materialize instances of type classes, however such materialized instances can't guide type inference. In Scala 2.11.0, materializers can also affect type inference, helping scalac to infer type arguments for enclosing method applications, something that's used with great success in Shapeless. Even more, with the fix of [SI-3346](https://issues.scala-lang.org/browse/SI-3346), this inference guiding capability can affect both normal methods and implicit conversions alike. Please note, however, that fundep materialization doesn't let one change how Scala's type inference works, but merely provides a way to throw more type constraints into the mix, so it's, for example, impossible to make type inference flow from right to left using fundep materializers. 2) **[Extractor macros](https://github.com/paulp/scala/commit/84a335916556cb0fe939d1c51f27d80d9cf980dc)**. A prominent new feature in Scala 2.11.0 is [name-based extractors](https://github.com/scala/scala/pull/2848) implemented by Paul Phillips. And as usual, when there's a Scala feature, it's very likely that macros can make use of it. Indeed, with the help of structural types, whitebox macros can be used to write extractors than refine the types of extractees on case-by-case basis. This is the technique that we use internally to implement quasiquotes. 3) **[Named and default arguments in macros](https://github.com/scala/scala/pull/3543)**. This is something that strictly speaking shouldn't belong to this changelog, because this feature was reverted shortly after being merged into Scala 2.11.0-RC1 due to a tiny mistake that led to a regression, but we've got a patch that makes the macro engine understand named/default arguments in macro applications. Even though the code freeze won't let us bring this change in Scala 2.11.0, we expect to merge it in Scala 2.11.1 at an earliest opportunity. -4) **[Type macros](http://docs.scala-lang.org/overviews/macros/typemacros.html) and [macro annotations](http://docs.scala-lang.org/overviews/macros/annotations.html)**. Neither type macros, not macro annotations are included of Scala 2.11.0. It is highly unlikely that type macros will ever be included in Scala, but we still deliberate on macro annotations. However, macro annotations are available both for Scala 2.10.x and for Scala 2.11.0 via the [macro paradise plugin](http://docs.scala-lang.org/overviews/macros/annotations.html). +4) **[Type macros](typemacros.html) and [macro annotations](annotations.html)**. Neither type macros, not macro annotations are included of Scala 2.11.0. It is highly unlikely that type macros will ever be included in Scala, but we still deliberate on macro annotations. However, macro annotations are available both for Scala 2.10.x and for Scala 2.11.0 via the [macro paradise plugin](annotations.html). 5) **@compileTimeOnly**. Standard library now features a new `scala.annotations.compileTimeOnly` annotation that tells scalac that its annottees should not be referred to after type checking (which includes macro expansion). The main use case for this annotation is marking helper methods that are only supposed be used only together with an enclosing macro call to indicate parts of arguments of that macro call that need special treatment (e.g. `await` in scala/async or `value` in sbt's new macro-based DSL). For example, scala/async's `await` marked as `@compileTimeOnly` only makes sense inside an `async { ... }` block that compiles it away during its transformation, and using it outside of `async` is a compile-time error thanks to the new annotation. ### Changes to the macro engine -6) **[Blackbox/whitebox separation](http://docs.scala-lang.org/overviews/macros/blackbox-whitebox.html)**. Macros whose macro implementations use `scala.reflect.macros.blackbox.Context` (new in Scala 2.11.0) are called blackbox, have reduced power in comparison to macros in 2.10.x, better support in IDEs and better perspectives in becoming part of Scala. Macros whose macro implementations use `scala.reflect.macros.whitebox.Context` (new in Scala 2.11.0) or `scala.reflect.macros.Context` (the only context in Scala 2.10.x, deprecated in Scala 2.11.0) are called whitebox and have at least the same power as macros in 2.10.x. +6) **[Blackbox/whitebox separation](blackbox-whitebox.html)**. Macros whose macro implementations use `scala.reflect.macros.blackbox.Context` (new in Scala 2.11.0) are called blackbox, have reduced power in comparison to macros in 2.10.x, better support in IDEs and better perspectives in becoming part of Scala. Macros whose macro implementations use `scala.reflect.macros.whitebox.Context` (new in Scala 2.11.0) or `scala.reflect.macros.Context` (the only context in Scala 2.10.x, deprecated in Scala 2.11.0) are called whitebox and have at least the same power as macros in 2.10.x. -7) **[Macro bundles](http://docs.scala-lang.org/overviews/macros/bundles.html)**. It is well-known that path-dependent nature of the current reflection API (that's there in both Scala 2.10.x and Scala 2.11.0) makes it difficult to modularize macros. There are [design patterns](http://docs.scala-lang.org/overviews/macros/overview.html#writing_bigger_macros) that help to overcome this difficulty, but that just leads to proliferation of boilerplate. One of the approaches to dealing with this problem is doing away with cakes altogether, and that's what we're pursing in Project Palladium, but that was too big of a change to pull off in Scala 2.11.0, so we've come up with a workaround that would alleviate the problem until the real solution arrives. Macro bundles are classes that have a single public field of type `Context` and any public method inside a bundle can be referred to as a macro implementation. Such macro implementations can then easily call into other methods of the same class or its superclasses without having to carry the context around, because the bundle already carries the context that everyone inside it can see and refer to. This significantly simplifies writing and maintaining complex macros. +7) **[Macro bundles](bundles.html)**. It is well-known that path-dependent nature of the current reflection API (that's there in both Scala 2.10.x and Scala 2.11.0) makes it difficult to modularize macros. There are [design patterns](overview.html#writing-bigger-macros) that help to overcome this difficulty, but that just leads to proliferation of boilerplate. One of the approaches to dealing with this problem is doing away with cakes altogether, and that's what we're pursing in Project Palladium, but that was too big of a change to pull off in Scala 2.11.0, so we've come up with a workaround that would alleviate the problem until the real solution arrives. Macro bundles are classes that have a single public field of type `Context` and any public method inside a bundle can be referred to as a macro implementation. Such macro implementations can then easily call into other methods of the same class or its superclasses without having to carry the context around, because the bundle already carries the context that everyone inside it can see and refer to. This significantly simplifies writing and maintaining complex macros. 8) **Relaxed requirements for signatures of macro implementations**. With the advent of quasiquotes, reify is quickly growing out of favor as being too clunky and inflexible. To recognize that we now allow both arguments and return types of macro implementations to be of type `c.Tree` rather than `c.Expr[Something]`. There's no longer a need to write huge type signatures and then spend time and lines of code trying to align your macro implementations with those types. Just take trees in and return trees back - the boilerplate is gone. @@ -120,11 +120,11 @@ Quasiquotes is the single most impressive upgrade for reflection and macros in S ### How to make your 2.11.0 macros work in 2.10.x -34) **Quasiquotes**. We don't plan to release quasiquotes as part of the Scala 2.10.x distribution, but they can still be used in Scala 2.10.x by the virtue of the macro paradise plugin. Read [paradise documentation](http://docs.scala-lang.org/overviews/macros/paradise.html) to learn more about what's required to use the compiler plugin, what are the binary compatibility consequences and what are the support guarantees. +34) **Quasiquotes**. We don't plan to release quasiquotes as part of the Scala 2.10.x distribution, but they can still be used in Scala 2.10.x by the virtue of the macro paradise plugin. Read [paradise documentation](paradise.html) to learn more about what's required to use the compiler plugin, what are the binary compatibility consequences and what are the support guarantees. -35) **Most of the new functionality doesn't have equivalents in 2.10.x**. We don't plan to backport any of the new functionality, e.g. fundep materialization or macro bundles, to Scala 2.10.x (except for maybe thread safety for runtime reflection). Consult [the roadmap of macro paradise for Scala 2.10.x](http://docs.scala-lang.org/overviews/macros/roadmap.html) to see what features are supported in paradise. +35) **Most of the new functionality doesn't have equivalents in 2.10.x**. We don't plan to backport any of the new functionality, e.g. fundep materialization or macro bundles, to Scala 2.10.x (except for maybe thread safety for runtime reflection). Consult [the roadmap of macro paradise for Scala 2.10.x](roadmap.html) to see what features are supported in paradise. -36) **Blackbox/whitebox**. If you're determined to have your macros blackbox, it's going to require additional effort to have those macros working consistently in both 2.10.x and 2.11.0, because in 2.10.x all macros are whitebox. First of all, make sure that you're not actually using any of [whitebox powers](http://docs.scala-lang.org/overviews/macros/blackbox-whitebox.html#codifying_the_distinction), otherwise you'll have to rewrite your macros first. Secondly, before returning from your macros impls, explicitly upcast the expansions to the type required by their macro defs. (Of course, none of this applies to whitebox macros. If don't mind your macros being whitebox, then you don't have to do anything to ensure cross-compatibility). +36) **Blackbox/whitebox**. If you're determined to have your macros blackbox, it's going to require additional effort to have those macros working consistently in both 2.10.x and 2.11.0, because in 2.10.x all macros are whitebox. First of all, make sure that you're not actually using any of [whitebox powers](blackbox-whitebox.html#codifying-the-distinction), otherwise you'll have to rewrite your macros first. Secondly, before returning from your macros impls, explicitly upcast the expansions to the type required by their macro defs. (Of course, none of this applies to whitebox macros. If don't mind your macros being whitebox, then you don't have to do anything to ensure cross-compatibility). object Macros { def impl(c: Context) = { diff --git a/_overviews/macros/overview.md b/_overviews/macros/overview.md index b9d2628c4b..f099131408 100644 --- a/_overviews/macros/overview.md +++ b/_overviews/macros/overview.md @@ -23,7 +23,7 @@ A subset of def macros, pending a thorough specification, is tentatively schedul so naturally the contents of the document are outdated. Nevertheless, this guide is not obsolete - everything written here will still work in both Scala 2.10.x and Scala 2.11.x, so it will be helpful to read it through. After reading the guide, take a look at the docs on [quasiquotes](/overviews/quasiquotes/intro.html) -and [macro bundles](/overviews/macros/bundles.html) to familiarize yourself with latest developments +and [macro bundles](bundles.html) to familiarize yourself with latest developments that dramatically simplify writing macros. Then it might be a good idea to follow [our macro workshop](https://github.com/scalamacros/macrology201) for more in-depth examples. diff --git a/_overviews/macros/paradise.md b/_overviews/macros/paradise.md index 130d293789..775b3ebc75 100644 --- a/_overviews/macros/paradise.md +++ b/_overviews/macros/paradise.md @@ -42,7 +42,7 @@ Proceed to the [the feature list](/overviews/macros/roadmap.html) document for m Consult [https://github.com/scalamacros/sbt-example-paradise](https://github.com/scalamacros/sbt-example-paradise) for an end-to-end example, but in a nutshell working with macro paradise is as easy as adding the following two lines -to your build (granted you’ve already [set up sbt](/overviews/macros/overview.html#using_macros_with_maven_or_sbt) +to your build (granted you’ve already [set up sbt](/overviews/macros/overview.html#using-macros-with-maven-or-sbt) to use macros). resolvers += Resolver.sonatypeRepo("releases") diff --git a/_overviews/macros/roadmap.md b/_overviews/macros/roadmap.md index 6140808a38..6c24216c8f 100644 --- a/_overviews/macros/roadmap.md +++ b/_overviews/macros/roadmap.md @@ -32,7 +32,7 @@ and become the new standard way of doing metaprogramming in Scala. | [Def macros](/overviews/macros/overview.html) | Yes | Yes 1 | Yes | Yes 1 | Yes | Yes 1 | | [Macro bundles](/overviews/macros/bundles.html) | No | No 1 | Yes | Yes 1 | Yes | Yes 1 | | [Implicit macros](/overviews/macros/implicits.html) | Yes (since 2.10.2) | Yes 1 | Yes | Yes 1 | Yes | Yes 1 | -| [Fundep materialization](/overviews/macros/implicits.html#fundep_materialization) | Yes (since 2.10.5) 3 | Yes 2 | Yes | Yes 1 | Yes | Yes 1 | +| [Fundep materialization](/overviews/macros/implicits.html#fundep-materialization) | Yes (since 2.10.5) 3 | Yes 2 | Yes | Yes 1 | Yes | Yes 1 | | [Type providers](/overviews/macros/typeproviders.html) | Partial support (see docs) | Yes 2 | Partial support (see docs) | Yes 2 | Partial support (see docs) | Yes 2 | | [Quasiquotes](/overviews/quasiquotes/intro.html) | No | Yes 1 | Yes | Yes 1 | Yes | Yes 1 | | [Type macros](/overviews/macros/typemacros.html) | No | No | No | No | No | No | diff --git a/_overviews/parallel-collections/concrete-parallel-collections.md b/_overviews/parallel-collections/concrete-parallel-collections.md index 558755cb55..d171a4968a 100644 --- a/_overviews/parallel-collections/concrete-parallel-collections.md +++ b/_overviews/parallel-collections/concrete-parallel-collections.md @@ -32,9 +32,9 @@ arrays in the sense that their size is constant. res1: scala.collection.parallel.mutable.ParArray[Int] = ParArray(0, 1, 2, 3, 4, 5, 6, 7,... Internally, splitting a parallel array -[splitter]({{ site.baseurl }}/overviews/parallel-collections/architecture.html#core_abstractions) +[splitter]({{ site.baseurl }}/overviews/parallel-collections/architecture.html#core-abstractions) amounts to creating two new splitters with their iteration indices updated. -[Combiners]({{ site.baseurl }}/overviews/parallel-collections/architecture.html#core_abstractions) +[Combiners]({{ site.baseurl }}/overviews/parallel-collections/architecture.html#core-abstractions) are slightly more involved.Since for most transformer methods (e.g. `flatMap`, `filter`, `takeWhile`, etc.) we don't know the number of elements (and hence, the array size) in advance, each combiner is essentially a variant of an array buffer with an @@ -65,9 +65,9 @@ update time. res0: scala.collection.parallel.immutable.ParVector[Int] = ParVector(0, 2, 4, 6, 8, 10, 12, 14, 16, 18,... Immutable vectors are represented by 32-way trees, so -[splitter]({{ site.baseurl }}/overviews/parallel-collections/architecture.html#core_abstractions)s +[splitter]({{ site.baseurl }}/overviews/parallel-collections/architecture.html#core-abstractions)s are split by assigning subtrees to each splitter. -[Combiners]({{ site.baseurl }}/overviews/parallel-collections/architecture.html#core_abstractions) +[Combiners]({{ site.baseurl }}/overviews/parallel-collections/architecture.html#core-abstractions) currently keep a vector of elements and are combined by lazily copying the elements. For this reason, transformer methods are less scalable than those of a parallel array. Once the @@ -94,7 +94,7 @@ created in a similar way as the sequential res1: scala.collection.parallel.immutable.ParRange = ParRange(15, 13, 11, 9, 7, 5) Just as sequential ranges have no builders, parallel ranges have no -[combiner]({{ site.baseurl }}/overviews/parallel-collections/architecture.html#core_abstractions)s. +[combiner]({{ site.baseurl }}/overviews/parallel-collections/architecture.html#core-abstractions)s. Mapping the elements of a parallel range produces a parallel vector. Sequential ranges and parallel ranges can be converted efficiently one from another using the `seq` and `par` methods. @@ -153,7 +153,7 @@ and res0: Int = 332833500 Similar to parallel hash tables, parallel hash trie -[combiners]({{ site.baseurl }}/overviews/parallel-collections/architecture.html#core_abstractions) +[combiners]({{ site.baseurl }}/overviews/parallel-collections/architecture.html#core-abstractions) pre-sort the elements into buckets and construct the resulting hash trie in parallel by assigning different buckets to different processors, which construct the @@ -194,7 +194,7 @@ following example which outputs square roots of number from 1 to 99: ... -[Combiners]({{ site.baseurl }}/overviews/parallel-collections/architecture.html#core_abstractions) +[Combiners]({{ site.baseurl }}/overviews/parallel-collections/architecture.html#core-abstractions) are implemented as `TrieMap`s under the hood-- since this is a concurrent data structure, only one combiner is constructed for the entire transformer method invocation and shared by all the processors. diff --git a/_overviews/parallel-collections/overview.md b/_overviews/parallel-collections/overview.md index d3ac71baeb..c3b69b97ad 100644 --- a/_overviews/parallel-collections/overview.md +++ b/_overviews/parallel-collections/overview.md @@ -82,7 +82,7 @@ only. As a general heuristic, speed-ups tend to be noticeable when the size of the collection is large, typically several thousand elements. (For more information on the relationship between the size of a parallel collection and performance, please see the -[appropriate subsection]({{ site.baseurl}}/overviews/parallel-collections/performance.html#how_big_should_a_collection_be_to_go_parallel) of the [performance]({{ site.baseurl }}/overviews/parallel-collections/performance.html) +[appropriate subsection]({{ site.baseurl}}/overviews/parallel-collections/performance.html#how-big-should-a-collection-be-to-go-parallel) of the [performance]({{ site.baseurl }}/overviews/parallel-collections/performance.html) section of this guide.) #### map diff --git a/getting-started.md b/getting-started.md index 340f89d8c8..c93106ed54 100644 --- a/getting-started.md +++ b/getting-started.md @@ -53,9 +53,9 @@ Scala project. --> ## Next Steps Once you've finished these tutorials, check out -* [The Tour of Scala](http://docs.scala-lang.org/tutorials/tour/tour-of-scala.html) for bite-sized introductions to Scala's features. -* [Learning Resources](learn), which includes online interactive tutorials and courses. -* [Our list of some popular Scala books]({{ site.baseurl }}/books.html). +* [The Tour of Scala](tutorials/tour/tour-of-scala.html) for bite-sized introductions to Scala's features. +* [Learning Resources](learn.html), which includes online interactive tutorials and courses. +* [Our list of some popular Scala books](books.html). ## Getting Help There are a multitude of mailing lists and real-time chat channels in case you want to quickly connect with other Scala users. Check out our [community](https://scala-lang.org/community/) page a list of these resources and where to reach out for help.