diff --git a/_includes/footer.html b/_includes/footer.html index 6d233e5c8e..eb43f821b8 100644 --- a/_includes/footer.html +++ b/_includes/footer.html @@ -53,7 +53,7 @@ - + {% if page.layout == "sips"%} diff --git a/_overviews/FAQ/collections.md b/_overviews/FAQ/collections.md index b044191727..d78a2340a9 100644 --- a/_overviews/FAQ/collections.md +++ b/_overviews/FAQ/collections.md @@ -167,7 +167,7 @@ The operation is meant to traverse all elements of the collection, and apply the given operation f to each element. The application is done for its side effect only; in fact any function result of f is discarded by foreach. -Traversible objects can be finite or infinite. An example of an infinite +Traversable objects can be finite or infinite. An example of an infinite traversable object is the stream of natural numbers `Stream.from(0)`. The method `hasDefiniteSize` indicates whether a collection is possibly infinite. If `hasDefiniteSize` returns true, the collection is certainly finite. If it @@ -250,7 +250,7 @@ it. Also available are some traits with further refinements, such as * `LinearSeq` -- A trait for linear sequences, with efficient time for `isEmpty`, `head` and `tail`. * `immutable.LinearSeq` - * `immutable.List` -- An immutable, singlely-linked, list implementation. + * `immutable.List` -- An immutable, singly-linked, list implementation. * `immutable.Stream` -- A lazy-list. Its elements are only computed on-demand, but memoized (kept in memory) afterwards. It can be theoretically infinite. * `mutable.LinearSeq` * `mutable.DoublyLinkedList` -- A list with mutable `prev`, `head` (`elem`) and `tail` (`next`). @@ -343,7 +343,7 @@ it. Also available are some traits with further refinements, such as This was done to achieve maximum code reuse. The concrete *generic* implementation for classes with a certain structure (a traversable, a map, etc) is done in the Like classes. The classes intended for general consumption, -then, override selected methods that can be optmized. +then, override selected methods that can be optimized. * What the companion methods are for (e.g. List.companion)? diff --git a/_overviews/FAQ/context-bounds.md b/_overviews/FAQ/context-bounds.md index fa0beaaaa6..ad79c2990d 100644 --- a/_overviews/FAQ/context-bounds.md +++ b/_overviews/FAQ/context-bounds.md @@ -40,7 +40,7 @@ Another very common example in the library is a bit more complex: def f[A : Ordering](a: A, b: A) = implicitly[Ordering[A]].compare(a, b) -Here, `implicitly` is used to retrive the implicit value we want, one of type +Here, `implicitly` is used to retrieve the implicit value we want, one of type `Ordering[A]`, which class defines the method `compare(a: A, b: A): Int`. We'll see another way of doing this below. diff --git a/_overviews/FAQ/finding-implicits.md b/_overviews/FAQ/finding-implicits.md index 015216f28c..69a1c4a49e 100644 --- a/_overviews/FAQ/finding-implicits.md +++ b/_overviews/FAQ/finding-implicits.md @@ -219,7 +219,7 @@ by a `t` that is not implicit, so no implicit `T` is in scope. The invocation of `f` was enabled by importing from `Y.X.`. But it is not convenient to require an import to access implicit values -providied by a package. +provided by a package. If an implicit value is not found in lexical scope, implicit search continues in implicit scope. diff --git a/_overviews/FAQ/finding-symbols.md b/_overviews/FAQ/finding-symbols.md index 0512856737..aca08ae12f 100644 --- a/_overviews/FAQ/finding-symbols.md +++ b/_overviews/FAQ/finding-symbols.md @@ -160,7 +160,7 @@ object of type that is receiving the method. For example, consider `"a" -> 1`. W to look for an implicit which works on `"a"`, and so it can take `String`, one of its supertypes (`AnyRef` or `Any`) or a type parameter. In this case, we find `implicit final class ArrowAssoc[A](private val self: A)` which makes this implicit -avaialable on all types. +available on all types. Other implicit conversions may be visible in your scope depending on imports, extended types or self-type annotations. See [Finding implicits](finding-implicits.html) for details. diff --git a/_overviews/collections/migrating-from-scala-27.md b/_overviews/collections/migrating-from-scala-27.md index c2c2bd9935..87f557ca4f 100644 --- a/_overviews/collections/migrating-from-scala-27.md +++ b/_overviews/collections/migrating-from-scala-27.md @@ -41,7 +41,7 @@ Generally, the old functionality of Scala 2.7 collections has been left in place There are two parts of the old libraries which have been replaced wholesale, and for which deprecation warnings were not feasible. -1. The previous `scala.collection.jcl` package is gone. This package tried to mimick some of the Java collection library design in Scala, but in doing so broke many symmetries. Most people who wanted Java collections bypassed `jcl` and used `java.util` directly. Scala 2.8 offers automatic conversion mechanisms between both collection libraries in the [JavaConversions]({{ site.baseurl }}/overviews/collections/conversions-between-java-and-scala-collections.html) object which replaces the `jcl` package. +1. The previous `scala.collection.jcl` package is gone. This package tried to mimic some of the Java collection library design in Scala, but in doing so broke many symmetries. Most people who wanted Java collections bypassed `jcl` and used `java.util` directly. Scala 2.8 offers automatic conversion mechanisms between both collection libraries in the [JavaConversions]({{ site.baseurl }}/overviews/collections/conversions-between-java-and-scala-collections.html) object which replaces the `jcl` package. 2. Projections have been generalized and cleaned up and are now available as views. It seems that projections were used rarely, so not much code should be affected by this change. So, if your code uses either `jcl` or projections there might be some minor rewriting to do. diff --git a/_overviews/core/architecture-of-scala-collections.md b/_overviews/core/architecture-of-scala-collections.md index 5cb894f1e9..3a2dcf6fb4 100644 --- a/_overviews/core/architecture-of-scala-collections.md +++ b/_overviews/core/architecture-of-scala-collections.md @@ -907,7 +907,7 @@ return with `None`. The combined selection over an option value `opt` is elegantly expressed using `opt.flatMap(x => f(x))`. When applied to an optional value that is `None`, it returns `None`. Otherwise `opt` is `Some(x)` and the function `f` is applied to the encapsulated value `x`, -yielding a new option, which is returned by the flatmap. +yielding a new option, which is returned by the flatMap. The next two methods to implement for a mutable map are `+=` and `-=`. In the implementation of `PrefixMap`, these are defined in terms of two @@ -948,7 +948,7 @@ using the map's `+=` method. For immutable maps, the non-destructive element addition method `+` is used instead of method `+=`. Sets work in the same way. -However, in all these cases, to build the right kind of colletion +However, in all these cases, to build the right kind of collection you need to start with an empty collection of that kind. This is provided by the `empty` method, which is the last method defined in `PrefixMap`. This method simply returns a fresh `PrefixMap`. diff --git a/_overviews/core/binary-compatibility-of-scala-releases.md b/_overviews/core/binary-compatibility-of-scala-releases.md index c12db85230..749a31f6cc 100644 --- a/_overviews/core/binary-compatibility-of-scala-releases.md +++ b/_overviews/core/binary-compatibility-of-scala-releases.md @@ -11,8 +11,8 @@ When two versions of Scala are binary compatible, it is safe to compile your pro We check binary compatibility automatically with [MiMa](https://github.com/lightbend/migration-manager). We strive to maintain a similar invariant for the `behavior` (as opposed to just linkage) of the standard library, but this is not checked mechanically (Scala is not a proof assistant so this is out of reach for its type system). -#### Forwards and Back -We distinguish forwards and backwards compatibility (think of these as properties of a sequence of versions, not of an individual version). Maintaining backwards compatibility means code compiled on an older version will link with code compiled with newer ones. Forwards compatibility allows you to compile on new versions and run on older ones. +#### Forward and Back +We distinguish forward and backward compatibility (think of these as properties of a sequence of versions, not of an individual version). Maintaining backwards compatibility means code compiled on an older version will link with code compiled with newer ones. Forward compatibility allows you to compile on new versions and run on older ones. Thus, backwards compatibility precludes the removal of (non-private) methods, as older versions could call them, not knowing they would be removed, whereas forwards compatibility disallows adding new (non-private) methods, because newer programs may come to depend on them, which would prevent them from running on older versions (private methods are exempted here as well, as their definition and call sites must be in the same compilation unit). diff --git a/_overviews/core/collections-migration-213.md b/_overviews/core/collections-migration-213.md index 71b9cbf815..f2b33ae434 100644 --- a/_overviews/core/collections-migration-213.md +++ b/_overviews/core/collections-migration-213.md @@ -29,7 +29,7 @@ The most important changes in the Scala 2.13 collections library are: The [scala-collection-compat](https://github.com/scala/scala-collection-compat) is a library released for 2.11, 2.12 and 2.13 that provides some of the new APIs from Scala 2.13 for the older versions. This simplifies cross-building projects. -The module also provides [migratrion rules](https://github.com/scala/scala-collection-compat#migration-tool) for [scalafix](https://scalacenter.github.io/scalafix/docs/users/installation.html) that can update a project's source code to work with the 2.13 collections library. +The module also provides [migration rules](https://github.com/scala/scala-collection-compat#migration-tool) for [scalafix](https://scalacenter.github.io/scalafix/docs/users/installation.html) that can update a project's source code to work with the 2.13 collections library. ## scala.Seq, varargs and scala.IndexedSeq migration diff --git a/_overviews/core/futures.md b/_overviews/core/futures.md index c753fca53d..a0eea92daf 100644 --- a/_overviews/core/futures.md +++ b/_overviews/core/futures.md @@ -665,16 +665,16 @@ multiple `andThen` calls are ordered, as in the following example which stores the recent posts from a social network to a mutable set and then renders all the posts to the screen: - val allposts = mutable.Set[String]() + val allPosts = mutable.Set[String]() Future { session.getRecentPosts } andThen { - case Success(posts) => allposts ++= posts + case Success(posts) => allPosts ++= posts } andThen { case _ => clearAll() - for (post <- allposts) render(post) + for (post <- allPosts) render(post) } In summary, the combinators on futures are purely functional. diff --git a/_overviews/core/implicit-classes.md b/_overviews/core/implicit-classes.md index 9b3d819d87..501724cdb4 100644 --- a/_overviews/core/implicit-classes.md +++ b/_overviews/core/implicit-classes.md @@ -72,8 +72,8 @@ Implicit classes have the following restrictions: implicit class RichDate(date: java.util.Date) // OK! - implicit class Indexer[T](collecton: Seq[T], index: Int) // BAD! - implicit class Indexer[T](collecton: Seq[T])(implicit index: Index) // OK! + implicit class Indexer[T](collection: Seq[T], index: Int) // BAD! + implicit class Indexer[T](collection: Seq[T])(implicit index: Index) // OK! While it's possible to create an implicit class with more than one non-implicit argument, such classes diff --git a/_overviews/macros/blackbox-whitebox.md b/_overviews/macros/blackbox-whitebox.md index aade9367f2..f243e4ac74 100644 --- a/_overviews/macros/blackbox-whitebox.md +++ b/_overviews/macros/blackbox-whitebox.md @@ -29,7 +29,7 @@ In fact, macros became an important part of our ecosystem so quickly that just a Macro flavors are plentiful, so we decided to carefully examine them to figure out which ones should be put in the standard. This entails answering a few important questions. Why are macros working so well? Why do people use them? Our hypothesis is that this happens because the hard to comprehend notion of metaprogramming expressed in def macros piggybacks on the familiar concept of a typed method call. Thanks to that, the code that users write can absorb more meaning without becoming bloated or losing -compehensibility. +comprehensibility. ## Blackbox and whitebox macros diff --git a/_overviews/macros/changelog211.md b/_overviews/macros/changelog211.md index 0b91acfb4c..c4753e56ea 100644 --- a/_overviews/macros/changelog211.md +++ b/_overviews/macros/changelog211.md @@ -60,7 +60,7 @@ Quasiquotes is the single most impressive upgrade for reflection and macros in S 16) **[knownDirectSubclasses is deemed to be officially broken](https://issues.scala-lang.org/browse/SI-7046)**. A lot of users who tried to traverse sealed hierarchies of classes have noticed that `ClassSymbol.knownDirectSubclasses` only works if invocations of their macros come after the definitions of those hierarchies in Scala's compilation order. For instance, if a sealed hierarchy is defined in the bottom of a source file, and a macro application is written in the top of the file, then knownDirectSubclasses will return an empty list. This is an issue that is deeply rooted in Scala's internal architecture, and we can't provide a fix for it in the near future. -17) **showCode**. Along with `Tree.toString` that prints Scala-ish source code and `showRaw(tree)` that prints internal structures of trees, we now have `showCode` that prints compileable Scala source code corresponding to the provided tree, courtesy of Vladimir Nikolaev, who's done an amazing work of bringing this to life. We plan to eventually replace `Tree.toString` with `showCode`, but in Scala 2.11.0 these are two different methods. +17) **showCode**. Along with `Tree.toString` that prints Scala-ish source code and `showRaw(tree)` that prints internal structures of trees, we now have `showCode` that prints compilable Scala source code corresponding to the provided tree, courtesy of Vladimir Nikolaev, who's done an amazing work of bringing this to life. We plan to eventually replace `Tree.toString` with `showCode`, but in Scala 2.11.0 these are two different methods. 18) **[It is now possible to typecheck in type and pattern modes](https://issues.scala-lang.org/browse/SI-6814)**. A very convenient `Context.typeCheck` and `ToolBox.typeCheck` functionality of Scala 2.10.x had a significant inconvenience - it only worked for expressions, and typechecking something as a type or as a pattern required building dummy expressions. Now `typeCheck` has the mode parameter that take case of that difficulty. @@ -170,7 +170,7 @@ Quasiquotes is the single most impressive upgrade for reflection and macros in S // provides a source compatibility stub // in Scala 2.10.x, it will make `import compat._` compile just fine, // even though `c.universe` doesn't have `compat` - // in Scala 2.11.0, it will be ignored, becase `import c.universe._` + // in Scala 2.11.0, it will be ignored, because `import c.universe._` // brings its own `compat` in scope and that one takes precedence private object HasCompat { val compat = ??? }; import HasCompat._ diff --git a/_overviews/parallel-collections/architecture.md b/_overviews/parallel-collections/architecture.md index d5173e915f..487c6deac9 100644 --- a/_overviews/parallel-collections/architecture.md +++ b/_overviews/parallel-collections/architecture.md @@ -37,7 +37,7 @@ terms of two core abstractions-- `Splitter`s and `Combiner`s. ### Splitters The job of a `Splitter`, as its name suggests, is to split a parallel -collection into a non-trival partition of its elements. The basic idea is to +collection into a non-trivial partition of its elements. The basic idea is to split the collection into smaller parts until they are small enough to be operated on sequentially. @@ -55,7 +55,7 @@ subsets of elements of the whole parallel collection. And similar to normal `Iterator`s, a `Splitter` is invalidated after its `split` method is invoked. In general, collections are partitioned using `Splitter`s into subsets of -roughly the same size. In cases where more arbitrarily-sized partions are +roughly the same size. In cases where more arbitrarily-sized partitions are required, in particular on parallel sequences, a `PreciseSplitter` is used, which inherits `Splitter` and additionally implements a precise split method, `psplit`. @@ -82,7 +82,7 @@ and the type of the resulting collection, respectively. _Note:_ Given two `Combiner`s, `c1` and `c2` where `c1 eq c2` is `true` (meaning they're the same `Combiner`), invoking `c1.combine(c2)` always does -nothing and simpy returns the receiving `Combiner`, `c1`. +nothing and simply returns the receiving `Combiner`, `c1`. ## Hierarchy diff --git a/_overviews/quasiquotes/definition-details.md b/_overviews/quasiquotes/definition-details.md index d859b6d136..7d3c38e3cc 100644 --- a/_overviews/quasiquotes/definition-details.md +++ b/_overviews/quasiquotes/definition-details.md @@ -216,7 +216,7 @@ Abstract type definitions have the following shape: low: universe.Tree = high: universe.Tree = List[T] -Whenever one of the bounds isn\'t available, it gets represented as an [empty tree](expression-details.html#empty). Here each of the type arguments is a type definition itself. +Whenever one of the bounds isn't available, it gets represented as an [empty tree](expression-details.html#empty). Here each of the type arguments is a type definition itself. Another form of type definition is a type alias: @@ -259,7 +259,7 @@ Alternatively you can also deconstruct arguments, separating implicit and non-im implparams: List[universe.ValDef] = List(implicit val y: Int = _) body: universe.Tree = x.$plus(y) -This way of handling parameters will still work if the method doesn\'t have any implicit parameters and `implparams` will get extracted as an empty list: +This way of handling parameters will still work if the method doesn't have any implicit parameters and `implparams` will get extracted as an empty list: scala> val q"def g(...$paramss)(implicit ..$implparams) = $rhs" = q"def g(x: Int)(y: Int) = x + y" @@ -344,7 +344,7 @@ Packages are a fundamental primitive to organize source code. You can express th } }) -Quasiquotes don\'t support the inline package definition syntax that is usually used in the header of the source file (but it's equivalent to the supported one in terms of ASTs). +Quasiquotes don't support the inline package definition syntax that is usually used in the header of the source file (but it's equivalent to the supported one in terms of ASTs). ## Package Object Definition @@ -352,7 +352,7 @@ Package objects are a cross between packages and object: q"package object $tname extends { ..$earlydefns } with ..$parents { $self => ..$stats }" -All of the handling properties are equivalent to those of objects apart from the fact that they don\'t have [modifiers](#modifiers). +All of the handling properties are equivalent to those of objects apart from the fact that they don't have [modifiers](#modifiers). Even though package and regular objects seem to be quite similar syntactically, they don't match one another: diff --git a/_overviews/quasiquotes/expression-details.md b/_overviews/quasiquotes/expression-details.md index 371b7508dc..19cbc5ee67 100644 --- a/_overviews/quasiquotes/expression-details.md +++ b/_overviews/quasiquotes/expression-details.md @@ -290,7 +290,7 @@ At the moment, tuples are only supported up to an arity of 22, but this is just scala> val `tuple 23 supported?` = definitions.TupleClass(23) != NoSymbol tuple 23 supported?: Boolean = false -Despited the fact that `Tuple1` class exists there is no built-in syntax for it. Single parens around expression do not change its meaning: +Despite the fact that `Tuple1` class exists there is no built-in syntax for it. Single parens around expression do not change its meaning: scala> val inparens = q"(a)" inparens: universe.Ident = a diff --git a/_overviews/quasiquotes/intro.md b/_overviews/quasiquotes/intro.md index fbebb00f77..274a0d936f 100644 --- a/_overviews/quasiquotes/intro.md +++ b/_overviews/quasiquotes/intro.md @@ -73,7 +73,7 @@ Each of these contexts is covered by a separate interpolator: tq | [types]({{ site.baseurl }}/overviews/quasiquotes/syntax-summary.html#types) pq | [patterns]({{ site.baseurl }}/overviews/quasiquotes/syntax-summary.html#patterns) -Syntactical similarity between different contexts doesn\'t imply similarity between underlying trees: +Syntactical similarity between different contexts doesn't imply similarity between underlying trees: scala> println(q"List[Int]" equalsStructure tq"List[Int]") false @@ -110,7 +110,7 @@ Unquote splicing is a way to unquote a variable number of elements: scala> val fab = q"f(..$ab)" fab: universe.Tree = f(a, b) -Dots before the unquotee annotate indicate a degree of flattenning and are called a *splicing rank*. `..$` expects the argument to be an `Iterable[Tree]` and `...$` expects an `Iterable[Iterable[Tree]]`. +Dots before the unquotee annotate indicate a degree of flattening and are called a *splicing rank*. `..$` expects the argument to be an `Iterable[Tree]` and `...$` expects an `Iterable[Iterable[Tree]]`. Splicing can easily be combined with regular unquotation: diff --git a/_overviews/quasiquotes/terminology.md b/_overviews/quasiquotes/terminology.md index d528551c4c..3b153a2274 100644 --- a/_overviews/quasiquotes/terminology.md +++ b/_overviews/quasiquotes/terminology.md @@ -19,6 +19,6 @@ permalink: /overviews/quasiquotes/:title.html * **Tree deconstruction** refers to usages of quasiquotes as patterns to structurally tear apart trees. * **Unquoting** is a way of either putting things in or extracting things out of quasiquotes. Can be performed with `$` syntax within a quasiquote. * **Unquote splicing** (or just splicing) is another form of unquoting that flattens contents of the unquotee into a tree. Can be performed with either `..$` or `...$` syntax. -* **Rank** is a degree of flattenning of unquotee: `rank($) == 0`, `rank(..$) == 1`, `rank(...$) == 2`. +* **Rank** is a degree of flattening of unquotee: `rank($) == 0`, `rank(..$) == 1`, `rank(...$) == 2`. * [**Lifting**](lifting.html) is a way to unquote non-tree values and transform them into trees with the help of the `Liftable` typeclass. * [**Unlifting**](unlifting.html) is a way to unquote non-tree values out of quasiquote patterns with the help of the `Unliftable` typeclass. diff --git a/_overviews/quasiquotes/type-details.md b/_overviews/quasiquotes/type-details.md index b9d6929f22..9258d5928e 100644 --- a/_overviews/quasiquotes/type-details.md +++ b/_overviews/quasiquotes/type-details.md @@ -92,7 +92,7 @@ Lastly and [similarly to expressions](expression-details.html#super-and-this) on ## Applied Type -Instantiations of parametized types can be expressed with the help of applied types (type-level equivalent of type application): +Instantiations of parameterized types can be expressed with the help of applied types (type-level equivalent of type application): scala> val applied = tq"Foo[A, B]" applied: universe.Tree = Foo[A, B] @@ -151,7 +151,7 @@ Existential types consist of a type tree and a list of definitions: tpt: universe.Tree = List[T] defns: List[universe.MemberDef] = List(type T) -Alternatively there is also an underscrore notation: +Alternatively there is also an underscore notation: scala> val tq"$tpt forSome { ..$defns }" = tq"List[_]" tpt: universe.Tree = List[_$1] diff --git a/_overviews/tutorials/binary-compatibility-for-library-authors.md b/_overviews/tutorials/binary-compatibility-for-library-authors.md index d8cf3002da..655317c95d 100644 --- a/_overviews/tutorials/binary-compatibility-for-library-authors.md +++ b/_overviews/tutorials/binary-compatibility-for-library-authors.md @@ -79,23 +79,23 @@ Two library versions are **Binary Compatible** with each other if the compiled b ### Relationship between source and binary compatibility While breaking source compatibility often results in binary incompatibilities as well, they are actually orthogonal -- breaking one does not imply breaking the other. -#### Forwards and Backwards Compatibility +#### Forward and Backward Compatibility There are two "directions" when we describe compatibility of a library release: -**Backwards Compatible** means that a newer library version can be used in an environment where an older version is expected. When talking about binary and source compatibility, +**Backward Compatible** means that a newer library version can be used in an environment where an older version is expected. When talking about binary and source compatibility, this is the common and implied direction. -**Forwards Compatible** means that an older library can be used in an environment where a newer version is expected. +**Forward Compatible** means that an older library can be used in an environment where a newer version is expected. Forward compatibility is generally not upheld for libraries. Let's look at an example where library `A v1.0.0` is compiled with library `C v1.1.0`. -![Forwards and Backwards Compatibility]({{ site.baseurl }}/resources/images/library-author-guide/fowards_backwards_compatibility.png){: style="width: 65%; margin: auto; display: block"} +![Forward and Backward Compatibility]({{ site.baseurl }}/resources/images/library-author-guide/forward_backward_compatibility.png){: style="width: 65%; margin: auto; display: block"} `C v1.1.0 ` is **Forwards Binary Compatible** with `v1.0.0` if we can use `v1.0.0`'s JAR at runtime instead of `v1.1.0`'s JAR without any linkage errors. -`C v1.2.0 ` is **Backwards Binary Compatible** with `v1.1.0` if we can use `v1.2.0`'s JAR at runtime instead of `v1.1.0`'s JAR without any linkage errors. +`C v1.2.0 ` is **Backward Binary Compatible** with `v1.1.0` if we can use `v1.2.0`'s JAR at runtime instead of `v1.1.0`'s JAR without any linkage errors. ## Why binary compatibility matters diff --git a/_sips/minutes/2016-09-20-sip-20th-september-minutes.md b/_sips/minutes/2016-09-20-sip-20th-september-minutes.md index fe5bd46fe5..584975a64d 100644 --- a/_sips/minutes/2016-09-20-sip-20th-september-minutes.md +++ b/_sips/minutes/2016-09-20-sip-20th-september-minutes.md @@ -154,12 +154,12 @@ issue, two things are required: The second option is not feasible because unsigned numbers are AnyVals, and they can only extend `Object`. Working around this in the backend is, in Sébastien's opinion, not an exciting adventure to embark on: a lot of patches and quirky -fixes are required in the compiler. Sébastien, recognizing his unability to fix +fixes are required in the compiler. Sébastien, recognizing his inability to fix the issue, recommends to reject the proposal. Josh needs to leave. Eugene wonders if these problems are only JVM-specific. Sébastien replies that both yes and no, and he confirms that unsigned integers -will be implemented in Scalajs alone, so the implementation won't be +will be implemented in Scala.js alone, so the implementation won't be platform-independent. Eugene is interested in knowing if there will be any code duplication in the implementation, and Sébastien doesn't think so, since Scala Native implements unsigned integer in a different way. diff --git a/_sips/minutes/2016-11-29-sip-minutes.md b/_sips/minutes/2016-11-29-sip-minutes.md index a3736d79a0..c970ddb730 100644 --- a/_sips/minutes/2016-11-29-sip-minutes.md +++ b/_sips/minutes/2016-11-29-sip-minutes.md @@ -66,7 +66,7 @@ The main motivation is to prepare for inline. Inline won't work very well withou ### [SIP-NN:Static](https://github.com/scala/docs.scala-lang/pull/491/files) -Iulian says too much code is generated by annotations. We could solve name clashes the way ScalaJS does by specifying the exported name. How can we wake code generation predictable without looking at annotations? How do we emit public static field without accessors? Having everything emitted as static and object where possible is going to simplify reasoning about how things are initialized. +Iulian says too much code is generated by annotations. We could solve name clashes the way Scala.js does by specifying the exported name. How can we wake code generation predictable without looking at annotations? How do we emit public static field without accessors? Having everything emitted as static and object where possible is going to simplify reasoning about how things are initialized. How should a user decide when to use static? It is platform-dependent. diff --git a/_sips/minutes/2017-02-14-sip-minutes.md b/_sips/minutes/2017-02-14-sip-minutes.md index a1d8785ba0..ac54f216d3 100644 --- a/_sips/minutes/2017-02-14-sip-minutes.md +++ b/_sips/minutes/2017-02-14-sip-minutes.md @@ -147,7 +147,7 @@ The current SIP tries to make it behave as expected by the users in common cases **Jorge** We have to pass here both on this proposal as is right now but I think this could be dangerous in the case where we don't have an implementation for Scalac because maybe the details change and assume something in Scalac that the SIP is not able to predict or guard against it. Let's wait until next month and I will double check whether this is possible or not. Then I will get in touch with the Lightbend team to see whether this can be implemented or not. We'll decide in a month whether it should be accepted. -**Sébastien** ScalaJS already implemented it under another name but it's supposed to be conservative with respect to the aesthetic SIP in the sense that things that are allowed now with @jsstatic will also be allowed with @static. @static might open up a little bit more. +**Sébastien** Scala.js already implemented it under another name but it's supposed to be conservative with respect to the aesthetic SIP in the sense that things that are allowed now with @jsstatic will also be allowed with @static. @static might open up a little bit more. **Conclusion** The static SIP proposal has to be implemented in Scala, as it's already present in Dotty. Triplequote (Iulian Dragos and Mirco Dotta) has offered to provide an implementation targeting 2.12.3. @@ -221,7 +221,7 @@ There are multiple use cases covered by this SIP. I think the two most important **Martin** If you don't emit a Scala signature then you can't have a co- or contravariant type parameter because they are only expressed in Scala signatures, in Java it's not there. I don't see how that follows from the current proposal. Also, isn't it platform dependent? -**Sébastien** We do have a Java signature. Scala-JS doesn't disable classfile emission. When you say quickly compile, it uses the classfiles to quickly compile. when you use macros, it will extend from those classfiles. When you use an IDE it reduces the classfiles to identify things. When you use sbt, it uses classfiles to detect the changes. However, they aren't used by the ScalaJS linker. +**Sébastien** We do have a Java signature. Scala.js doesn't disable classfile emission. When you say quickly compile, it uses the classfiles to quickly compile. when you use macros, it will extend from those classfiles. When you use an IDE it reduces the classfiles to identify things. When you use sbt, it uses classfiles to detect the changes. However, they aren't used by the Scala.js linker. **Seth** Does this need to be part of the compiler or can it move forward as a plugin or just as a check performed in MiMa? MiMa just compares two different APIs. Can it have this other job as well: seeing if it does anything outside of the boundaries. diff --git a/_sips/minutes/2017-05-08-sip-minutes.md b/_sips/minutes/2017-05-08-sip-minutes.md index e5d2e52ccf..752873ef1b 100644 --- a/_sips/minutes/2017-05-08-sip-minutes.md +++ b/_sips/minutes/2017-05-08-sip-minutes.md @@ -47,12 +47,12 @@ Minutes were taken by Darja Jovanovic. Proposal aims to introduce new syntax from comprehension for monads to comonads. Martin is the reviewer. He asks others attendees for their opinion on this. Everyone had read the SIP. -**Eugene** referred to original proposal and wishes to see a better motivation for this language feature encouraging use of “plain English” to simplify the use of Scala as practice oriented language. He believes that it could be critical how this SIP can be improved. During the recent conference, organized by Facebook, he spoke with typescript guys that are developing idiomic solutions that would benefit typescript and javascript and allow community users to give their inputs. +**Eugene** referred to original proposal and wishes to see a better motivation for this language feature encouraging use of “plain English” to simplify the use of Scala as practice oriented language. He believes that it could be critical how this SIP can be improved. During the recent conference, organized by Facebook, he spoke with TypeScript guys that are developing idiomatic solutions that would benefit TypeScript and JavaScript and allow community users to give their inputs. Refers to the paper “Denotation” he linked in a proposal, that is not enough for Scala, but a good start. **Jorge** is getting back discussion on voting on this proposal and he mentioned that Josh insisted on more examples and suggestions on motivation of this SIP. -**Eugene** wanted to add more syntax (map and flatmap), but **Martin** opposed to that saying that Scala is quite serious program and needs more reason to add any additional syntax to it. **Martin** would like to see more widespread use of comonadic constructs and Libraries, and before doing that, he wouldn’t consider any further change. **Sebastian** agrees with Martin and says that he doesn’t really understand Josh’s and Eugene’s proposal. **Iulian** agrees that the proposal is quite complicated and he wonders how it can be useful. He believes that it is an interesting research direction, but that it needs more users feedbacks in aim to be included in the Scala, therefore questioning if the proposal should be numbered in the current form. Seth and Adriaan agree with Martin and Iulian. +**Eugene** wanted to add more syntax (map and flatMap), but **Martin** opposed to that saying that Scala is quite serious program and needs more reason to add any additional syntax to it. **Martin** would like to see more widespread use of comonadic constructs and Libraries, and before doing that, he wouldn’t consider any further change. **Sebastian** agrees with Martin and says that he doesn’t really understand Josh’s and Eugene’s proposal. **Iulian** agrees that the proposal is quite complicated and he wonders how it can be useful. He believes that it is an interesting research direction, but that it needs more users feedback in aim to be included in the Scala, therefore questioning if the proposal should be numbered in the current form. Seth and Adriaan agree with Martin and Iulian. **Conclusion** Proposal discarded unanimously. They will send the feedback to the author. diff --git a/_sips/minutes/2018-05-18-sip-minutes.md b/_sips/minutes/2018-05-18-sip-minutes.md index 3f4c390f78..b707f52f73 100644 --- a/_sips/minutes/2018-05-18-sip-minutes.md +++ b/_sips/minutes/2018-05-18-sip-minutes.md @@ -62,7 +62,7 @@ Given the short time and amount of decisions that need to be made, the Committee *Structure* a) Have a **list of changes**, grouped in batches that would be decided within the next year, meeting once a month -b) **Plan** - full and strucutured list of changes that need to be implemented consolidated between the Committee members using a shared a Google doc +b) **Plan** - full and structured list of changes that need to be implemented consolidated between the Committee members using a shared a Google doc c) **Public comments** - each batch should be published on the Contributors thread, for a month at a time in order to have community involved, share their opinion and contribute. Advise was proposed - each thread should clearly point out start and end date of collecting the comments/suggestions. *Organisation* @@ -80,10 +80,10 @@ The above mentioned structure and organisation was gathered throughout the meeti 4. Other: spec, quorum -**Heather** bings up an important question "What about Scala spec" ([YouTube time: 4'49''](https://youtu.be/q2LVmTe9qmU?t=289)) to which **Martin** responds within the next year we should know which features are included as a first priority but that spec should not be left for the last minute. +**Heather** brings up an important question "What about Scala spec" ([YouTube time: 4'49''](https://youtu.be/q2LVmTe9qmU?t=289)) to which **Martin** responds within the next year we should know which features are included as a first priority but that spec should not be left for the last minute. **Miles** ([YouTube time: 8'45](https://youtu.be/q2LVmTe9qmU?t=525))suggested that SIP proposals should include draft specification changes to save time and effort pulling the eventual spec update together. **Martin** ([YouTube time: 37'59''](https://youtu.be/q2LVmTe9qmU?t=2279)) also raised a question about the decision making process, asking if it would be better to change to simple majority when it comes to voting. This was rejected by most of the Members and agreed it should be discussed in a different meeting or time. -**Conclusion** The first batch should be agreed upon, posted on the Contributors thread for public comments. Such discussion should be summaraized and included in the next meeting (22nd June 2018, after ScalaDays NewYork). +**Conclusion** The first batch should be agreed upon, posted on the Contributors thread for public comments. Such discussion should be summarized and included in the next meeting (22nd June 2018, after ScalaDays NewYork). diff --git a/_sips/minutes/2018-08-30-sip-minutes.md b/_sips/minutes/2018-08-30-sip-minutes.md index 30fbc4bc49..48bc21c12f 100644 --- a/_sips/minutes/2018-08-30-sip-minutes.md +++ b/_sips/minutes/2018-08-30-sip-minutes.md @@ -141,7 +141,7 @@ Comments + Interaction with nullary constructors. Currently `class Foo` is interpreted as `class Foo()`. Mutatis mutandis for case class `apply` methods. -**Martin** ([Youtube time: 11.37](https://youtu.be/gnlL4PlstFY?t=688)) mentiones that this proposal also came about due to New collection usecases that surfaced in recent work - showing that without a strict rule there is a high amount of "un-disciplined" use of (). But he agrees with Miles about merging the two proposals together. +**Martin** ([Youtube time: 11.37](https://youtu.be/gnlL4PlstFY?t=688)) mentioned that this proposal also came about due to New collection usecases that surfaced in recent work - showing that without a strict rule there is a high amount of "un-disciplined" use of (). But he agrees with Miles about merging the two proposals together. **Conclusion** **Jorge** takes the task to merge the proposals and extend the motivation. @@ -210,7 +210,7 @@ Discussion: **Josh** ([YouTube time: 27’16’’](https://youtu.be/gnlL4PlstFY?t=1636)) clarifies that in order to replace the libraries one would need a proof of concept, and currently there is none. -**Adriaan** ([YouTube time 30’](https://youtu.be/gnlL4PlstFY?t=1796)) summarises the discussion, pointing out that Committee needs to answer a question *will we support XML in some way* and *waht would be the most "Scala-like" way to do so* and *who will be maintaing it*. +**Adriaan** ([YouTube time 30’](https://youtu.be/gnlL4PlstFY?t=1796)) summarises the discussion, pointing out that Committee needs to answer a question *will we support XML in some way* and *what would be the most "Scala-like" way to do so* and *who will be maintaining it*. **Seth** ([YouTube time 35’57’’](https://youtu.be/gnlL4PlstFY?t=2157)) is under the impression that large portion of XML user base are the ones using it to do generation and rarer to be reading in XML using the existing Scala XML support and asks others to share their impressions. **Martin** re-phrases it as “using XML for pattern matching”. @@ -269,7 +269,7 @@ Counter Proposals - Multiple "def" keywords, one which would mean side-effecting function := for side effects -**Josh** concludes: big point to debate would the language consistency be worth the change to more verbose expresion. +**Josh** concludes: big point to debate would the language consistency be worth the change to more verbose expression. **Iulian** ([You/tube time: ]( https://youtu.be/gnlL4PlstFY?t=2928)) adds that 1. last 5 years Syntax procedure was anyway deprecated; 2. going forward we should consider Scala 3 in a light of next 15 years, now is the right moment to clean up the language and 3. this is “the easiest refactoring to automate the code base” that could be a “zero cost migration” diff --git a/_sips/sip-tutorial.md b/_sips/sip-tutorial.md index 680eee2d73..b5ed7e03ae 100644 --- a/_sips/sip-tutorial.md +++ b/_sips/sip-tutorial.md @@ -11,7 +11,7 @@ The process to submit is simple: * Fork the [Scala documentation repository](http://github.com/scala/docs.scala-lang) and clone it. * Create a new SIP file in the `sips/pending/_posts/`. Use the [S(L)IP template](https://github.com/scala/docs.scala-lang/blob/master/_sips/sip-template.md) - * Make sure the new file follows the format: `YYYY-MM-dd-{title}.md`. Use the proposal date for `YYYY-MM-dd`. + * Make sure the new file follows the format: `YYYY-MM-dd-{title}.md`. Use the proposal date for `YYYY-MM-dd`. * Use the [Markdown Syntax](http://daringfireball.net/projects/markdown/syntax) to write your SIP. * Follow the instructions in the [README](https://github.com/scala/docs.scala-lang/blob/master/README.md) to build your SIP locally so you can ensure that it looks correct on the website. * Create a link to your SIP in the "pending sips" section of `index.md`. @@ -21,7 +21,7 @@ The process to submit is simple: ## SIP Post Format ## -First, create a new SIP file in the `pending/_posts` directory. Make sure the new file follows the format: `YYYY-MM-dd-{title}.md`. Where: +First, create a new SIP file in the `pending/_posts` directory. Make sure the new file follows the format: `YYYY-MM-dd-{title}.md`. Where: * `YYYY` is the current year when the proposal originated. * `MM` is the current month (`01` = January, `12` = December) when the proposal originated. * `dd` is the day of the month when the proposal originated. diff --git a/_sips/sips/2010-01-22-named-and-default-arguments.md b/_sips/sips/2010-01-22-named-and-default-arguments.md index fecaed1b71..a3d5a96165 100644 --- a/_sips/sips/2010-01-22-named-and-default-arguments.md +++ b/_sips/sips/2010-01-22-named-and-default-arguments.md @@ -16,13 +16,13 @@ The second language feature discussed in this document, default arguments, in ge ## Named Arguments -In Scala 2.8, method arguments can be specified in _named style_ using the same syntax as variable assignments: +In Scala 2.8, method arguments can be specified in _named style_ using the same syntax as variable assignments: def f[T](a: Int, b: T) f(b = getT(), a = getInt()) -The argument expressions are evaluated in call-site order, so in the above example `getT()` is executed before `getInt()`f. Mixing named and positional arguments is allowed as long as the positional part forms a prefix of the argument list: +The argument expressions are evaluated in call-site order, so in the above example `getT()` is executed before `getInt()`f. Mixing named and positional arguments is allowed as long as the positional part forms a prefix of the argument list: f(0, b = "1") // valid f(b = "1", a = 0) // valid @@ -51,7 +51,7 @@ The following list shows how named arguments interfere with other language featu **By-Name Parameters** continue to work as expected when using named arguments. The expression is only (and repeatedly) evaluated when the body of the method accesses the parameter. -**Repeated Parameters** When an application uses named arguments, the repeated parameter has to be specified exactly once. Using the same parameter name multiple times is disallowed. +**Repeated Parameters** When an application uses named arguments, the repeated parameter has to be specified exactly once. Using the same parameter name multiple times is disallowed. **Functional values** A functional value in Scala is an instance of a class which implements a method called apply. One can use the parameter names of that apply method for a named application. For functional values whose static type is scala.FunctionN, the parameter names of that apply method can be used. @@ -68,17 +68,17 @@ The following list shows how named arguments interfere with other language featu val a: A = new B a.f(a = 1) // OK -**Overloading Resolution** When a method application refers to an overloaded method, first the set of applicable alternatives is determined and then the most specific alternative is chosen (see [1], Chapter 6.26.3). +**Overloading Resolution** When a method application refers to an overloaded method, first the set of applicable alternatives is determined and then the most specific alternative is chosen (see [1], Chapter 6.26.3). -The presence of named argument influences the set of applicable alternatives, the argument types have to be matched against the corresponding parameter types based on the names. In the following example, the second alternative is applicable: +The presence of named argument influences the set of applicable alternatives, the argument types have to be matched against the corresponding parameter types based on the names. In the following example, the second alternative is applicable: def f() // #1 def f(a: Int, b: String) // #2 f(b = "someString", a = 1) // using #2 -If multiple alternatives are applicable, the most specific one is determined. This process is independent of the argument names used in a specific application and only looks at the method signature (for a detailed description, see [1], Chapter 6.26.3). +If multiple alternatives are applicable, the most specific one is determined. This process is independent of the argument names used in a specific application and only looks at the method signature (for a detailed description, see [1], Chapter 6.26.3). -In the following example, both alternatives are applicable, but none of them is more specific than the other because the argument types are compared based on their position, not on the argument name: +In the following example, both alternatives are applicable, but none of them is more specific than the other because the argument types are compared based on their position, not on the argument name: def f(a: Int, b: String) // #1 def f(b: Object, a: Int) // #2 @@ -113,7 +113,7 @@ A default argument may be an arbitrary expression. Since the scope of a paramete // def f(a: Int = 0, b: Int = a + 1) // "error: not found: value a" f(10)() // returns 11 (not 1) -A special expected type is used for type-checking the default argument `expr` of a method parameter `”x: T = expr”`: it is obtained by replacing all occurrences of type parameters of the method (type parameters of the class for constructors) with the undefined type. This allows specifying default arguments for polymorphic methods and classes: +A special expected type is used for type-checking the default argument `expr` of a method parameter `”x: T = expr”`: it is obtained by replacing all occurrences of type parameters of the method (type parameters of the class for constructors) with the undefined type. This allows specifying default arguments for polymorphic methods and classes: def f[T](a: T = 1) = a f() // returns 1: Int @@ -157,7 +157,7 @@ During type-checking, the static type is used to determine whether a parameter h def f(a: String, b: Int = 1) // #2 f("str") // both are applicable, #1 is selected -**Case Classes** For every case class, a method named `”copy”` is now generated which allows to easily create modified copies of the class’s instances. The copy method takes the same type and value parameters as the primary constructor of the case class, and every parameter defaults to the corresponding constructor parameter. +**Case Classes** For every case class, a method named `”copy”` is now generated which allows to easily create modified copies of the class’s instances. The copy method takes the same type and value parameters as the primary constructor of the case class, and every parameter defaults to the corresponding constructor parameter. case class A[T](a: T, b: Int) { // def copy[T’](a’: T’ = a, b’: Int = b): A[T’] = @@ -215,4 +215,4 @@ For constructor defaults, these methods are added to the companion object of the // } ## References -1. Odersky, M. _The Scala Language Specification, Version 2.11_. Available online at [http://www.scala-lang.org/files/archive/spec/2.11/](http://www.scala-lang.org/files/archive/spec/2.11/) +1. Odersky, M. _The Scala Language Specification, Version 2.11_. Available online at [http://www.scala-lang.org/files/archive/spec/2.11/](http://www.scala-lang.org/files/archive/spec/2.11/) diff --git a/_sips/sips/2010-01-22-scala-2-8-arrays.md b/_sips/sips/2010-01-22-scala-2-8-arrays.md index 407dc19bf1..3411fa929c 100644 --- a/_sips/sips/2010-01-22-scala-2-8-arrays.md +++ b/_sips/sips/2010-01-22-scala-2-8-arrays.md @@ -26,8 +26,8 @@ First, there’s actually not a single array type representation in Java but nine different ones: One representation for arrays of reference type and another eight for arrays of each of the primitive types `byte`, `char`, `short`, `int`, `long`, `float`, `double`, and `boolean`. There is no common -type for these different representations which is more specific than just -`java.lang.Object`, even though there are some reflective methods to deal with +type for these different representations which is more specific than just +`java.lang.Object`, even though there are some reflective methods to deal with arrays of arbitrary type in `java.lang.reflect.Array`. Second, there’s no way to create an array of a generic type; only monomorphic array creations are allowed. Third, the only operations supported by arrays are indexing, updates, @@ -35,14 +35,14 @@ and get length. Contrast this with what we would like to have in Scala: Arrays should slot into the collections hierarchy, supporting the hundred or so methods that are -defined on sequences. And they should certainly be generic, so that one can +defined on sequences. And they should certainly be generic, so that one can create an `Array[T]` where `T` is a type variable. ### The Past How to combine these desirables with the representation restrictions imposed by Java interoperability and performance? There’s no easy answer, and I -believe we got it wrong the first time when we designed Scala. The Scala +believe we got it wrong the first time when we designed Scala. The Scala language up to 2.7.x “magically” wrapped and unwrapped arrays when required in a process called boxing and unboxing, similarly to what is done to treat primitive numeric types as objects. “Magically” means: the compiler generated @@ -89,7 +89,7 @@ proposal is that one would not normally refer to Scala native arrays in user code, just as one rarely referred to RichString in Scala. One would only rely on the implicit conversion to add the necessary methods and traits to Java arrays. Unfortunately, the String/RichString experience has shown that this is -also problematic. In par- ticular, in pre 2.8 versions of Scala, one had the +also problematic. In particular, in pre 2.8 versions of Scala, one had the non-intuitive property that "abc".reverse.reverse == "abc" //, yet diff --git a/_sips/sips/2011-10-13-uncluttering-control.md b/_sips/sips/2011-10-13-uncluttering-control.md index 5f4dea8d64..e1ceb5d790 100644 --- a/_sips/sips/2011-10-13-uncluttering-control.md +++ b/_sips/sips/2011-10-13-uncluttering-control.md @@ -71,7 +71,7 @@ For while loops: To write a `do-while` inside a `while` loop you will need braces, like this: - while (expression1) { do expression2 while epression3 } + while (expression1) { do expression2 while expression3 } 3. In Scala 2.11: Disallow diff --git a/_sips/sips/2012-01-21-futures-promises.md b/_sips/sips/2012-01-21-futures-promises.md index a1b8d43666..11983d2201 100644 --- a/_sips/sips/2012-01-21-futures-promises.md +++ b/_sips/sips/2012-01-21-futures-promises.md @@ -109,7 +109,7 @@ We do so by calling the method `getRecentPosts` which returns a `List[String]`: f onComplete { case Right(posts) => for (post <- posts) render(post) - case Left(t) => render("An error has occured: " + t.getMessage) + case Left(t) => render("An error has occurred: " + t.getMessage) } The `onComplete` method is general in the sense that it allows the @@ -387,16 +387,16 @@ multiple `andThen` calls are ordered, as in the following example which stores the recent posts from a social network to a mutable set and then renders all the posts to the screen: - val allposts = mutable.Set[String]() + val allPosts = mutable.Set[String]() Future { session.getRecentPosts } andThen { - case Success(posts) => allposts ++= posts + case Success(posts) => allPosts ++= posts } andThen { case _ => clearAll() - for (post <- allposts) render(post) + for (post <- allPosts) render(post) } In summary, the combinators on futures are purely functional. diff --git a/_sips/sips/2013-05-31-improved-lazy-val-initialization.md b/_sips/sips/2013-05-31-improved-lazy-val-initialization.md index b3fc711ced..2ff2dfee7c 100644 --- a/_sips/sips/2013-05-31-improved-lazy-val-initialization.md +++ b/_sips/sips/2013-05-31-improved-lazy-val-initialization.md @@ -529,7 +529,7 @@ Note that this class is extracted from other place in standard library that uses - it requires usage of `identityHashCode` that is stored for every object inside object header. - as global arrays are used to store monitors, seemingly unrelated things may create contention. This is addressed in detail in evaluation section. -Both absence of monitor expansion and usage of `idetityHashCode` interact with +Both absence of monitor expansion and usage of `identityHashCode` interact with each other, as both of them operate on the object header. \[[12][12]\] presents the complete graph of transitions between possible states of the object header. What can be seen from this transition graph is that in the contended case, @@ -679,7 +679,7 @@ For those wishing to reproduce the results, the benchmarking suite takes 90 minu The final result of those benchmarks is that amount proposed versions, the two that worth considering are (V4-general) and (V6). They both perform better than the current implementation in all the contended case. -Specifically, in the contended case, V6 is 2 times fater than V1, while V4-general is 4 times faster. +Specifically, in the contended case, V6 is 2 times faster than V1, while V4-general is 4 times faster. Unfortunately V4-general is 30% slower in the uncontended case than current implementation(V1), while V6 is in the same ballpark, being up to 5% slower or faster depending on the setup of the benchmark. Based on this, we propose V6 to be used as default in future versions of Scala. @@ -699,7 +699,7 @@ Both Dotty and released Scala 2.12 already implement "Elegant Local lazy vals". ### Unsafe ### The proposed version, V6 relies on `sun.misc.Unsafe` in order to implement it's behaviour. -While `sun.misc.Unsafe` will remain availabe in Java9 there's an intention to deprecate it and replace it with VarHandles.\[[20][20]\]. +While `sun.misc.Unsafe` will remain available in Java9 there's an intention to deprecate it and replace it with VarHandles.\[[20][20]\]. The proposed version V6 can be implemented with using functionality present in Var Handles. ## Acknowledgements ## diff --git a/_sips/sips/2014-06-27-42.type.md b/_sips/sips/2014-06-27-42.type.md index 0c4e208810..afe0aeaeae 100644 --- a/_sips/sips/2014-06-27-42.type.md +++ b/_sips/sips/2014-06-27-42.type.md @@ -455,7 +455,7 @@ applications which work with large datasets. ``` This SIP updates the specification to match the current implementation and then adds the further - refinement that an explict upper bound of `Singleton` indicates that a singleton type should be + refinement that an explicit upper bound of `Singleton` indicates that a singleton type should be inferred. Given, @@ -592,7 +592,7 @@ the singleton types). This is desirable, but due to value class restrictions, en primitive types (such as `Int`). If we implemented `ValueOf[A]` as an opaque type instead of a value class, then this boxing -would be ellided, and the `valueOf[A]` method would be compiled to an identity function. +would be elided, and the `valueOf[A]` method would be compiled to an identity function. ## Related Scala issues resolved by the literal types implementation diff --git a/_sips/sips/2016-01-11-static-members.md b/_sips/sips/2016-01-11-static-members.md index 9fff070f02..3d98d6f5fe 100644 --- a/_sips/sips/2016-01-11-static-members.md +++ b/_sips/sips/2016-01-11-static-members.md @@ -97,7 +97,7 @@ object O { } {% endhighlight %} -Under the proposed scheme users will be able to opt-in to have the field `f` defined in the inner object `I` emited as a static field. +Under the proposed scheme users will be able to opt-in to have the field `f` defined in the inner object `I` emitted as a static field. In case `O.d` is annotated with `@static` the field will be created as a static field `d` in `class O`. If not annotated, it will be created in the companion module with a static forwarder `d` in `class O`. @@ -109,7 +109,7 @@ The following rules ensure that methods can be correctly compiled into static me 2. The fields annotated with `@static` should precede any non-`@static` fields. This ensures that we do not introduce surprises for users in initialization order of this class. -3. The right hand side of a method or field annotated with `@static` can only refer to top-level classes, members of globally accessible objects and `@static` members. In particular, for non-static objects `this` is not accesible. `super` is never accessible. +3. The right hand side of a method or field annotated with `@static` can only refer to top-level classes, members of globally accessible objects and `@static` members. In particular, for non-static objects `this` is not accessible. `super` is never accessible. 4. If a member `foo` of an `object C` is annotated with `@static`, the companion class `C` is not allowed to define term members with name `foo`. @@ -153,7 +153,7 @@ This means that no code precedes the `@static` field initialization which makes since fields are initialized in the order `as written`, similar to how normal fields are initialized. The `@static` proposal is similar to `@tailrec` in a sense that it fails compilation in the case where the user did not write code that follows the aforementioned rules. -These rules exist to enforce the unlikelyhood of an observable difference in semantics if `@static` annotations are dropped; +These rules exist to enforce the unlikelihood of an observable difference in semantics if `@static` annotations are dropped; The restrictions in this SIP make it hard to observe changes in initialization within the same object. It is still possible to observe those changes using multiple classes and side effects within initializers: diff --git a/_sips/sips/2017-02-07-priority-based-infix-type-precedence.md b/_sips/sips/2017-02-07-priority-based-infix-type-precedence.md index 95c73ab90d..8e2341c1eb 100644 --- a/_sips/sips/2017-02-07-priority-based-infix-type-precedence.md +++ b/_sips/sips/2017-02-07-priority-based-infix-type-precedence.md @@ -104,7 +104,7 @@ val fails : 1 + 2 * 3 + 4 = 11 //left associative:(((1+2)*3)+4))) = 13 ``` #### Developer issues example -[This](http://stackoverflow.com/questions/23333882/scala-infix-type-aliasing-for-2-type-parameters) stackoverflow question demonstrate developers are 'surprised' by the difference in infix precedence, expecting infix type precedence to act the same as expression operations. +[This](http://stackoverflow.com/questions/23333882/scala-infix-type-aliasing-for-2-type-parameters) Stack Overflow question demonstrate developers are 'surprised' by the difference in infix precedence, expecting infix type precedence to act the same as expression operations. --- diff --git a/_sips/sips/2017-09-20-opaque-types.md b/_sips/sips/2017-09-20-opaque-types.md index 7c8d5e3444..bbd4a798d3 100644 --- a/_sips/sips/2017-09-20-opaque-types.md +++ b/_sips/sips/2017-09-20-opaque-types.md @@ -885,7 +885,7 @@ there will certainly be an impact on the bytecode produced (and possibly the runtime performance). By contrast, replacing `String` with `Digits` is guaranteed to have no -impact (all occurances of `Digits` are guaranteed to be erased to +impact (all occurrences of `Digits` are guaranteed to be erased to `String`). Aside from the ergonomics of calling the `fromString` and `asString` methods, there's no runtime impact versus using the underlying type. @@ -953,7 +953,7 @@ but it is *not* a `List[AnyVal]`. Since value classes do have a runtime representation, they do increase the size of runtime artifacts produced (whether a JAR file, a -javascript file, or something else). Their methods are also compiled +JavaScript file, or something else). Their methods are also compiled to multiple representations (i.e. they support both the boxed and unboxed forms via extensions methods). Again, this comes at a cost. diff --git a/_sips/sips/2018-07-31-interpolation-quote-escape.md b/_sips/sips/2018-07-31-interpolation-quote-escape.md index bd5658d618..a04703d2a9 100644 --- a/_sips/sips/2018-07-31-interpolation-quote-escape.md +++ b/_sips/sips/2018-07-31-interpolation-quote-escape.md @@ -21,7 +21,7 @@ rather passes the raw string to the interpolator, which then has the option to process escapes itself as it sees fit. That means there are no lexing rules that process the escape, and the sequence `\"` simply terminates the interpolation. -Interpolations have a different meta-charcter -- the `$` character -- which is +Interpolations have a different meta-character -- the `$` character -- which is treated specially. Interpolations use this escape to splice in arguments, and it can also be used to escape itself as the sequence `$$` to represent a literal `$` character. @@ -90,7 +90,7 @@ on the original ticket ## Implementation The implementation is simple to the point of being trivial: see -[the implementation][5] for the actual change in functonality and the rest of +[the implementation][5] for the actual change in functionality and the rest of that PR for the spec and test changes. ## Drawbacks @@ -101,12 +101,12 @@ the language. An argument could be made that this change makes that worse rather than better. Because it affects parsing, this change may affect syntax highlighters. Syntax -highlighters tend to already stuggle around "funky" strings and interpolations. +highlighters tend to already struggle around "funky" strings and interpolations. ## Alternatives More ambitious proposals around interpolations are possible, and have been -propsed in different forms before. [This PR][6] in particular shows more options +proposed in different forms before. [This PR][6] in particular shows more options around using `\` as a meta character in interpolations. It stranded somewhere between red tape, ambition and changing processes. diff --git a/_tour/automatic-closures.md b/_tour/automatic-closures.md index 37b82524ef..e65c7130af 100644 --- a/_tour/automatic-closures.md +++ b/_tour/automatic-closures.md @@ -7,7 +7,7 @@ discourse: true partof: scala-tour --- -Scala allows parameterless function names as parameters of methods. When such a method is called, the actual parameters for parameterless function names are not evaluated and a nullary function is passed instead which encapsulates the computation of the corresponding parameter (so-called *call-by-name* evalutation). +Scala allows parameterless function names as parameters of methods. When such a method is called, the actual parameters for parameterless function names are not evaluated and a nullary function is passed instead which encapsulates the computation of the corresponding parameter (so-called *call-by-name* evaluation). The following code demonstrates this mechanism: diff --git a/_tour/traits.md b/_tour/traits.md index 121255457d..37bf615c6c 100644 --- a/_tour/traits.md +++ b/_tour/traits.md @@ -45,7 +45,7 @@ trait Iterator[A] { class IntIterator(to: Int) extends Iterator[Int] { private var current = 0 override def hasNext: Boolean = current < to - override def next(): Int = { + override def next(): Int = { if (hasNext) { val t = current current += 1 diff --git a/contribute.md b/contribute.md index 9f4692f679..5f4c2b6850 100644 --- a/contribute.md +++ b/contribute.md @@ -79,7 +79,7 @@ If you have something you're thinking about contributing, or that you're thinkin ### Guides/Overviews -A guide or an overview that can be logically placed on **one** page must be placed in the directory `overviews/RELEVANT-CATEGORY/_posts` with the file name in the format `YYYY-MM-dd-title-separarted-by-dashes.md`, and header: +A guide or an overview that can be logically placed on **one** page must be placed in the directory `overviews/RELEVANT-CATEGORY/_posts` with the file name in the format `YYYY-MM-dd-title-separated-by-dashes.md`, and header: --- layout: overview diff --git a/getting-started-intellij-track/building-a-scala-project-with-intellij-and-sbt.md b/getting-started-intellij-track/building-a-scala-project-with-intellij-and-sbt.md index 74bfff8b5b..021aed1bbf 100644 --- a/getting-started-intellij-track/building-a-scala-project-with-intellij-and-sbt.md +++ b/getting-started-intellij-track/building-a-scala-project-with-intellij-and-sbt.md @@ -19,11 +19,11 @@ Started with Scala and sbt on the Command Line]({{site.baseurl}}/getting-started here to the section "Writing Scala code". 1. If you didn't create the project from the command line, open up IntelliJ and select "Create New Project" - * On the left panel, select Scala and on the right panel, select SBT + * On the left panel, select Scala and on the right panel, select sbt * Click **Next** - * Name the project "SBTExampleProject" + * Name the project "SbtExampleProject" 1. If you already created the project on the command line, open up IntelliJ, select *Import Project* and open the `build.sbt` file for your project -1. Make sure the **JDK Version** is 1.8 and the **SBT Version** is at least 0.13.13 +1. Make sure the **JDK version** is 1.8 and the **sbt version** is at least 0.13.13 1. Select **Use auto-import** so dependencies are automatically downloaded when available 1. Select **Finish** @@ -47,7 +47,7 @@ but here's a glance at what everything is for: ## Writing Scala code -1. On the **Project** panel on the left, expand `SBTExampleProject` => `src` +1. On the **Project** panel on the left, expand `SbtExampleProject` => `src` => `main` 1. Right-click `scala` and select **New** => **Package** 1. Name the package `example` and click **OK**. @@ -68,9 +68,9 @@ to see if sbt can run your project on the command line. ## Running the project 1. From the **Run** menu, select **Edit configurations** -1. Click the **+** button and select **SBT Task**. +1. Click the **+** button and select **sbt Task**. 1. Name it `Run the program`. -1. In the **Tasks** field, type `~run`. The `~` causes SBT to rebuild and rerun the project +1. In the **Tasks** field, type `~run`. The `~` causes sbt to rebuild and rerun the project when you save changes to a file in the project. 1. Click **OK**. 1. On the **Run** menu. Click **Run 'Run the program'**. diff --git a/getting-started-sbt-track/testing-scala-with-sbt-on-the-command-line.md b/getting-started-sbt-track/testing-scala-with-sbt-on-the-command-line.md index 388f76a1de..ba38a030ee 100644 --- a/getting-started-sbt-track/testing-scala-with-sbt-on-the-command-line.md +++ b/getting-started-sbt-track/testing-scala-with-sbt-on-the-command-line.md @@ -20,9 +20,9 @@ We assume you know [how to create a Scala project with sbt](getting-started-with ``` sbt test -[info] Loading global plugins from /Users/travislee/.sbt/0.13/plugins -[info] Loading project definition from /Users/travislee/workspace/sandbox/my-something-project/project -[info] Set current project to scalatest-example (in build file:/Users/travislee/workspace/sandbox/my-something-project/) +[info] Loading global plugins from /Users/username/.sbt/0.13/plugins +[info] Loading project definition from /Users/username/workspace/sandbox/my-something-project/project +[info] Set current project to scalatest-example (in build file:/Users/username/workspace/sandbox/my-something-project/) [info] CubeCalculatorTest: [info] - CubeCalculator.cube [info] Run completed in 267 milliseconds. diff --git a/news/_posts/2012-12-12-functional-programming-principles-in-scala-impressions-and-statistics.md b/news/_posts/2012-12-12-functional-programming-principles-in-scala-impressions-and-statistics.md index 7735015903..60c52a7875 100644 --- a/news/_posts/2012-12-12-functional-programming-principles-in-scala-impressions-and-statistics.md +++ b/news/_posts/2012-12-12-functional-programming-principles-in-scala-impressions-and-statistics.md @@ -6,7 +6,7 @@ discourse: true ###### By Heather Miller and Martin Odersky
- In this post, we discuss our experience giving the popular MOOC Functional Programming Principles in Scala, and provide some insight into who our course participants were, how, overall, students performed in the course, and how students felt about the course. We visualize a lot of these statistics in a number of interactive plots, and we go on to publicly release the data and the code to generate these plots within a fun Scala-based project aimed at allowing you to manipulate these statistics with functional programming in Scala, to generate HTML/Javascript for easily visualizing and sharing them. We encourage you to share what you find with us— we'll share a number of your plots in a follow-up post! + In this post, we discuss our experience giving the popular MOOC Functional Programming Principles in Scala, and provide some insight into who our course participants were, how, overall, students performed in the course, and how students felt about the course. We visualize a lot of these statistics in a number of interactive plots, and we go on to publicly release the data and the code to generate these plots within a fun Scala-based project aimed at allowing you to manipulate these statistics with functional programming in Scala, to generate HTML/JavaScript for easily visualizing and sharing them. We encourage you to share what you find with us— we'll share a number of your plots in a follow-up post!
[_Functional Programming Principles in Scala_](https://www.coursera.org/course/progfun) is a [MOOC](http://en.wikipedia.org/wiki/Massive_open_online_course) given by [our research group](http://lamp.epfl.ch) at [EPFL](http://www.epfl.ch), whose first edition was recently completed on [Coursera](http://www.coursera.org). The certificates of completion for those who passed the course have been released, and in looking back as the dust settles— it was a great experience to have done a class like that which greatly exceeded our expectations in more than one dimension. diff --git a/resources/images/library-author-guide/fowards_backwards_compatibility.png b/resources/images/library-author-guide/forward_backward_compatibility.png similarity index 100% rename from resources/images/library-author-guide/fowards_backwards_compatibility.png rename to resources/images/library-author-guide/forward_backward_compatibility.png