Skip to content

Backport "Fix assorted typos" to LTS #19110

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 1 commit into from
Dec 8, 2023
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion compiler/src/dotty/tools/dotc/ast/Trees.scala
Original file line number Diff line number Diff line change
Expand Up @@ -763,7 +763,7 @@ object Trees {
}

/** Tree that replaces a level 1 splices in pickled (level 0) quotes.
* It is only used when picking quotes (will never be in a TASTy file).
* It is only used when pickling quotes (will never be in a TASTy file).
*
* @param isTerm If this hole is a term, otherwise it is a type hole.
* @param idx The index of the hole in it's enclosing level 0 quote.
Expand Down
6 changes: 3 additions & 3 deletions compiler/src/dotty/tools/dotc/core/ConstraintHandling.scala
Original file line number Diff line number Diff line change
Expand Up @@ -102,7 +102,7 @@ trait ConstraintHandling {
*
* If we trust bounds, then the lower bound of `X` is `x.M` since `x.M >: 1`.
* Then even if we correct levels on instantiation to eliminate the local `x`,
* it is alreay too late, we'd get `Int & String` as instance, which does not
* it is already too late, we'd get `Int & String` as instance, which does not
* satisfy the original constraint `X >: 1`.
*
* But if `trustBounds` is false, we do not conclude the `x.M >: 1` since
Expand Down Expand Up @@ -708,8 +708,8 @@ trait ConstraintHandling {
// Widening can add extra constraints, in particular the widened type might
// be a type variable which is now instantiated to `param`, and therefore
// cannot be used as an instantiation of `param` without creating a loop.
// If that happens, we run `instanceType` again to find a new instantation.
// (we do not check for non-toplevel occurences: those should never occur
// If that happens, we run `instanceType` again to find a new instantiation.
// (we do not check for non-toplevel occurrences: those should never occur
// since `addOneBound` disallows recursive lower bounds).
if constraint.occursAtToplevel(param, widened) then
instanceType(param, fromBelow, widenUnions, maxLevel)
Expand Down
2 changes: 1 addition & 1 deletion compiler/src/dotty/tools/dotc/core/Symbols.scala
Original file line number Diff line number Diff line change
Expand Up @@ -77,7 +77,7 @@ object Symbols {

/** Does this symbol retain its definition tree?
* A good policy for this needs to balance costs and benefits, where
* costs are mainly memoty leaks, in particular across runs.
* costs are mainly memory leaks, in particular across runs.
*/
def retainsDefTree(using Context): Boolean =
ctx.settings.YretainTrees.value ||
Expand Down
2 changes: 1 addition & 1 deletion compiler/src/dotty/tools/dotc/core/TypeApplications.scala
Original file line number Diff line number Diff line change
Expand Up @@ -406,7 +406,7 @@ class TypeApplications(val self: Type) extends AnyVal {
if (typeParams.nonEmpty) appliedTo(args) else self

/** A cycle-safe version of `appliedTo` where computing type parameters do not force
* the typeconstructor. Instead, if the type constructor is completing, we make
* the type constructor. Instead, if the type constructor is completing, we make
* up hk type parameters matching the arguments. This is needed when unpickling
* Scala2 files such as `scala.collection.generic.Mapfactory`.
*/
Expand Down
6 changes: 3 additions & 3 deletions compiler/src/dotty/tools/dotc/core/TypeComparer.scala
Original file line number Diff line number Diff line change
Expand Up @@ -1751,7 +1751,7 @@ class TypeComparer(@constructorOnly initctx: Context) extends ConstraintHandling
* any, or no constraint at all.
*
* Otherwise, we infer _sufficient_ constraints: we try to keep the smaller of
* the two constraints, but if never is smaller than the other, we just pick
* the two constraints, but if neither is smaller than the other, we just pick
* the first one.
*/
protected def either(op1: => Boolean, op2: => Boolean): Boolean =
Expand Down Expand Up @@ -1961,7 +1961,7 @@ class TypeComparer(@constructorOnly initctx: Context) extends ConstraintHandling
// is that if the refinement does not refer to a member symbol, we will have to
// resort to reflection to invoke the member. And Java reflection needs to know exact
// erased parameter types. See neg/i12211.scala. Other reflection algorithms could
// conceivably dispatch without knowning precise parameter signatures. One can signal
// conceivably dispatch without knowing precise parameter signatures. One can signal
// this by inheriting from the `scala.reflect.SignatureCanBeImprecise` marker trait,
// in which case the signature test is elided.
def sigsOK(symInfo: Type, info2: Type) =
Expand Down Expand Up @@ -2785,7 +2785,7 @@ class TypeComparer(@constructorOnly initctx: Context) extends ConstraintHandling
else
false
case (AppliedType(tycon1, args1), AppliedType(tycon2, args2)) if isSame(tycon1, tycon2) =>
// It is possible to conclude that two types applies are disjoint by
// It is possible to conclude that two types applied are disjoint by
// looking at covariant type parameters if the said type parameters
// are disjoin and correspond to fields.
// (Type parameter disjointness is not enough by itself as it could
Expand Down
4 changes: 2 additions & 2 deletions compiler/src/dotty/tools/dotc/core/Types.scala
Original file line number Diff line number Diff line change
Expand Up @@ -3340,7 +3340,7 @@ object Types {
def isAnd: Boolean = true
private var myBaseClassesPeriod: Period = Nowhere
private var myBaseClasses: List[ClassSymbol] = _
/** Base classes of are the merge of the operand base classes. */
/** Base classes are the merge of the operand base classes. */
override final def baseClasses(using Context): List[ClassSymbol] = {
if (myBaseClassesPeriod != ctx.period) {
val bcs1 = tp1.baseClasses
Expand Down Expand Up @@ -3433,7 +3433,7 @@ object Types {
def isSoft: Boolean
private var myBaseClassesPeriod: Period = Nowhere
private var myBaseClasses: List[ClassSymbol] = _
/** Base classes of are the intersection of the operand base classes. */
/** Base classes are the intersection of the operand base classes. */
override final def baseClasses(using Context): List[ClassSymbol] = {
if (myBaseClassesPeriod != ctx.period) {
val bcs1 = tp1.baseClasses
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -263,7 +263,7 @@ class TreeUnpickler(reader: TastyReader,
/** Read reference to definition and return symbol created at that definition */
def readSymRef()(using Context): Symbol = symbolAt(readAddr())

/** The symbol at given address; createa new one if none exists yet */
/** The symbol at given address; create a new one if none exists yet */
def symbolAt(addr: Addr)(using Context): Symbol = symAtAddr.get(addr) match {
case Some(sym) =>
sym
Expand Down
8 changes: 4 additions & 4 deletions compiler/src/dotty/tools/dotc/inlines/Inliner.scala
Original file line number Diff line number Diff line change
Expand Up @@ -177,7 +177,7 @@ class Inliner(val call: tpd.Tree)(using Context):
/** A map from the classes of (direct and outer) this references in `rhsToInline`
* to references of their proxies.
* Note that we can't index by the ThisType itself since there are several
* possible forms to express what is logicaly the same ThisType. E.g.
* possible forms to express what is logically the same ThisType. E.g.
*
* ThisType(TypeRef(ThisType(p), cls))
*
Expand Down Expand Up @@ -338,7 +338,7 @@ class Inliner(val call: tpd.Tree)(using Context):

protected def hasOpaqueProxies = opaqueProxies.nonEmpty

/** Map first halfs of opaqueProxies pairs to second halfs, using =:= as equality */
/** Map first halves of opaqueProxies pairs to second halves, using =:= as equality */
private def mapRef(ref: TermRef): Option[TermRef] =
opaqueProxies.collectFirst {
case (from, to) if from.symbol == ref.symbol && from =:= ref => to
Expand Down Expand Up @@ -1047,13 +1047,13 @@ class Inliner(val call: tpd.Tree)(using Context):
val evaluatedSplice = inContext(quoted.MacroExpansion.context(inlinedFrom)) {
Splicer.splice(body, splicePos, inlinedFrom.srcPos, MacroClassLoader.fromContext)
}
val inlinedNormailizer = new TreeMap {
val inlinedNormalizer = new TreeMap {
override def transform(tree: tpd.Tree)(using Context): tpd.Tree = tree match {
case Inlined(EmptyTree, Nil, expr) if enclosingInlineds.isEmpty => transform(expr)
case _ => super.transform(tree)
}
}
val normalizedSplice = inlinedNormailizer.transform(evaluatedSplice)
val normalizedSplice = inlinedNormalizer.transform(evaluatedSplice)
if (normalizedSplice.isEmpty) normalizedSplice
else normalizedSplice.withSpan(splicePos.span)
}
Expand Down
4 changes: 2 additions & 2 deletions compiler/src/dotty/tools/dotc/quoted/Interpreter.scala
Original file line number Diff line number Diff line change
Expand Up @@ -47,7 +47,7 @@ class Interpreter(pos: SrcPos, classLoader0: ClassLoader)(using Context):

/** Returns the result of interpreting the code in the tree.
* Return Some of the result or None if the result type is not consistent with the expected type.
* Throws a StopInterpretation if the tree could not be interpreted or a runtime exception ocurred.
* Throws a StopInterpretation if the tree could not be interpreted or a runtime exception occurred.
*/
final def interpret[T](tree: Tree)(using ct: ClassTag[T]): Option[T] =
interpretTree(tree)(using emptyEnv) match {
Expand All @@ -59,7 +59,7 @@ class Interpreter(pos: SrcPos, classLoader0: ClassLoader)(using Context):
}

/** Returns the result of interpreting the code in the tree.
* Throws a StopInterpretation if the tree could not be interpreted or a runtime exception ocurred.
* Throws a StopInterpretation if the tree could not be interpreted or a runtime exception occurred.
*/
protected def interpretTree(tree: Tree)(using Env): Object = tree match {
case Literal(Constant(value)) =>
Expand Down
2 changes: 1 addition & 1 deletion compiler/src/dotty/tools/dotc/transform/MegaPhase.scala
Original file line number Diff line number Diff line change
Expand Up @@ -26,7 +26,7 @@ object MegaPhase {
*
* - Stats: to prepare/transform a statement sequence in a block, template, or package def,
* - Unit : to prepare/transform a whole compilation unit
* - Other: to prepape/transform a tree that does not have a specific prepare/transform
* - Other: to prepare/transform a tree that does not have a specific prepare/transform
* method pair.
*/
abstract class MiniPhase extends Phase {
Expand Down
2 changes: 1 addition & 1 deletion compiler/src/dotty/tools/dotc/transform/Recheck.scala
Original file line number Diff line number Diff line change
Expand Up @@ -497,7 +497,7 @@ abstract class Recheck extends Phase, SymTransformer:
throw ex
}

/** Typing and previous transforms sometiems leaves skolem types in prefixes of
/** Typing and previous transforms sometimes leaves skolem types in prefixes of
* NamedTypes in `expected` that do not match the `actual` Type. -Ycheck does
* not complain (need to find out why), but a full recheck does. We compensate
* by de-skolemizing everywhere in `expected` except when variance is negative.
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -85,7 +85,7 @@ class SpecializeFunctions extends MiniPhase {
case Select(qual, _) =>
val qual1 = qual.tpe.widen match
case defn.ByNameFunction(res) =>
// Need to cast to regular function, since specialied apply methods
// Need to cast to regular function, since specialized apply methods
// are not members of ContextFunction0. The cast will be eliminated in
// erasure.
qual.cast(defn.FunctionOf(Nil, res))
Expand Down
2 changes: 1 addition & 1 deletion compiler/src/dotty/tools/dotc/transform/SymUtils.scala
Original file line number Diff line number Diff line change
Expand Up @@ -101,7 +101,7 @@ object SymUtils:
else if (self.is(Abstract)) "it is an abstract class"
else if (self.primaryConstructor.info.paramInfoss.length != 1) "it takes more than one parameter list"
else if (isDerivedValueClass(self)) "it is a value class"
else if (!(companionMirror || canAccessCtor)) s"the constructor of $self is innaccessible from the calling scope."
else if (!(companionMirror || canAccessCtor)) s"the constructor of $self is inaccessible from the calling scope."
else ""
end whyNotGenericProduct

Expand Down
2 changes: 1 addition & 1 deletion compiler/src/dotty/tools/dotc/transform/patmat/Space.scala
Original file line number Diff line number Diff line change
Expand Up @@ -444,7 +444,7 @@ object SpaceEngine {
*
* We cannot use type erasure here, as it would lose the constraints
* involving GADTs. For example, in the following code, type
* erasure would loose the constraint that `x` and `y` must be
* erasure would lose the constraint that `x` and `y` must be
* the same type, resulting in false inexhaustive warnings:
*
* sealed trait Expr[T]
Expand Down
2 changes: 1 addition & 1 deletion compiler/src/dotty/tools/dotc/typer/Checking.scala
Original file line number Diff line number Diff line change
Expand Up @@ -371,7 +371,7 @@ object Checking {

/** Check that `info` of symbol `sym` is not cyclic.
* @pre sym is not yet initialized (i.e. its type is a Completer).
* @return `info` where every legal F-bounded reference is proctected
* @return `info` where every legal F-bounded reference is protected
* by a `LazyRef`, or `ErrorType` if a cycle was detected and reported.
*/
def checkNonCyclic(sym: Symbol, info: Type, reportErrors: Boolean)(using Context): Type = {
Expand Down
14 changes: 7 additions & 7 deletions compiler/src/dotty/tools/dotc/typer/Inferencing.scala
Original file line number Diff line number Diff line change
Expand Up @@ -141,7 +141,7 @@ object Inferencing {
* 3. T is minimized if it has a lower bound (different from Nothing) in the
* current constraint (the bound might come from T's declaration).
* 4. Otherwise, T is maximized if it has an upper bound (different from Any)
* in the currented constraint (the bound might come from T's declaration).
* in the current constraint (the bound might come from T's declaration).
* 5. Otherwise, T is not instantiated at all.

* If (1) and (2) do not apply, and minimizeSelected is not set:
Expand Down Expand Up @@ -244,16 +244,16 @@ object Inferencing {
* relationship _necessarily_ must hold.
*
* We accomplish that by:
* - replacing covariant occurences with upper GADT bound
* - replacing contravariant occurences with lower GADT bound
* - leaving invariant occurences alone
* - replacing covariant occurrences with upper GADT bound
* - replacing contravariant occurrences with lower GADT bound
* - leaving invariant occurrences alone
*
* Examples:
* - If we have GADT cstr A <: Int, then for all A <: Int, Option[A] <: Option[Int].
* Therefore, we can approximate Option[A] ~~ Option[Int].
* - If we have A >: S <: T, then for all such A, A => A <: S => T. This
* illustrates that it's fine to differently approximate different
* occurences of same type.
* occurrences of same type.
* - If we have A <: Int and F <: [A] => Option[A] (note the invariance),
* then we should approximate F[A] ~~ Option[A]. That is, we should
* respect the invariance of the type constructor.
Expand Down Expand Up @@ -453,7 +453,7 @@ object Inferencing {
* +1 means: only covariant occurrences
* 0 means: mixed or non-variant occurrences
*
* We need to take the occurences in `pt` into account because a type
* We need to take the occurrences in `pt` into account because a type
* variable created when typing the current tree might only appear in the
* bounds of a type variable in the expected type, for example when
* `ConstraintHandling#legalBound` creates type variables when approximating
Expand Down Expand Up @@ -568,7 +568,7 @@ trait Inferencing { this: Typer =>
* Eligible for interpolation are all type variables owned by the current typerstate
* that are not in `locked` and whose `nestingLevel` is `>= ctx.nestingLevel`.
* Type variables occurring co- (respectively, contra-) variantly in the tree type
* or expected type are minimized (respectvely, maximized). Non occurring type variables are minimized if they
* or expected type are minimized (respectively, maximized). Non occurring type variables are minimized if they
* have a lower bound different from Nothing, maximized otherwise. Type variables appearing
* non-variantly in the type are left untouched.
*
Expand Down
2 changes: 1 addition & 1 deletion compiler/src/dotty/tools/dotc/typer/TypeAssigner.scala
Original file line number Diff line number Diff line change
Expand Up @@ -122,7 +122,7 @@ trait TypeAssigner {
val qualType0 = qual1.tpe.widenIfUnstable
val qualType =
if !qualType0.hasSimpleKind && tree.name != nme.CONSTRUCTOR then
// constructors are selected on typeconstructor, type arguments are passed afterwards
// constructors are selected on type constructor, type arguments are passed afterwards
errorType(em"$qualType0 takes type parameters", qual1.srcPos)
else if !qualType0.isInstanceOf[TermType] && !qualType0.isError then
errorType(em"$qualType0 is illegal as a selection prefix", qual1.srcPos)
Expand Down
4 changes: 2 additions & 2 deletions compiler/src/dotty/tools/dotc/typer/Typer.scala
Original file line number Diff line number Diff line change
Expand Up @@ -2138,7 +2138,7 @@ class Typer(@constructorOnly nestingLevel: Int = 0) extends Namer
// any references to other parameter types of the underlying hk lambda
// in order not to get orphan parameters. Test case in pos/i15564.scala.
// Note 1: It would be better to substitute actual arguments for corresponding
// formal paramaters, but it looks very hard to do this at the point where
// formal parameters, but it looks very hard to do this at the point where
// a bound type variable is created.
// Note 2: If the type constructor is a class type, no sanitization is needed
// since we can refer to the other paraeters with dependent types C[...]#X.
Expand Down Expand Up @@ -3485,7 +3485,7 @@ class Typer(@constructorOnly nestingLevel: Int = 0) extends Namer
val app = tryExtMethod(alt)(using nestedCtx)
(if nestedCtx.reporter.hasErrors then failures else successes)
+= ((app, nestedCtx.typerState))
typr.println(i"multiple extensioin methods, success: ${successes.toList}, failure: ${failures.toList}")
typr.println(i"multiple extension methods, success: ${successes.toList}, failure: ${failures.toList}")

def pick(alt: (Tree, TyperState)): Tree =
val (app, ts) = alt
Expand Down
2 changes: 1 addition & 1 deletion dist/bin/common.bat
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,7 @@ if defined JAVACMD (
set __JAVA_BIN_DIR=
for /f "delims=" %%i in ('where /f java.exe') do (
set "__PATH=%%~dpi"
@rem we take first occurence and ignore Oracle path for java executable
@rem we take first occurrence and ignore Oracle path for java executable
if not defined __JAVA_BIN_DIR if "!__PATH!"=="!__PATH:javapath=!" set "__JAVA_BIN_DIR=!__PATH!"
)
if defined __JAVA_BIN_DIR set "_JAVACMD=!__JAVA_BIN_DIR!\java.exe"
Expand Down
2 changes: 1 addition & 1 deletion docs/_docs/reference/dropped-features/existential-types.md
Original file line number Diff line number Diff line change
Expand Up @@ -10,7 +10,7 @@ have been dropped. The reasons for dropping them are:

- Existential types violate a type soundness principle on which DOT
and Scala 3 are constructed. That principle says that every
prefix (`p`, respectvely `S`) of a type selection `p.T` or `S#T`
prefix (`p`, respectively `S`) of a type selection `p.T` or `S#T`
must either come from a value constructed at runtime or refer to a
type that is known to have only good bounds.

Expand Down
2 changes: 1 addition & 1 deletion docs/_docs/reference/metaprogramming/compiletime-ops.md
Original file line number Diff line number Diff line change
Expand Up @@ -30,7 +30,7 @@ enabling us to handle situations where a value is not present. Note that `S` is
the type of the successor of some singleton type. For example the type `S[1]` is
the singleton type `2`.

Since tuples are not constant types, even if their constituants are, there is `constValueTuple`, which given a tuple type `(X1, ..., Xn)`, returns a tuple value `(constValue[X1], ..., constValue[Xn])`.
Since tuples are not constant types, even if their constituents are, there is `constValueTuple`, which given a tuple type `(X1, ..., Xn)`, returns a tuple value `(constValue[X1], ..., constValue[Xn])`.

### `erasedValue`

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -10,7 +10,7 @@ have been dropped. The reasons for dropping them are:

- Existential types violate a type soundness principle on which DOT
and Scala 3 are constructed. That principle says that every
prefix (`p`, respectvely `S`) of a type selection `p.T` or `S#T`
prefix (`p`, respectively `S`) of a type selection `p.T` or `S#T`
must either come from a value constructed at runtime or refer to a
type that is known to have only good bounds.

Expand Down
Loading