Skip to content

Commit 7750694

Browse files
authored
Merge pull request #102 from scala/invisible-whitespaces
Remove invisible whitespaces
2 parents 946abec + b53c4bc commit 7750694

11 files changed

+159
-159
lines changed

content/42.type.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -582,7 +582,7 @@ terms.
582582
583583
### Byte and short literals
584584
585-
`Byte` and `Short` have singleton types, but lack any corresponding syntax either at the type or at the term level.
585+
`Byte` and `Short` have singleton types, but lack any corresponding syntax either at the type or at the term level.
586586
These types are important in libraries which deal with low-level numerics and protocol implementation
587587
(see eg. [Spire](https://github.com/non/spire) and [Scodec](https://github.com/scodec/scodec)) and
588588
elsewhere, and the ability to, for instance, index a type class by a byte or short literal would be

content/alternative-bind-variables.md

Lines changed: 19 additions & 19 deletions
Original file line numberDiff line numberDiff line change
@@ -49,7 +49,7 @@ Typically, the commands are tokenized and parsed. After a parsing stage we may e
4949
enum Word
5050
case Get, North, Go, Pick, Up
5151
case Item(name: String)
52-
52+
5353
case class Command(words: List[Word])
5454
```
5555

@@ -64,7 +64,7 @@ matching on a single stable identifier, `North` and the code would look like thi
6464

6565
~~~ scala
6666
import Command.*
67-
67+
6868
def loop(cmd: Command): Unit =
6969
cmd match
7070
case Command(North :: Nil) => // Code for going north
@@ -107,7 +107,7 @@ def loop(cmd: Cmd): Unit =
107107
case Command(Get :: Item(name)) => pickUp(name)
108108
~~~
109109

110-
Or any number of different encodings. However, all of them are less intuitive and less obvious than the code we tried to write.
110+
Or any number of different encodings. However, all of them are less intuitive and less obvious than the code we tried to write.
111111

112112
## Commentary
113113

@@ -147,7 +147,7 @@ type, like so:
147147
enum Foo:
148148
case Bar(x: Int)
149149
case Baz(y: Int)
150-
150+
151151
def fun = this match
152152
case Bar(z) | Baz(z) => ... // z: Int
153153
~~~
@@ -161,11 +161,11 @@ Removing the restriction would also allow recursive alternative patterns:
161161
enum Foo:
162162
case Bar(x: Int)
163163
case Baz(x: Int)
164-
164+
165165
enum Qux:
166166
case Quux(y: Int)
167167
case Corge(x: Foo)
168-
168+
169169
def fun = this match
170170
case Quux(z) | Corge(Bar(z) | Baz(z)) => ... // z: Int
171171
~~~
@@ -177,8 +177,8 @@ We also expect to be able to use an explicit binding using an `@` like this:
177177
enum Foo:
178178
case Bar()
179179
case Baz(bar: Bar)
180-
181-
def fun = this match
180+
181+
def fun = this match
182182
case Baz(x) | x @ Bar() => ... // x: Foo.Bar
183183
~~~
184184

@@ -191,7 +191,7 @@ inferred within within each branch.
191191
enum Foo:
192192
case Bar(x: Int)
193193
case Baz(y: String)
194-
194+
195195
def fun = this match
196196
case Bar(x) | Baz(x) => // x: Int | String
197197
~~~
@@ -203,26 +203,26 @@ the following case to match all instances of `Bar`, regardless of the type of `A
203203
enum Foo[A]:
204204
case Bar(a: A)
205205
case Baz(i: Int) extends Foo[Int]
206-
206+
207207
def fun = this match
208-
case Baz(x) | Bar(x) => // x: Int | A
208+
case Baz(x) | Bar(x) => // x: Int | A
209209
~~~
210210

211211
### Given bind variables
212212

213-
It is possible to introduce bindings to the contextual scope within a pattern match branch.
213+
It is possible to introduce bindings to the contextual scope within a pattern match branch.
214214

215215
Since most bindings will be anonymous but be referred to within the branches, we expect the _types_ present in the contextual scope for each branch to be the same rather than the _names_.
216216

217217
~~~ scala
218218
case class Context()
219-
219+
220220
def run(using ctx: Context): Unit = ???
221-
221+
222222
enum Foo:
223223
case Bar(ctx: Context)
224224
case Baz(i: Int, ctx: Context)
225-
225+
226226
def fun = this match
227227
case Bar(given Context) | Baz(_, given Context) => run // `Context` appears in both branches
228228
~~~
@@ -233,7 +233,7 @@ This begs the question of what to do in the case of an explicit `@` binding wher
233233
enum Foo:
234234
case Bar(s: String)
235235
case Baz(i: Int)
236-
236+
237237
def fun = this match
238238
case Bar(x @ given String) | Baz(x @ given Int) => ???
239239
~~~
@@ -254,13 +254,13 @@ However, since untagged unions are part of Scala 3 and the fact that both are re
254254

255255
#### Type ascriptions in alternative branches
256256

257-
Another suggestion is that an _explicit_ type ascription by a user ought to be defined for all branches. For example, in the currently proposed rules, the following code would infer the return type to be `Int | A` even though the user has written the statement `id: Int`.
257+
Another suggestion is that an _explicit_ type ascription by a user ought to be defined for all branches. For example, in the currently proposed rules, the following code would infer the return type to be `Int | A` even though the user has written the statement `id: Int`.
258258

259259
~~~scala
260260
enum Foo[A]:
261261
case Bar[A](a: A)
262262
case Baz[A](a: A)
263-
263+
264264
def test = this match
265265
case Bar(id: Int) | Baz(id) => id
266266
~~~
@@ -295,7 +295,7 @@ If `p_i` is a quoted pattern binding a variable or type variable, the alternativ
295295

296296
Each $`p_n`$ must introduce the same set of bindings, i.e. for each $`n`$, $`\Gamma_n`$ must have the same **named** members $`\Gamma_{n+1}`$ and the set of $`{T_0, ... T_n}`$ must be the same.
297297

298-
If $`X_{n,i}`$, is the type of the binding $`x_i`$ within an alternative $`p_n`$, then the consequent type, $`X_i`$, of the
298+
If $`X_{n,i}`$, is the type of the binding $`x_i`$ within an alternative $`p_n`$, then the consequent type, $`X_i`$, of the
299299
variable $`x_i`$ within the pattern scope, $`\Gamma`$ is the least upper-bound of all the types $`X_{n, i}`$ associated with
300300
the variable, $`x_i`$ within each branch.
301301

content/better-fors.md

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -54,7 +54,7 @@ There are some clear pain points related to Scala'3 `for`-comprehensions and tho
5454

5555
This complicates the code, even in this simple example.
5656
2. The simplicity of desugared code
57-
57+
5858
The second pain point is that the desugared code of `for`-comprehensions can often be surprisingly complicated.
5959

6060
e.g.
@@ -92,7 +92,7 @@ There are some clear pain points related to Scala'3 `for`-comprehensions and tho
9292
This SIP suggests the following changes to `for` comprehensions:
9393

9494
1. Allow `for` comprehensions to start with pure aliases
95-
95+
9696
e.g.
9797
```scala
9898
for
@@ -103,7 +103,7 @@ This SIP suggests the following changes to `for` comprehensions:
103103
```
104104
2. Simpler conditional desugaring of pure aliases. i.e. whenever a series of pure aliases is not immediately followed by an `if`, use a simpler way of desugaring.
105105

106-
e.g.
106+
e.g.
107107
```scala
108108
for
109109
a <- doSth(arg)
@@ -250,7 +250,7 @@ A new desugaring rules will be introduced for simple desugaring.
250250
For any N:
251251
for (P <- G; P_1 = E_1; ... P_N = E_N; ...)
252252
==>
253-
G.flatMap (P => for (P_1 = E_1; ... P_N = E_N; ...))
253+
G.flatMap (P => for (P_1 = E_1; ... P_N = E_N; ...))
254254

255255
And:
256256

content/byname-implicits.md

Lines changed: 16 additions & 16 deletions
Original file line numberDiff line numberDiff line change
@@ -167,7 +167,7 @@ object Semigroup {
167167
}
168168
}
169169
```
170-
170+
171171
then we can manually write instances for, for example, tuples of types which have `Semigroup`
172172
instances,
173173

@@ -387,7 +387,7 @@ val showListInt: Show[List[Int]] =
387387
showUnit
388388
)
389389
)
390-
```
390+
```
391391

392392
where at least one argument position between the val definition and the recursive occurrence of
393393
`showListInt` is byname.
@@ -499,16 +499,16 @@ any _T<sub>j</sub>_, where _i_ < _j_.
499499
The essence of the algorithm described in the Scala Language Specification is as follows,
500500

501501
> Call the sequence of open implicit types _O_. This is initially empty.
502-
>
503-
> To resolve an implicit of type _T_ given stack of open implicits _O_,
504-
>
502+
>
503+
> To resolve an implicit of type _T_ given stack of open implicits _O_,
504+
>
505505
> + Identify the definition _d_ which satisfies _T_.
506-
>
506+
>
507507
> + If the core type of _T_ dominates any element of _O_ then we have observed divergence and we're
508508
> done.
509-
>
509+
>
510510
> + If _d_ has no implicit arguments then the result is the value yielded by _d_.
511-
>
511+
>
512512
> + Otherwise for each implicit argument _a_ of _d_, resolve _a_ against _O+T_, and the result is the
513513
> value yielded by _d_ applied to its resolved arguments.
514514
@@ -550,15 +550,15 @@ divergence check across the set of relevant implicit definitions.
550550

551551
This gives us the following,
552552

553-
> To resolve an implicit of type _T_ given stack of open implicits _O_,
554-
>
553+
> To resolve an implicit of type _T_ given stack of open implicits _O_,
554+
>
555555
> + Identify the definition _d_ which satisfies _T_.
556-
>
556+
>
557557
> + If the core type of _T_ dominates the type _U_ of some element _<d, U>_ of _O_ then we have
558558
> observed divergence and we're done.
559-
>
559+
>
560560
> + If _d_ has no implicit arguments then the result is the value yielded by _d_.
561-
>
561+
>
562562
> + Otherwise for each implicit argument _a_ of _d_, resolve _a_ against _O+<d, T>_, and the result is
563563
> the value yielded by _d_ applied to its resolved arguments.
564564
@@ -646,8 +646,8 @@ larger than _U_ despite using only elements that are present in _U_.
646646

647647
This gives us the following,
648648

649-
> To resolve an implicit of type _T_ given stack of open implicits _O_,
650-
>
649+
> To resolve an implicit of type _T_ given stack of open implicits _O_,
650+
>
651651
> + Identify the definition _d_ which satisfies _T_.
652652
>
653653
> + if there is an element _e_ of _O_ of the form _<d, T>_ such that at least one element between _e_
@@ -658,7 +658,7 @@ This gives us the following,
658658
> observed divergence and we're done.
659659
>
660660
> + If _d_ has no implicit arguments then the result is the value yielded by _d_.
661-
>
661+
>
662662
> + Otherwise for each implicit argument _a_ of _d_, resolve _a_ against _O+<d, T>_, and the result is
663663
> the value yielded by _d_ applied to its resolved arguments.
664664

content/clause-interleaving.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -71,7 +71,7 @@ This definition provides the expected source API at call site, but it has two is
7171

7272
Another workaround is to return a polymorphic function, for example:
7373
~~~scala
74-
def getOrElse(k:Key): [V >: k.Value] => (default: V) => V =
74+
def getOrElse(k:Key): [V >: k.Value] => (default: V) => V =
7575
[V] => (default: V) => ???
7676
~~~
7777
While again, this provides the expected API at call site, it also has issues:

content/drop-stdlib-forwards-bin-compat.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -210,7 +210,7 @@ repositories {
210210
dependencies {
211211
implementation 'org.scala-lang:scala-library:2.13.8'
212212
implementation 'com.softwaremill.sttp.client3:core_2.13:3.8.3'
213-
implementation 'com.softwaremill.sttp.shared:ws_2.13:1.2.7'
213+
implementation 'com.softwaremill.sttp.shared:ws_2.13:1.2.7'
214214
}
215215
$> gradle dependencies --configuration runtimeClasspath
216216

content/interpolation-quote-escape.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -38,7 +38,7 @@ escape a `"` character to represent a literal `"` withing a string.
3838
## Motivating Example
3939

4040
That the `"` character can't be easily escaped in interpolations has been an
41-
open issue since at least 2012[^1], and how to deal with this issue is a
41+
open issue since at least 2012[^1], and how to deal with this issue is a
4242
somewhat common SO question[^2][^3]
4343

4444
{% highlight Scala %}

content/polymorphic-eta-expansion.md

Lines changed: 13 additions & 13 deletions
Original file line numberDiff line numberDiff line change
@@ -21,7 +21,7 @@ permalink: /sips/:title.html
2121
- For a first-time reader, a high-level overview of what they should expect to see in the proposal.
2222
- For returning readers, a quick reminder of what the proposal is about. -->
2323

24-
We propose to extend eta-expansion to polymorphic methods.
24+
We propose to extend eta-expansion to polymorphic methods.
2525
This means automatically transforming polymorphic methods into corresponding polymorphic functions when required, for example:
2626

2727
~~~ scala
@@ -44,10 +44,10 @@ This section should clearly express the scope of the proposal. It should make it
4444

4545
Regular eta-expansion is so ubiquitous that most users are not aware of it, for them it is intuitive and obvious that methods can be passed where functions are expected.
4646

47-
When manipulating polymorphic methods, we wager that most users find it confusing not to be able to do the same.
47+
When manipulating polymorphic methods, we wager that most users find it confusing not to be able to do the same.
4848
This is the main motivation of this proposal.
4949

50-
It however remains to be demonstrated that such cases appear often enough for time and maintenance to be devoted to fixing it.
50+
It however remains to be demonstrated that such cases appear often enough for time and maintenance to be devoted to fixing it.
5151
To this end, the remainder of this section will show a manufactured example with tuples, as well as real-world examples taken from the [Shapeless-3](https://index.scala-lang.org/typelevel/shapeless-3) and [kittens](https://index.scala-lang.org/typelevel/kittens) libraries.
5252

5353

@@ -89,7 +89,7 @@ There is however the following case, where a function is very large:
8989
case (acc, Some(t)) => Some((t, acc._1))
9090
}
9191
}
92-
~~~
92+
~~~
9393

9494
By factoring out the function, it is possible to make the code more readable:
9595

@@ -113,7 +113,7 @@ By factoring out the function, it is possible to make the code more readable:
113113
case (acc, Some(t)) => Some((t, acc._1))
114114
}
115115
}
116-
~~~
116+
~~~
117117

118118
It is natural at this point to want to transform the function into a method, as the syntax for the latter is more familiar, and more readable:
119119

@@ -139,7 +139,7 @@ It is natural at this point to want to transform the function into a method, as
139139
}
140140
~~~
141141

142-
However, this does not compile.
142+
However, this does not compile.
143143
Only monomorphic eta-expansion is applied, leading to the same issue as with our previous `Tuple.map` example.
144144

145145
#### Kittens ([source](https://github.com/typelevel/kittens/blob/e10a03455ac3dd52096a1edf0fe6d4196a8e2cad/core/src/main/scala-3/cats/derived/DerivedTraverse.scala#L44-L48))
@@ -251,7 +251,7 @@ For example, if the syntax of the language is changed, this section should list
251251

252252
Before we go on, it is important to clarify what we mean by "polymorphic method", we do not mean, as one would expect, "a method taking at least one type parameter clause", but rather "a (potentially partially applied) method whose next clause is a type clause", here is an example to illustrate:
253253

254-
~~~ scala
254+
~~~ scala
255255
extension (x: Int)
256256
def poly[T](x: T): T = x
257257
// signature: (Int)[T](T): T
@@ -279,15 +279,15 @@ Note: Polymorphic functions always take term parameters (but `k` can equal zero
279279
1. Copies of `T_i`s are created, and replaced in `U_i`s, `L_i`s, `A_i`s and `R`, noted respectively `T'_i`, `U'_i`, `L'_i`, `A'_i` and `R'`.
280280

281281
2. Is the expected type a polymorphic context function ?
282-
* 1. If yes then `m` is replaced by the following:
282+
* 1. If yes then `m` is replaced by the following:
283283
~~~ scala
284-
[T'_1 <: U'_1 >: L'_1, ... , T'_n <: U'_n >: L'_n]
284+
[T'_1 <: U'_1 >: L'_1, ... , T'_n <: U'_n >: L'_n]
285285
=> (a_1: A'_1 ..., a_k: A'_k)
286286
?=> m[T'_1, ..., T'_n]
287287
~~~
288-
* 2. If no then `m` is replaced by the following:
288+
* 2. If no then `m` is replaced by the following:
289289
~~~ scala
290-
[T'_1 <: U'_1 >: L'_1, ... , T'_n <: U'_n >: L'_n]
290+
[T'_1 <: U'_1 >: L'_1, ... , T'_n <: U'_n >: L'_n]
291291
=> (a_1: A'_1 ..., a_k: A'_k)
292292
=> m[T'_1, ..., T'_n](a_1, ..., a_k)
293293
~~~
@@ -321,7 +321,7 @@ extension [A](x: A)
321321
def foo[B](y: B) = (x, y)
322322

323323
val voo: [T] => T => [U] => U => (T, U) = foo
324-
// foo expands to:
324+
// foo expands to:
325325
// [T'] => (t: T') => ( foo[T'](t) with expected type [U] => U => (T', U) )
326326
// [T'] => (t: T') => [U'] => (u: U') => foo[T'](t)[U'](u)
327327
~~~
@@ -384,7 +384,7 @@ Not included in this proposal are:
384384
* Polymorphic SAM conversion
385385
* Polymorphic functions from wildcard: `foo[_](_)`
386386

387-
While all of the above could be argued to be valuable, we deem they are out of the scope of this proposal.
387+
While all of the above could be argued to be valuable, we deem they are out of the scope of this proposal.
388388

389389
We encourage the creation of follow-up proposals to motivate their inclusion.
390390

0 commit comments

Comments
 (0)