Skip to content

Commit cc19d7e

Browse files
committed
Add draft text of some chapters
1 parent 297d9f1 commit cc19d7e

File tree

3 files changed

+170
-68
lines changed

3 files changed

+170
-68
lines changed

docs/src/concepts/futures.md

Lines changed: 62 additions & 66 deletions
Original file line numberDiff line numberDiff line change
@@ -1,122 +1,118 @@
11
# Futures
22

3-
> I have looked into the future, everyone is slightly older.
3+
A notable point about Rust is [*fearless concurrency*](https://blog.rust-lang.org/2015/04/10/Fearless-Concurrency.html). That is the notion that you should be empowered to do concurrent things, without giving up safety. Also, Rust being a low-level language, it's about fearless concurrency *without picking a specific implementation strategy*. This means we *must* abstract over the strategy, to allow choice *later*, if we want to have any way to share code between users of different strategies.
44

5-
-- Future of the Left -- The Plot Against Common Sense
6-
7-
A notable point about Rust is [_fearless concurrency_][fearless-concurrency]. That is the notion that you should be empowered to do concurrent things, without giving up safety. Also, Rust being a low-level language, it's about fearless concurrency _without picking a specific implementation strategy_. This means we _must_ abstract over the strategy, to allow choice _later_, if we want to have any way to share code between users of different strategies.
8-
9-
Futures abstract over _computation_. They describe the "what", independent of the "where" and the "when". For that, they aim to break code into small, composable actions that can then be executed by a part of our system. Let's take a tour through what it means to compute things to find where we can abstract.
5+
Futures abstract over *computation*. They describe the "what", independent of the "where" and the "when". For that, they aim to break code into small, composable actions that can then be executed by a part of our system. Let's take a tour through what it means to compute things to find where we can abstract.
106

117
## Send and Sync
128

13-
Luckily, concurrent Rust already has two well-known and effective concepts abstracting over sharing between Rust concurrent parts of a program: Send and Sync. Notably, both the Send and Sync traits abstract over _strategies_ of concurrent work, compose neatly, and don't prescribe an implementation.
9+
Luckily, concurrent Rust already has two well-known and effective concepts abstracting over sharing between Rust concurrent parts of a program: Send and Sync. Notably, both the Send and Sync traits abstract over *strategies* of concurrent work, compose neatly, and don't prescribe an implementation.
1410

1511
As a quick summary, `Send` abstracts over passing data in a computation over to another concurrent computation (let's call it the receiver), losing access to it on the sender side. In many programming languages, this strategy is commonly implemented, but missing support from the language side expects you to keep up this behaviour yourself. This is a regular source of bugs: senders keeping handles to sent things around and maybe even working with them after sending. Rust mitigates this problem by making this behaviour known. Types can be `Send` or not (by implementing the appropriate marker trait), allowing or disallowing sending them around.
1612

17-
Note how we avoided any word like _"thread"_, but instead opted for "computation". The full power of `Send` (and subsequently also `Sync`) is that they relieve you of the burden of knowing _what_ shares. At the point of implementation, you only need to know which method of sharing is appropriate for the type at hand. This keeps reasoning local and is not influenced by whatever implementation the user of that type later uses.
18-
19-
`Sync` is about sharing data between two concurrent parts of a program. This is another common pattern: as writing to a memory location or reading while another party is writing is inherently unsafe, this access needs to be moderated through synchronisation.[^1] There are many common ways of two parties to agree on not using the same part in memory at the same time, for example mutexes and spinlocks. Again, Rust gives you the option of (safely!) not caring. Rust gives you the ability to express that something _needs_ synchronisation while not being specific about the _how_.
13+
Note how we avoided any word like *"thread"*, but instead opted for "computation". The full power of `Send` (and subsequently also `Sync`) is that they relieve you of the burden of knowing *what* shares. At the point of implementation, you only need to know which method of sharing is appropriate for the type at hand. This keeps reasoning local and is not influenced by whatever implementation the user of that type later uses.
14+
`Sync` is about sharing data between two concurrent parts of a program. This is another common pattern: as writing to a memory location or reading while another party is writing is inherently unsafe, this access needs to be moderated through synchronisation.[^1] There are many common ways of two parties to agree on not using the same part in memory at the same time, for example mutexes and spinlocks. Again, Rust gives you the option of (safely!) not caring. Rust gives you the ability to express that something *needs* synchronisation while not being specific about the *how*.
2015

2116
`Send` and `Sync` can be composed in interesting fashions, but that's beyond the scope here. You can find examples in the [Rust Book][rust-book-sync].
2217

2318
To sum up: Rust gives us the ability to safely abstract over important properties of concurrent programs: their data sharing. It does so in a very lightweight fashion: the language itself only knows about the two markers `Send` and `Sync` and helps us a little by deriving them itself, when possible. The rest is a library concern.
2419

2520
## An easy view of computation
2621

27-
While computation is a subject to write a whole [book][understanding-computation] about, a very simplified view of them suffices for us:
28-
29-
* computation is a sequence of composable operations
30-
* they can branch based on a decision
31-
* they either run to succession and yield a result or they can yield an error
22+
While computation is a subject to write a whole [book](https://computationbook.com/) about, a very simplified view of them suffices for us:
3223

24+
- computation is a sequence of composable operations
25+
- they can branch based on a decision
26+
- they either run to succession and yield a result or they can yield an error
3327
## Deferring computation
3428

35-
As mentioned above `Send` and `Sync` are about data. But programs are not only about data, they also talk about _computing_ the data. And that's what [Futures][futures] do. We are going to have a close look at how that works in the next chapter. Let's look at what Futures allow us to express, in English. Futures go from this plan:
29+
As mentioned above `Send` and `Sync` are about data. But programs are not only about data, they also talk about *computing* the data. And that's what \[Futures\][futures] do. We are going to have a close look at how that works in the next chapter. Let's look at what Futures allow us to express, in English. Futures go from this plan:
3630

37-
* Do X
38-
* If X succeeds, do Y
31+
- Do X
32+
- If X succeeds, do Y
3933

4034
towards
4135

42-
* Start doing X
43-
* Once X succeeds, start doing Y
36+
- Start doing X
37+
- Once X succeeds, start doing Y
4438

45-
Remember the talk about "deferred computation" in the intro? That's all it is. Instead of telling the computer what to execute and decide upon _now_, you tell it what to start doing and how to react on potential events the... well... `Future`.
39+
Remember the talk about "deferred computation" in the intro? That's all it is. Instead of telling the computer what to execute and decide upon *now*, you tell it what to start doing and how to react on potential events the... well... `Future`.
4640

4741
## Orienting towards the beginning
4842

4943
Let's have a look at a simple function, specifically the return value:
5044

51-
```rust
52-
fn compute_value() -> String {
53-
"test".into()
54-
}
55-
```
45+
fn read_file(path: &str) -> Result<String, io::Error> {
46+
let mut file = File.open(path)?;
47+
let mut contents = String::new();
48+
file.read_to_string(&mut contents)?;
49+
contents
50+
}
5651

5752
You can call that at any time, so you are in full control on when you call it. But here's the problem: the moment you call it, you transfer control to the called function. It returns a value.
58-
5953
Note that this return value talks about the past. The past has a drawback: all decisions have been made. It has an advantage: the outcome is visible. We can unwrap the presents of program past and then decide what to do with it.
6054

61-
But here's a problem: we wanted to abstract over _computation_ to be allowed to let someone else choose how to run it. That's fundamentally incompatible with looking at the results of previous computation all the time. So, let's find a type that describes a computation without running it. Let's look at the function again:
55+
But here's a problem: we wanted to abstract over *computation* to be allowed to let someone else choose how to run it. That's fundamentally incompatible with looking at the results of previous computation all the time. So, let's find a type that describes a computation without running it. Let's look at the function again:
6256

63-
```rust
64-
fn compute_value() -> String {
65-
"test".into()
66-
}
67-
```
57+
fn read_file(path: &str) -> Result<String, io::Error> {
58+
let mut file = File.open(path)?;
59+
let mut contents = String::new();
60+
file.read_to_string(&mut contents)?;
61+
contents
62+
}
6863

69-
Speaking in terms of time, we can only take action _before_ calling the function or _after_ the function returned. This is not desireable, as it takes from us the ability to do something _while_ it runs. When working with parallel code, this would take from us the ability to start a parallel task while the first runs (because we gave away control).
70-
71-
This is the moment where we could reach for [threads][threads]. But threads are a very specific concurrency primitive and we said that we are searching for an abstraction.
64+
Speaking in terms of time, we can only take action *before* calling the function or *after* the function returned. This is not desirable, as it takes from us the ability to do something *while* it runs. When working with parallel code, this would take from us the ability to start a parallel task while the first runs (because we gave away control).
7265

66+
This is the moment where we could reach for [threads](https://en.wikipedia.org/wiki/Thread_). But threads are a very specific concurrency primitive and we said that we are searching for an abstraction.
7367
What we are searching is something that represents ongoing work towards a result in the future. Whenever we say `something` in Rust, we almost always mean a trait. Let's start with an incomplete definition of the `Future` trait:
7468

75-
```rust
76-
trait Future {
77-
type Output;
78-
79-
fn poll(self: Pin<&mut Self>, cx: &mut Context) -> Poll<Self::Output>;
80-
}
81-
```
69+
trait Future {
70+
type Output;
71+
72+
fn poll(self: Pin<&mut Self>, cx: &mut Context) -> Poll<Self::Output>;
73+
}
8274

8375
Ignore `Pin` and `Context` for now, you don't need them for high-level understanding. Looking at it closely, we see the following: it is generic over the `Output`. It provides a function called `poll`, which allows us to check on the state of the current computation.
84-
8576
Every call to `poll()` can result in one of these two cases:
8677

87-
1. The future is done, `poll` will return [`Poll::Ready`][poll-ready]
88-
2. The future has not finished executing, it will return [`Poll::Pending`][poll-pending]
78+
1. The future is done, `poll` will return `[Poll::Ready](https://doc.rust-lang.org/std/task/enum.Poll.html#variant.Ready)`
79+
2. The future has not finished executing, it will return `[Poll::Pending](https://doc.rust-lang.org/std/task/enum.Poll.html#variant.Pending)`
8980

9081
This allows us to externally check if a `Future` has finished doing its work, or is finally done and can give us the value. The most simple way (but not efficient) would be to just constantly poll futures in a loop. There's optimistions here, and this is what a good runtime is does for you.
91-
92-
Note that calling `poll` after case 1 happened may result in confusing behaviour. See the [futures-docs][futures-docs] for details.
82+
Note that calling `poll` after case 1 happened may result in confusing behaviour. See the [futures-docs](https://doc.rust-lang.org/std/future/trait.Future.html) for details.
9383

9484
## Async
9585

96-
While the `Future` trait has existed in Rust for a while, it was inconvenient to build and describe them. For this, Rust now has a special syntax: `async`. It takes the idea introduced above: if we want to have a function that sets up a deferred computation, we call it an `async` function:
86+
While the `Future` trait has existed in Rust for a while, it was inconvenient to build and describe them. For this, Rust now has a special syntax: `async`. The example from above, implemented in `async-std`, would look like this:
9787

98-
```rust
99-
async fn compute_value() -> String {
100-
"test".into()
101-
}
102-
```
10388

104-
When this function is called, it will produce a `Future<Output=String>` instead of immediately returning a String. (Or, more precisely, generate a type for you that implements `Future<Output=String>`.)
89+
use async_std::fs::File;
90+
91+
async fn read_file(path: &str) -> Result<String, io::Error> {
92+
let mut file = File.open(path).await?;
93+
let mut contents = String::new();
94+
file.read_to_string(&mut contents).await?;
95+
contents
96+
}
10597

106-
## Conclusion
98+
Amazingly little difference, right? All we did is label the function `async` and insert 2 special commands: `.await`.
10799

108-
Working from values, we searched for something that expresses _working towards a value available sometime later_. From there, we talked about the concept of polling.
100+
This function sets up a deferred computation. When this function is called, it will produce a `Future<Output=String>` instead of immediately returning a String. (Or, more precisely, generate a type for you that implements `Future<Output=String>`.)
109101

110-
A `Future` is any data type that does not represent a value, but the ability to _produce a value at some point in the future_. Implementations of this are very varied and detailled depending on use-case, but the interface is simple.
102+
## What does `.await` do?
111103

112-
From here on, we are going to introduce you to two other important concepts: `tasks` and `streams`, to then talk about how we combine the three to build things.
104+
The `.await` postfix does exactly what it says on the tin: the moment you use it, the code will wait until the requested action (e.g. opening a file or reading all data in it) is finished. `.await?` is not special, it’s just the application of the `?` operator to the result of `.await`. So, what is gained over the initial code example? We’re getting futures and then immediately waiting for them?
113105

106+
The `.await` points act as a marker. Here, the code will wait for a `Future` to produce its value. How will a future finish? You don’t need to care! The marker allows the code later *executing* this piece of code (usually called the “runtime”) when it can take some time to care about all the other things it has to do. It will come back to this point when the operation you are doing in the background is done. This is why this style of programming is also called *evented programming*. We are waiting for *things to happen* (e.g. a file to be opened) and then react (by starting to read).
114107

115-
[^1]: Two parties reading while it is guaranteed that no one is writing is always safe.
108+
When executing 2 or more of these functions at the same time, our runtime system is then able to fill the wait time with handling *all the other events* currently going on.
109+
110+
## Conclusion
116111

117-
[poll-ready]: https://doc.rust-lang.org/std/task/enum.Poll.html#variant.Ready
118-
[poll-pending]: https://doc.rust-lang.org/std/task/enum.Poll.html#variant.Pending
119-
[futures-docs]: https://doc.rust-lang.org/std/future/trait.Future.html
120-
[fearless-concurrency]: https://blog.rust-lang.org/2015/04/10/Fearless-Concurrency.html
121-
[understanding-computation]: https://computationbook.com/
122-
[threads]: https://en.wikipedia.org/wiki/Thread_(computing)
112+
Working from values, we searched for something that expresses *working towards a value available sometime later*. From there, we talked about the concept of polling.
113+
114+
A `Future` is any data type that does not represent a value, but the ability to *produce a value at some point in the future*. Implementations of this are very varied and detailled depending on use-case, but the interface is simple.
115+
116+
Next, we will introduce you to `tasks`, which we need to actually *run* Futures.
117+
118+
[^1]: Two parties reading while it is guaranteed that no one is writing is always safe.

docs/src/concepts/tasks.md

Lines changed: 81 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1 +1,81 @@
1-
# tasks
1+
# Tasks
2+
Now that we know what Futures are, we now want to run them!
3+
4+
In `async-std`, the `tasks` (TODO: link) module is responsible for this. The simplest way is using the `block_on` function:
5+
6+
```rust
7+
use async_std::fs::File;
8+
use async_std::task;
9+
10+
async fn read_file(path: &str) -> Result<String, io::Error> {
11+
let mut file = File.open(path).await?;
12+
let mut contents = String::new();
13+
file.read_to_string(&mut contents).await?;
14+
contents
15+
}
16+
17+
fn main() {
18+
let task = task::spawn(async {
19+
let result = read_file("data.csv");
20+
match result {
21+
Ok(s) => println!("{}", s),
22+
Err(e) => println!("Error reading file: {:?}", e)
23+
}
24+
});
25+
println!("Started task!");
26+
task::block_on(task);
27+
println!("Stopped task!");
28+
}
29+
```
30+
31+
This asks the runtime baked into `async_std` to execute the code that reads a file. Let’s go one by one, though, inside to outside.
32+
33+
```rust
34+
async {
35+
let result = read_file("data.csv");
36+
match result {
37+
Ok(s) => println!("{}", s),
38+
Err(e) => println!("Error reading file: {:?}", e)
39+
}
40+
}
41+
```
42+
43+
This is an `async` *block*. Async blocks are necessary to call `async` functions, and will instruct the compiler to include all the relevant instructions to do so. In Rust, all blocks return a value and `async` blocks happen to return a value of the kind `Future`.
44+
45+
But let’s get to the interesting part:
46+
47+
```rust
48+
task::spawn(async { })
49+
```
50+
51+
`spawn` takes a Future and starts running it on a `Task`. It returns a `JoinHandle`. Futures in Rust are sometimes called *cold* Futures. You need something that starts running them. To run a Future, there may be some additional bookkeeping required, e.g. if it’s running or finished, where it is being placed in memory and what the current state is. This bookkeeping part is abstracted away in a `Task`. A `Task` is similar to a `Thread`, with some minor differences: it will be scheduled by the program instead of the operating system kernel and if it encounters a point where it needs to wait, the program itself responsible for waking it up again. We’ll talk a little bit about that later. An `async_std` task can also has a name and an ID, just like a thread.
52+
53+
For now, it is enough to know that once you `spawn`ed a task, it will continue running in the background. The `TaskHandle` in itself is a future that will finish once the `Task` ran to conclusion. Much like with `threads` and the `join` function, we can now call `block_on` on the handle to *block* the program (or the calling thread, to be specific) to wait for it to finish.
54+
55+
56+
## Tasks in `async_std`
57+
58+
Tasks in `async_std` are one of the core abstractions. Much like Rust’s `thread`s, they provide some practical functionality over the raw concept. `Tasks` have a relationship to the runtime, but they are in themselves separate. `async_std` tasks have a number of desirable properties:
59+
60+
61+
- They are single-allocated
62+
- All tasks have a *backchannel*, which allows them to propagate results and errors to the spawning task through the `TaskHandle`
63+
- The carry desirable metadata for debugging
64+
- They support task local storage
65+
66+
`async_std` s task api handles setup and teardown of a backing runtime for you and doesn’t rely on a runtime being started.
67+
68+
## Blocking
69+
70+
TODO: fill me in
71+
72+
## Errors and panics
73+
74+
TODO: fill me in
75+
76+
77+
## Conclusion
78+
79+
`async_std` comes with a useful `Task` type that works with an API similar to `std::thread`. It covers error and panic behaviour in a structured and defined way.
80+
81+
Tasks are separate concurrent units and sometimes they need to communicate. That’s where `Stream`s come in.

0 commit comments

Comments
 (0)