Skip to content

Commit 3787658

Browse files
committed
smallscale changes to multiple chapters.
1 parent 1a75eee commit 3787658

File tree

6 files changed

+64
-27
lines changed

6 files changed

+64
-27
lines changed

chapters/euclidean_algorithm/euclidean.md

Lines changed: 6 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -52,7 +52,9 @@ The algorithm is a simple way to find the *greatest common divisor* (GCD) of two
5252

5353
Here, we simply line the two numbers up every step and subtract the lower value from the higher one every timestep. Once the two values are equal, we call that value the greatest common divisor. A graph of `a` and `b` as they change every step would look something like this:
5454

55-
![Subtraction-based Euclidean algorithm](res/subtraction.png)
55+
<p align="center">
56+
<img src="res/subtraction.png" width="500" height="500" />
57+
</p>
5658

5759
Modern implementations, though, often use the modulus operator (%) like so
5860

@@ -81,7 +83,9 @@ Modern implementations, though, often use the modulus operator (%) like so
8183

8284
Here, we set `b` to be the remainder of `a%b` and `a` to be whatever `b` was last timestep. Because of how the modulus operator works, this will provide the same information as the subtraction-based implementation, but when we show `a` and `b` as they change with time, we can see that it might take many fewer steps:
8385

84-
![Modulus-based Euclidean algorithm](res/modulus.png)
86+
<p align="center">
87+
<img src="res/modulus.png" width="500" height="500" />
88+
</p>
8589

8690
The Euclidean Algorithm is truly fundamental to many other algorithms throughout the history of computer science and will definitely be used again later. At least to me, it's amazing how such an ancient algorithm can still have modern use and appeal. That said, there are still other algorithms out there that can find the greatest common divisor of two numbers that are arguably better in certain cases than the Euclidean algorithm, but the fact that we are discussing Euclid two millenia after his death shows how timeless and universal mathematics truly is. I think that's pretty cool.
8791

chapters/sorting_searching/bogo/bogo_sort.md

Lines changed: 2 additions & 8 deletions
Original file line numberDiff line numberDiff line change
@@ -36,14 +36,8 @@ In the best case, this algorithm runs with a complexity of $$\Omega(1)$$, and in
3636
In code, it looks something like this:
3737

3838
{% method %}
39-
{% sample lang="pseudo" %}
40-
```julia
41-
function bogo_sort(Vector{Type} a)
42-
while(!is_sorted(a))
43-
shuffle(a)
44-
end
45-
end
46-
```
39+
{% sample lang="jl" %}
40+
[import:1-14, lang:"julia"](code/julia/bogo.jl)
4741
{% sample lang="cs" %}
4842
[import:29-35, lang:"csharp"](code/cs/bogo.cs)
4943
{% sample lang="clj" %}
Lines changed: 22 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,22 @@
1+
function is_sorted(a::Vector{Float64})
2+
for i = 1:length(a)-1
3+
if (a[i+1] > a[i])
4+
return false
5+
end
6+
end
7+
return true
8+
end
9+
10+
function bogo_sort(a::Vector{Float64})
11+
while(!is_sorted(a))
12+
shuffle!(a)
13+
end
14+
end
15+
16+
function main()
17+
a = [1., 3, 2,4]
18+
bogo_sort(a)
19+
println(a)
20+
end
21+
22+
main()

chapters/sorting_searching/bubble/bubble_sort.md

Lines changed: 2 additions & 13 deletions
Original file line numberDiff line numberDiff line change
@@ -31,19 +31,8 @@ In this way, we sweep through the array $$n$$ times for each element and continu
3131
This means that we need to go through the vector $$\mathcal{O}(n^2)$$ times with code similar to the following:
3232

3333
{% method %}
34-
{% sample lang="pseudo" %}
35-
```julia
36-
function bubble_sort(Vector{Type} a)
37-
n = length(a)
38-
for i = 0:n
39-
for j = 1:n
40-
if(a[j-1] > a[j])
41-
swap(a[j-1], a[j])
42-
end
43-
end
44-
end
45-
end
46-
```
34+
{% sample lang="jl" %}
35+
[import:1-10, lang:"julia"](code/julia/bubble.jl)
4736
{% sample lang="cs" %}
4837
[import:9-27, lang:"csharp"](code/cs/Sorting.cs)
4938
{% endmethod %}
Lines changed: 19 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,19 @@
1+
function bubble_sort(a::Vector{Float64})
2+
n = length(a)
3+
for i = 1:n
4+
for j = 1:n-1
5+
if(a[j] < a[j+1])
6+
a[j], a[j+1] = a[j+1], a[j]
7+
end
8+
end
9+
end
10+
end
11+
12+
13+
function main()
14+
a = [1., 3, 2, 4, 5, 10, 50, 7, 1.5, 0.3]
15+
bubble_sort(a)
16+
println(a)
17+
end
18+
19+
main()

chapters/tree_traversal/tree_traversal.md

Lines changed: 13 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -71,7 +71,9 @@ Because of this, the most straightforward way to traverse the tree might be recu
7171

7272
At least to me, this makes a lot of sense. We fight recursion with recursion! First, we first output the node we are on and then we call `DFS_recursive(...)` on each of its children nodes. This method of tree traversal does what its name implies: it goes to the depths of the tree first before going through the rest of the branches. In this case, the ordering looks like:
7373

74-
![DFS ordering pre](res/DFS_pre.png)
74+
<p align="center">
75+
<img src="res/DFS_pre.png" width="500" height="500" />
76+
</p>
7577

7678
Note that the in the code above, we are missing a crucial step: *checking to see if the node we are using actually exists!* Because we are using a vector to store all the nodes, we will be careful not to run into a case where we call `DFS_recursive(...)` on a node that has yet to be initialized; however, depending on the language we are using, we might need to be careful of this to avoid recursion errors!
7779

@@ -99,7 +101,9 @@ Now, in this case the first element searched through is still the root of the tr
99101
[import:20-29, lang:"julia"](code/julia/Tree.jl)
100102
{% endmethod %}
101103

102-
![DFS ordering post](res/DFS_post.png)
104+
<p align="center">
105+
<img src="res/DFS_post.png" width="500" height="500" />
106+
</p>
103107

104108
In this case, the first node visited is at the bottom of the tree and moves up the tree branch by branch. In addition to these two types, binary trees have an *in-order* traversal scheme that looks something like this:
105109

@@ -124,7 +128,10 @@ In this case, the first node visited is at the bottom of the tree and moves up t
124128
[import:31-47, lang:"julia"](code/julia/Tree.jl)
125129
{% endmethod %}
126130

127-
![DFS ordering in](res/DFS_in.png)
131+
<p align="center">
132+
<img src="res/DFS_in.png" width="500" height="500" />
133+
</p>
134+
128135

129136
The order here seems to be some mix of the other 2 methods and works through the binary tree from left to right.
130137

@@ -162,7 +169,9 @@ In code, it looks like this:
162169

163170
All this said, there are a few details about DFS that might not be idea, depending on the situation. For example, if we use DFS on an incredibly long tree, we will spend a lot of time going further and further down a single branch without searching the rest of the data structure. In addition, it is not the natural way humans would order a tree if asked to number all the nodes from top to bottom. I would argue a more natural traversal order would look something like this:
164171

165-
![BFS ordering](res/BFS_simple.png)
172+
<p align="center">
173+
<img src="res/BFS_simple.png" width="500" height="500" />
174+
</p>
166175

167176
And this is exactly what Breadth-First Search (BFS) does! On top of that, it can be implemented in the same way as the `DFS_stack(...)` function above, simply by swapping the `stack` for a `queue`, which is similar to a stack, exept that it only allows you to interact with the very first element instead of the last. In code, this looks something like:
168177

0 commit comments

Comments
 (0)