## FANDOM

572 Pages

So, the title sums up everything. When I heard ThisWriter declare me a shapes dude, this sudden wave of impostor syndrome came over me. How could I possibly be a shapes dude? I don’t even know what shapes are! I’m just some guy who literally thinks that tesseracts are cubes that turn themselves inside out.

The only way to fix this is to, of course, learn all of topology. My estimates suggest that, because topology is easy, it should take me about five minutes. Alright, let’s hop right in. I have some lecture notes right here, so I should just be able to get cracking.

So, chapter 1, section 1. Easy.

Um.

## I Teach Myself Metric Spaces And Drag You Along For The Ride Because I Hate You, Specifically, Dear Reader

Alright. Back to basics. What is a metric space? I don’t really know. I imagine that it’s a space, and the space presumably has a metric of some sort. I don’t really know what those actually mean, especially at the level that I’d need to start doing topology proofs.

Conveniently, I also have a set of lecture notes on that very topic that I’ve just downloaded. This should take me about four minutes, because Metric Spaces are easier than Topology.

STOP ASSUMING MY MATHEMATICAL BACKGROUND!

Now, this sounds unnerving, but Analysis is just what matheticians call calculus for some reason. I don’t know why they do that. Mathematicians invented calculus. I guess Analysis is to signal that you’re gonna be really, really serious. There’s no playing around here. Fucking limits for days.

I don’t know what this has to do with metric spaces yet, but hopefully the notes are getting there. I scrolled down, and Section 2 is called Mᴇᴛʀɪᴄ Sᴘᴀᴄᴇs, which is always a good sign.

So, we start off with the epsilon-delta definition for whether a function is continuous. Intuitively, continuity means that there are no sudden jumps in the function.

Rigorously, the definition is that, for a $\left| f(x) - f(x_0) \right| < \epsilon$, no matter how small your ϵ, you can get a $\left| x - x_0 \right| < \delta$. This means that you can get a distance between $f(x)$ and $f(x_0)$ as small as you want just by making the distance between x and x0 small.

And, heck, let’s also crack open a can of convergence, because converging sequences get useful later on apparently. Intutively, convergence means that a sequences of numbers gets closer and closer to a single point.

Rigorously, the definition is that a sequence $v^1,v^2,v^3 \dots v^n$ converges to the value w if, for any ϵ, there is a value N for which, for any $k>N$, $\left| v^k - w \right| < \epsilon$. This means that you can get a distance ϵ between vk and w as small as you want, and keep the distance as small as you want no matter how far in the sequence you go, just by picking a large enough N to start counting at.

Now, activite the VSauce voice. What is a small distance? For the real numbers, a small distance is easy – you just take the difference between the two numbers and check whether it’s close to zero. But we’re not always going to be on the real numbers, because you can do functions over basically anything. My lecture notes call this a notion of distance, which sounds terrible but mostly because the word “notion” has been poisoned by weird descriptions of verse. I don’t even know what notion means.

notion (n) - a conception of or belief about something.

No, that’s terrible. I am not calling it a notion of distance. Let’s call it something nice. It’s used for measurement, so let’s give it a nice name for Greek. Say, something derived from métron, to measure. Metric?

We’re calling it a metric.

Now, there are some things a metric should do. This is how mathematicians work out everything, by the way – they come up with a generalisation, they write down some things that it should do, and then ping every proof they can think of off that. For a metric over a set X, written as d(x,y) where x and y are two points in X, we should have:

• Positivity: d(x,y) = 0 if, and only if, x=y. Distance between two different points is always positive.
• Symmetry: d(x,y) = d(y,x). The distance between x and y is the same as the distance bteween y and x.
• Triangle inequality: d(x,z) ≤ d(x,y) + d(y,z).

Wait a minute, that last one doesn’t look primitive at all! Hold on, let’s draw a diagram:

Ah, that makes more sense. Basically, the shortest distance between x and z is always going directly from x to z, without taking a detour via y first. Or equal, if y is between x and z.

So, now to define what it means for something to be continuous, and what it means for a sequence to converge, but again, now with our new powerful mathematical tool of saying "distance, but not that distance".

Continuity: A function f from the set X, with a metric dX, to the set Y, with a metric dY is continuous for any $\epsilon >0$, no matter how small, you can find a $\delta >0$ such that $d_X (x,a)<\delta$ and $d_Y (f(x),f(a))<\epsilon$, for every a.

This is identical to the normal definition of continuity, except now you’ve got your general metrics dX and dY instead of some specific distance that only works over a certain domain. Because this is also between two metric spaces, this means you can now say things about the continuity of functions between, say, numbers and vectors, provided you define a valid metric. That’s very nice.

Convergence: for the sequence xn over the set X with a metric dX, xn converges to a if for any $\epsilon>0$, no matter how small, you can find an N such that for all n>N there is $d_X (x_n,a)<\epsilon$.

This is also indentical to the normal defintion, but again, with our very nice metric space definition added on.

So, what kinds of things are metrics? There are some general rules of what can count as a metric, but not much about what these metrics actually look like.

Let’s say we have some points in $\mathbb{R}^n$, n-dimensional vectors of real numbers, v and w, with real-valued components indexed by i or something. Some fun metrics that we could do are

$d_1(v,w) = \sum { \left| v_i - w_i \right| }$

This is called the taxicab metric, and sums up the differences in each individual component. In this metric, (0, 0) and (3, 4) have a distance of 7.

$d_2(v,w) = \sqrt{ \sum { { \left| v_i - w_i \right| }^2 } }$

And here is your ordinary Euclidean metric, defined using the dot product! This satisfies all of the rules for a metric, which is reassuring – it means that distance can actually correspond to distance in physical reality.

$d_\infin(v,w) = \max_{i \in \{ 1 \dots n \} \left| v_i - w_i \right| }$

And here’s another metric that you can do over vectors of real numbers! This one is the greatest difference between two components of the vector. In this metric, (0, 0) and (3, 4) have a distance of 4.

There’s also a thing called the discrete metric, which works on any set X. It looks like this:

$d(x,y) = \begin{cases} 0 &\text{if } x = y \\ 1 &\text{if } x \neq y \\ \end{cases}$

This also follows the rules for metrics! You can apply this to basically any set, which is nice.

Anyway, that's all of this. Not much actual stuff got done, but there's plenty of content left. Then: topology!

## Next Up!

• Proving that these things actually are metrics
• Maybe other stuff
• People really fucking hype over open sets maybe I'll learn what the hell they are
• Unknown unknowns