# Explaining my Fall23 classes

college, learning23 Dec 2023

8 minute read

I’m gonna go through some of the classes I took with some photos and explain the kinds of things I learned, in a very simplified manner, as a sketch.

## Theory of computation - 18.404 #

The basic setup is that we want formal models of computation to compare the difficulty of problems, their equivalences, and reason about what is computable in general. What types of problems have solutions for any instance of the problem?

We need something concrete acts as an edifice to build that theory. To answer this we think in terms of *languages*, where your bitstring (usually) x belonging in the language means it has some property.

For example:

\[L = \{(a, b, c) | a + b = c\}\]is a language of triplets where the first two numbers add to the third,

\[L = \{ww^R | w \in \{0, 1\}^*\}\]is the language of bit string palindromes (the 01 thing is just the set of all bitstrings), etc…

Now that we have languages, we want to think about how, given a string, we would determine whether it belongs to a language or not. Formally, we will create different models of computation with different amounts of computational *power*, and see what types of models can solve what kinds of problems. In effect this allows us to reason about the intrinsic difficulty of certain problems and ground our theory.

I won’t go over the details (take the class!), but we basically define and investigate more and more complex models of computation, adding a simple form of memory, looping, etc… until we reach the Turing machine, which is a form of universal type of machine that can implement anything we can implement.

Turing machines reflect what computers and programming machines can actually do.

With those setup, we can now start thinking about: are there problems that in general are impossible? Ie for any possible input we can decide with some shared procedure whether or not they belong to the language? And the answer is no.

Things get a bit more complicated but this is related to the idea that if you have an algorithm to recognize a language, it can halt and say whether or not it’s in the language but it can also potentially go on forever, if the input is not in the language, and then you can’t every guess what’s going on (it could always accept at some point).

There are a lot of fun tricks and ways of showing these things if you take the class.

Then we move on to complexity, and considering the question not of whether problems are solvable in general but if they are solvable within some amount of limited resources - whether that be space or time, or even randomness! (indeed randomness and random numbers can sometimes give you faster algorithms almost for free).

This class covers a lot of fundamental ideas and has a lot of good problems, so I’d recommend. I think it’s helpful to understand underlying ideas behind a lot of things in CS.

## Analysis - 18.100B #

This is the harder intro analysis course. It was somewhat unconventional this semester as the professor brought in a lot of new content that is not usually covered from measure theory and functional analysis.

The basic idea is to build a lot of machinery to formally build up and prove the things we use in calculus and a lot of math in general. This means being very careful about details and edge cases.

In this class you start by defining all the basic things like integers and then constructing the rationals and their properties and how they relate to the reals. Once you have this setup, you abstract away real numbers into the idea of a metric space – a set with a well behaved distance function.

With metric spaces, we study the idea of open and closed set and formally describe limits (describing what happens as you get infinitely large or infinitely closer to something) and other nice properties of functions., which give you fundamental properties to study sequences of points within sets and limits from a topological perspective. *Open sets* have the property that for every point in the set, there is a ball around that point that’s also in the set, if you make it small enough. Eg $I = (a, b)$, $a, b \in \R$ is an open set because if i take any point no matter how close to the endpoints there are points in between so a small ball can fit inside I.

Closed sets have the property that if your set $A$ has a sequence of points $a_n$ getting closer and closer to a point $c$, infinitesimally so (as close as you want), then $c \in A$. So $[a, b]$ is such a set but $(a, b)$ is not because you can have a sequence that gets as close as you want to a but a is not in $(a, b)$.

These are useful/fundamental starting points in analysis and they tie in with another property called compactness, which you should take the course to understand.

From here we can look at sequences of numbers, and use the tools we’ve built to understand these sequences and when they diverge (go to infinity) or converge for example.

Intro analysis teaches you to be really careful about: when you’re proving something, what exact property allows you to then conclude what exact consequence.

For example, continuity is what allows you to say that if you have a function $f$, a sequence $a_n$ approaching a point x, the limit of $f(a_n)$ on that sequence will be the value of the function of that point, ie $f(x)$. This abstraction allows us to then build a whole edifice around integration for example.

A lot of things might seem obvious but thinking about why they’re not and why they can break down can lead you to a lot of interesting counterexamples/reflection.

Then we did integration and measure theory, where the key idea we built up was the idea of measuring a function on a range by looking at the range of values it takes, picking subsets of its range and then counting up the *measure* of the input x that give an $f(x)$ in that range. For small enough (length reaching 0) output ranges, this allows you to measure the area underneath the curve of a function, but it requires a lot of careful theory called *Lebesgue integration*.

We finished with functional analysis, where you treat functions themselves as elements of a space, with a norm, just like we treat numbers as part of the reals, with their norm being the absolute value. This allows you to think about the convergence of functions and other neat properties covered in 18.102.

## Electronics for mechanical systems - 2.678 #

This class speedruns basic electronics! It’s a 6 unit (half of a full class) class that packs a punch. It’s half lectures/psets and half labs. You start by reviewing the fundamentals of electricity and magnetism and then you move on to the fundamental electronic components that allow us to do digital electronics and make complex mechanical systems.

In the labs you learn to play around with the components to reverse engineer what they’re doing, and in the first lab you actually reverse engineer and modify the behavior of a mouse. This class gets you pretty far in terms of understanding how electronic systems work if you have no knowledge, although it doesn’t go too deep into the theoretical/solid state science behind some of the electronics.

We went through transistors, semiconductors, Op-Amps and Mosfets, which are all ways of getting desirable properties for circuits that allow you to implement digital logic (building blocks of computers. We also did work with motors, and worked with arduino, which allows you to program the logic for your circuit directly in C. We finished with a project where you use Arduino to code up a control algorithm for a motor on a track.

## Advanced Poetry Workshop - 21W.771 #

This is a different type of class. Each week you come together with a poem you wrote, and then you share it and get feedback from the professor and classmates. This was maybe my favorite class. It taught me to write more evocative and compelling poems, and get a better sense for what I want my poetry to be.

For more info here, I’d recommend [blog post].

## Deep Learning - 6.S898 #

To be Added - I need to sleep but I’ll add this a few days if you’re curious, not trying to be perfectionist with this post.

Have a good break!

Subscribe to the blog's newsletter: