Day 7 at RC

Summary

Working

It feels great to finally be in the productive groove again! Even though my cough continues to hack away at my lungs, with enough water and rest I can function more or less per usual. The one limiting factor is eye-strain — my eyes are always very tired by the end of the day, and I want to look into more ways to reduce eye strain beyond just using f.lux.

Deep Learning I really got into the groove here — the material is kind of hefty, though I myself spend a lot of time trying to rederive the equations Professor Ng states for the partial derivatives for gradient descent, as well as his results for the shapes of the matrices involved in the neural network.

I still haven’t fully derived the equations for backpropagation — that may take a few days, since I’ve deprioritized it for now.

Linear Algebra On that note, I’ve become motivated and interested once more in linear algebra — to that effect, I’ve picked up Ivan Savov’s No Bullshit Guide to Linear Algebra, which really does express the general idea and motivation for linear algebra in more concrete terms than university courses ever did.

From what I understand so far, linear algebra lets us approximate complex real-world systems with linear combinations of equations. More importantly, we can “probe” at the workings of the system by feeding it input — the example he uses is a giant projector screen running an application that’s mapped to your handheld tablet, where you can ostensibly draw images, but you are not provided any instructions about the orientation of the projected screen versus your tablet.

Linear algebra provides you a systematic approach to figuring out the linear transformation between points on your tablet to points on your screen.

And to provide a more theoretical grounding, I’ve also picked up Sheldon Axler’s Linear Algebra Done Right, a concise book that lays out the theoretical framework for understanding linear algebra in a more abstract fashion.

Nand2Tetris My fellow Recurser Adam Palay’s foray into understanding computation and how it’s translated from theory into hardware into software, from the bottom-up, has inspired me to follow suit and start studying Nand2Tetris, which is a course created by Professors Noam Nisan and Shimon Schocken that leads students through the entire vertical stack of computing systems.

You start by simulating the basic gates that implement boolean logic, then work your way up the hardware (using simulation software). Eventually, you implement your own assembly language, operating system, and then your own programming language that all runs on this hardware stack you’ve “built”.

Incredibly cool concept, and I’m excited (and a little nervous — will I ever finish?) to see where it takes me.

Next steps

I am satisfied with what I’m working on. Except for one thing — I need to write more actual code. So tomorrow, I plan to spend time actually architecting and coding out parts of the Subreddit classifier — mainly the webapp portion.