Skip to content

0. Introduction

Hi, my name is Matthew, and took this class in the Summer 2022 semester, though I have seen the subject of linear algebra 3 times before (Linear Algebra at a Community College, Physics 89, and Math 110), and the subject of circuits 2 times before (AP Physics 1/C, Physics 5B/5BL). I wanted to create a set of notes that try to summarize the course content in a very intuitive and concise way, while also allowing depth into the subject o linear algebra.

This course comes in two main parts: linear algebra and circuits. As of how the course is run in 2022, it feels like it is broken into three parts, but the last module (that being correlation and trilateration) is simply an application of linear algebra.

One thing that I wished for when I took this class was a deeper understanding of linear algebra. It definitely felt at some times the linear algebra felt procedural rather than intuitive: in my opinion linear algebra needs to be intuitive, seeing how it works visually, computationally, and abstractly. I hope to encapsulate linear algebra concepts through this manner, which does mean I will be reordering the topics as they appear in the class as it is now to the way I find the most logical.

The second part of the course, circuits, can get complicated really quickly. I personally find the circuits part really interesting, especially the applications of the devices we will analyze. The way I will approach it will be like 16A will, but go through a bunch of examples. You understand circuits better by first understanding the principles, but then doing a bunch of examples.

Overview

WIth all my notes, I like to give a quick overview of what will be covered, and why we need to even learn it in the first place.

[[1. Matrices and Vectors]]: Matrices and vectors are just like numbers you learned in arithmetic: you can add them, subtract them, multiply them, but you cannot divide them. You can think of vectors as multidimensional numbers, and you can think of matrices as multiple vectors. We will get a good grasp on how to handle algebra dealing with matrices and vectors.

[[2. Linear Systems and Gaussian Elimination]]: With our newfound knowledge of matrices and vectors, we will see how to use them to solve systems of linear equations in a systematic way. You'll see how row operations on augmented matrices will lead you to the solution, and how to row reduce all the way until you know if a solution has no solutions, one solution, or infinitely many solutions. We will also see how to write a linear system in matrix-vector form.

[[3. Invertibility and Determinants]]: Now we will see what unique solutions to linear equations might mean for us, and how one can transition this thought into the topic of invertibility. If you know a matrix is invertible, you'll know so much about the linear system. Afterwards, we will look into determinants, and find ways to compute \(n\times n\) determinants fairly quickly. Then we will see how to solve a system with inverses, and then solve systems using determinants. Finally, I will show that determinants and inverses are related to each other.

[[4. Vector Spaces]]: Vector spaces is a specific type of mathematical structure that you've probably have used before, but never really knew you were using it. It hinges on the idea of vector addition and scalar multiplication, as well as commutative, associate, distributive, identity, inverse properties. The vector as a magnitude-direction object creates a vector space, but now it is time see how this definition might apply to other mathematical objects. in fact, you'll see polynomials are vectors, matrices are vectors, and more. You can actually create your own object and as long as it satisfies the definition of a vector space, you can call the object a vector.

[[5. Linear Independence, Span, Bases]]: One can characterize a vector space through a set of vectors that can reach every vector in a vector space. For example, is it always true you can find such a set where this happens? If you do have a list of vectors that do this can you find the minimum amount that does the same thing? Is the minimum amount necessarily unique? This is what this section will talk about

[[6. Matrices as Linear Transformations]]: Now we have the tools to talk about linear transformations, and we will use matrices to visually see these transformations.

[[7. Eigenvalues and Eigenvectors]]: