In Section VO:Vector Operations we defined vector addition and scalar multiplication. These two operations combine nicely to give us a construction known as a linear combination, a construct that we will work with throughout this course.

## Linear Combinations

Definition LCCV (Linear Combination of Column Vectors) Given $n$ vectors $\vectorlist{u}{n}$ from $\complex{m}$ and $n$ scalars $\alpha_1,\,\alpha_2,\,\alpha_3,\,\ldots,\,\alpha_n$, their linear combination is the vector \begin{equation*} \lincombo{\alpha}{u}{n} \end{equation*}

So this definition takes an equal number of scalars and vectors, combines them using our two new operations (scalar multiplication and vector addition) and creates a single brand-new vector, of the same size as the original vectors. When a definition or theorem employs a linear combination, think about the nature of the objects that go into its creation (lists of scalars and vectors), and the type of object that results (a single vector). Computationally, a linear combination is pretty easy.

Example TLC: Two linear combinations in $\complex{6}$.

Example ABLC: Archetype B as a linear combination.

With any discussion of Archetype A or Archetype B we should be sure to contrast with the other.

Example AALC: Archetype A as a linear combination.

There's a lot going on in the last two examples. Come back to them in a while and make some connections with the intervening material. For now, we will summarize and explain some of this behavior with a theorem.

Theorem SLSLC (Solutions to Linear Systems are Linear Combinations) Denote the columns of the $m\times n$ matrix $A$ as the vectors $\vectorlist{A}{n}$. Then $\vect{x}$ is a solution to the linear system of equations $\linearsystem{A}{\vect{b}}$ if and only if $\vect{b}$ equals the linear combination of the columns of $A$ formed with the entries of $\vect{x}$, \begin{equation*} \vectorentry{\vect{x}}{1}\vect{A}_1+ \vectorentry{\vect{x}}{2}\vect{A}_2+ \vectorentry{\vect{x}}{3}\vect{A}_3+ \cdots+ \vectorentry{\vect{x}}{n}\vect{A}_n = \vect{b} \end{equation*}

In other words, this theorem tells us that solutions to systems of equations are linear combinations of the $n$ column vectors of the coefficient matrix ($\vect{A}_j$) which yield the constant vector $\vect{b}$. Or said another way, a solution to a system of equations $\linearsystem{A}{\vect{b}}$ is an answer to the question "How can I form the vector $\vect{b}$ as a linear combination of the columns of $A$?" Look through the archetypes that are systems of equations and examine a few of the advertised solutions. In each case use the solution to form a linear combination of the columns of the coefficient matrix and verify that the result equals the constant vector (see exercise LC.C21).

## Vector Form of Solution Sets

We have written solutions to systems of equations as column vectors. For example Archetype B has the solution $x_1 = -3,\,x_2 = 5,\,x_3 = 2$ which we now write as \begin{equation*} \vect{x}=\colvector{x_1\\x_2\\x_3}=\colvector{-3\\5\\2} \end{equation*} Now, we will use column vectors and linear combinations to express all of the solutions to a linear system of equations in a compact and understandable way. First, here's two examples that will motivate our next theorem. This is a valuable technique, almost the equal of row-reducing a matrix, so be sure you get comfortable with it over the course of this section.

Example VFSAD: Vector form of solutions for Archetype D.

This is such an important and fundamental technique, we'll do another example.

Example VFS: Vector form of solutions.

Did you think a few weeks ago that you could so quickly and easily list all the solutions to a linear system of 5 equations in 7 variables?

We'll now formalize the last two (important) examples as a theorem.

Theorem VFSLS (Vector Form of Solutions to Linear Systems) Suppose that $\augmented{A}{\vect{b}}$ is the augmented matrix for a consistent linear system $\linearsystem{A}{\vect{b}}$ of $m$ equations in $n$ variables. Let $B$ be a row-equivalent $m\times (n+1)$ matrix in reduced row-echelon form. Suppose that $B$ has $r$ nonzero rows, columns without leading 1's with indices $F=\set{f_1,\,f_2,\,f_3,\,\ldots,\,f_{n-r},\,n+1}$, and columns with leading 1's (pivot columns) having indices $D=\set{d_1,\,d_2,\,d_3,\,\ldots,\,d_r}$. Define vectors $\vect{c}$, $\vect{u}_j$, $1\leq j\leq n-r$ of size $n$ by

\begin{align*} \vectorentry{\vect{c}}{i}&= \begin{cases} 0&\text{if $i\in F$}\\ \matrixentry{B}{k,n+1}&\text{if $i\in D$, $i=d_k$} \end{cases}\\ \vectorentry{\vect{u}_j}{i}&= \begin{cases} 1&\text{if $i\in F$, $i=f_j$}\\ 0&\text{if $i\in F$, $i\neq f_j$}\\ -\matrixentry{B}{k,f_j}&\text{if $i\in D$, $i=d_k$} \end{cases}. \end{align*}

Then the set of solutions to the system of equations $\linearsystem{A}{\vect{b}}$ is \begin{equation*} S=\setparts{ \vect{c}+\alpha_1\vect{u}_1+\alpha_2\vect{u}_2+\alpha_3\vect{u}_3+\cdots+\alpha_{n-r}\vect{u}_{n-r} }{ \alpha_1,\,\alpha_2,\,\alpha_3,\,\ldots,\,\alpha_{n-r}\in\complex{\null} } \end{equation*}

Note that both halves of the proof of Theorem VFSLS indicate that $\alpha_i=\vectorentry{\vect{x}}{f_i}$. In other words, the arbitrary scalars, $\alpha_i$, in the description of the set $S$ actually have more meaning --- they are the values of the free variables $\vectorentry{\vect{x}}{f_i}$, $1\leq i\leq n-r$. So we will often exploit this observation in our descriptions of solution sets.

Theorem VFSLS formalizes what happened in the three steps of Example VFSAD. The theorem will be useful in proving other theorems, and it it is useful since it tells us an exact procedure for simply describing an infinite solution set. We could program a computer to implement it, once we have the augmented matrix row-reduced and have checked that the system is consistent. By Knuth's definition, this completes our conversion of linear equation solving from art into science. Notice that it even applies (but is overkill) in the case of a unique solution. However, as a practical matter, I prefer the three-step process of Example VFSAD when I need to describe an infinite solution set. So let's practice some more, but with a bigger example.

Example VFSAI: Vector form of solutions for Archetype I.

This technique is so important, that we'll do one more example. However, an important distinction will be that this system is homogeneous.

Example VFSAL: Vector form of solutions for Archetype L.

## Particular Solutions, Homogeneous Solutions

The next theorem tells us that in order to find all of the solutions to a linear system of equations, it is sufficient to find just one solution, and then find all of the solutions to the corresponding homogeneous system. This explains part of our interest in the null space, the set of all solutions to a homogeneous system.

Theorem PSPHS (Particular Solution Plus Homogeneous Solutions) Suppose that $\vect{w}$ is one solution to the linear system of equations $\linearsystem{A}{b}$. Then $\vect{y}$ is a solution to $\linearsystem{A}{b}$ if and only if $\vect{y}=\vect{w}+\vect{z}$ for some vector $\vect{z}\in\nsp{A}$.

After proving Theorem NMUS we commented (insufficiently) on the negation of one half of the theorem. Nonsingular coefficient matrices lead to unique solutions for every choice of the vector of constants. What does this say about singular matrices? A singular matrix $A$ has a nontrivial null space (Theorem NMTNS). For a given vector of constants, $\vect{b}$, the system $\linearsystem{A}{b}$ could be inconsistent, meaning there are no solutions. But if there is at least one solution ($\vect{w}$), then Theorem PSPHS tells us there will be infinitely many solutions because of the role of the infinite null space for a singular matrix. So a system of equations with a singular coefficient matrix never has a unique solution. Either there are no solutions, or infinitely many solutions, depending on the choice of the vector of constants ($\vect{b}$).

Example PSHS: Particular solutions, homogeneous solutions, Archetype D.

The ideas of this subsection will be appear again in Chapter LT:Linear Transformations when we discuss pre-images of linear transformations (Definition PI).