This tutorial in dedicated to the concept of linear independence of vectors and its associations with the solution of linear equation systems.

* What is the concept of vectors’ linear independence*, and why should I care about it

*One simple*

**?****: Let’s say you want to find the answer to the . One may say we can simply find the answer by taking an inverse of as . NOT SO FAST! There is more in that? What if does not have an inverse or is not square? You just let it go? I guess not.**

*example*That was just one example. *In Machine Learning, it frequently happens that you want to explore the correlation between vectors to analyze them better*. For

**example**, you are dealing with a matrix which in essence, is formed by vectors.

*What would you know if you are not familiar with the concept of*?

**linear independence**In this tutorial, you will learn the following:

- The concept of
*linear**independence*/dependence! - Examples of the linear dependence of vectors
- How to do it in Python using Numpy?
- Matrix ranks
- The solutions to a system of linear equations

**Before You Move On**

You may find the following resources helpful to better understand the concept of this article:

**Python Tutorials – A FREE Video Course:**You will become familiar with Python and its syntax.**Numpy – An Introduction to a Great Package for Linear Algebra:***Numpy is one of the best scientific computing packages for Linear Algebra*! You always need it for Machine Learning as you always need Linear Algebra for Machine Learning!**The Remarkable Importance of Linear Algebra in Machine Learning:**This article talks about why you should care about Linear Algebra if you want to master Machine Learning.**Basic Linear Algebra Definitions that You Hear Every Day:**Covers the primary and most frequently used Linear Algebra definitions in Machine Learning.

### The Concept of Linear Independence

Assuming we have the set of which are column vectors of size . Then, we call this set **linear independent**, if no vector exists that we can represent it as the linear combination of any other two vectors. Although, perhaps it is easier to define ** linear dependent**: A vector is linear dependent if we can express it as the linear combination of another two vectors in the set, as below:

In the above case, we say the set of vectors are ** linearly dependent**!

### Example

Consider the three vectors below:

The above set is **linearly dependent**. Why? It is simple. Because . Let’s do the above with Python and Numpy:

# Import Numpy library import numpy as np # Define three column vectors v = np.array([1, -1, 2]).reshape(-1,1) u = np.array([0, 3, 1]).reshape(-1,1) w = np.array([2, 1, 5]).reshape(-1,1) # Check the linear dependency with writing the equality print('Does the equality w = 2v+u holds? Answer:', np.all(w == 2*v+u))

Run the above code and see if Numpy confirms that or not!

### The Relationship With Matrix Rank

I talked about the linear dependence of vectors so far. Assume we have the matrix . The matrix has **m** rows and **n** columns. Let’s focus on the columns. The **n** columns form **n** vectors. I denote the column with . So we have the set of n vectors as columns. The size of the largest subset of that its vectors are linearly independent, is called the ** column rank** of the matrix . Considering the rows of , the size of the largest subset of rows that form a linearly independent set is called the

*of the matrix .*

**row rank**We have the following properties for matrix ranks:

- For , . If , the matrix is
**full rank**. - For , , and , .
- For , , and , .

Check first and second property above with the following code:

# Import Numpy library import numpy as np from numpy.linalg import matrix_rank # Define random 3x4 matrix using np.array # Ref: https://docs.scipy.org/doc/numpy-1.15.1/reference/generated/numpy.random.randint.html M = np.random.randint(10, size=(3, 4)) N = np.random.randint(10, size=(4, 3)) # np.all() test whether all array elements are True. # More info: https://docs.scipy.org/doc/numpy/reference/generated/numpy.all.html checkProperty = np.all(matrix_rank(M) <= min(M.shape[0],M.shape[1])) if checkProperty: print('Property rank(M) <= min(M.shape[0],M.shape[1]) is confirmed!') checkProperty = np.all(matrix_rank(np.matmul(M,N)) <= min(matrix_rank(M),matrix_rank(N))) if checkProperty: print('Property rank(MN) <= min(rank(M),rank(N)) is confirmed!')

**Practice:** Modify the above code and check property **(3)**.

### Linear Equations

I talked about **linear dependency** and **matrix ranks**. After that, I would like to discuss their application in finding the solution of linear equations, which is of great importance. Consider the following equality which set a system of linear equations:

Above, we see the matrix is multiplied by the vector and forms another vector . The above equality creates a set of **m-line** linear equations. Let’s write line for example:

** So the question is** how many solution exists for the system of equations . By solution I mean the possible values for the variables .

**The answer is one of the following**:

- There is
**NO**solution. - The system has
**one unique**solution. - We have
**infinite**numbers of solutions.

As you observed, *having more than one BUT less than infinity solutions is off the table*!

**Theorem:** If and are two solutions of the equation, then the specific linear combination of them as below, is a solution as well:

**PROOF:** Look at the below equations to see how is also a solution:

So the above proves shows that if we have ** more than one solution**, then we can say

**!**

*we have infinite number of solutions*The following two conditions determine the number of solutions if there is at least one solution! **Try to prove them** based on what you learned so far:

- has
**at least one solution**if and Rank() = m. - has
**exactly one solution**if and Rank() = m.

### Conclusion

In this tutorial, I ** discussed the concept of linear independence of the vectors** and their

**. This concept is crucial, especially in**

*associates with the system of linear equations**Machine Learning*and

*optimization theory*, in which we are dealing with all sorts of mathematical proofs necessary to justify why a method should work!

**.**

*For the majority of what you may work in Machine Learning, you may not need to use what I talked about here directly**BUT, you need to know it if you would like to stand out*!

*Do you see any ambiguous concept here? Anything missing or wrong?*Feel free to ask as it will help me, yourself, and the others to learn better, and I can further improve this article.

hello i did not get how you relate system of linear equation with linearly dependent or independent vectors please do share some more example thank you

please help me to understand how you relate system of linear equations with linearly independent vectors calling its an application