This tutorial is dedicated to explaining the concept of Singular Value Decomposition (SVD) and its applications. You will learn how you can decompose a non-square matrix to its constituent elements.

I previously talked about matrix decomposition and its importance. You previously learned what eigendecomposition is and how you can decompose a matrix using its eigenvalues and eigendecomposition. *There was one issue though**: For matrix eigendecomposition, the matrix MUST be square*! ** The Singular Value Decomposition (SVD) does NOT have this limitation, and it makes it even more useful and powerful compared to eigendecomposition**.

**Here, you will learn the following:**

- The definition of Singular Value Decomposition
- The benefits of decomposing a matrix using Singular Value Decomposition
- How to do it in Python and Numpy
- Some of its important applications

**Before You Move On**

You may find the following resources helpful to better understand the concept of this article:

**Python Tutorials – A FREE Video Course:**You will become familiar with Python and its syntax.**Numpy – An Introduction to a Great Package for Linear Algebra:***Numpy is one of the best scientific computing packages for Linear Algebra*! You always need it for Machine Learning as you always need Linear Algebra for Machine Learning!**The Remarkable Importance of Linear Algebra in Machine Learning:**This article talks about why you should care about Linear Algebra if you want to master Machine Learning.**Basic Linear Algebra Definitions that You Hear Every Day:**Covers the primary and most frequently used Linear Algebra definitions in Machine Learning.**Matrix Operations In Practice:**Everything you need to know about matrices and their operations.

### Definitions

#### Preliminary Definitions

Let’s have a preliminary definition before we move on.

**Conjugate Transpose of a Matrix:** The conjugate transpose or Hermitian transpose of is obtained taking the transpose and then taking the complex conjugate of each entry of matrix . The resulting matrix is denoted by or .

The space is the field of the complex numbers.

Now, let’s define a special kind of matrices.

**Unitary Matrix:** A complex square matrix is **unitary** if its conjugate transpose is also its inverse. It can be shown as below:

where is the field of the complex numbers.

**NOTE: **In the above definition, if belongs to the field of real numbers , the matrix will be orthogonal and .

#### Singular Value Decomposition

Now, let’s define the main concept, *Singular Value Decomposition (SVD)*.

**Singular Value Decomposition:** *Assuming we have the matrix o*f . Then, we can factorize matrix as below:

where is an and is an matrix and both are **unitary**. The matrix is a diagonal matrix with non-negative real numbers on the diagonal.

**NOTE:** The matrix is a diagonal matrix of size . So it does not have to be square!

The ** diagonal elements** of are known as the

**singular values**of the . The

**of are known as the**

*columns***left-singular**vectors and are the eigenvectors of . The

**columns**of are known as the

**right-singular**vectors and are the eigenvectors of .

The diagonal matrix is ** uniquely determined** by . The nonzero singular values of are the square roots of the eigenvalues . That’s why they are non-negative!

### A Practical Implementation

Let’s compute the singular value decomposition of a matrix using Python and Numpy.

# Import Numpy library import numpy as np # Define random 3x4 matrix using np.array # Ref: https://docs.scipy.org/doc/numpy-1.15.1/reference/generated/numpy.random.randint.html M=3 N=4 A = np.random.randint(10, size=(M, N)) print('A:\n',A) # Singular Value Decomposition # With full_matrices=True, U and Vt matrices will be squared. U, S, Vt = np.linalg.svd(A, full_matrices=True) # Show the shape of outputs print('The shape of U:', U.shape) print('The shape of S:', S.shape) print('The shape of V^{T}:', Vt.shape) # Create the diagonal matrix of \Lambda Sigma = np.diag(S) # Matrix A is fat! Sigma is not diagonal! if Sigma.shape[1] != Vt.shape[0]: zero_columns_count = Vt.shape[0] - Sigma.shape[1] additive = np.zeros((Sigma.shape[0],zero_columns_count), dtype=np.float32) Sigma = np.concatenate((Sigma,additive), axis=1) # Matrix A is fat! Sigma is not diagonal! if Sigma.shape[0] != U.shape[1]: zero_rows_count = U.shape[1] - Sigma.shape[0] additive = np.zeros((zero_rows_count,Sigma.shape[1]), dtype=np.float32) Sigma = np.concatenate((Sigma,additive), axis=0) # Check and Confirm the decomposition A_ = np.matmul(np.matmul(U,Sigma),Vt) print('Computing A with decomposed elements:\n', A_)

In the above code, I * just print the matrices’ shapes*. Try to change the code above to see the outputs. Starting from

**line 20**, I try to reconstruct matrix using the decomposed vector by SVD to confirm the computation!

### Applications of SVD

#### Moore–Penrose Pseudoinverse

Before, digging into how SVD can help us with pseudoinverse calculations, let’s talked about ** what is a Pseudoinverse**?

**Moore–Penrose Pseudoinverse:** *Assuming we have the matrix o*f . Then, we have a pseudoinverse of , defined as , that satisfies all of the following criteria:

Now, let’s get back to business! How we can calculate the **Pseudoinverse** of a matrix using SVD? If we have the SVD of a matrix as , then can be calculated as below:

where is the pseudoinverse of and we create it by replacing each non-zero diagonal entry in by and transposing the resulting matrix.

#### Matrix Rank Determination

In a previous post, I talked about the linear dependence and the matrix ranks. Now, as we have SVD, a more general approach compared to eigendecomposition, we can conclude the following directions.

**Matrix Rank:** Assume we have the SVD of a matrix as . The rank of equals the number of non-zero singular values which is the number of non-zero diagonal elements in .

### Conclusion

In this article, I talked about Singular Value Decomposition and what makes it essential. You saw some of its applications as well. One important aspect that you need to be aware of is the similarity of Singular Value Decomposition and eigendecomposition. There are many different aspects of SVD that I did not talk about here. ** I tried to address what you may need in Machine Learning**. Do NOT hesitate to ask if you have any questions. Please share your thoughts, so we call all learn better together.