DEV Community

Cover image for Singular Value Decomposition (SVD)
Shlok Kumar
Shlok Kumar

Posted on

Singular Value Decomposition (SVD)

Singular Value Decomposition (SVD) is a key concept in matrices, particularly useful in machine learning, data compression, and dimensionality reduction. It allows us to express a matrix in a way that reveals its underlying structure and properties.

What is SVD?

For any ( m times n ) matrix ( A ), SVD breaks it down into three matrices:

A = U Σ V^T
Enter fullscreen mode Exit fullscreen mode

Where:

  • ( U ) is an ( m times m ) orthogonal matrix whose columns are the left singular vectors.
  • ( Σ ) is an ( m times n ) diagonal matrix containing the singular values.
  • ( V ) is an ( n times n ) orthogonal matrix whose columns are the right singular vectors.

Understanding how to calculate these matrices is vital for performing SVD.

Steps to Calculate SVD

Step 1: Normalize Eigenvectors

To begin, we calculate the eigenvectors of a square matrix ( X ). For each eigenvector ( x_i ), we find the normalized eigenvector ( v_i ) by dividing by its magnitude:

Let x = [1, 2, 4]
=> mag(x) = √(1² + 2² + 4²) = √21
=> v = [(1/√21), (2/√21), (4/√21)]
Enter fullscreen mode Exit fullscreen mode

Step 2: Calculate Matrix ( V )

For the matrix ( A ), we compute ( A^T A ), which results in a symmetric matrix. We can find its eigenvectors ( v_i ) through:

(A^T A)v_i = σ_i²v_i = λ_iv_i
Enter fullscreen mode Exit fullscreen mode

Where:

  • ( λ_i ) are the corresponding eigenvalues,
  • ( σ_i ) are the singular values.

The matrix ( V ) is formed by arranging the normalized eigenvectors as columns:

V = [v₁, v₂, ..., v_n]
Enter fullscreen mode Exit fullscreen mode

Step 3: Calculate Matrix ( U )

Similarly, we compute ( A A^T ) to find its eigenvectors ( x_i ):

(AA^T)x_i = σ_i²x_i = λ_ix_i
Enter fullscreen mode Exit fullscreen mode

The columns of matrix ( U ) are computed using:

u_i = Av_i / σ_i
Enter fullscreen mode Exit fullscreen mode

Thus, we form:

U = [u₁, u₂, ..., u_m]
Enter fullscreen mode Exit fullscreen mode

Step 4: Create Diagonal Matrix ( Σ )

The diagonal matrix ( Σ ) depends on the rank of ( A ) and can have different forms based on its dimensions:

  • If ( m ≤ n ):
Σ = [σ₁, 0, 0]
    [0, σ₂, 0]
Enter fullscreen mode Exit fullscreen mode
  • If ( m ≥ n ):
Σ = [σ₁, 0, 0]
    [0, σ₂, 0]
    [0, 0, σ₃]
    [0, 0, 0]
Enter fullscreen mode Exit fullscreen mode

This structure implies that singular values ( σ_i ) determine the rank of the matrix.

Example Problem

Let’s consider the following matrix ( A ):

A = [1, 1, 0]
    [0, 1, 1]
Enter fullscreen mode Exit fullscreen mode

Step 1: Find ( A^T ) and Compute ( A^T A )

A^T = [1, 0]
      [1, 1]
      [0, 1]

A^T A = [1, 1, 0]
         [1, 1, 1]
         [0, 1, 1]
Enter fullscreen mode Exit fullscreen mode

Step 2: Find Eigenvalues of ( A^T A )

The eigenvalues for ( A^T A ) are ( λ = 0, 1, 3 ).

Step 3: Calculate Singular Values

Using the formula:

σ_i = √λ_i
Enter fullscreen mode Exit fullscreen mode

We get:

  • ( σ₁ = √3 )
  • ( σ₂ = 1 )
  • ( σ₃ = 0 )

Step 4: Form Diagonal Matrix ( Σ )

Since ( m < n ):

Σ = [√3, 0, 0]
    [0, 1, 0]
Enter fullscreen mode Exit fullscreen mode

Step 5: Find Eigenvectors and Normalize

For the eigenvalues from ( A^T A ):

  • For ( λ₁ = 3 ), eigenvector ( x_1 = [1, 2, 1] ) normalized to ( v_1 = [(1/√6), (2/√6), (1/√6)] ).
  • For ( λ₂ = 1 ), eigenvector ( x_2 = [-1, 0, 1] ) normalized to ( v_2 = [(-1/√2), 0, (1/√2)] ).
  • For ( λ₃ = 0 ), eigenvector ( x_3 = [1, -1, 1] ) normalized to ( v_3 = [(1/√3), (-1/√3), (1/√3)] ).

Step 6: Construct Matrix ( V )

V = [(1/√6), (-1/√2), (1/√3)]
    [(2/√6), 0, (-1/√3)]
    [(1/√6), (1/√2), (1/√3)]
Enter fullscreen mode Exit fullscreen mode

Step 7: Compute Matrix ( U )

Using the normalized eigenvectors:

u_1 = Av_1 / σ₁
u_2 = Av_2 / σ₂
Enter fullscreen mode Exit fullscreen mode

Resulting in:

U = [(1/√2), (-1/√2)]
    [(1/√2), (1/√2)]
Enter fullscreen mode Exit fullscreen mode

Step 8: Final SVD Representation

Finally, we can express ( A ) using SVD:

A = U Σ V^T
Enter fullscreen mode Exit fullscreen mode

This leads to the complete decomposition of matrix ( A ).

Conclusion

Singular Value Decomposition is a powerful tool that simplifies complex matrices into more manageable forms, making it essential for many applications in machine learning and data analysis. Understanding SVD allows us to leverage the intrinsic properties of data for better insights and performance.

FAQs

What is SVD used for?

SVD is widely used in data compression, noise reduction, and dimensionality reduction techniques such as Principal Component Analysis (PCA).

Can all matrices be decomposed using SVD?

Yes, every matrix can be decomposed into its singular values and vectors, regardless of its shape or rank.

What do the singular values indicate?

Singular values provide insight into the importance of corresponding dimensions in the data, helping identify the most significant features.

For more content, follow me at —  https://linktr.ee/shlokkumar2303

Top comments (0)