:

## Stupid SVD Tricks Part 1

Of all the ways to decompose a numerical (floating point) matrix, my favorite is the singular value decomposition (SVD).  There are a lot of applications of the SVD (see my dissertation for one related to polynomial algebra) but my favorite ones are probably two applications related to image processing.

The first one I want to talk about comes from the cover of James Demmel's book "Applied Numerical Linear Algebra": image compression.  This example gives a really cool intuitive understanding of the Rank of Matrix and is also nice excuse to play with Maple's ImageTools package.

So, the first thing you need a test image. I used the classic image compression benchmark of a Mandrill.

The result is a 512x512x3 array.  In order to do something with this, we need to make it into a matrix so, call

manmat:=convert(ArrayTools:-Reshape(mandrill, 512*3, 512), Matrix);

Now we can compute a singular value decomposition of the image:

(U, S, V) := LinearAlgebra:-SingularValues(manmat, output = ['U', 'S', 'Vt']);

Now we can zero-out small singular values and multiply things back together to create low-rank approximations of the matrix that are also compressed versions of the image.
Rank 32 will give us 1/8 of the data (64 dimension 512 vectors: 32 rows of U, 32 columns of V, and the 32 corresponding singular values) but still a pretty good image:

rank32approx:=MatrixMatrixMultiply(`.`(U, DiagonalMatrix(S[1..32], 3*512, 512)), V, outputoptions = [order = C_order]);

This reshape it back to an image and display:

Preview((Reshape(rank32approx, 512, 512, 3)));

Taking things down to rank 8, is leaving only 1/32 of the data, but it is amazing how what is left resembles the original image:

rank8approx:=MatrixMatrixMultiply(`.`(U, DiagonalMatrix(S[1..8], 3*512, 512)), V, outputoptions = [order = C_order]);
Preview((Reshape(rank8approx, 512, 512, 3)));

To look at more images in order of descending rank, take a look at my worksheet:

Next time: eigenfaces

﻿