What is the largest linear system that Maple can solve? You might be surprised to find out. In this article we present strategies for solving sparse linear systems over the rationals. An example implementation is provided, but first we present a bit of background. Sparse linear systems arise naturally from problems in mathematics, science, and engineering. Typically many quantities are related, but because of an underlying structure only a small subset of the elements appear in most equations. Consider networks, finite element models, structural analysis problems, and linear programming problems.

In all of these cases each element is connected to only a handful of neighbours making the resulting linear system sparse. Many mathematical algorithms also produce sparse linear algebra problems. For example, Berlekamp's algorithm for factoring polynomials, the quadratic seive algorithm for integer factorization, and the index calculus algorithm for computing discrete logarithms all use sparse linear algebra mod p or p^k-1. We could modify the code below to run mod p, but that's another article.


This article focuses on direct methods for solving linear systems, ie: Gaussian elimination. Iterative methods are not considered, although we could implement them - see this post. The main problem encountered when using a direct method is "fill in". A poor choice of pivot will increase the number of non-zero entries in each row and subsequently the amount of work required to solve the rest of the system. Repeatly choosing bad pivots will fill in the matrix entirely, producing a dense system that can be very hard to solve. Good pivot choices preserve sparsity and keep the elimination process from getting bogged down.

Structured Gaussian Elimination

This algorithm was developed to solve very large systems coming from integer factorization and discrete logarithm computations. You can read about it in more detail here. The algorithm assumes that the non-zero elements are not evenly distributed among the columns. It first solves for variables that appear in only one equation, repeatedly, until all triangular blocks are removed from the system. Next it declares a small number of the densest columns "heavy" and uses equations with one "light" variable for elimination. This ensures that the light part of the matrix never fills in. At some point one can also drop a few of the densest rows and recurse on the rest of the system to quickly solve for most of the variables. Afterwards the dense equations are evaluated at the solutions to obtain a smaller system in the unsolved variables. Experience shows that extremely sparse systems will often collapse.

Markowitz Pivoting

Markowitz pivoting is a simple and popular heuristic for reducing fill in. It selects a pivot to minimize the product of the number of non-zero row and column elements. That is, each non-zero A[i,j] is given weight (#row[i] - 1)(#column[j] - 1) and a pivot with minimal weight is selected. We subtract one from the row and column counts to force the selection of single elements in a row or column if they exist, since their choice produces no fill in at all. A common problem with the strategy is that the overhead required to select a pivot and update the row and column weights after each elimination step is O(|A|). To make our implementation efficient we select blocks of pivots at a time. The equations containing the pivots are triangularized and a block of variables is eliminated from the system in each step of the algorithm.

Dense Modular Methods

Eventually after running these methods one obtains a small dense linear system. To solve these systems we use LinearAlgebra:-Modular:-IntegerLinearSolve command, which uses hardware arithmetic to solve the system modulo primes and Chinese remaindering and rational reconstruction to recover the solutions. One advantage to using this routine is that it is output sensitive - the amount of computation it does depends mostly on the size of the result instead of the size of the input. The tradeoff is a very small probability of error, but the benefits are huge for structured systems where the solutions are often small.

Maple Code and Examples Download some preliminary Maple code from my homepage here. You may also want to download some example systems: small (900 x 924), medium (3211 x 3211), random (21952 x 21929), and large (99779 x 99619). I have intentionally made these systems easy so that you can see how the algorithms work on a slow computer. The code requires Maple 11. Here is an example:

> read "sge.mpl":		# load code
> read "medsys": # assigns the system to 'eqns'
> sol := RationalSparse(eqns): # solves system
> subs(sol, eqns); # check solution (if small!)
> map(lprint, sort_by_length(sol)): # print solution

I'm hoping that you'll want to try hard problems though. The University of Florida Sparse Matrix Collection is the best place to start. They have a lot of matrices there, and you can sort them by type to find ones with integer or binary entries (sorry, no floats or complex numbers in this code - that's another article). Once you've found a matrix you like, download it in MatrixMarket format (abbreviated MM) and decompress the file. I have included a MatrixMarket import command for you to use, with two examples below.

# bcspwr06 with randomly generated rhs
> eqns := MatrixMarket("bcspwr06.mtx"):
> r := rand(-10..10): # random number generator
> eqns := eqns - [seq(r(), i=1..nops(eqns))]:
> sol := RationalSparse(eqns):

# lpi_ceria3d with included rhs
> eqns := MatrixMarket("lpi_ceria3d.mtx", rhs="lpi_ceria3d_b.mtx"):
> eqns := convert(eqns, 'rational'): # important
> sol := RationalSparse(eqns):

You can also get a sparse Maple Matrix out of the MatrixMarket command by specifying output=matrix, however you should be careful with those since many routines in Maple are not optimized for sparse matrices. For example, a large sparse matrix (50000 x 50000) will hang the matrix browser. If you find any bugs or if you think the code above performs poorly on some systems please let me know - I'm trying to tweak it. Either a private message here or an email to the address on the bottom of my homepage would be appreciated.

Please Wait...