# Question:interpolate and plot larger data sets

## Question:interpolate and plot larger data sets

Maple needs commands to interpolate and plot large data sets. Consider the following:
```# generate data
data := convert(LinearAlgebra[RandomMatrix](10000,2,generator=-100000..100000),listlist):
data := sort(map([op], [op(op(table(map(`=`@op, data))))]), (a,b)->evalb(a < b)):
data := Matrix(map(proc(a) [a,a/200.] end,data), datatype=float):

# now suppose I take this data and...
f := CurveFitting[BSplineCurve](data, x);
plot(f, 0..100);
```
Someone trying to do this with real data (and 10000 points is small) will experience the following problems: 1) BSplineCurve takes forever, 200 points is seconds ?! 2) worksheet will probably crash if f is printed 3) evaluation of f for plot is O(n^2) I did not even realize how slow 1) was when I started writing this post. Lets assume you reimplement this command. The second problem should be easy to fix if it is not already fixed in the GUI, just test a large piecewise function. The real problem is 3). Each evaluation of f does an O(n) linear search. Maple needs a more efficient piecewise data structure which does a binary search. Even construction of piecewise functions is slow. Here is a benchmark:
```n := 100:
f := unapply(piecewise(seq(op([x < i, i]), i=1..n), n+1),x):
time(seq(f(i),i=1..n));
```
Notice that doubling n quadruples the time, demonstrating the point about O(n^2). You can not do this! This is bad! Also, just constructing f seems to take forever when n is large. Why is that ? All of this should be O(n*log(n)), so that 1000 data points is 15 times slower than 100 data points, and 10000 data points is 13.33 times slower than 1000 data points, and the time for all of them should be less than a tenth of a second or MATLAB users will scoff at you. ﻿