Linear least squares in scipy - accuracy of QR factorization vs other methods -


i have tried solving linear least squares problem ax = b in scipy using following methods:

x = numpy.linalg.inv(a.t.dot(a)).dot(a.t).dot(b) #usually not recommended 

and

x = numpy.linalg.lstsq(a, b) 

both give identical results. tried manually using qr algorithm ie:

qmat, rmat = la.qr(a)  bpr = dot(qmat.t,b) n=len(bpr) x = np.zeros(n) in xrange(n-1, -1,-1):     x[i] = bpr[i]     j in xrange(i+1, n):         x[i] -= rmat[i, j]*x[j]     x[i] /= rmat[i,i] 

this method, however, gives inaccurate results (errors on order of 1e-2). have made n00b mistake code or maths? or, issue method, or scipy itself?

my numpy version 1.6.1 (the mkl compiled version http://www.lfd.uci.edu/~gohlke/pythonlibs/), python 2.7.3 on x86_64.

if using binaries, qr factorization computed intel mkl, , correct.

for me above code gives solutions within 1e-12 of correct result, random matrices. matrixes did test with, , how measure error?

there cases in least squares problem ill-conditioned. instance, matrix large null space, rounding errors can affect result. consider rank-1 matrix:

np.random.seed(1234) v = np.random.rand(100) = v[:,none] * v[none,:] b = np.random.randn(100)  x = scipy.linalg.lstsq(a, b)[0] print(np.linalg.norm(a.dot(x) - b)) # -> 9.63612833771  # xp obtained using above code print(np.linalg.norm(a.dot(xp) - b)) # -> 3262.61161684 

your home-brewn triangular solve more suspectible rounding error more written lapack routine used in lstsq, less accurate.


Comments

Popular posts from this blog

Why does Ruby on Rails generate add a blank line to the end of a file? -

keyboard - Smiles and long press feature in Android -

node.js - Bad Request - node js ajax post -