简体   繁体   中英

How to calculate time complexity of given algorithm ( ridge regression)?

i have following expression and i need to calculate time complexity of this algorithm. Could anybody help to get correct time complexity of this algorithm.

% save a matrix-vector multiply
Atb = A'*b;

% cache the factorization (using cholesky factorization)
[L U] = factor(A, a);  

for( k = 0; k < maxiter; k++) 
    {
         x^k+1 = (A^TA + a* I)^-1 (A^Tb + a (z^k - u^k))^T
    }

Where A = mxn matrix and n>>>m, b,u,z = nx1 vectors, I = identity matrix and a=0.001

The most computationally intensive operation here is matrix inversion, so it depends on how you implement this operation. If we assume that you implemented with a Gauss–Jordan algorithm which takes O(n^3) then overall complexity is O(maxiter * n^3) . Here i take into account that n is bigger than m ( A^T*A takes O(m*n^2) ).

If you calculate (A^T*A + a*I)^-1 and A^Tb outside then you are left with

Inv * (Atb + a(z^k - u^k))^T

which is O(n^2) because you need to multiply nxn matrix by nx1 vector while addition and subtraction take O(n) .

Still, you have some inconsistencies in sizes which i described in comments for the question.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM