fbpx

how i invert a matrix without inverting it ^^

0

Introduction

In this article, we show you how to invert a matrix without going through the burdens of inversions, i.e. without doing \mathcal{O}(n^3) operations, which is required by Gauss Jordan. Herein, a Newton method is adopted for matrix inversion.

The approach is via Newton’s method and instead of computing explicitly A^{-1}, we solve <meta charset="utf-8">\pmb{F}( \pmb{X} ) = \pmb{I} - \pmb{A} \pmb{X} =0 whose solution is obviously A^{-1}.

Approach

Given matrix \pmb{X} \in \mathbb{R}^{n \times n}, define

(1)   \begin{equation*} \pmb{F}( \pmb{X} ) = \pmb{I} - \pmb{A} \pmb{X} \end{equation*}

Note that the solution of the above problem is \pmb{X} = \pmb{A}^{-1}, since

(2)   \begin{equation*} \pmb{F}( \pmb{X} ) = \pmb{F}( \pmb{A}^{-1} ) = \pmb{I} - \pmb{A} \pmb{A}^{-1} = \pmb{I} - \pmb{I} = \pmb{0} \end{equation*}

Using Newton, the updates to solve \pmb{X} are

(3)   \begin{equation*} \pmb{X}_k = \pmb{X}_{k-1} - [\pmb{F}'(\pmb{X}_{k-1} )]^{-1}\pmb{F}(\pmb{X}_{k-1}) \end{equation*}

But \pmb{F}'(\pmb{X}_{k-1} ) = -\pmb{A} hence

(4)   \begin{equation*} \pmb{X}_k = \pmb{X}_{k-1} + \pmb{A}^{-1} \big( \pmb{I} - \pmb{A} \pmb{X}_{k-1} \big) \end{equation*}

But \pmb{A}^{-1} is what we trying to estimate by iterations, and hence replace it by the most recent estimate of \pmb{A}^{-1}, namely \pmb{X}_{k-1}

(5)   \begin{equation*} \pmb{X}_k = \pmb{X}_{k-1} +\pmb{X}_{k-1} \big( \pmb{I} - \pmb{A} \pmb{X}_{k-1} \big) \end{equation*}

Defining a residual matrix

(6)   \begin{equation*} \pmb{R}_k = \pmb{I} - \pmb{A}\pmb{X}_k \end{equation*}

and error matrix

(7)   \begin{equation*} \pmb{E}_k = \pmb{A}^{-1} - \pmb{X}_k \end{equation*}

Let us compute \pmb{R}_{k+1} using equation (6) then plugging equation (4) at k+1

(8)   \begin{equation*} \begin{split} \pmb{R}_{k+1} = \pmb{I} - \pmb{A}\pmb{X}_{k+1} &= \pmb{I} - \pmb{A}\big[ \pmb{X}_{k} +\pmb{X}_{k} \big( \pmb{I} - \pmb{A} \pmb{X}_{k} \big) \big] \\ &= \pmb{I} - \pmb{A}\pmb{X}_k - \pmb{A} \pmb{X}_{k} \big( \pmb{I} - \pmb{A} \pmb{X}_{k} \big) \\ &= \pmb{I} - \pmb{A}\pmb{X}_k - \pmb{A}\pmb{X}_k + \pmb{A}\pmb{X}_k \pmb{A}\pmb{X}_k \\ &= \pmb{I} - 2\pmb{A}\pmb{X}_k + \pmb{A}\pmb{X}_k \pmb{A}\pmb{X}_k \end{split} \end{equation*}

(9)   \begin{equation*} \begin{split} \pmb{R}_k^2 = \pmb{R}_k\pmb{R}_k = \big( \pmb{I} - \pmb{A}\pmb{X}_k \big) \big( \pmb{I} - \pmb{A}\pmb{X}_k \big) &= \pmb{I} - \pmb{A}\pmb{X}_k - \pmb{A}\pmb{X}_k + \pmb{A}\pmb{X}_k \pmb{A}\pmb{X}_k \\ &= \pmb{I} - 2\pmb{A}\pmb{X}_k + \pmb{A}\pmb{X}_k \pmb{A}\pmb{X}_k \\ &= \pmb{R}_{k+1} \end{split} \end{equation*}

Now

(10)   \begin{equation*} \begin{split} \pmb{E}_{k+1} = \pmb{A}^{-1} - \pmb{X}_{k+1} &= \pmb{A}^{-1} - \big[ \pmb{X}_{k} +\pmb{X}_{k} \big( \pmb{I} - \pmb{A} \pmb{X}_{k} \big) \big] \\ &= \pmb{A}^{-1} - \pmb{X}_{k} -\pmb{X}_{k} \big( \pmb{I} - \pmb{A} \pmb{X}_{k} \big) \\ &= \pmb{A}^{-1} - \pmb{X}_{k} - \pmb{X}_{k} + \pmb{X}_{k} \pmb{A} \pmb{X}_{k} \\ &= \pmb{A}^{-1} - 2 \pmb{X}_{k} + \pmb{X}_{k} \pmb{A} \pmb{X}_{k} \end{split} \end{equation*}

Also,

    \begin{equation*} \begin{split} \pmb{E}_k\pmb{A}\pmb{E}_k = \big( \pmb{A}^{-1} - \pmb{X}_k \big) \pmb{A} \big( \pmb{A}^{-1} - \pmb{X}_k \big) &= \big( \pmb{I} - \pmb{X}_k \pmb{A} \big) \big( \pmb{A}^{-1} - \pmb{X}_k \big) \\ &= \pmb{A}^{-1} - \pmb{X}_k - \pmb{X}_k \pmb{A}\pmb{A}^{-1} - \pmb{X}_k \pmb{A}\pmb{X}_k \\ &= \pmb{A}^{-1} - \pmb{X}_k - \pmb{X}_k - \pmb{X}_k \pmb{A}\pmb{X}_k\\ &= \pmb{A}^{-1} - 2\pmb{X}_k - \pmb{X}_k \pmb{A}\pmb{X}_k \\ &= \pmb{E}_{k+1} \end{split} \end{equation*}

MATLAB implementation

So, Let’s start with building a function called NewtonInverse, which computes the inverse based on equation (5). This is simply attained as such

function[X] = NewtonInverse(A,N,Niter)

%% NewtonInverse: computes inverse of A 
%input
%A     - matrix
%N     - matrix dimension
%Niter - maximum number of iterations

I = eye(N);
%the true inverse to compute the error.
A_trueinverse = inv(A);
%initial guess
X(:,:,1) = A'/(norm(A,1) * norm(A,inf));

%iterations
for n = 2:Niter
   X(:,:,n) = X(:,:,n-1) + X(:,:,n-1)*(I - A*X(:,:,n-1));
   error(n-1) = norm(X(:,:,n) - A_trueinverse,2);
end

For sake of completeness, we will see how the iterations converge and compare it with an inverse based off MATLAB’s LU decomposition (which is not iterative herein). So we shall add the following block

%Using LU decompostion
[L1,U] = lu(A);
A_inverse_LU = inv(U)*inv(L1);
error_LU = norm(A_inverse_LU - A_trueinverse,2);
figure
plot(log10(error),'k','Linewidth',2)
hold on
plot(log10(error_LU*ones(Niter,1)),'--*r','Linewidth',2)
xlabel('n iteration')
ylabel('error (log scale)')
legend('Using Newton','Matlabs LU decomposition')
grid on
grid minor

Now our function looks like this

function[X] = NewtonInverse(A,N,Niter)

%% NewtonInverse: computes inverse of A 
%input
%A     - matrix
%N     - matrix dimension
%Niter - maximum number of iterations

I = eye(N);
%the true inverse to compute the error.
A_trueinverse = inv(A);
%initial guess
X(:,:,1) = A'/(norm(A,1) * norm(A,inf));

%iterations
for n = 2:Niter
   X(:,:,n) = X(:,:,n-1) + X(:,:,n-1)*(I - A*X(:,:,n-1));
   error(n-1) = norm(X(:,:,n) - A_trueinverse,2);
end
%Using LU decompostion
[L1,U] = lu(A);
A_inverse_LU = inv(U)*inv(L1);
error_LU = norm(A_inverse_LU - A_trueinverse,2);
figure
plot(log10(error),'k','Linewidth',2)
hold on
plot(log10(error_LU*ones(Niter,1)),'--*r','Linewidth',2)
xlabel('n iteration')
ylabel('error (log scale)')
legend('Using Newton','Matlabs LU decomposition')
grid on
grid minor

Our main function will then test the above by simply generating a random matrix and inputting it to the above function as

A = 50*randn(3,3);
N = 3;
Niter = 30;

[X] = NewtonInverse(A,N,Niter);

convergence of newton inverse
convergence of newton inverse


Summary

We have presented a method of matrix inversion through Newton by iterating over an appropriate cost function. MATLAB code is detailed as well.
Finally, Subscribe to my channel to support us ! We are very close to 100K subscribers <3 w/ love. The algorithmic chef – Ahmad Bazzi. Click here to subscribe. Check my other articles here.