Sunday 1 November 2020

Introduction to Linear Algebra - 2. Solving Linear Equations

 2. Solving Linear Equations

This article is a summary and a note-taking from the book 'Introduction to Linear Algebra, Gilbert Strang'. 

https://math.mit.edu/~gs/linearalgebra/


2.1 Vectors and Linear Equations 


The central problem of Linear Algebra is to solve a system of equations. 


Two pictures in solving linear equations in three-dimensional space. 


Rows.

The row picture shows each plane meeting at a single point. (the solution)


Cols.

The column picture combines the column vectors on the left side to produce the vector b on the right side. 


2.2 The Idea of Elimination 


A linear system Ax=b becomes upper triangular Ux=c after elimination.


The upper triangular system is solved by back substitution (starting at a bottom) 


When breakdown on an elimination is permanent, the system has no solution or infinitely many. 



2.3 Elimination Using Matrices


We now combine two ideas to solve linear equations, elimination and matrices.


Identity matrix, Elimination matrix, Permutation matrix.


Augmented matrix.

We can include b as an extra column and follow it through elimination. 



2.4 Rules for Matrix Operations


  • Matrix multiplication


The entry in row i and column j of AB: (row i of A) dot (col j of B)


  • The laws for matrix operations


Commutative law.

Distributive law.

Associative law.


  • Block multiplication 



2.5. Inverse Matrices


Inverse matrix

Def.

The matrix A is invertible if there exists a matrix A^-1 such that A^-1A = I and AA^-1 = I. 


The inverse of a product AB


(AB)^-1 = B^-1A^-1


Calculating A^-1 by Gauss-Jordan Elimination

A^-1를 직접 compute 하여 구하는 것은 필요도 없고 비효율적이다. 


A^-1를 구하기 위한 Gauss-Jordan elimination를 소개하고 있다. [A I] matrix에 A^-1 를 곱한다고 생각하여 [I A^-1]을 elimination을 통해서 유도한다. 


Singular and invertible (important)

A inverse는 n by n matrix A가 square이고 n개의 pivots을 가질 때만 존재한다.


Proof.

  • Right-inverse

n개의 pivots이 있으면 elimination으로 모든 형태의 Axi = ei를 풀 수 있다. xi column들이 A^-1가 되므로 AA^-1 = I이다.


  • Left-inverse

elimination은 실질적으로 E, P, D^-1 matrix의 sequence이다. 즉 (D^-1…E…P…E)A = I이다. 이 matrix의 matrix는 주어진 식 자체로 left-inverse임을 나타낸다. 


위의 반대 추론도 가능하다. 증명은 간단하게 가능하다. 


=> Elimination gives a complete test for invertibility of a square matrix. A^-1 exists (and Gauss-Jordan finds it) exactly when A has n pivots. The argument above shows more: if AC = I then CA = I and C = A^-1.



2.6. Elimination = Factorization: A = LU


Elimination = Factorization: A = LU


주어진 linear equations을 풀기 위해 elimination 하는 과정은 주어진 행렬 A를 A = LU로 factorization 하는 것과 같다.


(E32E31E21)A = U becomes A = (E21^-1E31^-1E32^-1)U which is A = LU.


elimination step 중 row exchange를 가정하지 않는다면 위와 같이 elimination matrix Ei로만 elimination 된 matrix를 얻을 수 있다.


이 elimination matrix Ei는

  • Invertible 하며 

  • Ei가 중첩되더라도 elimination step이 upside down으로 적용되므로 서로 mix up 되지 않는다. 즉, pivot row는 다시 변하지 않는다. 아래의 수식으로 표현할 수 있다. 


row 3 of U = (row 3 of A) - l31(row 1 of U) - l32(row 2 of U)

=> row 3 of A = l31(row 1 of U) + l32(row 2 of U) + 1(row 3 of U)


U matrix가 diagonal에 상수를 가지는데 이는 L과 대조된다. 따라서 diagonal matrix D를 두어 A = LU 대신 A = LDU로 조금 더 symmetric하게 나타낼 수 있다. 


One square system = Two triangular system

주어진 Ax = b는 다음과 같은 과정으로 푼다.


  1. Factor (into L and U, by elimination on the left side matrix A)

  2. Solve (forward elimination on b using L, then back substitution for x using U.)


The cost of elimination 


A practical question is cost - or computing time.


Elimination on A requires ⅓ n^3 multiplications and subtractions.

Each right side needs n^2 multiplications and subtractions. 


However most matrices in practice are sparse (or in a form of band). In these cases, A = LU is much faster. 


2.7. Transposes and Permutations


Transpose


(A^T)ij = Aji

A^T is invertible iff A is invertible.



Symmetric matrix 

If A^T = A, A is a symmetric matrix. 


Symmetric matrix in elimination


When A is symmetric, the usual form A = LDU becomes A = LDL^T.


Proof. 

https://math.stackexchange.com/questions/385414/prove-that-if-a-is-symmetric-and-has-a-lu-decomposition-then-a-ldu-rightarr

=> By uniqueness of matrix decomposition elements, A can be described as A = LDU = LDL^T


Permutation


Def.

A permutation matrix has the rows of the identity I in any order.


  • P^-1 is also a permutation matrix.

  • P^-1 is always the same as P^T.


Proof.

Let P = A product of Pij (only exchange rows i, j)j, then P^-1 = (A product of Pij)^-1.


The PA = LU factorization with Row Exchanges


More generalized, PA = LU.





No comments:

Post a Comment