Difference between revisions of "Linear algebra"

From Conservapedia
Jump to: navigation, search
(Basic Concepts: more)
m
 
(23 intermediate revisions by 8 users not shown)
Line 1: Line 1:
'''Linear algebra''' is the [[mathematical]] subject that studies [[vector]]s, vector spaces, linear maps, and systems of linear [[equation]]s. Branches of Linear algebra typically include Linear equations, Matrices, [[Matrix]] decompositions, Computations, Vectors, Vector spaces, Multilinear algebra, Affine space, Projective space.<ref>[http://www.m-w.com/cgi-bin/dictionary?book=Dictionary&va=linear+algebra Merriam Webster definition of Linear Algebra]</ref>  
+
'''Linear algebra''' is the branch of [[mathematics]] that deals with [[vector]]s, vector spaces, linear maps, and systems of linear [[equation]]s. Topics studied in Linear algebra include Linear equations, Matrices, [[Matrix]] decompositions, Computations, Vectors, Vector spaces, Multilinear algebra, Affine space, Projective space.<ref>[http://www.m-w.com/cgi-bin/dictionary?book=Dictionary&va=linear+algebra Merriam Webster definition of Linear Algebra]</ref>  
Linear algebra has numerous applications in [[engineering]], [[chemistry]] and [[physics]].  
+
Linear algebra has numerous applications in [[engineering]], [[chemistry]] and [[physics]].  Matrices can be used for data fitting tasks such as "curving" student grades on exams, and for finding correlations between two sets of data.  Linear algebra is particularly useful for organizing and simplifying data.
  
 
The three most vexing computations in linear algebra are these:
 
The three most vexing computations in linear algebra are these:
 
#linear equations
 
#linear equations
#least squares
+
#least squares ([[data fitting]])
 
#finding the eigenvalues of an ''n x n'' matrix for n > 3.
 
#finding the eigenvalues of an ''n x n'' matrix for n > 3.
  
Line 18: Line 18:
  
 
*Subspaces
 
*Subspaces
:*[[image]] and [[kernel]]
+
:*[[image (mathematics)|image]] and [[kernel (geometry)|kernel]]
 
:*[[basis]] and [[span]]
 
:*[[basis]] and [[span]]
 
:*[[dimension]]
 
:*[[dimension]]
Line 25: Line 25:
 
:*[[Sarrus's rule]]
 
:*[[Sarrus's rule]]
 
:*geometrical interpretation
 
:*geometrical interpretation
::*classical adjoint, expansion factor, application to [[parallelepiped]]s)
+
::*classical adjoint, expansion factor, application to [[parallelepiped]]s
 
:*determinant of similar matrix, [[inverse matrix]], product of matrices
 
:*determinant of similar matrix, [[inverse matrix]], product of matrices
 
:*Cramer's rule (with and without product rule)
 
:*Cramer's rule (with and without product rule)
Line 48: Line 48:
 
:*[[orthogonal matrix]]
 
:*[[orthogonal matrix]]
 
:*projections
 
:*projections
:*Gram-Schmidt Process and QR Factorization
+
:*Gram-Schmidt Process and [[QR factorization]]
 
:*orthogonal matrices, orthogonal transformations
 
:*orthogonal matrices, orthogonal transformations
 
:*data fitting, especially least squares
 
:*data fitting, especially least squares
Line 55: Line 55:
 
:*[[spectral theorem]]
 
:*[[spectral theorem]]
  
*Quadratic forms
+
*[[Quadratic forms]]
  
 
*Linear dynamical systems
 
*Linear dynamical systems
 
:*Euler's Formula
 
:*Euler's Formula
 +
 +
=== Notation ===
 +
 +
*matrices are commonly represented by ''A'' and ''B''
 +
*diagonal matrices are represented by ''D''
 +
*an upper triangular matrix is represented by ''R'', as in QR factorization
 +
*when ''A'' is similar to ''B'', then an [[invertible matrix]] ''S'' is used to represent that ''AS'' = ''SB''
  
 
== More advanced topics include ==
 
== More advanced topics include ==
Line 89: Line 96:
  
 
*simplifying or reducing matrices
 
*simplifying or reducing matrices
*[[Gauss-Jordan elimination]]
+
*[[Gaussian elimination|Gauss-Jordan elimination]]
 
*matrix multiplication
 
*matrix multiplication
 
*finding inverses and transposes of matrices
 
*finding inverses and transposes of matrices
Line 97: Line 104:
 
:#solve for the eigenvalues
 
:#solve for the eigenvalues
 
:#solve for the eigenvectors
 
:#solve for the eigenvectors
 +
*finding the orthogonal projection of a vector in a vector space
 
*diagonalize a matrix
 
*diagonalize a matrix
 
*find the geometric equivalent of a matrix
 
*find the geometric equivalent of a matrix
Line 102: Line 110:
 
*finding the inverse of a matrix
 
*finding the inverse of a matrix
 
*[[decomposition]] or [[factorization]] of a matrix: representing a given matrix as a product of simpler matrices
 
*[[decomposition]] or [[factorization]] of a matrix: representing a given matrix as a product of simpler matrices
 +
*QR factorization
 +
*find the least squares solution for a set of data
 +
*use known characteristics of symmetric or diagonalizable matrices to find solutions
 +
 +
== Application: Analyzing [[Liberal style]] on [[Wikipedia®]] ==
 +
 +
An ''n x m'' matrix can be developed using observed incidents of [[liberal style]] in Wikipedia entries, and that data can then be simplified to draw conclusions about how liberal style can mislead viewers.  The ''n'' rows can represent different elements of liberal style, while the ''m'' columns can represent different types of entries on Wikipedia.
 +
 +
The prevalence of certain types of liberal style may reflect their perceived effectiveness, and a linear algebra-based approach at modeling it may be useful in debunking the flawed reasoning and claims.
  
 
==References==
 
==References==
 +
 
{{reflist}}
 
{{reflist}}
  
[[Category:Linear algebra]]
+
[[Category:Linear Algebra]]
[[category:mathematics]]
+
[[Category:Mathematics]]

Latest revision as of 15:28, September 11, 2017

Linear algebra is the branch of mathematics that deals with vectors, vector spaces, linear maps, and systems of linear equations. Topics studied in Linear algebra include Linear equations, Matrices, Matrix decompositions, Computations, Vectors, Vector spaces, Multilinear algebra, Affine space, Projective space.[1] Linear algebra has numerous applications in engineering, chemistry and physics. Matrices can be used for data fitting tasks such as "curving" student grades on exams, and for finding correlations between two sets of data. Linear algebra is particularly useful for organizing and simplifying data.

The three most vexing computations in linear algebra are these:

  1. linear equations
  2. least squares (data fitting)
  3. finding the eigenvalues of an n x n matrix for n > 3.

Basic Concepts

  • coefficient matrices and Gauss-Jordan elimination
  • rank (row rank = column rank)
  • geometric representations, especially vectors and systems with more variables than equations
  • transformations, inverses and matrix products
  • Subspaces
  • determinant of similar matrix, inverse matrix, product of matrices
  • Cramer's rule (with and without product rule)
  • minor of a matrix
  • Laplace expansion (cofactors)
  • orthogonal (perpendicular) vectors
  • orthonormal vectors
  • orthogonal projections
  • orthogonal matrix
  • projections
  • Gram-Schmidt Process and QR factorization
  • orthogonal matrices, orthogonal transformations
  • data fitting, especially least squares
  • Linear dynamical systems
  • Euler's Formula

Notation

  • matrices are commonly represented by A and B
  • diagonal matrices are represented by D
  • an upper triangular matrix is represented by R, as in QR factorization
  • when A is similar to B, then an invertible matrix S is used to represent that AS = SB

More advanced topics include

  • the conditions of a vector space
  • isomorphisms
  • Nth dimensional spaces and subspaces
  • Inner spaces
  • inner product spaces
  • Determinants
  • cofactor
  • adjugate (useful in finding the inverse of a matrix)
  • Stability
  • Hermitian Matrices
  • Linear differential equations

Common problems

Common problems in linear algebra include:

  1. find the characteristic polynomial
  2. solve for the eigenvalues
  3. solve for the eigenvectors
  • finding the orthogonal projection of a vector in a vector space
  • diagonalize a matrix
  • find the geometric equivalent of a matrix
  • finding the determinant of a 2x2 matrix (easy) and a 3x3 matrix (hard)
  • finding the inverse of a matrix
  • decomposition or factorization of a matrix: representing a given matrix as a product of simpler matrices
  • QR factorization
  • find the least squares solution for a set of data
  • use known characteristics of symmetric or diagonalizable matrices to find solutions

Application: Analyzing Liberal style on Wikipedia®

An n x m matrix can be developed using observed incidents of liberal style in Wikipedia entries, and that data can then be simplified to draw conclusions about how liberal style can mislead viewers. The n rows can represent different elements of liberal style, while the m columns can represent different types of entries on Wikipedia.

The prevalence of certain types of liberal style may reflect their perceived effectiveness, and a linear algebra-based approach at modeling it may be useful in debunking the flawed reasoning and claims.

References