Programming: Matrix Algebra

LIMDEP’s matrix algebra provides a large range of techniques and the full set of operations needed to construct new estimators or manipulate any program results. The matrix algebra package is fully integrated with the rest of the program. For example, it will be an integral part of any estimator you design. Details

Matrix Algebra Features

  • Complicated extended expressions
  • Matrix operators +, -, *, ‘ (transpose), ^ (power)
  • Several forms of matrix to scalar or matrix power
  • Conditional matrix commands
  • Results displayed in exportable windows (objects)
  • Formatted or unformatted display
  • Statistical display of a vector and companion covariance matrix
  • Display all internal digits (18)
  • Descriptive statistics for elements of a matrix
  • Plot elements of one matrix against another
  • Matrix elements accessible - subscripting in expressions
  • Data matrices of unlimited length
  • Overlapping data matrices (matrices may share variables)

All Algebraic Operations Included Plus Dozens of Matrix Functions

  • Matrix dimensions, number of rows or columns
  • Partitioned matrices - matrix of matrices
  • Identity, band matrices
  • Random matrices from specified normal distribution
  • Multivariate normal probabilities
  • Extract submatrices
  • Inject vectors into rows or columns of matrices
  • Vectorize matrix
  • Vector of diagonal elements
  • Diagonal matrix from vector
  • Square root, inverse square root, orthonormalized matrix
  • Projection matrix
  • Characteristic roots and vectors
  • Complex characteristic roots for possibly nonsymmetric matrix
  • Characteristic roots for dynamic equations
  • Cholesky decomposition
  • Singular value decomposition
  • Element by element log or exponent
  • Hadamard (direct) product
  • Inverse matrix: ordinary, generalized, Moore-Penrose, G2
  • Inverse of sum of matrices
  • Determinant
  • Log determinant
  • Trace of matrix
  • Rank of matrix
  • Norm of a vector
  • Kronecker products
  • Column vector of quadratic forms (diagonals of hat matrix)
  • Weighted sums in product matrices
  • BHHH style outer products summation with weight variables
  • Covariance and correlation matrix
  • Cross correlation matrix
  • Matrix of mean squared residuals
  • Least squares computations by matrix
  • Least absolute deviations vector result
  • Sums for panel data: group sizes, group means, maxima, etc.
  • Scalar multiple of a set of variables
  • Principle components
  • Linear combination of a set of variables
  • Orthonormalized data matrix

Example: Ordinary Least Squares

OLS is the standard application used to illustrate matrix algebra. With LIMDEP’s matrix calculator, the computation is trivial.

NAMELIST
CREATE
CALC
MATRIX

; x
; y
; df
; bols
; xy
; v
; Stat(bols,v,x)

= the names of the variables in the model
= the name of the dependent variable $
= n - Col(x) $
= * x'y
= x'y
= {(y'y - bols'xy)/df *


? coefficients
? moments
? cov. matrix
$ display results

Example: Restricted least squares

In the linear regression model, y = Xb + e, the linear least squares coefficient vector, b*, and its asymptotic covariance matrix, computed subject to the set of linear restrictions Rb* = q are

b* = b - <X’X>*R’<R<X’X>R’>(Rb - q)

where

b = <X’X>*X’y

and

Est.Asy.Var[b*] = s2*<X’X> - s2*<X’X>R’<R<X’X>R’>R<X’X>

?============================================================
?  First define the X matrix.  Columns then Rows.
?  We assume the dependent variable is y.
?  Define R and q.  Varies by the application.
?============================================================
	NAMELIST	; x = ... $
	SAMPLE	; ... as appropriate  $
	MATRIX	; r = ... 	; q = ...  $
?============================================================
?  Unrestricted least squares and the discrepancy vector then
?  Restricted least squares.
?============================================================
	MATRIX	; bu = * x'y ; d =  r*bu - q
		; br = bu - * r' * Iprd(r, , r') * d $
?============================================================
?  Sum of squared deviations and disturbance variance estimator.
?============================================================
	CREATE	; u = y - x'br $
	CALC	; s2 = (1/(n-Col(x)+Row(r))) * u'u $
?============================================================
?  Covariance matrix, then display results
?============================================================
	MATRIX	; vr = s2*xxi - s2*i*r'*Iprd(r, ,r')*r*
		; Stat (br,vr,X) $

Example: Canonical correlations

Variables y1, ..., yL and x1,...,xK are arranged in nxL and nxK data matrices Y and X.  The canonical variates (y*,x*) are those M = Min(L,K) pairs of linear functions of Y and X which have maximum correlation chosen so that all variables with unequal subscripts are uncorrelated.  The canonical correlations are their pairwise correlation coefficients, r1*... rM*, ordered from largest to smallest.  There are several ways to compute canonical correlations and canonical variates.  The following has the useful virtue that it involves only symmetric matrices.  This simplifies the computations because we need to compute characteristic roots, and decomposing symmetric matrices is simpler in this regard.  Define the matrix product

R = sqr<Ryy>*Ryx<Rxx>Rxy*sqr<Ryy>

where Rij, i,j = x,y, is a sample correlation matrix.  The characteristic roots of R are the squared canonical correlation coefficients.  The ordered canonical variates are contained in

y* = YRC = YQ

where the mth column of C is the characteristic vector of R corresponding to the mth largest nonzero root, and

x* = X<Rxx>RxyQ = XV

The columns of Q are normalized to have unit length.  The following program computes the canonical correlations and the coefficients of the canonical variates.  It is assumed that Y and X are namelists defining the sets of variables and that Y does not have more variables than X, so that M is the number of columns in X.  Using matrix functions compresses large amounts of computation in small numbers of commands.  In this case, the large set of complex computations can be done in just a few short lines.

?==========================================================
?  Compute and display the sample canonical correlations.
	NAMELIST ; x = list of variables ; y = list of variables $
?==========================================================
MATRIX	; List
	; rxx = Xcor(x)
	; ryy = Xcor(y)
	; rxy = Xcor(x,y)  $
?==========================================================
?  Column i of Q is the coefficients of variate y*(i).
?  Column i of V is the coefficients of variate x*(i).
?  Squared canonical correlations are the diagonals of R.
?==========================================================
MATRIX	; List ; rr	= Isqr(ryy) * rxy' *  * rxy * Isqr(ryy) $
MATRIX	; List ; r 	= Root(rr)  $
MATRIX	; List ; r 	= Diag(r) $
MATRIX	; List ; q 	= r * Cvec(rr)  $
MATRIX	; List ; norm	= Diag(q'q)  $
MATRIX	; List ; q 	= q * Isqr(norm) $
MATRIX	; List ; v 	=  * rxy * q $