(15 points) In class we saw that access to any element of a 2-dimensional array is constant-time. If each item of an M rows and N column array contains b bytes, and L is the memory address of A[0][0], then the address of A[i][j] is L + b*(i*N + j), so two multiplications and two additions are needed regardless of the dimensions of the array.
A lower-triangular matrix is a (we’ll restrict it to) square matrix (i.e., N x N for some N) in which every element above the main diagonal is zero. I drew a diagram of this on the board in the Day 17 class. By “above the main diagonal”, we mean an element whose column number is larger than its row number. The internal representation of a triangular matrix should be a one-dimensional array. Since almost half of the positions in the matrix are guaranteed to always be zero, it is not necessary to explicitly store them in the representation array, so we can save almost half of the space needed for a “normal” square matrix with the same number of rows and columns. Note that the sum, difference, or product of two lower triangular square matrices is lower triangular, so these operations can (and should!) do considerably less work than the corresponding operations for “normal” matrices.
(a) (5 points) Show the order in which the elements of an N x N lower-triangular matrix can be stored in the computer's (one-dimensional) memory.
(b) (10 points) Given your storage scheme, give the formula for calculating the address in memory of A[i][j] where j ≤ i. Use A, L, and b to represent the same things as in the formula from the first paragraph of this problem. Your formula should be computable in constant time.