MatCreateMPIBDiag

Creates a sparse parallel matrix in MPIBDiag format.

Synopsis

int MatCreateMPIBDiag(MPI_Comm comm,int m,int M,int N,int nd,int bs,int *diag,Scalar **diagv,Mat *A)
Collective on MPI_Comm

Input Parameters

    diag = i/bs - j/bs  (integer division)
Set diag=PETSC_NULL on input for PETSc to dynamically allocate memory asneeded (expensive).
comm - MPI communicator
m - number of local rows (or PETSC_DECIDE to have calculated if M is given)
M - number of global rows (or PETSC_DECIDE to have calculated if m is given)
N - number of columns (local and global)
nd - number of block diagonals (global) (optional)
bs - each element of a diagonal is an bs x bs dense matrix
diag - optional array of block diagonal numbers (length nd). For a matrix element A[i,j], where i=row and j=column, the diagonal number is
diagv - pointer to actual diagonals (in same order as diag array), if allocated by user. Otherwise, set diagv=PETSC_NULL on input for PETSc to control memory allocation.

Output Parameter

A -the matrix

Options Database Keys

-mat_block_size <bs> -Sets blocksize
-mat_bdiag_diags <s1,s2,s3,...> -Sets diagonal numbers

Notes

The parallel matrix is partitioned across the processors by rows, whereeach local rectangular matrix is stored in the uniprocessor blockdiagonal format. See the users manual for further details.

The user MUST specify either the local or global numbers of rows(possibly both).

The case bs=1 (conventional diagonal storage) is implemented asa special case.

Fortran Notes

Fortran programmers cannot set diagv; this variable is ignored.

Keywords

matrix, block, diagonal, parallel, sparse

See Also

MatCreate(), MatCreateSeqBDiag(), MatSetValues()

Location: src/mat/impls/bdiag/mpi/mpibdiag.c
Matrix Index
Table of Contents