int MatCreateMPIBDiag(MPI_Comm comm,int m,int M,int N,int nd,int bs,int *diag,Scalar **diagv,Mat *A)Collective on MPI_Comm
comm | - MPI communicator |
m | - number of local rows (or PETSC_DECIDE to have calculated if M is given) |
M | - number of global rows (or PETSC_DECIDE to have calculated if m is given) |
N | - number of columns (local and global) |
nd | - number of block diagonals (global) (optional) |
bs | - each element of a diagonal is an bs x bs dense matrix |
diag | - optional array of block diagonal numbers (length nd). For a matrix element A[i,j], where i=row and j=column, the diagonal number is |
diagv | - pointer to actual diagonals (in same order as diag array), if allocated by user. Otherwise, set diagv=PETSC_NULL on input for PETSc to control memory allocation. |
The user MUST specify either the local or global numbers of rows(possibly both).
The case bs=1 (conventional diagonal storage) is implemented asa special case.
Location: src/mat/impls/bdiag/mpi/mpibdiag.c
Matrix Index
Table of Contents