int MatCreateMPIDense(MPI_Comm comm,int m,int n,int M,int N,Scalar *data,Mat *A)Collective on MPI_Comm
comm | - MPI communicator |
m | - number of local rows (or PETSC_DECIDE to have calculated if M is given) |
n | - number of local columns (or PETSC_DECIDE to have calculated if N is given) |
M | - number of global rows (or PETSC_DECIDE to have calculated if m is given) |
N | - number of global columns (or PETSC_DECIDE to have calculated if n is given) |
data | - optional location of matrix data. Set data=PETSC_NULL for PETSc to control all matrix memory allocation. |
The data input variable is intended primarily for Fortran programmerswho wish to allocate their own matrix memory space. Most users shouldset data=PETSC_NULL.
The user MUST specify either the local or global matrix dimensions(possibly both).
Currently, the only parallel dense matrix decomposition is by rows, so that n=N and each submatrix owns all of the global columns.
Location: src/mat/impls/dense/mpi/mpidense.c
Matrix Index
Table of Contents