Machines Presently Supported by PETSc 2.0
This file lists the machines and compilers that we currently support for PETSc 2.0. Our main limitation is easy and frequent access to the architecture. Since PETSc relies on some system details, porting to a new machine is not always trivial. This is basically for three reasons:
Details about machines currently supported by PETSc 2.0
Machine: Sun Sparc Workstations running 4.1.3/4 (PETSC_ARCH=sun4)
C compiler: gcc (version 2.7.2)
C++ compiler: g++ (version 2.7.2)
Complex version: g++
Fortran interface: working
This machine is our main development workhorse. Support is very good. Some of our users employ the Sun ANSI-C compilers and required a couple of trivial changes. Unfortunately, we do not have access to those compilers and thus cannot make a perfect release for them.
Machine: IBM RS6000 running AIX 4.2, including IBM SP (PETSC_ARCH=rs6000) Also AIX 3.2, see http://www.mcs.anl.gov/petsc/petsc-patches.html for several minor fixes required for AIX 4.1
C compiler: xlc (version 1.3.0.33 -- at least that's what the man page says)
C++ compiler: xlC (version unknown)
Complex version: xlC
Fortran interface: working
This machine is our parallel workhorse. Support is good.
Known Problems: Cannot use gdb debugger if using IBM's compiler.
Machine: SGI running IRIX 5.3 and 6.x (PETSC_ARCH=IRIX)
C compiler: cc (version unknown)
C++ compiler: g++ (version 2.5.8) NOT WORKING due to g++ bugs
CC ok
Complex version: CC ok
Fortran interface: ok (This is a 32 bit SGI machine.)
Known Problems: Automated debugger attachment (such as with the option -start_in_debugger) may not work for SGI machines with gdb debugger. Does work with dbx.
Machine: 64 bit SGI including Origin and PowerChallenge running IRIX 6.1+ (PETSC_ARCH=IRIX64)
C compiler: cc (Version unknown)
C++ compiler: CC (Version unknown.)
Complex version: ok
Fortran interface: ok.
MPI version: Tested successfully with both MPICH and SGI's implementation of MPI. See http://www.mcs.anl.gov/petsc/petsc-patches.html for patches that may be needed for out-dated version of MPI still installed on some IRIX64 machines.
Known Problems:
0) Trouble compiling the Fortran interface on some version of the Origin 2000; see the file troubleshooting.html for a work-around.
1) We do not know how to trap floating point exceptions, does not seem to be supported by the OS yet. 2) See known problems for PETSC_ARCH=IRIX above.
2) The optimizing compiler on the Power Challenge has been known to generate incorrect code, so always develop with BOPT=g and switch to BOPT=O when you are sure the code is running correctly
Significant performance degradation seems to result when using all processors of a Power Challenge system to run MPI programs. You should only use p-1 processors in a p processor system; this has nothing to do with PETSc .
Machine: Convex Exemplar running HPUX version 10.01 and others(PETSC_ARCH=hpux)
C compiler: /bin/cc (version unknown)
C++ compiler: /usr/bin/CC (version unknown)
Complex version: see C++ above.
Fortran interface: ok Very little testing is done.
To compile PETSc on Convex, use PETSC_ARCH=hpux and use only HP-compilers. Assuming that the HP compilers are in /bin and Convex compilers are in /usr/convex/bin, edit bmake/hpux/base* files and replace the occurence of each of the following: cc -> /bin/cc CC -> /bin/CC f77 -> /bin/f77 ld -> /bin/ld Also, add the following line to bmake/hpux/base: LD = /bin/ld
Machine: HP running UX version 10.10 (PETSC_ARCH=hpux) or
C compiler: cc C++ compiler:
CC Complex version: see C++ above.
Fortran interface: ok
You should use Gnu make rather than the native make on this machine. We have been unable to generate clean installs using the native make. To switch to using gnumake, edit the file ${PETSC_DIR}/bmake/hpux/base and define OMAKE to be the path and name of gnumake. Our machine is a 9000/770.
Known Problems:
Profiling version (BOPT=Opg) doesn't work. The HPUX libraries required for profiling are buggy.
Machine: Sun Sparcstations running Solaris 5.5 (PETSC_ARCH=solaris) 5.3 requires minor patches listed in http://www.mcs.anl.gov/ petsc/patches.html
C compiler: cc (version 3.01, we think)
C++ compiler: ok
Complex version: ok
Fortran interface: ok
The software builds fine and passes our tests. -
Machine: Cray T3E (PETSC_ARCH=t3e) also T3D (PETSC_ARCH=t3d)
C compiler: cc C++ compiler:
CC Fortran interface: OK The software builds fine and passes our tests.
Known Problems:
Cannot use automated debugger attachment via -attach_debugger_on_error or -start_in_debugger. With a little juggling you can run MPI programs in the TotalView debugger. To use Totalview, link with the option -Xn where x is the number of processors to use while debugging.
Machine: DEC alpha running OSF of unknown version (PETSC_ARCH=alpha)
C compiler: cc (version unknown)
C++ compiler: not tested, as we do not have access to a C++ compiler.
Complex version: not tested
Fortran interface: ok
Known Problems:
Have not been able to get a code with a C main program that uses Fortran write statements to link successfully.
When compiling the complex version with g++ one gets the warning messages "Warning: clog defined as GLOBAL DATA but is defined in a shared lib as a GLOBAL FUNC". It seems this can be safely ignored.
Machine: PCs running Linux, Version 1.2.13 and others (PETSC_ARCH=linux)
C compiler: gcc (version unknown)
C++ compiler: ok
Complex version: ok
Fortran interface: ok
Known Problems:
When compiling and linking Fortran code we got the error message make [filename.o] Error 4 (ignored) but the compile seems ok, so this message can be safely ignored.
When using the Fortran interface, routines that return pointers to data allocated by PETSc (e.g., VecGetArray() and MatGetArray()) do not work due to the gcc/f2c combinations not double-aligning the doubles. If you know how to fix this, please let us know. -
Machine: PC running Windows NT 4.0/95, (PETSC_ARCH=nt)
C compiler: MS Visual C++ 5.0
C++ compiler: MS Visual C++ 5.0
Complex version: not done
Fortran Compiler: Digital Visual Fortran (with 5.0C update)
Fortran interface: Working
Little tested, but the source all compiles and a few examples have been run successfully.
Known Problems:
Currently only console applications are supported since PETSc makes extensive use of STDOUT/STDERR.
Some of the SYS functions may not work. This could be due to the absence of corresponding UNIX functionality on NT.
Machine: PC running Windows NT 4.0/95, (PETSC_ARCH=nt_gnu)
C compiler: gcc version egcs-1.01 for gnu-win32 b-19
C++ compiler: g++ version egcs-1.01 for gnu-win32 b-19
Fortran interface: the source all compiles and examples have been run successfully.
Known Problems:
Bash form gnu-win32 b-19, messes up the machine when you type C-c. hence this mode of development is unstable.
We no longer actively support FreeBSD or the Intel Paragon.. If you are really need these machines with PETSc and it is important to you, please send mail to petsc-maint@mcs.anl.gov and we will provide you with the bmake files. If you currently use FreeBSD, we highly recommend switching to Linux; we plan to provide strong Linux support in the future.
Machine: PCs running FreeBSD 1.1.5.1 and others (PETSC_ARCH=freebsd)
C compiler: gcc (version 2.4.5)
C++ compiler: g++ (version 2.4.5)
Complex version: g++ ok
Fortran interface: ok, except see 1) below.
Known Problems:
One must use gnumake, not the freeBSD make. The freeBSD make was heavily modified from real make and is essentially worthless; it should not even be called make!
When using the Fortran interface, routines that return pointers to data allocated by PETSc (e.g., VecGetArray() and MatGetArray()) do not work due to the gcc/f2c combinations not double-aligning the doubles. If you know how to fix this, please let us know.
Machine: Paragon Running Release 1.0.4 Server 1.3 R1_3 (PETSC_ARCH=paragon)
C compiler: cc (version unknown)
C++ compiler: not done
Complex version: not done
Fortran interface: not tested
We have had occasional trouble compiling on the Paragon. We sometimes get errors of the form:
crabcake% make BOPT=g all rm -f /ump/hpcc/home05/bsmith/petsc/lib/libg/paragon/*.a sh: 2555969 Memory fault - core dumped *** Exit 139 (ignored)
If you encounter messages of this type please talk to your system support staff.
More strange problems: crabcake% cd src/vec/interface crabcake% make BOPT=g Segmentation fault crabcake% exit
If you encounter this error, try unsetenv on most enviromental variables unrelated to PETSc. (in one case "unsetenv TERM" worked for us). Maybe the problem will go away; if not, give up on Intel.
We will never port to some machines since they are very much out-of-date and not worth the work involved. Such machines include:
BBN
Alliant
Titan
Sequent
Intel i860
Intel DELTA
Thinking Machines CM-2, CM-5
Sun3
NeXT
DEC3000, DEC5000