Newsgroups: comp.parallel From: Michael Harm Subject: newbie questions Organization: University of Southern Californa Date: 2 Jul 1997 16:48:06 GMT Message-ID: <5pe0o6$1f0@server1.ctc.com> Hi. I've got a project which I'd like to implement on parallel hardware. At my university I have access to a variety of MP machines, including a 28 proc IBM SP2, a 16 processor SGI Power Challenge, and a 16 processor Convex Exemplar. There's also a few 2 processor Sparc 20's here and there. How should I go about this? The SGI has some nice, easy tools and good documentation for how to parallelize loops and such, but these pragmas only work on SGI's. Ditto the convex. I could come up with parallel pragmas for each platform, but that seems like a pain. The project is the parallel implementation of a neural net simulator. Each processor would take a hunk of a problem, access shared read only data, and generate a private set of deltas. Once each hunk has done its thing, the deltas get summed up and scattered to the shared data, then it iterates again. Communication overhead isn't such an issue; the processing of a hunk of the problem should take on the order of a dozen or more cpu seconds. so: big, naive question number one: Do I want to use pthreads? or PVM? I know our SP2 supports PVM in some way... I don't know if the others do, or even how I'd find out. I also don't know if any of them have the pthreads lib installed... how does one figure that out? and in general, which is better/easier? pthreads, pvm, or native-machine stuff, with lots of #ifdefs? It would be nice if I only had to modify my simulator for one parallel environment. littler question: Does anyone have a simple parallel program demo that runs on an SP2? I am mystified as to whether I want to use POE, PVM, or something else. I'll be submitting it through LoadLeveler. For the SP2, do I want to use POE or PVM or neither? I checked rtfm.mit.edu for a FAQ for comp.parallel and comp.parallel.pvm and didn't find one; are there any out there? Many thanks for any help I can get. cheers, -- Mike Harm mharm@gizmo.usc.edu Psycholinguistics Lab Univ. of Southern California http://www-rcf.usc.edu/~mwharm/ --------------------------------------------------------------- Big Science. Every man, every man for himself. Big Science. Halleluja. Yodellayehoo. -Laurie Andersen -- Articles to parallel@ctc.com (Administrative: bigrigg@ctc.com) Archive: http://www.hensa.ac.uk/parallel/internet/usenet/comp.parallel