next up previous index
Next: The Diffusion Problem Up: Message Passing Interface Previous: The Slave Program

Not So Simple MPI

The four programs we had a look at so far had been all implemented using a minimal set of MPI calls, plus broadcast and reduce.

In this section we are going to look at two more involved examples that will illustrate the following:

The first example will also illustrate how to do things in MPI that are done for you automatically by HPF. This example provides a very good comparison, and it illustrates very succinctly the power of HPF, for those problems, of course, that are tractable using data parallelism. If your problems do not fit in this category, then you may have little choice but to grit your teeth and grapple with MPI.


Zdzislaw Meglicki