Parallel Computing '26: Exercise Sheet 2
(DRAFT) Message Passing Interface (MPI)

Submission Deadline:

Word of the week:
Brief"ly (?), adv. Concisely; in few words.

Situating Ourselves

From a programmers perspective, briefly explain the differences between OpenMP and MPI. Explain in your own words why is MPI interesting, if we have already ween OpenMP?

Terminology: Ranks?

Explain what a Rank is in the MPI world. How does it differ from processes, threads and nodes?

MPI: Reduction I

Explain why it is important for all processes to call the same collective operation (e.g., MPI_Reduce). What happens when one process does something else instead?

MPI: Reduction II

Consider the following sequential code, where arr is an array of size N and elements of type int:

int max = -1; size_t idx = -1;
for (size_t i = 0; i < N; ++i) {
  if (max < arr[i]) {
    max = arr[i];
    idx = i;
  }
}

Assume a friend implements a implementation that would implement reduction manually, by sending all values to process 0. What could they have done better, and why?

MPI: Tags

What roles to tags play in point-to-point and collective communication?

MPI: Primitives & Derivatives

Show how using MPI_Recv, MPI_Send, MPI_Comm_size and MPI_Comm_rank you implement your own version of MPI_Scatter?


Remember to submit your answers in groups of two via Brightspace! If you cannot find a group on your own, please reach out and we will try to pair you up.