Minimising Synchronisation with Task-Aware Communication Libraries
Presenter
DescriptionImplementing pure task-based programming on distributed memory systems is very challenging. Instead, we propose a hybrid model which uses task-based programming inside a node and traditional message-passing between nodes. To minimise synchronisation and expose as much parallelism as possible, experience has shown that communication as well as computation should be expressed as tasks. However, standard communication libraries are difficult to use like this without encountering the risk of deadlock, for example where all threads are executing tasks containing blocking communincation calls. We will describe task-aware versions of MPI and GASPI libraries (called TAMPI and TAGASPI) which are integrated with OmpSs and OpenMP runtimes. In TAMPI and TAGASPI, tasks blocked on communication calls are paused, freeing their executing threads to process other tasks, until the communications complete and the paused tasks can be resumed. We will present some details of the implementations of these libraries as well as reporting on our experiences with porting a number of mini-apps to this hydrid model.
TimeMonday, 5 July 202116:00 - 16:30 CEST
LocationMère Royaume
Session Chair
Event Type
Minisymposium
CS and Math