Modelling parallel metaheuristics and hyperheuristics for auto-tuning

José-Matías Cutillas-Lozano, Domingo Giménez Cánovas


This paper studies the auto-tuning of parallel metaheuristics and hyperheuristics. The modelling of the shared-memory scheme is considered for both types of algorithms, and a first study of message-passing metaheuristic schemes is introduced. A theoretical model of the execution time of a parametrized metaheuristic scheme is empirically adapted for a particular metaheuristic through experimentation. The parallelization of the shared-memory scheme is achieved through the independent parallelization of the basic functions in the metaheuristic scheme. The model is used to decide at running time the number of threads to obtain a reduced execution time. The number of threads is different for the different basic functions in the scheme, and depends on the problem to be solved, the metaheuristic or hyperheuristic scheme, the implementation of the basic functions and the computational system where the problem is solved. The auto-tuning methodology for shared-memory parametrized metaheuristic schemes can in turn be applied to shared-memory hyperheuristics developed on top of them. In the case of the message-passing scheme, an island model implemented with the master-slave scheme is used, and new metaheuristic-parallelism parameters representing the migration frequency, the size of the migration and the number of processes are introduced. The applicability of the proposal is shown with a minimization of electricity consumption in exploitation of wells problem and with the problem of obtaining satisfactory metaheuristics for that problem. Experimental results with these two problems show that satisfactory execution times can be achieved in metaheuristics with auto-tuning techniques based on theoretical-empirical models of the execution time.


Parametrized metaheuristic schemes; Parallel metaheuristics; Parallel hyperheuristics; Shared-memory; Message-passing; Auto-tuning

Full Text: PDF


Creative Commons License
This work is licensed under a Creative Commons Attribution 3.0 License.