moocore: Core Algorithms for Multi-Objective Optimization#
Version: 0.1.9.dev0 (What’s new)
Date Aug 28, 2025
Useful links: Install | Source Repository | Issue Tracker
This webpage documents the moocore
Python package. There is also a moocore R package.
The goal of the moocore project (multi-objective/moocore) is to collect fast implementations of core mathematical functions and algorithms for multi-objective optimization and make them available to different programming languages via similar interfaces. These functions include:
Quality metrics such as (weighted) hypervolume, epsilon, IGD+, etc.
Computation of the Empirical Attainment Function. The empirical attainment function (EAF) describes the probabilistic distribution of the outcomes obtained by a stochastic algorithm in the objective space.
Most critical functionality is implemented in C, with the R and Python packages providing convenient interfaces to the C code.
Keywords: empirical attainment function, summary attainment surfaces, EAF differences, multi-objective optimization, bi-objective optimization, performance measures, performance assessment
The reference guide contains a detailed description of the functions, modules, and objects.
Detailed examples and tutorials.
Benchmarks#
The following plots compare the performance of moocore, pymoo, BoTorch, and jMetalPy. Other optimization packages are not included in the comparison because they are based on these packages for the functionality already benchmarked, so they are at least as slow as them. For example Xopt uses BoTorch, pysamoo is an extension of pymoo, DESDEO already moocore for hypervolume and other quality metrics, and most of the multi-objective functionality of DEAP is shared by pymoo. We do not compare with the Bayesian optimization toolbox trieste, because it is much slower than BoTorch and too slow to run the benchmarks in a reasonable time.
Not all packages provide the same functionality. For example, pymoo does not provide the epsilon indicator whereas jMetalPy does not provide the IGD+ indicator. BoTorch provides neither of them.
The source code for the benchmarks below can be found at multi-objective/moocore .
Exact computation of hypervolume#
The following plots compare the speed of computing the hypervolume indicator in 3D, 4D, 5D and 6D. As the plots show, moocore is 100 times faster than the other packages and 1000 times faster than BoTorch and, by extension, Xopt. BoTorch is not included for more than 4 objectives because it is tens of thousands of times slower than moocore.
Approximation of the hypervolume#
The following plots compare the accuracy and speed of approximating the hypervolume with the various methods provided by moocore.hv_approx()
. The plots show that method DZ2019-HW
consistently produces the lowest approximation error, but it is also slower than method DZ2019-MC
. When the number of points increases, both methods are significantly faster than pymoo.
Identifying nondominated points#
The following plots compare the speed of finding nondominated solutions, equivalent to moocore.is_nondominated()
, in 2D and 3D. We test both keep_weakly=True
and keep_weakly=False
(the latter is not supported by pymoo nor DESDEO). Interestingly, DESDEO is significantly faster than pymoo, despite the former using a naive Python implementation jit-compiled by Numba and the latter using CPython. Nevertheless, As the plots show, moocore is 10 times faster than DESDEO and 100 times faster than the other packages.
Epsilon and IGD+ indicators#
The following plots compare the speed of computing the epsilon indicator metric and IGD+ indicator. Despite the algorithms for computing these metrics are relatively simple and easy to vectorize in Python, the moocore implementation is still 10 to 100 times faster.