A GPU-Enabled Compact Genetic Algorithm for Very Large-Scale Optimization Problems

The ever-increasing complexity of industrial and engineering problems poses nowadays a number of optimization problems characterized by thousands, if not millions, of variables. For instance, very large-scale problems can be found in chemical and material engineering, networked systems, logistics an...

Full description

Bibliographic Details
Main Authors: Andrea Ferigo, Giovanni Iacca
Format: Article
Language:English
Published: MDPI AG 2020-05-01
Series:Mathematics
Subjects:
Online Access:https://www.mdpi.com/2227-7390/8/5/758
id doaj-52d30e5340f044fb961271812168ad10
record_format Article
spelling doaj-52d30e5340f044fb961271812168ad102020-11-25T02:04:34ZengMDPI AGMathematics2227-73902020-05-01875875810.3390/math8050758A GPU-Enabled Compact Genetic Algorithm for Very Large-Scale Optimization ProblemsAndrea Ferigo0Giovanni Iacca1Department of Information Engineering and Computer Science, University of Trento, 38122 Trento, ItalyDepartment of Information Engineering and Computer Science, University of Trento, 38122 Trento, ItalyThe ever-increasing complexity of industrial and engineering problems poses nowadays a number of optimization problems characterized by thousands, if not millions, of variables. For instance, very large-scale problems can be found in chemical and material engineering, networked systems, logistics and scheduling. Recently, Deb and Myburgh proposed an evolutionary algorithm capable of handling a scheduling optimization problem with a staggering number of variables: one billion. However, one important limitation of this algorithm is its memory consumption, which is in the order of 120 GB. Here, we follow up on this research by applying to the same problem a GPU-enabled “compact” Genetic Algorithm, i.e., an Estimation of Distribution Algorithm that instead of using an actual population of candidate solutions only requires and adapts a probabilistic model of their distribution in the search space. We also introduce a smart initialization technique and custom operators to guide the search towards feasible solutions. Leveraging the compact optimization concept, we show how such an algorithm can optimize efficiently very large-scale problems with millions of variables, with limited memory and processing power. To complete our analysis, we report the results of the algorithm on very large-scale instances of the OneMax problem.https://www.mdpi.com/2227-7390/8/5/758compact optimizationdiscrete optimizationlarge-scale optimizationone billion variablesevolutionary algorithmsestimation distribution algorithms
collection DOAJ
language English
format Article
sources DOAJ
author Andrea Ferigo
Giovanni Iacca
spellingShingle Andrea Ferigo
Giovanni Iacca
A GPU-Enabled Compact Genetic Algorithm for Very Large-Scale Optimization Problems
Mathematics
compact optimization
discrete optimization
large-scale optimization
one billion variables
evolutionary algorithms
estimation distribution algorithms
author_facet Andrea Ferigo
Giovanni Iacca
author_sort Andrea Ferigo
title A GPU-Enabled Compact Genetic Algorithm for Very Large-Scale Optimization Problems
title_short A GPU-Enabled Compact Genetic Algorithm for Very Large-Scale Optimization Problems
title_full A GPU-Enabled Compact Genetic Algorithm for Very Large-Scale Optimization Problems
title_fullStr A GPU-Enabled Compact Genetic Algorithm for Very Large-Scale Optimization Problems
title_full_unstemmed A GPU-Enabled Compact Genetic Algorithm for Very Large-Scale Optimization Problems
title_sort gpu-enabled compact genetic algorithm for very large-scale optimization problems
publisher MDPI AG
series Mathematics
issn 2227-7390
publishDate 2020-05-01
description The ever-increasing complexity of industrial and engineering problems poses nowadays a number of optimization problems characterized by thousands, if not millions, of variables. For instance, very large-scale problems can be found in chemical and material engineering, networked systems, logistics and scheduling. Recently, Deb and Myburgh proposed an evolutionary algorithm capable of handling a scheduling optimization problem with a staggering number of variables: one billion. However, one important limitation of this algorithm is its memory consumption, which is in the order of 120 GB. Here, we follow up on this research by applying to the same problem a GPU-enabled “compact” Genetic Algorithm, i.e., an Estimation of Distribution Algorithm that instead of using an actual population of candidate solutions only requires and adapts a probabilistic model of their distribution in the search space. We also introduce a smart initialization technique and custom operators to guide the search towards feasible solutions. Leveraging the compact optimization concept, we show how such an algorithm can optimize efficiently very large-scale problems with millions of variables, with limited memory and processing power. To complete our analysis, we report the results of the algorithm on very large-scale instances of the OneMax problem.
topic compact optimization
discrete optimization
large-scale optimization
one billion variables
evolutionary algorithms
estimation distribution algorithms
url https://www.mdpi.com/2227-7390/8/5/758
work_keys_str_mv AT andreaferigo agpuenabledcompactgeneticalgorithmforverylargescaleoptimizationproblems
AT giovanniiacca agpuenabledcompactgeneticalgorithmforverylargescaleoptimizationproblems
AT andreaferigo gpuenabledcompactgeneticalgorithmforverylargescaleoptimizationproblems
AT giovanniiacca gpuenabledcompactgeneticalgorithmforverylargescaleoptimizationproblems
_version_ 1724942514558861312