Rahnamayan, ShahryarMakrehchi, MasoudZanjani Miyandoab, Sevil2024-06-252024-06-252024-04-01https://hdl.handle.net/10155/1804Feature selection is a persistent challenge aimed at minimizing the number of features while maximizing accuracy of classification, or any other machine learning and data mining task, by mitigating the curse of dimensionality. We frame feature selection as a multi-objective binary optimization task with the objectives of enhancing accuracy and reducing the feature count. Given that feature selection, along with binary optimization problems in general, is a NP-hard problem since the size of search space increases exponentially by the increase of the number of features, so especially in high-dimensional spaces, it can be very challenging. We propose three innovative approaches to tackle large-scale multi-objective feature selection. The first technique involves an augmentation to the diversity of the population in the well-established multi-objective scheme of the genetic algorithm, NSGA-II, which is achieved through the substitution of the worst individuals with new randomly generated individuals with a limited number of features in each generation. As the second method, a binary Compact NSGA-II (CNSGA-II) algorithm has been introduced for feature selection for the first time, which represents the population as a probability distribution not only to be more memory-efficient but also to accelerate finding a better candidate solution. Additionally, we present a novel binary multi-objective coordinate search (MOCS) algorithm, which, to the best of our knowledge, is the first of its kind, demonstrating effectiveness in solving multi-objective binary optimization problems. A comparative study on the proposed methods and NSGA-II showcases the promising performance of our methods in feature selection tasks involving large-scale datasets, surpassing the renowned NSGA-II algorithm. These methods are not limited to feature selection but can be applied to various binary optimization domains, including the Knapsack problem and training Binary Deep Neural Networks.enLarge-scale feature selectionMulti-objective feature selectionMulti-objective optimizationNSGA-IIMulti-objective coordinate searchNew frontiers in population-based multi-objective feature selectionThesis