MO-NAS: Multi-Objective Neural Architecture Search Using NSGA-II

Uncategorized

Authors: Rayapudi Gautam Kumar

Abstract: MO-NAS: Multi-Objective Neural Architecture Search Using NSGA-IIThis paper presents MO-NAS, a production-ready framework for automatically discovering optimal neural network architectures using the Non-dominated Sorting Genetic Algorithm II (NSGA-II) multi-objective optimization ap‐ proach. Unlike traditional Neural Architecture Search (NAS) methods that optimize for a single objective (typically accuracy), MO-NAS simultaneously optimizes multiple competing objectives including accuracy, computational cost (FLOPs), model size (parameters), inference latency, and memory footprint. The framework supports multiple data modalities including image, text, sequence, and tabular data, making it a universal NAS solution. We incorporate advanced techniques such as zero-cost proxies for rapid evaluation, Bayesian guidance for search efficiency, and weight sharing to reduce training costs. Our approach produces a Pareto-optimal front of architectures, allowing practitioners to select the best trade-off for their specific deployment constraints.

 

× How can I help you?