Kernel-Level Energy-Efficient Neural Architecture Search for Tabular Dataset

Bibliographic Details
Title: Kernel-Level Energy-Efficient Neural Architecture Search for Tabular Dataset
Authors: La, Hoang-Loc, Ha, Phuong Hoai
Publication Year: 2025
Collection: Computer Science
Subject Terms: Computer Science - Machine Learning, Computer Science - Artificial Intelligence
More Details: Many studies estimate energy consumption using proxy metrics like memory usage, FLOPs, and inference latency, with the assumption that reducing these metrics will also lower energy consumption in neural networks. This paper, however, takes a different approach by introducing an energy-efficient Neural Architecture Search (NAS) method that directly focuses on identifying architectures that minimize energy consumption while maintaining acceptable accuracy. Unlike previous methods that primarily target vision and language tasks, the approach proposed here specifically addresses tabular datasets. Remarkably, the optimal architecture suggested by this method can reduce energy consumption by up to 92% compared to architectures recommended by conventional NAS.
Comment: ACIIDS 2025 Conference
Document Type: Working Paper
Access URL: http://arxiv.org/abs/2504.08359
Accession Number: edsarx.2504.08359
Database: arXiv
More Details
Description not available.