Acadlore takes over the publication of IJCMEM from 2025 Vol. 13, No. 3. The preceding volumes were published under a CC BY 4.0 license by the previous owner, and displayed here as agreed between Acadlore and the previous owner. ✯ : This issue/volume is not published by Acadlore.
Optimizing Program Efficiency by Predicting Loop Unroll Factors Using Ensemble Learning
Abstract:
Loop unrolling is a well-known code-transforming method that can enhance program efficiency during runtime. The fundamental advantage of unrolling a loop is that it frequently reduces the execution time of the unrolled loop when compared to the original loop. Choosing a large unroll factor might initially save execution time by reducing loop overhead and improving parallelism, but excessive unrolling can result in increased cache misses, register pressure, and memory inefficiencies, eventually slowing down the program. Therefore, identifying the optimal unroll factor is of essential importance. This paper introduces three ensemble-learning techniques—XGBoost, Random Forest (RF), and Bagging—for predicting the efficient unroll factor for specific programs. A dataset comprises various programs derived from many benchmarks, which are Polybench, Shootout, and other programs. More than 220 examples, drawn from 20 benchmark programs with different loop iterations, used to train three ensemble-learning methods. The unroll factor with the biggest reduction in program execution time is chosen to be added to the dataset, and ultimately it will be a candidate for the unseen programs. Our empirical results reveal that the XGBoost and RF methods outperform the Bagging algorithm, with a final accuracy of 99.56% in detecting the optimal unroll factor.