An input-aware ensemble learning method — Multi-output Random Forest — for dynamic memory configuration of serverless functions. Achieves 57–87% cost savings and 54–82% memory allocation reduction vs. static baselines on AWS Lambda. Evaluated against COSE, Parrotfish, and AWS Power Tuning.
57–87%
Run-time cost savings
54–82%
Memory allocation reduction
R² 98%
Execution time prediction accuracy
R² 96%
Memory utilisation prediction accuracy
Evaluated against COSE, Parrotfish, and AWS Power Tuning across diverse serverless function benchmarks.
Profiles serverless functions across diverse input sizes to build a labelled dataset of input characteristics, execution times, and memory consumption. Data stored in AWS DynamoDB and S3.
Multi-output Random Forest Regressor learns input-to-resource correlations, jointly predicting execution time and memory utilisation for any unseen input payload.
Given model predictions, solves a constraint optimisation problem to select the minimum-cost memory configuration that satisfies execution time and SLA constraints.
Reconfigures AWS Lambda memory allocation per invocation via the AWS API before function execution. Periodic model retraining via AWS Step Functions maintains accuracy under workload drift.
Matrix multiplication, linear algebra (Linpack), and cryptographic operations (pyaes). Up to 73% resource savings.
Minimum spanning tree, breadth-first search, and PageRank. Up to 87% cost efficiency.
Dynamic HTML generation and template rendering workloads with significant performance improvements.
Input-Based Ensemble-Learning Method for Dynamic Memory Configuration of Serverless Computing Functions
Agarwal, S. et al. — IEEE CS Press · Sharjah, UAE · DOI: 10.1109/UCC62667.2024
View on IEEE Xplore