In the ever-evolving landscape of cloud computing, AWS Lambda has emerged as a game-changer, enabling developers to build and deploy serverless applications with ease. With its robust features and scalable architecture, Lambda has become an indispensable tool for businesses seeking to optimize performance, reduce costs, and enhance productivity.
One of the key use cases for AWS Lambda revolves around calling DynamoDB, Amazon’s fully managed NoSQL database service. This powerful combination allows developers to create efficient, event-driven architectures by leveraging Lambda’s event-driven capabilities and DynamoDB’s seamless scalability. Whether it’s processing real-time data streams, automating data transformations, or executing complex business logic, AWS Lambda and DynamoDB provide a winning solution.
AWS Lambda pricing
When it comes to pricing, AWS Lambda offers a unique model that aligns with the pay-as-you-go philosophy. With Lambda, you pay only for the actual compute time consumed by your functions and the memory ressources you assigned for that time. This granular pricing structure ensures that you optimize costs by paying only for the execution time your code requires, making it an attractive choice for organizations of all sizes.
Lambda for Java developers
As a Java developer, you might be wondering how Java performs in comparison to GraalVM native image builds and Rust. Why did we choose GraalVM and Rust as benchmark competitors? Well, GraalVM offers the advantage of allowing you to continue using the Java language while compiling it into native code. On the other hand, Rust is a powerful language with its own syntax, known for its exceptionally low memory consumption and fast execution times.
In the context of AWS Lambda pricing and high-load use cases, the combination of memory consumption and execution time plays a crucial role. Lets assume you have a Lambda function with 256mb which runs 10 ms per invocation. You can reduce the costs by 50% if you manage to run your function within 5ms per invocation. But this is usually not the easiest way to achieve savings. On the other hand you can reduce the costs by 50% if you manage to run your function with 128mb at 10ms, which is usually the easier way to go.
By optimizing these factors, you can effectively control the costs associated with AWS Lambda. That’s why we’ve selected GraalVM and Rust as benchmark competitors, as they both have features that can potentially address the performance and cost considerations for such use cases.
We have developed Lambda functions in Rust, Java, and GraalVM that interact with a DynamoDB table. To conduct a load test, we have created a dedicated Lambda function named „Load Test Lambda“ that invokes each of these functions 3000 times. We leverage CloudWatch Insights to gather the execution times for each invocation, enabling us to analyze and compare the results effectively.
This query was used to summarize the exeuction times for aws lambda:
filter @type = "REPORT" | parse @log /\d+:\/aws\/lambda\/(?<function>.*)/ | stats count(*) as calls, avg(@duration+coalesce(@initDuration,0)) as avg_duration, pct(@duration+coalesce(@initDuration,0), 0) as p0, pct(@duration+coalesce(@initDuration,0), 25) as p25, pct(@duration+coalesce(@initDuration,0), 50) as p50, pct(@duration+coalesce(@initDuration,0), 75) as p75, pct(@duration+coalesce(@initDuration,0), 90) as p90, pct(@duration+coalesce(@initDuration,0), 95) as p95, pct(@duration+coalesce(@initDuration,0), 100) as p100 group by function, ispresent(@initDuration) as coldstart | sort by coldstart, function
Compare cold start times
The result table includes a column called „coldstart,“ where a value of 1 indicates a cold start. In general, Java is recognized for its higher memory usage and slower startup times. AWS has introduced several measures to mitigate these concerns, including the snapstart feature . However, for our specific test case, we opted to compare raw startup times without applying this optimization.
When observing the startup times, it is evident that Java’s performance is influenced by the allocated memory. The more memory and indirect CPU power provided, the faster the startup time. For instance, at 256MB, Rust demonstrates an impressive startup time of approximately 0.1 seconds, whereas Java takes around 5.8 seconds. GraalVM falls in between at 0.5 seconds, which is generally deemed acceptable for most use cases.
In certain asynchronous use cases, a startup time of 5.8 seconds might still be acceptable. However, if you choose to proceed with Java, it is recommended to explore the snapstart feature mentioned earlier to optimize your startup times.
 For more information about the snapstart feature, refer to: https://docs.aws.amazon.com/lambda/latest/dg/snapstart.html
Compare hot lambda performance
With a memory allocation of 256MB, Java exhibits an average execution time of approximately 19.5 ms, while GraalVM boasts a significantly faster performance at 4.5 ms. Consequently, using Java in comparison to GraalVM can lead to lambda execution costs up to four times higher.
Introducing Rust into the equation, we observe that even with a memory allocation of 128MB, Rust performs admirably with an average execution time of 5.8 ms. On the other hand, GraalVM struggles with memory usage, resulting in slower and more unstable response times, particularly with p100 at 304.6 ms. Our comprehensive results indicate that when comparing Rust to GraalVM, cost reductions of up to 50% can be achieved, depending on the specific use case.
Arm64 vs x86 runtime architecture
Our benchmark results indicate a significant improvement in execution times, up to 10% faster, for Java runtimes on the arm64 architecture compared to x86. This translates to a superior price-performance ratio for arm64 architecture. In line with these findings, an AWS blog post  also confirms the enhanced price-performance benefits of arm64 architecture over x86, with the arm64 runtime offering lower per-hour costs.
Considering that Java artifacts can run on both arm64 and x86 architectures using the same artifact, we generally recommend selecting arm64 as the preferred Lambda execution architecture for Java runtimes. By leveraging the advantages of arm64 architecture, you can maximize the performance and cost efficiency of your Java-based Lambda functions.
As a Java developer, when it comes to low usage use cases, Java remains a straightforward and convenient choice. However, it is advisable to minimize dependencies on external libraries whenever possible to optimize performance and minimize costs.
On the other hand, for high load use cases, our benchmark results indicate that Rust surpasses other options in terms of both performance and costs. Rust offers exceptional performance while ensuring cost-effectiveness, making it a compelling choice for resource-intensive scenarios.
In the middle ground, GraalVM with native builds presents itself as an alternative with certain trade-offs. It allows you to continue writing Java code and leverage your existing expertise. By adopting GraalVM, you can potentially reduce costs by up to 75% compared to traditional Java implementations. However, it’s important to note that building Docker images for GraalVM native functions can be more complex, and there may be limitations, such as restrictions on reflection usage. Additionally, if you prefer using the arm64 architecture instead of x86, you will need to compile native artifacts, whereas Java allows the use of the same artifact across different architectures.
Furthermore, it’s worth mentioning that Java still enjoys broader support, including Maven archetypes and extensive documentation, which can be advantageous when it comes to development workflows and community resources.
In summary, your choice of language for AWS Lambda depends on the specific requirements of your use case. For low usage scenarios, Java remains a convenient option, while Rust excels in high load situations where performance and cost efficiency are paramount. GraalVM offers a middle ground with potential cost savings, but it comes with complexities and certain limitations. Consider your project’s needs and trade-offs carefully to make an informed decision.