LLRT (Low Latency Runtime), created by AWS, is the newest of the 14+ popular JavaScript Runtimes. So, why do we need another? Unlike some of the other popular JavaScript runtimes, like Node.js, LLRT is specifically intended for serverless applications—especially AWS Lambda. This means LLRT can effectively reduce cold start times and the associated costs.
Note—LLRT is still an experimental package and is therefore not yet recommended for production workloads.
The Cold Start Problem in Lambda and JavaScript Runtimes
The cold start problem in Lambda is something that AWS has been trying to improve for a while with measures like SnapStart and Provisioned Concurrency. In short, it’s what happens after there’s a period of inactivity before a Lambda function is invoked, resulting in a delay as the environment spins up and initializes the function. This latency not only impacts user experience and application responsiveness but also contributes to increased costs, as AWS bills for the time it takes to start the execution environment.
JavaScript runtimes are another method that can be helpful for reducing cold start times. They’re generally faster than Java runtimes, execute code efficiently, and load quickly.
LLRT Benefits
LLRT is the newest JavaScript runtime, created by AWS. Unlike its predecessors, which are more general-purpose, LLRT is purpose-built for serverless applications. According to AWS benchmarks, it leads to 10x faster startups and 2x overall lower costs on Lambda compared to other JavaScript runtimes.
A few design considerations were made that make LLRT so suitable to address Lambda cold starts:
- Built with Rust: Rust is a compiled language known for its efficiency and memory safety. This translates to a smaller executable size for LLRT, reducing download time at the start.
- Uses QuickJS as its JavaScript Engine: QuickJS is lightweight, capable, and does not need a lot of resources (again, corresponding to a quicker start time). See the benchmarks here.
- Includes AWS SDK clients built into the executable: Since there is no need to download it separately on start, it minimizes the initial download size and avoids potential dependency management problems.
- Does not use JIT (Just-In-Time) compilation: JIT compilation involves translating code from JavaScript to machine code at runtime, which adds to the cold start time.
LLRT Limitations
In the development of the Low Latency Runtime (LLRT), certain tradeoffs were necessary, leading to inherent limitations. The biggest limitation is that since it was made for serverless use cases, it won’t work as well in scenarios requiring heavy computational processing or extensive APIs that are commonly found in general-purpose runtimes like Node.js.
This is due, in part, to the decision not to use JIT—though you will get faster starts, it will be slower for compute-heavy tasks. Because of this tradeoff, LLRT is not intended for tasks like “large data processing, Monte Carlo simulations or performing tasks with hundreds of thousands or millions of iterations.”
There are also many APIs that are not supported or not yet supported (see list here). However, in Yan Cui’s podcast, the creator of LLRT, Richard Davison, stated the goal is to become WinterCG compliant. This thereby ensures LLRT is interoperable with Node.js so that you can switch to/back to Node.js if there are APIs you need that are unsupported with LLRT.
Another important consideration is that while you may save money due to shortened runtimes that, because this is technically a “custom runtime,” you are charged while the runtime is spinning up (something that is not charged for AWS-provided runtimes). Finally, to reiterate, LLRT is still an experimental package that is subject to change and is therefore not yet recommended for production workloads.
LLRT vs Other Runtimes (Bun, Deno, Node.js, Workerd, etc.)
Node.js was developed in 2009 and significantly contributed to the rise of full-stack JavaScript development because of its, at the time, unique addition of server-side development capabilities. It has since matured into a robust platform with a vast ecosystem of tools, APIs, and frameworks that remains a leader in JavaScript runtimes.
However, having such an extensive toolset and the need to ensure backward compatibility due to it being so widely used can lead to slower execution and adoption of new use cases. Modern JavaScript runtimes, like Bun and Deno, were created to improve performance, reduce complexities, and add modern functionalities.
There is a huge difference in intended uses between comprehensive runtimes like Bun, Deno, and Node.js vs LLRT. Contributors of LLRT emphasize that while Bun and LLRT are intended as replacements or drop-in replacements for Node.JS, LLRT is not intended as a “drop-in replacement for Node.js, nor will it ever be.”
As we mentioned, LLRT stands apart by its focus on serverless (mostly Lambda) use cases. The focus is evident in the design choices, such as the language, engine, and APIs available:
Runtime | Language | Engine |
---|---|---|
LLRT | Rust | QuickJS |
Bun | Zig | JavaScriptCore |
Deno | Primarily Rust | V8 |
Node.js | Primarily C++ | V8 |
Workerd | Primarily C++ | V8 |
LLRT is actually more comparable to JavaScript runtimes that are highly optimized toward specific use cases, rather than the general-purpose runtimes. Most relevantly, Workerd (also in Beta), was created by Cloudflare specifically for Cloudflare Workers.
Potential Impact of LLRT on AWS Lambda vs Cloudflare Workers
We’ve gone in-depth with a pricing comparison of Lambda vs Workers before and found Workers to be more cost-efficient in certain scenarios. This was due in large part to Workers not charging for duration and not being impacted by cold starts. By significantly minimizing cold start times, LLRT can reduce the cost of running serverless functions on Lambda. This could lead to increased adoption of Lambda in use cases where Lambda was previously prohibitive due to cold starts.
Monitor your AWS costs.