What are we trying to achieve
Let's say you have a JavaScript application that runs in bun and you've identified some bottleneck that you'd like to optimize.
Rewriting it in a more performant language may just be the solution you need.
As a modern JS runtime, Bun supports Foreign Function Interface (FFI) to call libraries written in other languages that support exposing C ABIs, like C, C , Rust and Zig.
In this post, we'll go over how one may use it, and conclude whether one can benefit from it.
How to link the library to JavaScript
This example is using Rust. Creating a shared library with C bindings looks differently in other languages but the idea remains the same.
From JS side
Bun exposes its FFI API through bun:ffi module.
The entrypoint is a dlopen function. It takes a path that's either absolute or relative to the current working directory to the library file (the build output with a .so extension for Linux, .dylib for macOS or .dll for Windows) and an object with the signatures of functions you want to import.
It returns an object with a close method which you may use to close the library once it's not needed anymore and symbols property which is an object containing the functions you chose.
import { dlopen, FFIType, read, suffix, toArrayBuffer, type Pointer, } from "bun:ffi"; // Both your script and your library don't typically change their locations // Use `import.meta.dirname` to make your script independent from the cwd const DLL_PATH = import.meta.dirname + `/../../rust-lib/target/release/library.${suffix}`; function main() { // Deconstruct object to get functions // but collect `close` method into object // to avoid using `this` in a wrong scope const { symbols: { do_work }, ...dll } = dlopen(DLL_PATH, { do_work: { args: [FFIType.ptr, FFIType.ptr, "usize", "usize"], returns: FFIType.void, }, }); /* ... */ // It is unclear whether it is required or recommended to call `close` // an example says `JSCallback` instances specifically need to be closed // Note that using `symbols` after calling `close` is undefined behaviour dll.close(); } main();
Passing data through FFI boundary
As you may notice, the supported types that bun accepts through FFI are limited to numbers, including pointers.
Notably size_t or usize is missing from the list of supported types, even though the code for it exists as of bun version 1.1.34.
Bun doesn't offer any help in passing data more complex than a C string. That means you'll have to work with pointers yourself.
Let's see how to pass a pointer from JavaScript to Rust ...
{ reconstruct_slice: { args: [FFIType.ptr, "usize"], returns: FFIType.void, }, } const array = new BigInt64Array([0, 1, 3]); // Bun automatically converts `TypedArray`s into pointers reconstruct_slice(array, array.length);
/// Reconstruct a `slice` that was initialized in JavaScript unsafe fn reconstruct_slice( array_ptr: *const i64, length: libc::size_t, ) -> &[i64] { // Even though here it's not null, it's good practice to check assert!(!array_ptr.is_null()); // Unaligned pointer can lead to undefined behaviour assert!(array_ptr.is_aligned()); // Check that the array doesn't "wrap around" the address space assert!(length <p>... and how to return a pointer from Rust to JavaScript.<br> </p> <pre class="brush:php;toolbar:false">{ allocate_buffer: { args: [], returns: FFIType.ptr, }, as_pointer: { args: ["usize"], returns: FFIType.ptr, }, } // Hardcoding this value for 64-bit systems const BYTES_IN_PTR = 8; const box: Pointer = allocate_buffer()!; const ptr: number = read.ptr(box); // Reading the value next to `ptr` const length: number = read.ptr(box, BYTES_IN_PTR); // Hardcoding `byteOffset` to be 0 because Rust guarantees that // Buffer holds `i32` values which take 4 bytes // Note how we need to call a no-op function `as_pointer` because // `toArrayBuffer` takes a `Pointer` but `read.ptr` returns a `number` const _buffer = toArrayBuffer(as_pointer(ptr)!, 0, length * 4);
#[no_mangle] pub extern "C" fn allocate_buffer() -> Box { let buffer: Vec<i32> = vec![0; 10]; let memory: ManuallyDrop<vec>> = ManuallyDrop::new(buffer); let ptr: *const i32 = memory.as_ptr(); let length: usize = memory.len(); // Unlike a `Vec`, `Box` is FFI compatible and will not drop // its data when crossing the FFI // Additionally, a `Box<t>` where `T` is `Sized` will be a thin pointer Box::new([ptr as usize, length]) } #[no_mangle] pub const extern "C" fn as_pointer(ptr: usize) -> usize { ptr } </t></vec></i32>
Rust doesn't know JS is taking ownership of the data on the other side, so you have to explicitly tell it to not deallocate the data on the heap using ManuallyDrop. Other languages that manage memory will have to do something similar.
Memory management
As we can see, it's possible to allocate memory in both JS and Rust, and neither can safely manage others memory.
Let's choose where you should allocate your memory and how.
Allocate in Rust
There are 3 methods of delegating memory cleanup to Rust from JS and all have their pros and cons.
Use FinalizationRegistry
Use FinalizationRegistry to request a cleanup callback during garbage collection by tracking the object in JavaScript.
import { dlopen, FFIType, read, suffix, toArrayBuffer, type Pointer, } from "bun:ffi"; // Both your script and your library don't typically change their locations // Use `import.meta.dirname` to make your script independent from the cwd const DLL_PATH = import.meta.dirname + `/../../rust-lib/target/release/library.${suffix}`; function main() { // Deconstruct object to get functions // but collect `close` method into object // to avoid using `this` in a wrong scope const { symbols: { do_work }, ...dll } = dlopen(DLL_PATH, { do_work: { args: [FFIType.ptr, FFIType.ptr, "usize", "usize"], returns: FFIType.void, }, }); /* ... */ // It is unclear whether it is required or recommended to call `close` // an example says `JSCallback` instances specifically need to be closed // Note that using `symbols` after calling `close` is undefined behaviour dll.close(); } main();
{ reconstruct_slice: { args: [FFIType.ptr, "usize"], returns: FFIType.void, }, } const array = new BigInt64Array([0, 1, 3]); // Bun automatically converts `TypedArray`s into pointers reconstruct_slice(array, array.length);
Pros
- It's simple
Cons
- Garbage collection is engine specific and non-deterministic
- Cleanup callback is not guaranteed to be called at all
Use toArrayBuffer's finalizationCallback parameter
Delegate garbage collection tracking to bun to call a cleanup callback.
When passing 4 parameters to toArrayBuffer, the 4th one must be a C function to be called on cleanup.
However, when passing 5 parameters, the 5th parameter is the function and the 4th parameter must be a context pointer that gets passed it.
/// Reconstruct a `slice` that was initialized in JavaScript unsafe fn reconstruct_slice( array_ptr: *const i64, length: libc::size_t, ) -> &[i64] { // Even though here it's not null, it's good practice to check assert!(!array_ptr.is_null()); // Unaligned pointer can lead to undefined behaviour assert!(array_ptr.is_aligned()); // Check that the array doesn't "wrap around" the address space assert!(length <pre class="brush:php;toolbar:false">{ allocate_buffer: { args: [], returns: FFIType.ptr, }, as_pointer: { args: ["usize"], returns: FFIType.ptr, }, } // Hardcoding this value for 64-bit systems const BYTES_IN_PTR = 8; const box: Pointer = allocate_buffer()!; const ptr: number = read.ptr(box); // Reading the value next to `ptr` const length: number = read.ptr(box, BYTES_IN_PTR); // Hardcoding `byteOffset` to be 0 because Rust guarantees that // Buffer holds `i32` values which take 4 bytes // Note how we need to call a no-op function `as_pointer` because // `toArrayBuffer` takes a `Pointer` but `read.ptr` returns a `number` const _buffer = toArrayBuffer(as_pointer(ptr)!, 0, length * 4);
Pros
- Delegate logic out of JavaScript
Cons
- A lot of boilerplate and chances to leak memory
- Missing type annotation for toArrayBuffer
- Garbage collection is engine specific and non-deterministic
- Cleanup callback is not guaranteed to be called at all
Manage memory manually
Just drop the memory yourself after you don't need it anymore.
Luckily TypeScript has a very helpful Disposable interface for this and the using keyword.
It's an equivalent to Python's with or C#'s using keywords.
See the docs for it
- TypeScript 5.2 changelog
- Pull request for using
#[no_mangle] pub extern "C" fn allocate_buffer() -> Box { let buffer: Vec<i32> = vec![0; 10]; let memory: ManuallyDrop<vec>> = ManuallyDrop::new(buffer); let ptr: *const i32 = memory.as_ptr(); let length: usize = memory.len(); // Unlike a `Vec`, `Box` is FFI compatible and will not drop // its data when crossing the FFI // Additionally, a `Box<t>` where `T` is `Sized` will be a thin pointer Box::new([ptr as usize, length]) } #[no_mangle] pub const extern "C" fn as_pointer(ptr: usize) -> usize { ptr } </t></vec></i32>
{ drop_buffer: { args: [FFIType.ptr], returns: FFIType.void, }, } const registry = new FinalizationRegistry((box: Pointer): void => { drop_buffer(box); }); registry.register(buffer, box);
Pros
- Cleanup is guaranteed to run
- You have control of when you want to drop the memory
Cons
- Boilerplate object for Disposable interface
- Manually dropping memory is slower than using garbage collector
- If you want to give away the ownership of the buffer you have to make a copy and drop the original
Allocate in JS
This is much simpler and safer as deallocating is handled for you.
However, there is a significant drawback.
Since you can't manage JavaScript's memory in Rust, you can't go over the buffer's capacity as that will cause a deallocation. That means you have to know buffer size before passing it to Rust.
Not knowing how many buffers you need beforehand will also incur a lot of overhead as you'll be going back and forth through FFI just to allocate.
/// # Safety /// /// This call assumes neither the box nor the buffer have been mutated in JS #[no_mangle] pub unsafe extern "C" fn drop_buffer(raw: *mut [usize; 2]) { let box_: Box = unsafe { Box::from_raw(raw) }; let ptr: *mut i32 = box_[0] as *mut i32; let length: usize = box_[1]; let buffer: Vec<i32> = unsafe { Vec::from_raw_parts(ptr, length, length) }; drop(buffer); } </i32>
{ box_value: { args: ["usize"], returns: FFIType.ptr, }, drop_box: { args: [FFIType.ptr], returns: FFIType.void, }, drop_buffer: { args: [FFIType.ptr, FFIType.ptr], returns: FFIType.void, }, } // Bun expects the context to specifically be a pointer const finalizationCtx: Pointer = box_value(length)!; // Note that despite the presence of these extra parameters in the docs, // they're absent from `@types/bun` //@ts-expect-error see above const buffer = toArrayBuffer( as_pointer(ptr)!, 0, length * 4, //@ts-expect-error see above finalizationCtx, drop_buffer, ); // Don't leak the box used to pass buffer through FFI drop_box(box);
A sidenote on strings
If the output you're expecting from the library is a string you may have considered the microoptimization of returning a vector of u16 rather than a string since typically JavaScript engines use UTF-16 under the hood.
However, that would be a mistake because transforming your string to a C string and using bun's cstring type will be mildly faster.
Here's a benchmark done using a nice benchmark library mitata
import { dlopen, FFIType, read, suffix, toArrayBuffer, type Pointer, } from "bun:ffi"; // Both your script and your library don't typically change their locations // Use `import.meta.dirname` to make your script independent from the cwd const DLL_PATH = import.meta.dirname + `/../../rust-lib/target/release/library.${suffix}`; function main() { // Deconstruct object to get functions // but collect `close` method into object // to avoid using `this` in a wrong scope const { symbols: { do_work }, ...dll } = dlopen(DLL_PATH, { do_work: { args: [FFIType.ptr, FFIType.ptr, "usize", "usize"], returns: FFIType.void, }, }); /* ... */ // It is unclear whether it is required or recommended to call `close` // an example says `JSCallback` instances specifically need to be closed // Note that using `symbols` after calling `close` is undefined behaviour dll.close(); } main();
{ reconstruct_slice: { args: [FFIType.ptr, "usize"], returns: FFIType.void, }, } const array = new BigInt64Array([0, 1, 3]); // Bun automatically converts `TypedArray`s into pointers reconstruct_slice(array, array.length);
/// Reconstruct a `slice` that was initialized in JavaScript unsafe fn reconstruct_slice( array_ptr: *const i64, length: libc::size_t, ) -> &[i64] { // Even though here it's not null, it's good practice to check assert!(!array_ptr.is_null()); // Unaligned pointer can lead to undefined behaviour assert!(array_ptr.is_aligned()); // Check that the array doesn't "wrap around" the address space assert!(length <h2> What about WebAssembly? </h2> <p>It's time to address the elephant in the room that is WebAssembly.<br> Should you choose nice existing WASM bindings over dealing with C ABI? </p> <p>The answer is <strong>probably neither</strong>. </p> <h2> Is it actually worth it? </h2> <p>Introducing another language to your codebase will require more than just a single bottleneck to be worth it DX-wise and performance-wise. </p> <p>Here is a benchmark for a simple range function in JS, WASM and Rust.<br> </p> <pre class="brush:php;toolbar:false">{ allocate_buffer: { args: [], returns: FFIType.ptr, }, as_pointer: { args: ["usize"], returns: FFIType.ptr, }, } // Hardcoding this value for 64-bit systems const BYTES_IN_PTR = 8; const box: Pointer = allocate_buffer()!; const ptr: number = read.ptr(box); // Reading the value next to `ptr` const length: number = read.ptr(box, BYTES_IN_PTR); // Hardcoding `byteOffset` to be 0 because Rust guarantees that // Buffer holds `i32` values which take 4 bytes // Note how we need to call a no-op function `as_pointer` because // `toArrayBuffer` takes a `Pointer` but `read.ptr` returns a `number` const _buffer = toArrayBuffer(as_pointer(ptr)!, 0, length * 4);
#[no_mangle] pub extern "C" fn allocate_buffer() -> Box { let buffer: Vec<i32> = vec![0; 10]; let memory: ManuallyDrop<vec>> = ManuallyDrop::new(buffer); let ptr: *const i32 = memory.as_ptr(); let length: usize = memory.len(); // Unlike a `Vec`, `Box` is FFI compatible and will not drop // its data when crossing the FFI // Additionally, a `Box<t>` where `T` is `Sized` will be a thin pointer Box::new([ptr as usize, length]) } #[no_mangle] pub const extern "C" fn as_pointer(ptr: usize) -> usize { ptr } </t></vec></i32>
{ drop_buffer: { args: [FFIType.ptr], returns: FFIType.void, }, } const registry = new FinalizationRegistry((box: Pointer): void => { drop_buffer(box); }); registry.register(buffer, box);
Native library barely beats out WASM and consistently loses to the pure TypeScript implementation.
And that's it for this tutorial for/exploration of bun:ffi module. Hopefully we all have walked away from this a little bit more educated.
Feel free to share thoughts and questions in the comments
The above is the detailed content of How to and Should you use Bun FFI. For more information, please follow other related articles on the PHP Chinese website!

Understanding how JavaScript engine works internally is important to developers because it helps write more efficient code and understand performance bottlenecks and optimization strategies. 1) The engine's workflow includes three stages: parsing, compiling and execution; 2) During the execution process, the engine will perform dynamic optimization, such as inline cache and hidden classes; 3) Best practices include avoiding global variables, optimizing loops, using const and lets, and avoiding excessive use of closures.

Python is more suitable for beginners, with a smooth learning curve and concise syntax; JavaScript is suitable for front-end development, with a steep learning curve and flexible syntax. 1. Python syntax is intuitive and suitable for data science and back-end development. 2. JavaScript is flexible and widely used in front-end and server-side programming.

Python and JavaScript have their own advantages and disadvantages in terms of community, libraries and resources. 1) The Python community is friendly and suitable for beginners, but the front-end development resources are not as rich as JavaScript. 2) Python is powerful in data science and machine learning libraries, while JavaScript is better in front-end development libraries and frameworks. 3) Both have rich learning resources, but Python is suitable for starting with official documents, while JavaScript is better with MDNWebDocs. The choice should be based on project needs and personal interests.

The shift from C/C to JavaScript requires adapting to dynamic typing, garbage collection and asynchronous programming. 1) C/C is a statically typed language that requires manual memory management, while JavaScript is dynamically typed and garbage collection is automatically processed. 2) C/C needs to be compiled into machine code, while JavaScript is an interpreted language. 3) JavaScript introduces concepts such as closures, prototype chains and Promise, which enhances flexibility and asynchronous programming capabilities.

Different JavaScript engines have different effects when parsing and executing JavaScript code, because the implementation principles and optimization strategies of each engine differ. 1. Lexical analysis: convert source code into lexical unit. 2. Grammar analysis: Generate an abstract syntax tree. 3. Optimization and compilation: Generate machine code through the JIT compiler. 4. Execute: Run the machine code. V8 engine optimizes through instant compilation and hidden class, SpiderMonkey uses a type inference system, resulting in different performance performance on the same code.

JavaScript's applications in the real world include server-side programming, mobile application development and Internet of Things control: 1. Server-side programming is realized through Node.js, suitable for high concurrent request processing. 2. Mobile application development is carried out through ReactNative and supports cross-platform deployment. 3. Used for IoT device control through Johnny-Five library, suitable for hardware interaction.

I built a functional multi-tenant SaaS application (an EdTech app) with your everyday tech tool and you can do the same. First, what’s a multi-tenant SaaS application? Multi-tenant SaaS applications let you serve multiple customers from a sing

This article demonstrates frontend integration with a backend secured by Permit, building a functional EdTech SaaS application using Next.js. The frontend fetches user permissions to control UI visibility and ensures API requests adhere to role-base


Hot AI Tools

Undresser.AI Undress
AI-powered app for creating realistic nude photos

AI Clothes Remover
Online AI tool for removing clothes from photos.

Undress AI Tool
Undress images for free

Clothoff.io
AI clothes remover

AI Hentai Generator
Generate AI Hentai for free.

Hot Article

Hot Tools

EditPlus Chinese cracked version
Small size, syntax highlighting, does not support code prompt function

WebStorm Mac version
Useful JavaScript development tools

Safe Exam Browser
Safe Exam Browser is a secure browser environment for taking online exams securely. This software turns any computer into a secure workstation. It controls access to any utility and prevents students from using unauthorized resources.

SublimeText3 English version
Recommended: Win version, supports code prompts!

Zend Studio 13.0.1
Powerful PHP integrated development environment