I’m currently exploring two interesting topics for Memphis, my Python interpreter in Rust: building for WebAssembly and embedding CPython. With no major milestones to report this week, I thought I’d share some in-progress thoughts. For me, Memphis is been a project for expanding my conceptual understanding through practical experiments—hopefully, this post can do the same for you as we walk through some of the design decisions I'm exploring.
Python in the browser
Compiling Memphis to a WebAssembly target had been in the back of my mind for some time, and two Saturdays ago, I finally gave it a go. With a lukewarm cup of drip coffee on my coaster, I cracked my knuckles and began.
WebAssembly is a sandboxed execution environment inside modern web browsers which complements the traditional JavaScript environment. The Wasm environment is closer to native code and can be used for tasks which benefit from a more performant CPU context; think number crunching or silly busy loops. I was interested in it less from a performance perspective and more because it was possible at all. One of Rust’s selling points (out of literally bajillions) is it can target Wasm. How do, one might ask? This is possible because Rust uses LLVM as its compiler backend. The Rust compiler frontend produces LLVM Intermediate Representation (IR) code and LLVM can compile this to native code for dozens of targets.
That’s a pretty massive benefit and I was curious if it would Just Work for Memphis. I had given literally zero thought to running Python in the browser before, so this seemed like a perfect opportunity to test out the Wasm learning curve.
Setting Up wasm-pack and Building for WebAssembly
I fired up my AI assistant and asked for the launch sequence. It went beep boop beep boop
. Below are the steps annotated with my learnings along the way.
# wasm-pack helps compile our Rust code to WebAssembly and bundle it
# with JavaScript bindings we can call from our HTML/JavaScript page.
cargo install wasm-pack
# wasm-pack also downloads the wasm32-unknown-unknown target via
# rustup for us. If for whatever reason it does not, you can use this:
# rustup target add wasm32-unknown-unknown
# We must specify a feature flag because our wasm_bindgen interface is
# behind the wasm feature flag.
wasm-pack build --target web --out-dir wasm_ui/pkg -- --features wasm
The build succeeded on my first try! However, because we haven’t marked any functions in our Rust binary as being available to call from WebAssembly, it doesn’t do much.
We can install the wasm-bindgen
crate to do this, which I put behind a feature flag. I added this to my Cargo.toml
.
[dependencies]
wasm-bindgen = { version = "0.2", optional = true }
[features]
wasm = ["wasm-bindgen"]
Here’s a small piece of code I added to my src/lib.rs
file, behind the wasm
feature flag. The greet
function is decorated with #[wasm_bindgen]
to make this symbol available in JavaScript.
#[cfg(feature = "wasm")]
mod wasm {
use wasm_bindgen::prelude::wasm_bindgen;
// Export a function to JavaScript
#[wasm_bindgen]
pub fn greet() -> String {
"Hello from WebAssembly!".to_string()
}
}
Creating a JavaScript Interface
I also asked my AI assistant for the smallest possible piece of JavaScript I could use to test my Wasm interface. When we call init()
, the browser loads the .wasm
file, performs a JIT compilation step to convert the portable WebAssembly binary into native code, and initializes memory for the WebAssembly runtime.
<!DOCTYPE html>
<html lang="en">
<head>
<meta charset="UTF-8">
<title>Wasm Test</title>
</head>
<body>
<script type="module">
import init, { greet } from './pkg/memphis.js';
async function run() {
await init();
console.log(greet());
}
run();
</script>
</body>
</html>
Like a miracle among miracles, it Just Worked. Granted, I wasn’t running any Python code in the browser, but interfacing with my binary was a HUGE step that younger-me-who-could-barely-install-java did not want to undervalue.
The next step was to give it a Python expression defined in JavaScript and have the Wasm binary crunch the numbers. As I mentioned in my REPL post, every entry point in a software project is an opportunity to improve my abstractions, and it would certainly be the case again here. As I thumbed through my Memphis repo, I realized Wow, I should really have a better interface to pass a string and evaluate it as Python. Like I said, I LOVE new entry points.
For the time being, I would use my crosscheck
adapter. Crosscheck is my work-in-progress testing framework to validate the treewalk interpreter and bytecode VM produce the same behavior for a given Python input. It’s named after the thing flight attendants do.
Here is my updated Rust code.
#[cfg(feature = "wasm")]
mod wasm {
use wasm_bindgen::prelude::wasm_bindgen;
use crosscheck::{InterpreterTest, TreewalkAdapter};
// Export a function to JavaScript
#[wasm_bindgen]
pub fn greet() -> String {
"Hello from WebAssembly!".to_string()
}
#[wasm_bindgen]
pub fn evaluate(code: String) -> String {
let result = TreewalkAdapter.execute(&code);
format!("{}", result)
}
}
Here is my updated JavaScript code, which invokes the new Rust evaluate
function.
<!DOCTYPE html>
<html lang="en">
<head>
<meta charset="UTF-8">
<title>Wasm Test</title>
</head>
<body>
<script type="module">
import init, { greet, evaluate } from './pkg/memphis.js';
async function run() {
await init();
console.log(greet());
const expr = "[ 2 * i for i in range(5) if i % 2 == 0 ]";
console.log(expr, "=", evaluate(expr));
}
run();
</script>
</body>
</html>
Debugging WebAssembly Errors
Now when I ran it I got……… a console error. It crashed with an unimplemented
error.
I poked around a bit and it was not clear what was causing this. You can click into the source but for a Wasm build that is just a block of assembly without references to the original Rust functions.
I did some AI chatting/Googling and found two helpful approaches. One is console_log for use in Wasm builds, which displays log statements from your Rust code in your browser console. This helped some, but what I was really looking for was a stack trace. Enter console_error_panic_hook. It gave me the Rust stack trace immediately, which was CLUTCH. If you are doing your own Wasm build, stop reading this now and add this crate. I don’t even mind if you never finish reading this post. Ferris would want you to use this crate 🦀. Here’s how I added it to my Wasm interface.
#[cfg(feature = "wasm")]
mod wasm {
use console_error_panic_hook::set_once;
use wasm_bindgen::prelude::wasm_bindgen;
#[wasm_bindgen]
pub fn evaluate(code: String) -> String {
// Set the panic hook for better error messages in the
// browser console
set_once();
let result = TreewalkAdapter.execute(&code);
format!("{}", result)
}
}
My stack trace pointed me to my culprit: I was using std::env
to request some OS resources, which are not allowed in a Wasm runtime (that’s the sandboxed part). I put these calls behind a feature flag (they are related to how I hack-ily determine the location of the Python standard lib on the host machine) and fired up my build again. After a few small failures related to properly displaying my return types….
IT WORKED. Here’s what I now see in my browser console.
wasm_ui/:13: Hello from WebAssembly!
wasm_ui/:15: [ 2 * i for i in range(5) if i % 2 == 0 ] = [0, 4, 8]
tldr I can run Python in the browser. (To their credit, RustPython does this too: https://rustpython.github.io/demo/. I haven’t looked deeply at their project but it seems comprehensive.) The Python list comprehension is defined in JavaScript in string form and the response list is evaluated by the Rust code compiled to Wasm and converted back into a string which can be displayed by JavaScript.
This setup only supports expressions at the moment. To evaluate statements (and later read back their results), I will need to keep state on the Rust side. I also dream of building a JavaScript REPL. That sounds like a problem for future-me (and a boring dream tbh).
The End
I’ve been talking long enough, so I’m going to hold off on discussing embedded Python until next Monday.
Apologies for the bait and switch. The content calendar waits for no one.
To be clear, by embedded Python, I mean embedding a CPython interpreter inside of Memphis, not running Python in an “embedded systems” environment. That would be hard for no reason. Unlike Memphis, which is hard for FUN.
If you’d like to get more posts like this directly to your inbox, you can subscribe here!
Elsewhere
In addition to mentoring software engineers, I also write about my experience as an adult-diagnosed autistic person. Less code and the same number of jokes.
- Why do I crave recognition? - From Scratch dot org
Top comments (0)