Show Hn: Jax-js, Array Library In Javascript Targeting Webgpu - Analysis
I’m excited to release jax-js, a machine learning library for the web.
You can think of it as a reimplementation of Google DeepMind’s JAX framework (similar to PyTorch) in pure JavaScript.
jax-js runs completely in the browser by generating fast WebGPU and Wasm kernels.
Starting in February this year, I spent nights and weekends working on a new ML library for the browser. I wanted a cross-platform way to run numerical programs on the frontend web, so you can do machine learning.
Python and JavaScript are the most popular languages in the world:
Python is simple, expressive and now ubiquitous in ML thanks to frameworks like PyTorch and JAX.
But most developers would balk at running any number crunching in JavaScript. While the JavaScript JIT is really good, it’s not optimized for tight numerical loops. JavaScript doesn’t even have a fast, native integer data type! So how can you run fast numerical code on the web?
The answer is to rely on new browser technologies — WebAssembly and WebGPU, which allow you to run programs at near-native speeds. WebAssembly is a low-level portable bytecode, and WebGPU is GPU shaders on the web.
If we can use these native runtimes, then this lends itself to a programming model similar to JAX, where you trace programs and JIT compile them to GPU kernels. Here, instead of Nvidia CUDA, we write pure JavaScript to generate WebAssembly and WebGPU kernels. Then we can run them and execute instructions at near-native speed, skipping the JavaScript interpreter bottleneck.
That is what I ended up doing in jax-js, and now it “just works”.
Source: HackerNews