-
-
Notifications
You must be signed in to change notification settings - Fork 134
Add: WASM Emscripten example #347
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
This comment was marked as off-topic.
This comment was marked as off-topic.
I imagine an even more minimal example / test setup than YoloV8 that is first compiled and then run in the CI within a headless Chromium. We could check the developer console output to determine whether the model was correctly inferred. What do you think? Which model would be suited? Any good example how to use headless Chromium in GitHub workflows? Would this become its own workflow or rather a job in an existing workflow? PS: I guess we could use puppeteer to execute JavaScript code without any |
YOLOv8 is already a pretty good example model - not too simple like the MNIST model, but not too large like GPT-2. I don't know off the top of my head if either supports WASM threads but we should give Node.js or ideally Deno a shot before jumping straight to headless Chromium for CI. Btw, great work here! Until now I didn't think it was even possible to get |
Sure, I can use Deno to test the
|
cbf5276
to
4337eca
Compare
I have observed that you have added mechanisms to provide a precompiled static libonnxruntime for the Emscripten environment. However, when I try to use the |
Like I said in #349 I couldn't get binaries for 1.20.2 but they should be available for 1.21. Keep the current binary mechanism, and don't worry about CI, I'll handle those when 1.21 rolls around 👍 |
FYI, |
c63d5ad
to
78b0f6f
Compare
I have pulled the latest changes from
I have removed the Another cause for the issue could be the upgrade to Emscripten v4.0.3. Any glue? |
Those symbols are C++'s std, not Rust. It's definitely caused by the Emscripten upgrade. Was the ONNX Runtime binary compiled with Emscripten 4.0.3? |
It should have been, according to the log files. Update: Soon it must be Emscripten 4.0.4 :) |
Is there any progress? I tried to reproduce but failed. I am looking forward to the final merge. |
Pure WASI is a no, unfortunately. |
@raphaelmenges Did you ever get to the bottom of that linking issue? And do you need my help on anything here? Looks like ONNX Runtime v1.21 is |
I have not further investigated the issue, yet. I can do so next week sometime, if this is early enough for you! |
862a70b
to
8a174a9
Compare
I have upgraded my code to include your latest changes and your precompiled
I then removed linking to
I have then checked the logs of your precompiled In the release note of v1.21 it is also mentioned that |
Dawn is indeed required for Emscripten too. Does the Let's get WASM32 working before looking into WASM64. Browser support doesn't seem too good anyways (though better than WebGPU to be fair 😶) |
I tried Anyway, I am very fine if you decide to not to include the WebGPU ep in the WASM example at the moment. I am also happy if you take over this pull request now and make any desired changes you want, i.e., remove the WebGPU ep code etc! |
Hmm, those |
In another context I just realized that the YoloV8 model you provide has now a different URL: https://cdn.pyke.io/0/pyke:ort-rs/example-models@0.0.0/yolov8m.onnx Can you fix that link in the WASM example code? |
I don't think I can push to your branch because it's an organization-owned fork, but I updated the model URL and did another minor tweak to fix the name of the output .js file and it's all working! Next is WebGPU =) EDIT: yeah, no WebGPU for now. It links fine and acknowledges the EP but complains of a Rust panic during the call to EDIT 2: seems like the WebGPU build is just broken in general as removing the registration makes it panic at a different point. I still don't know where it would be panicking with an empty message. This type of weirdness only ever happens on WASM 🙃 |
This might be the pull request to track about the status of the WebGPU ep in WASM: microsoft/onnxruntime#23697 I am using WebGPU ep on macOS already successfully btw. :) |
Opened #363 with my changes, thank you for all your hard work here! |
According to @fs-eire, the WebGPU ep should work now in WASM: microsoft/onnxruntime#23072 (comment) |
Co-authored-by: r.menges <Raphael.Menges@alfatraining.de>
Hello 👋,
I noticed that support for the WASM (WASI?) target in this crate has been officially dropped. However, I recently encountered a use case that requires me running
onnxruntime
on the Web. This led me to investigate whetherort
could work within an Emscripten environment.I discovered that this crate can be used as-is with the
wasm32-unknown-emscripten
Rust compilation target! While the setup can get a bit complex—especially when enabling multi-threading inonnxruntime
—it works well. I would love to see this example merged into the official repository, but I also understand if it is considered too experimental to be officially endorsed.Here’s a
.gif
for motivation, showcasing my example using the YoloV8 model to classify objects in pictures on Chromium:The Less Crazy Part
Microsoft does not provide precompiled static libraries for WASM, so I created a GitHub Actions workflow to handle this. The generated
libonnxruntime.a
can be linked withort
as usual—even when targetingwasm32-unknown-emscripten
.To expose Rust functions to JavaScript, the Rust main file must provide a C interface, which Emscripten can export. I use
rust-embed
to bundle the.onnx
model into the.wasm
. Anindex.html
file then incorporates the.js
and.wasm
outputs, which include the compiled Rust code,onnxruntime
, and the model.Additionally, the Emscripten SDK version used by the Rust compiler must match the exact version used by
onnxruntime
for successful linking.The Crazy Part
Things get trickier when enabling multi-threading in
onnxruntime
. Since v1.19.0, Microsoft recommends enabling multi-threading using the--enable_wasm_threads
build flag. This linkslibonnxruntime.a
topthread
, meaning all linked objects—including Rust’s standard library—must also be compiled withpthread
support.However, Rust’s standard library is not compiled this way by default, so you must switch to Rust nightly and compile the Rust standard library with
+atomics,+bulk-memory,+mutable-globals
.Additionally, the server must set specific CORS flags, as multi-threaded Emscripten uses
SharedArrayBuffer
in the Web browser, which requires these settings.Verdict
I am already opening the pull request as a draft to get early feedback. However, at least following ToDos are pending before a pull request could happen:
ort
's precompiled staticlibonnxruntime
mechanism.In the future, I would love to see execution providers for the Web to be available in
ort
:Might be the best option for now?Seems to exposed to the JavaScript world, only.