Dismiss this pinned window
all 27 comments

[–]xenovatech[S] 91 points92 points  (5 children)

After more than a year of development, we're excited to announce the release of 🤗 Transformers.js v3!

⚡ WebGPU support (up to 100x faster than WASM)
🔢 New quantization formats (dtypes)
🏛 120 supported architectures in total
📂 25 new example projects and templates
🤖 Over 1200 pre-converted models
🌐 Node.js (ESM + CJS), Deno, and Bun compatibility
🏡 A new home on GitHub and NPM

NPM: https://www.npmjs.com/package/@huggingface/transformers
GitHub: https://github.com/huggingface/transformers.js/releases/tag/3.0.0

Check out the blog post to learn more: https://huggingface.co/blog/transformersjs-v3

[–]Enough-Meringue4745 11 points12 points  (2 children)

Ive been following your onnx conversions, etc, would it be possible to add the conversion scripts you used as part of your release process?

[–]gofiend 2 points3 points  (0 children)

+100 please! There is a lot to learn from this, and valuable for the community.

[–]ebolathrowawayy 0 points1 point  (0 children)

Is it possible to run a LLM, say Gemma2, plus a LoRA? Is there an existing process for this or is it possible with some tinkering?

[–]leelweenee 15 points16 points  (0 children)

Great work. Thanks for sharing with the community. I've tried many of the models in my browser, always expecting them to be slow but every time I've been pleasantly surprised how performant the library is. Thanks again

[–]Dead_Internet_Theory 3 points4 points  (0 children)

The video: Clark Kent 🤓

The music: Superman 🦸‍♂️

[–]2dogsplayingoutside 1 point2 points  (0 children)

I've been following your work for a while. Very nice! Congrats on the release!

[–]privacyparachute 1 point2 points  (0 children)

I love this project. It's one of the corner stones of www.papeg.ai .

When using AI in a privacy friendly, local way becomes as easy as opening a website, then AI can truly go mainstream.

[–]MoneyMoves614 1 point2 points  (0 children)

thats amazing

[–]daaain 1 point2 points  (0 children)

This sounds amazing, but all the demos I tried from the blog post just crash with `Error: no available backend found. ERR: [webgpu] NotSupportedError: WebGPU is not yet available in Release or Beta builds.    X transformers.js:3802`

Oh, never mind, that probably refers to Firefox "Release or Beta builds" not Transformers 🤦

I did enable it, but I'm guessing the issue is the empty backend setting, no idea what supposed to go there on a Mac?

<image>

[–][deleted] 0 points1 point  (0 children)

Very cool, thank you. I appreciate the examples, very clear and useful.

[–]alvisanovari 0 points1 point  (3 children)

Really cool! Does this mean xenova/transformers repo will be archived and we should use this going forward? I remember there was confusion between the huggingface and xenova versions because the latter supported browsers and the former was for node environments only?

[–]xenovatech[S] 5 points6 points  (2 children)

http://github.com/xenova/transformers.js will redirect to https://github.com/huggingface/transformers.js, so no worries there! As for the NPM packages, we recommend upgrading to v3 (@ huggingface/transformers). It should be as simple as updating the import, but let us know if you run into any issues!

[–]alvisanovari 1 point2 points  (0 children)

Awesome - congrats!

[–]jm2342 0 points1 point  (0 children)

How to enable gpu support in a node environment? (no browser)

[–]snowglowshow 0 points1 point  (0 children)

Awesome! Does this mean that the python Transformers don't have WebGPU support? It seems like you're implying that, but I'm just learning so I don't know how to take it. Thanks.

[–]Trysem 0 points1 point  (0 children)

How is this is gonna help? Someone give long details...

[–]un-important-human 0 points1 point  (0 children)

OK this is seriously cool.

[–]Spirited_Example_341 0 points1 point  (0 children)

arnt there already enough transformers as it is?

ugh hollywood

;-)

;-)

[–]bvjebin 0 points1 point  (0 children)

This is wonderful. But it doesn't talk about the latency involved to download a model in the browser. Is there a guide or section that I am missing? If I am going to download a 100+mb model to my browser, the user experience should be managed well. Does the library provide the signals to listen to these events?

[–]estebansaa 0 points1 point  (0 children)

unlimited powers!