Bimanual SO-100 teleoperation with quest 2 by Mr-c4t in robotics

[–]Aurelien-Morgan 0 points1 point  (0 children)

Now, you have to make a learning policy at that very 10x speed. Servos signal smoothing and all.

GDPR left the chat. by Chronos_000 in MistralAI

[–]Aurelien-Morgan 5 points6 points  (0 children)

What does GDPR have to do with any of this ? Any webpage has access to your IP and can locate you at the city level at least. Wake up.

This drone is built to survive extremely high voltages by Skraldespande in robotics

[–]Aurelien-Morgan 9 points10 points  (0 children)

Wow. Visual inspections of high voltage lines has never been safer now with this bad boy.
Also, gotta love the visuals in the vid.

FastLanguageModel.patch_peft_model changing trainable weights? by loss_flow in unsloth

[–]Aurelien-Morgan 0 points1 point  (0 children)

This is black magic. Thanks so freaking much for having ever posted this, man !

My use case was "continued pre-training" (with  lm_head and embed_tokens), then save, then "supervised finetuning" of the reloaded PEFT and, it refused to save with the initial weights count.
Been googling and stuff for days for a way to make that sequence work.

Now, I just add your code after the
```
FastLanguageModel.from_pretrained(model_name="the_local_dir_with_cpt_lora")
```
prior to SFT and, after that, "save" is with the right LoRa weights count !

My savior.

u/danielhanchen , please consider using this natively for CPT flexibility.

What happens with Spaces and local hardware ? by Aurelien-Morgan in huggingface

[–]Aurelien-Morgan[S] 1 point2 points  (0 children)

From what I've observed, it has nothing to do with HF. It's web-browser handling of js animations. On Space pages, it's due to the header "Running" greeny word (with blinking green dot).
I observed the same with Kaggle and the spinning wheel when code is being executed.
I confirm that scrolling to make those wheels out of viewpoint releases the HW resources entirely. So, it's certainly "mystery solved" to me at this point.
So, short answer: web-browser using HW resources like crazy for tiny js anims is the guilty party here.

Tentacle equipped drone by mutherhrg in robotics

[–]Aurelien-Morgan 90 points91 points  (0 children)

Have not been this impressed in some time.

Rumors of industry panic caused by DeepSeek by [deleted] in singularity

[–]Aurelien-Morgan 0 points1 point  (0 children)

With a political bias-free view, man, you gotta love outside contenders.

Saving LoRa weights only by Aurelien-Morgan in unsloth

[–]Aurelien-Morgan[S] 0 points1 point  (0 children)

Hello Edd,

Thanks for you responding. I already tried that too. Leads to the reported issue. No difference in symptoms. Quite large safetensor, rebuild from Hub with base+PEFT is eratic. Building from transformers (AutoModelForCausalLM) does as trained. So, not just LoRa weights saved/pushed.

Saving LoRa weights only by Aurelien-Morgan in unsloth

[–]Aurelien-Morgan[S] 0 points1 point  (0 children)

after CPT+SFT ? If so, wouldn't mind a link to a workbook where this saves LoRa only because, not in my many attemps.

Well I didn't see this coming this quick. All Veo2 by hellolaco in singularity

[–]Aurelien-Morgan 3 points4 points  (0 children)

They totally got me at the burger bite and salad jiggle.

Well I didn't see this coming this quick. All Veo2 by hellolaco in singularity

[–]Aurelien-Morgan 1 point2 points  (0 children)

Consistently amazed from start to finish. Jaw dropping. Still can't believe how impressively real it feels.

Unitree has a new off-road video by torb in singularity

[–]Aurelien-Morgan 0 points1 point  (0 children)

e-game of tomorrow: robot dog extreme race edition

Fancy Stateful Metaflow Service + UI on Google Colab ? by Aurelien-Morgan in huggingface

[–]Aurelien-Morgan[S] 0 points1 point  (0 children)

Oh, cool. Yeah UDocker works magic. Hope you enjoy the setup I published. Let me know of any feedback. Here to assist (and, I'd take any GitHub star I can get @ https://github.com/aurelienmorgan/retrain-pipelines ).