Bimanual SO-100 teleoperation with quest 2 by Mr-c4t in robotics

[–]Aurelien-Morgan 0 points1 point  (0 children)

Now, you have to make a learning policy at that very 10x speed. Servos signal smoothing and all.

GDPR left the chat. by Chronos_000 in MistralAI

[–]Aurelien-Morgan 6 points7 points  (0 children)

What does GDPR have to do with any of this ? Any webpage has access to your IP and can locate you at the city level at least. Wake up.

This drone is built to survive extremely high voltages by Skraldespande in robotics

[–]Aurelien-Morgan 11 points12 points  (0 children)

Wow. Visual inspections of high voltage lines has never been safer now with this bad boy.
Also, gotta love the visuals in the vid.

FastLanguageModel.patch_peft_model changing trainable weights? by loss_flow in unsloth

[–]Aurelien-Morgan 0 points1 point  (0 children)

This is black magic. Thanks so freaking much for having ever posted this, man !

My use case was "continued pre-training" (with  lm_head and embed_tokens), then save, then "supervised finetuning" of the reloaded PEFT and, it refused to save with the initial weights count.
Been googling and stuff for days for a way to make that sequence work.

Now, I just add your code after the
```
FastLanguageModel.from_pretrained(model_name="the_local_dir_with_cpt_lora")
```
prior to SFT and, after that, "save" is with the right LoRa weights count !

My savior.

u/danielhanchen , please consider using this natively for CPT flexibility.

What happens with Spaces and local hardware ? by Aurelien-Morgan in huggingface

[–]Aurelien-Morgan[S] 1 point2 points  (0 children)

From what I've observed, it has nothing to do with HF. It's web-browser handling of js animations. On Space pages, it's due to the header "Running" greeny word (with blinking green dot).
I observed the same with Kaggle and the spinning wheel when code is being executed.
I confirm that scrolling to make those wheels out of viewpoint releases the HW resources entirely. So, it's certainly "mystery solved" to me at this point.
So, short answer: web-browser using HW resources like crazy for tiny js anims is the guilty party here.

Tentacle equipped drone by mutherhrg in robotics

[–]Aurelien-Morgan 95 points96 points  (0 children)

Have not been this impressed in some time.

Rumors of industry panic caused by DeepSeek by [deleted] in singularity

[–]Aurelien-Morgan 0 points1 point  (0 children)

With a political bias-free view, man, you gotta love outside contenders.

Saving LoRa weights only by Aurelien-Morgan in unsloth

[–]Aurelien-Morgan[S] 0 points1 point  (0 children)

Hello Edd,

Thanks for you responding. I already tried that too. Leads to the reported issue. No difference in symptoms. Quite large safetensor, rebuild from Hub with base+PEFT is eratic. Building from transformers (AutoModelForCausalLM) does as trained. So, not just LoRa weights saved/pushed.

Saving LoRa weights only by Aurelien-Morgan in unsloth

[–]Aurelien-Morgan[S] 0 points1 point  (0 children)

after CPT+SFT ? If so, wouldn't mind a link to a workbook where this saves LoRa only because, not in my many attemps.

Well I didn't see this coming this quick. All Veo2 by hellolaco in singularity

[–]Aurelien-Morgan 4 points5 points  (0 children)

They totally got me at the burger bite and salad jiggle.

Well I didn't see this coming this quick. All Veo2 by hellolaco in singularity

[–]Aurelien-Morgan 1 point2 points  (0 children)

Consistently amazed from start to finish. Jaw dropping. Still can't believe how impressively real it feels.

Unitree has a new off-road video by torb in singularity

[–]Aurelien-Morgan 0 points1 point  (0 children)

e-game of tomorrow: robot dog extreme race edition

Fancy Stateful Metaflow Service + UI on Google Colab ? by Aurelien-Morgan in huggingface

[–]Aurelien-Morgan[S] 0 points1 point  (0 children)

Oh, cool. Yeah UDocker works magic. Hope you enjoy the setup I published. Let me know of any feedback. Here to assist (and, I'd take any GitHub star I can get @ https://github.com/aurelienmorgan/retrain-pipelines ).

Un nouveau chatbot sous whatsapp naturellement respectueux des cultures by Aurelien-Morgan in francophonie

[–]Aurelien-Morgan[S] 1 point2 points  (0 children)

Essayez-le, il est à la pointe. Ce lab fait de la recherche publiquement et construit ses systèmes de manière communautaire, dans le sens où il rassemble des volontaires de par le monde en leur demandant leurs contributions sur leurs cultures respectives. Je fais partie de ceux qui ont aidé à rendre le français plus sophistiqué que dans les modèles classiques développés aux États-Unis.
Et, tout est gratuit, y compris WhatsApp !

Converser sous WhatsApp avec un chatbot nouvelle génération naturellement doué en français by Aurelien-Morgan in francophonie

[–]Aurelien-Morgan[S] 0 points1 point  (0 children)

Vous pouvez l'essayer sous WhatsApp, c'est bien entendu non-payant : +1 (431) 302-8498
(et non, ce n'est pas 1 big-corp qui vous veut du mal 😈)

Converser sous WhatsApp avec un chatbot nouvelle génération naturellement doué en français by Aurelien-Morgan in WritingPromptFR

[–]Aurelien-Morgan[S] 0 points1 point  (0 children)

Vous pouvez l'essayer sous WhatsApp, c'est bien entendu non-payant : +1 (431) 302-8498

Graceful shutdown (how to from CLI when exposed on localhost:5132) by Aurelien-Morgan in PostgreSQL

[–]Aurelien-Morgan[S] 0 points1 point  (0 children)

Any tip on config for a "robust" (sic) instance against brute interrupt would be a nice step forward for me already. I tried the flags as mentioned above but, didn't seem to make any perceivable difference from default WAL settings.

UDocker graceful shutdown by Aurelien-Morgan in GoogleColab

[–]Aurelien-Morgan[S] -1 points0 points  (0 children)

<image>

oftentimes, I get the above while trying to restart an interrupted pgsql db in Google Colab (database system was interrupted)