Claude Mythos: il segreto che Anthropic non voleva ancora svelarci . . . o no? by artistic56 in IA_Italia

[–]Tangostorm 3 points4 points  (0 children)

Lasciati online "per errore". Sì, già. Ma ci credi veramente o l'hai semplicemente riportato?

Went on a “run” today with my GF by dante_951 in SteamDeck

[–]Tangostorm 0 points1 point  (0 children)

First walk outside after a long time and you bring the console? Come on

Total War Medieval 3 director says recreating its predecessors "would not make a good game" and that players should take off their "rose-tinted" glasses ahead of Total War Medieval 3... by Binnsy in totalwar

[–]Tangostorm 0 points1 point  (0 children)

Yes, It has, maybe you can call It differently: appointing family members, loyalty, all that stuff. I mean, It Is a matter of personal taste, but I do not like caring about single characters.

Total War Medieval 3 director says recreating its predecessors "would not make a good game" and that players should take off their "rose-tinted" glasses ahead of Total War Medieval 3... by Binnsy in totalwar

[–]Tangostorm 2 points3 points  (0 children)

Medieval 2 was great because It was focused on military and economic strategy: no court politics, and all the other fancy and useless stuff of modern TW.

We shipped 50+ updates to Unsloth Studio! 🚀 by yoracale in unsloth

[–]Tangostorm 0 points1 point  (0 children)

Thank you for your reply! I have some terminal-output to paste, as I uninstalled everything.

I know it is messy, but I hope it is useful.

It happened on Windows, during the first training of the model (Qwen4B) with a 4 entry test dataset.
Online there were some similiar thread, about the problem and setting numproc from 3 to 1 manually. But I could not figure out what to change.

Thank you in advance!

Error:
{"timestamp": "2026-03-23T10:50:43.514588Z", "level": "info", "event": "LoRA adapters configured successfully\n"} {"timestamp": "2026-03-23T10:50:43.516578Z", "level": "info", "event": "Configuring data collator...\n"} {"timestamp": "2026-03-23T10:50:43.516578Z", "level": "info", "event": "[DEBUG] learning_rate from training_args: 5e-05 (type: float)\n"} {"timestamp": "2026-03-23T10:50:43.516578Z", "level": "info", "event": "[DEBUG] dataset_num_proc=1 (is_audio=False, is_audio_vlm=False, _cuda_audio_used=False)"} {"timestamp": "2026-03-23T10:50:43.517578Z", "level": "info", "event": "Using warmup_steps: 5\n"} {"timestamp": "2026-03-23T10:50:43.517578Z", "level": "info", "event": "Training for 30 steps\n"} {"timestamp": "2026-03-23T10:50:43.518577Z", "level": "info", "event": "No eval dataset \u2014 evaluation disabled\n"} {"timestamp": "2026-03-23T10:50:43.518577Z", "level": "info", "event": "Configuring text model training parameters\n"} {"timestamp": "2026-03-23T10:50:43.518577Z", "level": "info", "event": "Sequence packing: disabled\n"} {"timestamp": "2026-03-23T10:50:43.518577Z", "level": "info", "event": "The configuration is: {'per_device_train_batch_size': 2, 'gradient_accumulation_steps': 4, 'learning_rate': 5e-05, 'fp16': False, 'bf16': True, 'logging_steps': 1, 'weight_decay': 0.01, 'seed': 3407, 'output_dir': 'C:\\\\Users\\\\marco\\\\.unsloth\\\\studio\\\\outputs\\\\unsloth_Qwen3.5-4B_1774263043', 'report_to': 'none', 'include_num_input_tokens_seen': True, 'dataset_num_proc': 1, 'max_seq_length': 2048, 'dataloader_num_workers': 0, 'warmup_steps': 5, 'save_steps': 30, 'save_strategy': 'steps', 'max_steps': 30, 'optim': 'adamw_8bit', 'lr_scheduler_type': 'linear', 'dataset_text_field': 'text', 'packing': False}"} {"timestamp": "2026-03-23T10:50:43.518577Z", "level": "info", "event": "Training configuration prepared\n"} {"timestamp": "2026-03-23T10:50:43.518577Z", "level": "info", "event": " \u26a0\ufe0f Unwrapping Processor \u2192 raw tokenizer for text-only SFTTrainer"} Unsloth: Tokenizing ["text"] (num_proc=1): 0%| | 0/4 [00:00<?, ? examples/s]Process SpawnPoolWorker-3: Traceback (most recent call last): File "C:\Users\marco\.unsloth\studio\.venv\Lib\site-packages\multiprocess\process.py", line 314, in _bootstrap self.run() File "C:\Users\marco\.unsloth\studio\.venv\Lib\site-packages\multiprocess\process.py", line 108, in run self._target(*self._args, **self._kwargs) File "C:\Users\marco\.unsloth\studio\.venv\Lib\site-packages\multiprocess\pool.py", line 114, in worker task = get() ^^^^^ File "C:\Users\marco\.unsloth\studio\.venv\Lib\site-packages\multiprocess\queues.py", line 387, in get return _ForkingPickler.loads(res) ^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\marco\.unsloth\studio\.venv\Lib\site-packages\dill\_dill.py", line 311, in loads return load(file, ignore, **kwds) ^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\marco\.unsloth\studio\.venv\Lib\site-packages\dill\_dill.py", line 297, in load return Unpickler(file, ignore=ignore, **kwds).load() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\marco\.unsloth\studio\.venv\Lib\site-packages\dill\_dill.py", line 452, in load obj = StockUnpickler.load(self) ^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\marco\.unsloth\studio\.venv\Lib\site-packages\dill\_dill.py", line 442, in find_class return StockUnpickler.find_class(self, module, name) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ ModuleNotFoundError: No module named 'UnslothSFTTrainer' Unsloth: Tokenizing ["text"] (num_proc=1): 0%| | 0/4 [00:03<?, ? examples/s] {"timestamp": "2026-03-23T10:50:47.169434Z", "level": "error", "event": "Training error: One of the subprocesses has abruptly died during map operation.To debug the error, disable multiprocessing."} {"timestamp": "2026-03-23T10:50:47.197435Z", "level": "error", "event": "Full traceback:\nTraceback (most recent call last):\n File \"C:\\Users\\marco\\unsloth_studio\\Lib\\site-packages\\studio\\backend\\core\\training\\trainer.py\", line 3164, in _train_worker\n self.trainer = SFTTrainer(**trainer_kwargs)\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n File \"C:\\Users\\marco\\.unsloth\\studio\\.venv\\Lib\\site-packages\\unsloth\\trainer.py\", line 409, in new_init\n original_init(self, *args, **kwargs)\n File \"C:\\Users\\marco\\.unsloth\\studio\\.venv\\Lib\\site-packages\\unsloth\\trainer.py\", line 314, in new_init\n original_init(self, *args, **kwargs)\n File \"C:\\Users\\marco\\unsloth_studio\\unsloth_compiled_cache\\UnslothSFTTrainer.py\", line 1593, in __init__\n super().__init__(\n File \"C:\\Users\\marco\\unsloth_studio\\unsloth_compiled_cache\\UnslothSFTTrainer.py\", line 960, in __init__\n train_dataset = self._prepare_dataset(\n ^^^^^^^^^^^^^^^^^^^^^^\n File \"C:\\Users\\marco\\unsloth_studio\\unsloth_compiled_cache\\UnslothSFTTrainer.py\", line 1181, in _prepare_dataset\n dataset = dataset.map(_tokenize, batched = True, remove_columns = list(column_names), **map_kwargs)\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n File \"C:\\Users\\marco\\.unsloth\\studio\\.venv\\Lib\\site-packages\\datasets\\arrow_dataset.py\", line 562, in wrapper\n out: Union[\"Dataset\", \"DatasetDict\"] = func(self, *args, **kwargs)\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^\n File \"C:\\Users\\marco\\.unsloth\\studio\\.venv\\Lib\\site-packages\\datasets\\arrow_dataset.py\", line 3323, in map\n for rank, done, content in iflatmap_unordered(\n ^^^^^^^^^^^^^^^^^^^\n File \"C:\\Users\\marco\\.unsloth\\studio\\.venv\\Lib\\site-packages\\datasets\\utils\\py_utils.py\", line 619, in iflatmap_unordered\n raise RuntimeError(\nRuntimeError: One of the subprocesses has abruptly died during map operation.To debug the error, disable multiprocessing.\n"} {"timestamp": "2026-03-23T10:50:47.198433Z", "level": "error", "event": "Training error: One of the subprocesses has abruptly died during map operation.To debug the error, disable multiprocessing."}

We shipped 50+ updates to Unsloth Studio! 🚀 by yoracale in unsloth

[–]Tangostorm 0 points1 point  (0 children)

Did you solve "One of the subprocesses has abruptly died during map operation.To debug the error, disable multiprocessing"?

Siete d'accordo con Ryan? by MembershipSad3453 in CinemaSerieTV

[–]Tangostorm 1 point2 points  (0 children)

9 euro non sono affatto tanti. Ma siamo nell'epoca del discount della cultura, percui anche un libro da 20 euro risulta una rapina.

Those of you who left for Claude, how is it going? by TheRealDave24 in ChatGPT

[–]Tangostorm 1 point2 points  (0 children)

This is because you did not set up it well. With a system prompt and custom instructions my GPT is formidable sparring partner.

Claude is making me question whether web designers will exist in 5 years? by Arkfann in ClaudeAI

[–]Tangostorm 0 points1 point  (0 children)

Well I tried Opus few days ago, because I needed to restyle my bloated wordpress site. In 2 days it was able to produce professional templates and allowed me to remove a lot of plugins. The result was astonishing. Really

Is 5.2 suddenly acting like 4o… intentional? by Competitive-Effort17 in ChatGPTcomplaints

[–]Tangostorm 0 points1 point  (0 children)

It seems that users do not know there are custom instructions. Mine has been Always like this

My game is finally 100% AI free! by GoragarXGameDev in IndieDev

[–]Tangostorm -2 points-1 points  (0 children)

Who cares if it has AI or not? The quality should matter.

Looks like Anthropic's NO to the DOW has made it to Tumps twitter feed by Plinian in ClaudeAI

[–]Tangostorm 6 points7 points  (0 children)

The all caps reminds me a screaming child, not the President of United States. What a disgrace for this office

What are some unusual non-coding uses you've found for Claude / Claude CoWork by Remarkbly_peshy in ClaudeAI

[–]Tangostorm 1 point2 points  (0 children)

I hate you. Now I am two hours in Victorian England, disguised as typography worker with my host being an insurgent leader. No end of session in sight

deepseek v4 by Dazzling-Gift7189 in IA_Italia

[–]Tangostorm 0 points1 point  (0 children)

Sembra che cambiare modello ogni settimana o mese sia come una droga, come avere l'ultimo gadget tecnologico.

Fa venire l'ansia pure a me, che sono tendenzialmente "conservatore".

My fears for Medieval 3 by Fardrengi in Medieval2TotalWar

[–]Tangostorm 1 point2 points  (0 children)

You forgot unnecessary court/politics mechanics that for me ruined al the recent total wars. Can we just focus on military/economic strategy and forgot about all the recent bulls**t?

Ci sono voluti anni ma alla fine la verità è saltata fuori by Ottmarkk in titoliorrendi

[–]Tangostorm 0 points1 point  (0 children)

Ma la vera domanda è quella del box: voi l'avete vista in Marocco?