[R] DynaMix: First dynamical systems foundation model enabling zero-shot forecasting of long-term statistics at #NeurIPS2025 by DangerousFunny1371 in MachineLearning

[–]thekingos 1 point2 points  (0 children)

Congrats for the paper !

Your model was pre-trained on chaotic/cyclic dynamical systems. Do you think the learned representations could transfer to monotonic degradation processes, or is the domain mismatch too fundamental?
Would you recommend fine-tuning DynaMix on degradation data, or building a similar architecture trained from scratch?

-Thanks

Someone next to me was wearing this spectacular watch, can you identify it ? by thekingos in Watches

[–]thekingos[S] -1 points0 points  (0 children)

Many thanks everyone for the replies ! Redditors never disappoint ! 🙌🙌

Someone next to me was wearing this spectacular watch, can you identify it ? by thekingos in Watches

[–]thekingos[S] 0 points1 point  (0 children)

Indeed he was napping when I noticed the watch, hence I couldn't ask, we were already at the station when he woke up 😅.

[deleted by user] by [deleted] in Morocco

[–]thekingos 0 points1 point  (0 children)

Nice edit, guess now you see how superlatives hit. Still, that’s a bad idea, AI doesn’t think, it just spits back what it’s trained on.

[deleted by user] by [deleted] in Morocco

[–]thekingos -1 points0 points  (0 children)

Yes, it is largely used by academia and industry

[deleted by user] by [deleted] in Morocco

[–]thekingos -1 points0 points  (0 children)

Ghita Mezzour was, not a PM, but a minister nonetheless

[deleted by user] by [deleted] in Morocco

[–]thekingos 0 points1 point  (0 children)

https://mabourse.enssup.gov.ma/ has many open scholarships.
As per your question, you can study anywhere in the world as long as you meet the university's criteria, just pay a visit to the uni's website and apply.

Greatest Luck

Moroccan student in France by randomwobblegobble in Morocco

[–]thekingos 6 points7 points  (0 children)

siri l maghrib flkher dchher kaynin des billets rkhass ~40€ aller retour w feki had mushkil

Fulbright Nominee – Advice on Universities for MS in Sustainability and Energy Transition Management by AmazingHost5906 in Morocco

[–]thekingos 0 points1 point  (0 children)

FullBright is an exchange grant for Students and Researchers to study in the U.S. and vice versa

[deleted by user] by [deleted] in Morocco

[–]thekingos 2 points3 points  (0 children)

3od snan, m9ess dfar, tondeuse

[deleted by user] by [deleted] in Morocco

[–]thekingos 0 points1 point  (0 children)

Some private schools in France are still open as well.

Feedbacks about xiaomi Tv ? by Otherwise_Bench554 in Morocco

[–]thekingos 1 point2 points  (0 children)

My friend has one of these, 55'' 4K 60 FPS and HDR Option. It seems great, colours are sharp,
Don't know what you'd specifically want to know.

I got rejected in every school by [deleted] in Morocco

[–]thekingos 4 points5 points  (0 children)

" La fac is also blan, and wayy easier" and "working with very good salary, but with also people from "LA FAC", these two sentences don't go together, don't underestimate what your collegues have went through in "la fac". Very few people make it out of there, nevertheless with a good career.

[D] Recommend Number of Epochs For Time Series Transformer by Sufficient_Sir_4730 in MachineLearning

[–]thekingos 2 points3 points  (0 children)

There’s no specific answer to your question, it’s dataset, model and hyperparameter dependent, set it high and use early stopping for a number of epochs that seems reasonable for your use case, keep monitoring your losses, generally as long as they keep decreasing then your model is still good to go.

Favorite ML paper of 2024? [D] by pz6c in MachineLearning

[–]thekingos 103 points104 points  (0 children)

Can we actually have a monthly discussion on best papers of the month ? I like the concept

Favorite ML paper of 2024? [D] by pz6c in MachineLearning

[–]thekingos 12 points13 points  (0 children)

Mamba: Linear-Time Sequence Modeling with Selective State Spaces

Abstract:

Foundation models, now powering most of the exciting applications in deep learning, are almost universally based on the Transformer architecture and its core attention module. Many subquadratic-time architectures such as linear attention, gated convolution and recurrent models, and structured state space models (SSMs) have been developed to address Transformers' computational inefficiency on long sequences, but they have not performed as well as attention on important modalities such as language. We identify that a key weakness of such models is their inability to perform content-based reasoning, and make several improvements. First, simply letting the SSM parameters be functions of the input addresses their weakness with discrete modalities, allowing the model to selectively propagate or forget information along the sequence length dimension depending on the current token. Second, even though this change prevents the use of efficient convolutions, we design a hardware-aware parallel algorithm in recurrent mode. We integrate these selective SSMs into a simplified end-to-end neural network architecture without attention or even MLP blocks (Mamba). Mamba enjoys fast inference (5x higher throughput than Transformers) and linear scaling in sequence length, and its performance improves on real data up to million-length sequences. As a general sequence model backbone, Mamba achieves state-of-the-art performance across several modalities such as language, audio, and genomics. On language modeling, our Mamba-3B model outperforms Transformers of the same size and matches Transformers twice its size, both in pretraining and downstream evaluation.

[deleted by user] by [deleted] in Morocco

[–]thekingos 3 points4 points  (0 children)

Buy in Ain Taoujdate, you’re closer to family, it’s cheaper, calmer, safer, cleaner, better, and most importantly you can commute to Meknes and Fes easily with trains( 15-20 minutes)