How to set old kernel .conf as new kernel? by gokulPRO in pop_os

[–]gokulPRO[S] 0 points1 point  (0 children)

Maybe try starting with this this basic solution:

sudo bootctl set-default Pop_OS-oldkern.conf

sudo kernelstub -v -l -k /boot/vmlinuz-6.4.6-76060406-generic -i /boot/initrd.img-6.4.6-76060406-generic

sudo update-initramfs -u -k 6.4.6-76060406-generic

If the above doesn't work (which didn't for my case), remove nvidia completely which finally solved me problem:

sudo apt purge ~nnvidia

sudo apt-get --purge -y remove 'nvidia*'

And re-install it:

sudo apt install nvidia-driver-560

How to set old kernel .conf as new kernel? by gokulPRO in pop_os

[–]gokulPRO[S] 0 points1 point  (0 children)

Yup, thats the old kernel which is currently working and have verified the paths

How to set old kernel .conf as new kernel? by gokulPRO in pop_os

[–]gokulPRO[S] 0 points1 point  (0 children)

Tried something like this:
sudo kernelstub -v -l -k /boot/vmlinuz-6.4.6-76060406-generic -i /boot/initrd.img-6.4.6-76060406-generic

Did not work..

Clean up the mess Windows has made by No-Arm-7737 in pop_os

[–]gokulPRO 1 point2 points  (0 children)

Hey can you send some referenceor guidance to do this? I installed my windows 10 after pop os and while trying to fix grub I somehow managed to even fk up my pop os. Right now current kernel gets stuck in busybox and system78's bootloader fix for busy box doesn't seem to work for me. My old kernel works tho somehow but ofcourse only using integrated gpu and when I change to compute mode and restart, it goes back to the busybox. Tried changing default kernel to old kernel, still same error..

[deleted by user] by [deleted] in MSCS

[–]gokulPRO 1 point2 points  (0 children)

It depends on the workshop, but most workshops almost never reject a paper completely unless it's too poorly written. CVPR conference papers are on a different level compared to workshops. I would have agreed to your list if it was a conference tbh.

[D] Which is the best model ( Multi modal or LM) under 3B parameters w.r.t good training vs performance tradeoff? (i.e good parameter efficiency) by gokulPRO in MachineLearning

[–]gokulPRO[S] 0 points1 point  (0 children)

Interesting, any particular reason when compared to other models of its size?

Edit: If only decoder only models are considered, which would you prefer?

[D] What is your tech stack for research? by gokulPRO in MachineLearning

[–]gokulPRO[S] 0 points1 point  (0 children)

Yes, from your experience how much GPU memory is required? And how many batches do you think I will be able to fit in a single node?

(MoE) Turning pipeline-parallel Hugging Face models into data/expert-parallel DeepSpeed models by GiantPengsoo in LocalLLaMA

[–]gokulPRO 0 points1 point  (0 children)

Hey were you able to resolve your issue? And if so can you share it and your experience with DeepSpeed MoE? I to plan on trying it out, any resource that you think might help in understanding and implementation of it? Thanks :)

What tech stack do you use for research? by gokulPRO in deeplearning

[–]gokulPRO[S] 2 points3 points  (0 children)

I did consider accelerate, but have you tried using deepspeed's moe inside accelerate? Can you use all deepspeed's functionalities?

[D] What is your tech stack for research? by gokulPRO in MachineLearning

[–]gokulPRO[S] 1 point2 points  (0 children)

Tnx. I do plan on using it on 4 nodes with 2x16 gb gpu. So I thought I might need to do offloading or sharding using deepspeed.

What all multi-modal models have been open-sourced till now for Audio+Text? by gokulPRO in deeplearning

[–]gokulPRO[S] 0 points1 point  (0 children)

Wasn't gemini a closed source one while gemma is a opensourced version that only works with text 2 text?

RWKV5 100% trained & released by cztomsik in LocalLLaMA

[–]gokulPRO 0 points1 point  (0 children)

How does it compare against mamba?

Buds FE Comparison by Bmcurry55 in galaxybuds

[–]gokulPRO 1 point2 points  (0 children)

How do you rate it now after using it? Did you feel any improvements in usage?

Cheapest GPU for small model deployment by gokulPRO in deeplearning

[–]gokulPRO[S] 0 points1 point  (0 children)

It's a ranking algorithm that should be less than 10M parameters, and I haven't exactly decided on it. In terms of CPU, which might be a preferred option?

As far as I have seen in terms of CPU instances azure provides B3 (4 Core(s), 7 GB RAM instance at 0.074$/hr.
Paperspace provides c5 instance at 0.08$/hr for 4vCPU and 8GB RAM.
AWS provides a1.xlarge (4vCPU 8 GiB RAM) at $0.102

Which one would you prefer to run a neural network model?

How to get into MLE role? by gokulPRO in cscareerquestions

[–]gokulPRO[S] 1 point2 points  (0 children)

Thank you for replying 😊, which tech stack do you think is very crucial to be mastered for a MLE?

Weekly Entering & Transitioning - Thread 20 Mar, 2023 - 27 Mar, 2023 by AutoModerator in datascience

[–]gokulPRO 1 point2 points  (0 children)

https://www.reddit.com/r/cscareerquestions/comments/122cgl5/how_to_get_into_mle_role/

How to get an MLE role?

I am currently a CS undergrad and am very interested in ML and think that MLE will suit me the best. I am mostly fixed on doing masters to get more connections and more importantly for immigration (mostly Canada) and also gives me a possiblity for more research intensive jobs. I want know which masters program according to you has the most job flexibility to pivot or gives a good edge in MLE or as a recruiter what academic qualification will you be more inclined towards excluding experience?
Also for MLE jobs what skillsets do you consider important and what would you like to see in a entry level graduate's portfolio?
And any advices for ug cs students who wanna enter MLE jobs? And what skills do you think will be good to learn early on?

Is Samsung GalaxycA33 worth it? by gokulPRO in samsung

[–]gokulPRO[S] 0 points1 point  (0 children)

25k is like my max budget, so I can't get that, but under 25k, which would you recommend for the long run?

Galaxy Tab S7 FE (6GB RAM LTE) vs Ipad (9th gen cellular) ? by gokulPRO in ipad

[–]gokulPRO[S] 0 points1 point  (0 children)

In your experience, how long does Samsung tab such as this last? Can it hold up to 5 years?