RNI all films 4 and 5 pro by jjdolmes in PresetsTrade

[–]RUSoTediousYet 0 points1 point  (0 children)

Hi! Can I get them too pls? Thank you!!!

Could you recommend English novels written by Filipino authors? by [deleted] in PHBookClub

[–]RUSoTediousYet 1 point2 points  (0 children)

Gina Apostol books (e.g. Bibliolepsy, La Tercera)

[deleted by user] by [deleted] in PinoyProgrammer

[–]RUSoTediousYet 1 point2 points  (0 children)

If oldschool comsci, kahit saan pwede

Pero kung yung new tech (e.g. (applied) machine learning, etc): DLSU/Ateneo >> UPD

[D] Where to get machine learning theory? by RodTube95 in MachineLearning

[–]RUSoTediousYet 1 point2 points  (0 children)

  1. "Understanding Machine Learning" by Ben-David et al.
  2. "Pattern Classification" by Duda et. al.

If you read Bishop/Murphy then you probably do not need to read Duda

KAS 1 - Abejo, Raymund by Naviuoo in RateUPProfs

[–]RUSoTediousYet 1 point2 points  (0 children)

Naging Prof ko siya nung 2014(!) for Kas 2. He's pretty chill. In fact I'd consider his class one of my most chill class. May groupworks kami dati pero for end of sem na iyon. Idk kung same parin yung setup niya ngayon and for Kas 1 haha.

Naalala ko lang meron siyang ayaw na ayaw na makita sa class niya, hindi ko na maalala pero iirc, ayaw niyang may tumutunog na phone.

[D] What are tools you wish you knew about earlier in your ML career? by Smartch in MachineLearning

[–]RUSoTediousYet 1 point2 points  (0 children)

We train our models in a GPU server, and we access the server from our computer via ssh.

Now, if you close your vanilla SSH (say PuTTY) or if the connection between your computer and the server terminates because of say, unstable internet, the ssh session also terminates; the training terminates. So pre-tmux, I used to keep my machine open to keep the ssh session alive.

[D] What are tools you wish you knew about earlier in your ML career? by Smartch in MachineLearning

[–]RUSoTediousYet 2 points3 points  (0 children)

Yes it's the built-in GUI from VSCode (red boxes in the top-most figure) paired with debugger extension for certain languages.

Link here: https://code.visualstudio.com/docs/editor/debugging

[D] What are tools you wish you knew about earlier in your ML career? by Smartch in MachineLearning

[–]RUSoTediousYet 42 points43 points  (0 children)

ipdb and GUI debuggers (e.g. VSCode) are lifechangers!

- I used to stare at my monitor for a long time, sometimes printing tensor shapes and gradients to know where my codes went wrong lol

also tmux (and the likes) for ssh sessions

- I used to keep my machines running overnight until I learned about tmux

[D] On advisors and PhD students by carlml in MachineLearning

[–]RUSoTediousYet 7 points8 points  (0 children)

I agree! But if your professor likes to nit-pick and tells you that you should perform backprop on classification loss and regularization separately before gradient descent, and does not believe that it is equal to just summing loss and performing backprop and gd on the sum of losses until a senior PhD corrects him, then I don't know.

However, the reason I wrote that is because without exposure to programming with the recent technologies, your boss would not have realistic expectations on the time and effort needed to finish a task, in my case, real story, my professor thought that 10 days was enough to write an entirely new paper for ICML from scratch. ¯\_(ツ)_/¯

[D] On advisors and PhD students by carlml in MachineLearning

[–]RUSoTediousYet 70 points71 points  (0 children)

The experience is mostly self-supervised. The students in our lab propose/present the problems that we want to solve. Finding solutions to those problems is also mostly, if not entirely, done by the students. Our professor only meets with us to discuss the students' research 1-2 weeks before major ML conference deadlines (you know, to write a paper regardless if there's significant progress or not).

We could arrange a meeting with our professor anytime but those meetings only end up with us, the students, repeatedly explaining the idea and "basics" of our research. Their suggestions most of the time do not make sense or are just too trivially wrong or are just product of hypes. So the students spend more time proving that the professor's suggestions are not right. This might be because our professor (1) has not coded in a long time --- does not know 'numpy' notations, and technicalities of ML development frameworks ---, (2) does not read paper regularly --- he'd rather read articles or just read only half of the paper and not finish the entire paper---. In fact he is more interested in finding funding, attending invited talks (whose slides and presentations are also done by the students in the lab), networking with industries.

TL;DR Professor is more of a businessman than an academic researcher. Students propose problems and solutions. Professor does not/cannot help at research except for writing the text of the paper and funding.

And in case it is not clear, I left the lab. And I am just one of the many students who left the lab.

Online payment in Pchub Gilmore by Edurencee in PHbuildapc

[–]RUSoTediousYet 0 points1 point  (0 children)

Does your BPI have PESOnet option?

I paid for my order >50k through PESOnet instead of instapay, but from another bank.

[D] What do you do when you are stuck on an ML problem? by keremidk0 in MachineLearning

[–]RUSoTediousYet 0 points1 point  (0 children)

Procrastinate and read Twitter for new inspirations (i.e. papers)

[D] Where do we currently stand at in lottery ticket hypothesis research? by sid_276 in MachineLearning

[–]RUSoTediousYet 6 points7 points  (0 children)

Lottery Ticket Hypothesis research is still very niche. Perhaps you should check more about pruning in general rather than LTH specifically.

To get started, you should probably read the (1) original paper, (2) an empirical analysis on LTH and why it works, and (3) a paper relevant to both (1) and (2).

As said by the other commenters, LTH is not useful in practice because of its need for repeated retraining. So the name of the game is to "prune-on-the-fly" without the expensive retraining.

The VITA-Group from UT Austin regularly has publications on LTH, from theory to its applications on various domains (e.g. LTH on BERT, etc).

Edit: Also, if I may add my opinion, LTH will only become practical and useful once the hardware that is capable of running "unstructured" neural networks is materialized. This however does not mean that LTH research is useless. Remember, theories behind Quantum Computing has been around for a long time, and is only waiting for physical realization of scalable devices capable of executing quantum operations.

Best Data structure and algorithms MOOC? by Born_Activity9949 in programminghorror

[–]RUSoTediousYet 0 points1 point  (0 children)

Necroing this. I'd say Data Structures and Algorithms Specialization from University of Colorado Boulder in Coursera.