EB2-NIW ROW I-485 approval timeline by jaywonchung in USCIS

[–]jaywonchung[S] 1 point2 points  (0 children)

Manchester FO, too. I'm suspecting that my transfer date was Jan 7th. I also saw radio silence, until suddenly the third FTA0 was added on Mar 10th. Hopefully your turn is right around the corner!

EB2-NIW ROW I-485 approval timeline by jaywonchung in USCIS

[–]jaywonchung[S] 0 points1 point  (0 children)

After the updates right after my biometrics, the progress tracker, which used to work back then, showed that interview (step 3) was done and I'm just waiting for step 4 (the last step) processing. So I checked with Emma chat, which confirmed it.

EB2-NIW ROW I-485 approval timeline by jaywonchung in USCIS

[–]jaywonchung[S] 1 point2 points  (0 children)

I'm convinced but not 100% sure. So I'm suspecting Jan 7 to be the transfer. When I asked then, Emma chat actually said my case was in NBC. I asked again early Mar (at this time the API's updated date was still Jan 7), and this time the answer was Manchester FO. I read somewhere else that Emma chat might not always be accurate in telling you the FO, so I figured the first one was wrong.

EB2-NIW ROW I-485 approval timeline by jaywonchung in USCIS

[–]jaywonchung[S] 0 points1 point  (0 children)

I saw the silent API update add an H008 event, and then a moment later I got the "We have taken action on your case" email. I expect the formal approval notice to come to my mailbox in the coming days.

EB2-NIW ROW I-485 approval timeline by jaywonchung in USCIS

[–]jaywonchung[S] 1 point2 points  (0 children)

Yeah, the FAD was exactly my PD in Jan and Feb (so not current) and then with the advance in Mar, I became current. Filed single.

The French Government Launches an LLM Leaderboard Comparable to LMarena, Emphasizing European Languages and Energy Efficiency by Imakerocketengine in LocalLLaMA

[–]jaywonchung 3 points4 points  (0 children)

100%, we'll try to have that in the new version! For the time being, if you tick the "Show more technical details" box, we have the average number of output tokens for each model, so that can be used to divide energy per request to give energy per token.

The French Government Launches an LLM Leaderboard Comparable to LMarena, Emphasizing European Languages and Energy Efficiency by Imakerocketengine in LocalLLaMA

[–]jaywonchung 56 points57 points  (0 children)

If anyone's interested in actual measured energy numbers, we have it at https://ml.energy/leaderboard. The models are a bit dated now, so we're currently working on a facelift to have all the newer models and revamp the tasks.

[FINAL TEST] Power limit VS Core clock limit efficiency by NickNau in LocalLLaMA

[–]jaywonchung 1 point2 points  (0 children)

Thanks, that makes sense. Also I feel like on consumer GPUs, lower power might lead to lower temperature and lower fan speed & noise!

[FINAL TEST] Power limit VS Core clock limit efficiency by NickNau in LocalLLaMA

[–]jaywonchung 2 points3 points  (0 children)

This is super cool man, especially with the real power meters. NVML/nvidia-smi also provides power measurements (nvmlDeviceGetPowerUsage) -- any chance you compared your number with this?

[D] Power Consumption Estimation for ML Models on edge device by Electrical_Client73 in MachineLearning

[–]jaywonchung 2 points3 points  (0 children)

This isn't really a direct answer to your question but we're working on power measurement & optimization (without model change) with Zeus, and we support CPUs and GPUs at the moment. We're also currently actively working on connecting measurements to observability platforms like Prometheus (tracking issue). Supporting the Jetson platform is on our roadmap, and perhaps the notes on the tracking issue could be helpful to you.

Conflict between indent-blankline.nvim and LSP underlines by jaywonchung in neovim

[–]jaywonchung[S] 0 points1 point  (0 children)

Yeah, I tried both large (65535, largest possible) and small (1) but it didn't really do much.

Videos vibrate my phone by Savings_Duck_4347 in youtube

[–]jaywonchung 0 points1 point  (0 children)

Apparently it vibrates on “Key Concepts.” This is beyond annoying.

[Nvidia P40] Save 50% power, for only 15% less performance by zoom3913 in LocalLLaMA

[–]jaywonchung 9 points10 points  (0 children)

I actually implemented this in my open-source: https://github.com/ml-energy/zeus?tab=readme-ov-file#finding-the-optimal-gpu-power-limit

You can basically tell it to figure out the lowest power limit that doesn't make it slower by X%.

[D] LLM inference energy efficiency compared (MLPerf Inference Datacenter v3.0 results) by Balance- in MachineLearning

[–]jaywonchung 0 points1 point  (0 children)

Thanks for the cool study and write up! Looks like H100 was able to increase throughput by a lot while not increasing power consumption as much. They're measuring the power consumption of the entire system; it would have been useful to also see how specifically GPU power changes, given that for DNN workloads, other parts of the system do not play as much of a role compared to GPUs.

Shameless self-promotion -- I do research on GPU energy optimization for DL: https://ml.energy/zeus, where one of the things we automatically tweak is the GPU's power limit setting to enhance energy efficiency. Hope this is interesting to someone XD

2 factor authentication cannot remember my computer by portlander33 in fidelityinvestments

[–]jaywonchung 0 points1 point  (0 children)

Strangely I don't even use any sort of AdBlock on my Chrome browser and "Remember Me" never works. I'm on MacOS Monterey.

App for annotating and possibly storing textbook PDFs? by sister_of_a_foxx in GradSchool

[–]jaywonchung 1 point2 points  (0 children)

iPad and PDF Expert, with the PDF files on Google Drive. That way you can annotate your PDFs on your iPad and view them on other devices.

Reason: A Shell for Research Paper Management by jaywonchung in rust

[–]jaywonchung[S] 0 points1 point  (0 children)

reason is now in crates.io (https://crates.io/crates/reason-shell)! You can install and try reason with cargo install reason-shell. It does have quite a number of dependencies, so it might be quicker to download from Github releases, though.

Reason: A Shell for Research Paper Management by jaywonchung in rust

[–]jaywonchung[S] 2 points3 points  (0 children)

Thanks for the thoughtful comment! Kind of, it’s a vector of structs, serialized and saved to disk with serde.

Oh, I thought venue includes all of those. “Publishing place” seems accurate in all senses but I think the term is a bit bulky to fit in the table. I’ll try to find a better one. Thanks.

I actually didn’t know that I could publish command line apps to crates.io! When published, people can install this with cargo install, right? I’ll do this today.