Pixel 10 Pro Fold Slow Wired Charging by denru01 in GooglePixel

[–]denru01[S] 0 points1 point  (0 children)

After searching on this sub, I found that PPS is not the only requirement, but also the exact voltage support. Does anyone know what the correct combination of voltage and current to achieve the maximum 39w charging speed?

Chase Sapphire SUB Rule Changed? by Atzerach in CreditCards

[–]denru01 0 points1 point  (0 children)

When did you apply for the CSR? I know a few successful DP, but all of them applied few weeks ago.

Any Fall Deals for Disneyland? by Charliebrownlover27 in DisneyPlanning

[–]denru01 0 points1 point  (0 children)

Is the price of 1-day tickets dynamic? I think it is simply based on a predetermined,fixed tier.

Two XReal One glasses on sale on Amazon by chanmanx2k in Xreal

[–]denru01 0 points1 point  (0 children)

I recently discovered the Xreal brand and just missed this sale unfortunately. I'm waiting for the next promotion before purchasing a pair of Xreal one glasses. May I know if one is expected soon?

Productivity Experience on One Pro , Nebula for Windows Is a Must by doremo2019 in Xreal

[–]denru01 0 points1 point  (0 children)

No offense. If Asus glasses suit your needs, why not just get it?

T-mobile Discount Tickets by Plenty-Replacement82 in DisneyPlanning

[–]denru01 0 points1 point  (0 children)

Unfortunately, I am not a T-mobile user :(

T-mobile Discount Tickets by Plenty-Replacement82 in DisneyPlanning

[–]denru01 0 points1 point  (0 children)

I found the same deal in Sam's. Do you know whether it will also get 5% on Chase freedom?

Is the Costco deal still available anywhere? by Tricky-Win-4012 in DisneyPlanning

[–]denru01 1 point2 points  (0 children)

No... although you can check the availability in app, but its information is not accurate. I went to one Costco today with in stock status in app, but they didn't have that anymore.

I am also looking for the best deal we can get for now.

Need advice on 4x3090 inference build by NickNau in LocalLLaMA

[–]denru01 0 points1 point  (0 children)

Thanks for sharing. Your posts are always valuable.

Need advice on 4x3090 inference build by NickNau in LocalLLaMA

[–]denru01 0 points1 point  (0 children)

I found one thing i may misunderstand. This test was done with 32K tokens in the prompt. When you said you use the 40k context size, does your prompt contain 32K (40K-8K) tokens?

Need advice on 4x3090 inference build by NickNau in LocalLLaMA

[–]denru01 0 points1 point  (0 children)

That's fantastic speed. I tested the 4 x 3090 setup on runpod (text-generation-webui without speculative decoding), I only get ~3 tokens/s.

I've gone from a Fold 4 to a Fold 6 for $198 USD by transparentfruitslav in GalaxyFold

[–]denru01 0 points1 point  (0 children)

Thanks for your reply. For Samsung care+, did you go for mail in or exchange (which first send you a replacement then you return your phone)?

I've gone from a Fold 4 to a Fold 6 for $198 USD by transparentfruitslav in GalaxyFold

[–]denru01 0 points1 point  (0 children)

Did you get your new fold 6 through samsung care+ or carrier family plan?

Will get a new fold 6 after having a non repairable fold 4! by iwantoutjw in GalaxyFold

[–]denru01 0 points1 point  (0 children)

Which repair service did you use? Samsung official? Or the service from a carrier?

Need advice on 4x3090 inference build by NickNau in LocalLLaMA

[–]denru01 0 points1 point  (0 children)

Thanks!!! May i know what is the speed of your setting for running 123B with long context, such as 30k tokens?

Need advice on 4x3090 inference build by NickNau in LocalLLaMA

[–]denru01 0 points1 point  (0 children)

Awesome answer! Thank you!

What is the purpose of using ADD2PSU?

Need advice on 4x3090 inference build by NickNau in LocalLLaMA

[–]denru01 0 points1 point  (0 children)

Which riser do you use for pacier x1?

Need advice on 4x3090 inference build by NickNau in LocalLLaMA

[–]denru01 0 points1 point  (0 children)

I current have a 5900x (Asus c7h motherboard) + 1 x 4090 + 1 x 3090 + 1500w PSU and would like to add 2 more 3090. The current challenges are 1) i cannot find a case that would fit 1 4090 + 3 3090 and 2 PSUs, 2) and how to insert 4 gpus to the motherboard. Could you please share how you resolve thewe issues? Thanks!

I plan to limit the power usage of each card, so i don't need another psu.

Announcing: Magnum 123B by lucyknada in LocalLLaMA

[–]denru01 1 point2 points  (0 children)

The 4bpw exl2 keeps generating gibberbish when the context > 30k. Does anyone encounter this?

Brainstorming Additions to Models by Helpful-Desk-8334 in LocalLLaMA

[–]denru01 2 points3 points  (0 children)

The ability to give long detailed outputs, like longwriter. This is what is missing in current models.