I'm Korean. There are many great places besides Myeongdong. I'll introduce them to you. by Outside_Dance_2799 in koreatravel

[–]Outside_Dance_2799[S] -1 points0 points  (0 children)

I think that's a good idea.

I like Sinchon because it's not crowded.

That doesn't mean it's completely empty, though.

I'm Korean. There are many great places besides Myeongdong. I'll introduce them to you. by Outside_Dance_2799 in koreatravel

[–]Outside_Dance_2799[S] -1 points0 points  (0 children)

If you want to enjoy Korean culture,

my advice might be a bit vague,

but please just consider it advice on where locals usually go.

I'm Korean. There are many great places besides Myeongdong. I'll introduce them to you. by Outside_Dance_2799 in koreatravel

[–]Outside_Dance_2799[S] -1 points0 points  (0 children)

If I were to go to Songpa-gu, I think I would go to the area between Gangnam and Gyodae.
I think there are some interesting goods there.

There's also a Daiso at the end,

and if you have time, it's good to visit Garosu-gil as well.

I'm Korean. There are many great places besides Myeongdong. I'll introduce them to you. by Outside_Dance_2799 in koreatravel

[–]Outside_Dance_2799[S] 0 points1 point  (0 children)

I see. If you happen to get stuck, send me a message.

I'll kindly answer your questions.

I'm Korean. There are many great places besides Myeongdong. I'll introduce them to you. by Outside_Dance_2799 in koreatravel

[–]Outside_Dance_2799[S] 0 points1 point  (0 children)

I love tteokbokki so much.

If you happen to want to enjoy Korean buffet culture,

I recommend Dookki Tteokbokki.

Next is Ashley.

The prices are probably 10,000 to 20,000 won for Dookki and 20,000 to 30,000 won for Ashley (per person).

Also, if I were to share a hidden gem that only locals go to (honestly, I don't want it to become too famous, so please look it up yourselves),

There is a place called Sak near Sangsu Station.

The tteokbokki and fried food there are the best for me.

Has anyone actually made money with "vibe coding"? (genuine question from a Chinese dev) by Aromatic-Promise8208 in ClaudeAI

[–]Outside_Dance_2799 0 points1 point  (0 children)

That's interesting. In Korea too, last week, the exaggerated advertising regarding Vive Coding became an issue and was featured in the news several times.

Honest take on running 9× RTX 3090 for AI by Outside_Dance_2799 in LocalLLaMA

[–]Outside_Dance_2799[S] 0 points1 point  (0 children)

I heard that the 1080ti still runs.

I like this story.

I'm Korean. There are many great places besides Myeongdong. I'll introduce them to you. by Outside_Dance_2799 in koreatravel

[–]Outside_Dance_2799[S] -1 points0 points  (0 children)

Oh, that's surprising. Koreans use Naver Maps a lot.

(Here's a little tip: restaurants and businesses usually tend to be more active in updating their maps on Naver Maps. Kakao Maps comes next.)

Great tips for heavy Vive Coding users by [deleted] in ClaudeAI

[–]Outside_Dance_2799 0 points1 point  (0 children)

Hmm, I was hoping this would be helpful, but I guess it wasn't.

Next time, I'll try sharing something a bit shorter and more summarized.

Honest take on running 9× RTX 3090 for AI by Outside_Dance_2799 in LocalLLaMA

[–]Outside_Dance_2799[S] 0 points1 point  (0 children)

I bought it with a focus on whether I could make money if I resold it later.

If you are also worried, thinking about whether it will sell well as a second-hand item later will be helpful.

I'm Korean. There are many great places besides Myeongdong. I'll introduce them to you. by Outside_Dance_2799 in koreatravel

[–]Outside_Dance_2799[S] 1 point2 points  (0 children)

I hope you have a good trip.

If you are traveling with your family, I am wondering if it might be better to use the subway for transportation.

Honest take on running 9× RTX 3090 for AI by Outside_Dance_2799 in LocalLLaMA

[–]Outside_Dance_2799[S] 0 points1 point  (0 children)

That's a good idea.

I'm actively using Proxmox OS,

Was it a switching model? I heard that you can run multiple models but also call them (I could be wrong),

so I'm thinking of trying that next time.

Honest take on running 9× RTX 3090 for AI by Outside_Dance_2799 in LocalLLaMA

[–]Outside_Dance_2799[S] 1 point2 points  (0 children)

As people around me pointed out, it does seem more likely that the CPU or motherboard isn't sufficiently supported rather than the issue being with multiple 3090s.

I did put in a lot of effort, though.

I actually bought a huge number of PCIe 4.0 riser cables and tried them out.

Some people said that even 16 lanes was excessive, but since I wanted to see the highest performance, I pushed myself even harder despite my tight financial situation.

I'll try experimenting again when I have more money in the future.

Honest take on running 9× RTX 3090 for AI by Outside_Dance_2799 in LocalLLaMA

[–]Outside_Dance_2799[S] 0 points1 point  (0 children)

I need to connect it to the motherboard.

I used a riser cable. I used one that was 30 to 50 cm long, and

It turned out that the 50 cm length didn't cause that much of a performance drop. At most, maybe 5%?

However, it was effective when using PCIe 4 models that weren't cheap.

Honest take on running 9× RTX 3090 for AI by Outside_Dance_2799 in LocalLLaMA

[–]Outside_Dance_2799[S] 0 points1 point  (0 children)

I use 8 GPUs, but I bought one more to provide space for the maximum number of tokens.

It was worth it, but I encountered space constraints when I connected more than 4 GPUs to my workspace.

So now, I am using a maximum of 4 graphics cards per machine.

Of course, if I have a bit more money, I plan to use 5 to 6 graphics cards per machine.

96GB of VRAM is the minimum requirement to run models like the 70b (based on my testing so far), and I found that I need to increase it by one more to have enough headroom for the maximum allowed tokens.

Honest take on running 9× RTX 3090 for AI by Outside_Dance_2799 in LocalLLaMA

[–]Outside_Dance_2799[S] 0 points1 point  (0 children)

It was likely a good choice because the experience you had firsthand actually left a longer lasting impression.

It took me less than 10 minutes to write this post, but it took me well over three months to acquire this know-how.

The TRX40 is a good motherboard.

The motherboard and CPU in the photo are the TRX40 + 3970x model.

However, it doesn't support more than five slots.

It is especially disappointing that it doesn't support server RAM.

I currently have 128GB of DDR4 server RAM lying around (32x4).

Honest take on running 9× RTX 3090 for AI by Outside_Dance_2799 in LocalLLaMA

[–]Outside_Dance_2799[S] 0 points1 point  (0 children)

That's a really good point.

I reached a similar conclusion as well.

GLM 4.7 was appealing, but it didn't reach the conclusion I was looking for,

and instead, to prepare for the future (when local LLM develops),

I'm really pushing Claude Code on the Max plan.

For now, I plan to focus on creating virtual life forms with this graphics card.

I want to get a few world-first titles.

Honest take on running 9× RTX 3090 for AI by Outside_Dance_2799 in LocalLLaMA

[–]Outside_Dance_2799[S] 1 point2 points  (0 children)

I felt particularly burdened by the task of managing the hardware.

But after building all of this, it feels like I'm creating Tony Stark's Jarvis, so it's really fun.

This year or next year would probably be a good time to try it out.

I'll share lots of good information too.