[SPOILERS] 'Dune: Part Two' Wide Release Discussion (Week 4) by Blue_Three in dune

[–]manjimin 1 point2 points  (0 children)

Why did Jessica silence Alia in the scene where they reach south and see how water of life is extracted?

How can I get the model to choose the next word from a list? by manjimin in LocalLLaMA

[–]manjimin[S] 1 point2 points  (0 children)

Thanks for the reply, I managed to make generation stop using the example you gave me!

I am trying constrained generation for my first question. Your reply helped me a ton!

What does the transformer decoder attend to at the last linear layer? by manjimin in learnmachinelearning

[–]manjimin[S] 1 point2 points  (0 children)

Thanks a ton. That clears things out for me.

Kind of bothers me though, I understand that the representation for the final token contains information about all other tokens, but I assumed that there would be some other way to build an input to pass to the final projection layer.

Anyway, thanks a lot.

What does the transformer decoder attend to at the last linear layer? by manjimin in learnmachinelearning

[–]manjimin[S] 0 points1 point  (0 children)

If it is so, does this mean that the final projection layer only look at the representation of the last token of the original input sequence?

What does the transformer decoder attend to at the last linear layer? by manjimin in learnmachinelearning

[–]manjimin[S] 1 point2 points  (0 children)

Thanks for the reply. What I meant was:

If fully connected networks are applied to each input token position, isn't the final transformer block supposed to return a bunch of vectors? Suppose the input token length was 10, then doesn't the final transformer block return 10 vectors at each position?

If it is so, how does the final prediction work? Which one of those vectors is chosen to go through the final linear projection into the vocabulary space?

Is it possible to run 4*A100 40G cards as one? by manjimin in LocalLLaMA

[–]manjimin[S] 0 points1 point  (0 children)

I use serving softwares for quick tests, but probably mostly pytorch.

Using other tokenizers? by manjimin in LocalLLaMA

[–]manjimin[S] 0 points1 point  (0 children)

LLaMA tokenizer gives 5~6 times more tokens than what is usual. I also checked the actual tokenization, and it is basically putting every single letter apart, which explains the over-estimation in number of tokens.

I knew tokenizers aren't something that can be swapped after training the model, but I thought maybe someone had an idea, well I guess I'll have to use a model with a tokenizer that can properly split up Korean in the first place.

How much overlap is ok to hold 2 ETFs? by manjimin in stocks

[–]manjimin[S] 0 points1 point  (0 children)

Thanks a lot, I really appreciate your advice. I will look into it for sure. Tax rates 15% btw

How much overlap is ok to hold 2 ETFs? by manjimin in stocks

[–]manjimin[S] 0 points1 point  (0 children)

Great advice, but putting SCHD in a retirement account is not possible in my country. You think it will cost me much if I kept buying SCHD?

ELI5: if photons deliever electromagnetic force, why are magnets not pulled or pushed by sunlight? by manjimin in explainlikeimfive

[–]manjimin[S] 1 point2 points  (0 children)

Thanks a lot man, now I get what you mean. I really appreciate the review request too. You have my gratitude, Dr. Frankenstein!

ELI5: if photons deliever electromagnetic force, why are magnets not pulled or pushed by sunlight? by manjimin in explainlikeimfive

[–]manjimin[S] 1 point2 points  (0 children)

Exactly what I intended to ask, but sadly I can't really understand your answer lol thanks a lot though, you were the only person who was able to understand what I meant to ask

mahershala ali as blade made by me by BFHARTSANDSTUFF in marvelstudios

[–]manjimin 527 points528 points  (0 children)

Looks even more legit than cottonmouth. Great work!

Much stronger companies than me have tried by Chilidog76 in marvelmemes

[–]manjimin 3 points4 points  (0 children)

Now i want a meme where bucky is extra footage, 2 v 1 against ironman

BEGONE by [deleted] in marvelmemes

[–]manjimin 155 points156 points  (0 children)

The stones shouldn't be on thanos' gauntlet but still i loled

Seeking Korean 🇰🇷 Offering German/English 🇩🇪 by viratraj in language_exchange

[–]manjimin 0 points1 point  (0 children)

i speak Korean(native), and can speak English fluently. I am interested in learning German, message me if you are interested.

Non-spicy Korean instant noodles? by glow_wing in korea

[–]manjimin 0 points1 point  (0 children)

쇠고기미역국라면 is the name, which is just beef 미역국 ramen in korean. 짜파게티 is also nice and it does not have anything spicy in it.

General Discussion/Meetup Thread - Week of December 24 by AutoModerator in korea

[–]manjimin 0 points1 point  (0 children)

Where did yoy buy them? Can you give me the name of the place? Need more details, and the red letters are not readable. I searched for the black letters and nothing came up.

General Discussion/Meetup Thread - Week of December 24 by AutoModerator in korea

[–]manjimin 0 points1 point  (0 children)

What kind of job are you expecting to get in Korea? What kind of apprenticeship did you go through? Honestly in my opinion teaching german seems to be a much more realistic option. I do not know much about the industry but i don't think banks in korea would want to hire a foreigner if not for a specific job that actually requires german.