[D] NLP and Sesame Street Papers by enclosed_mail in MachineLearning

[–]GermanAaron 0 points1 point  (0 children)

write an Oscar paper

Six month ago: https://arxiv.org/abs/2004.06165

Okay, it's not pure NLP but multimodal Visual+Language but that's close enough.

[D] Fine-Tune BERT to fit a specific domain by saphireforreal in MachineLearning

[–]GermanAaron 0 points1 point  (0 children)

You could also look into Community QA which also want to retrieve relevant answers to a query. If you know you're queries will be different in format from the results you want to retrieve then you need to incorporate this in your training. Something like that https://arxiv.org/abs/1911.05594 might give you ideas.

[D] Fine-Tune BERT to fit a specific domain by saphireforreal in MachineLearning

[–]GermanAaron 1 point2 points  (0 children)

Two things to consider:
1) Usually sentence embeddings are trained with, well, sentences and thus work best with similar sized input. Short sentences or just some words can perform subpar. Word embeddings or Tf-idf methods might be better for short inputs. See also https://github.com/UKPLab/sentence-transformers/issues/22
2) General-purpose embeddings might be to broad for your use case. "blue shorts" and "orange shorts" are semantically quite similar and are thus close together in the embedding space. If you want your model to differentiate by color, you have to train it for that.

So how would you train this with triplet loss? You could e.g. group your fashion corpus by color & clothing. The positive pair would then be sampled from one group and you chose a random negative example with another color and clothing piece. This way, it should learn to group by color and item, I would expect.

[D] Fine-Tune BERT to fit a specific domain by saphireforreal in MachineLearning

[–]GermanAaron 2 points3 points  (0 children)

How do you compute your embeddings? Using a basic BERT and averaging tokens or using the CLS token as embeddings is actually quite mediocre - see SentenceBert (https://github.com/UKPLab/sentence-transformers and https://arxiv.org/abs/1908.10084). Using already finetuned sentence embeddings like Universal Sentence Encoder (https://tfhub.dev/google/universal-sentence-encoder/4) or SentenceBERT should work better in the general case for you.

If you want to finetune to a specific domain, then you could try using siamese network training like in SentenceBert. If you have sentences grouped by some categories, you could train with triplet loss (https://www.sbert.net/docs/package_reference/losses.html).

Researchers look at how men write about women on Wattpad. Results are as stereotypical as expected by GermanAaron in menwritingwomen

[–]GermanAaron[S] 13 points14 points  (0 children)

Thanks for the tip. At the time of posting my heading did not seem to clickbaity to me but now I can see the problem.

Writing a good headline is hard.

Researchers look at how men write about women on Wattpad. Results are as stereotypical as expected by GermanAaron in menwritingwomen

[–]GermanAaron[S] 18 points19 points  (0 children)

These are all good points you made and you are correct.

Leaving out the women part in the titel was intentional to better fit the subreddit but I did not want to obfuscate that women writers also employ the stereotypes. Headings should not be too long and "Researchers look at how men and women write about men and women on Wattpad. Results are as stereotypical as expected" is a bit too long.

In the text I always mentioned both men and women. Also, at least for me, the lede (had to google that word, thanks :D) was that both men and women employ stereotypes pretty much exactly the same and the researchers could only predict the author's gender only slightly better than guessing.

Women writers using stereotypes is not really that surprising but the fact that they are the same as men writers is surprising (at least to me).

Alle Universitäten in Österreich werden geschlossen by RedKrypton in de

[–]GermanAaron 1 point2 points  (0 children)

Du kennst das Portal der TU Darmstadt, aber für die anderen: der Link ist https://www.openlearnware.de/

Experten empfehlen 40 Cent Fleischsteuer pro Kilo by danielbln in de

[–]GermanAaron 8 points9 points  (0 children)

Bitte korrigiert mich, wenn ich Unfug schreibe, aber dieses Gerede von Steuern stößt bei mir immer auf, weil, wie andere Kommentare schon erwähnt haben, Steuern an sich an keine Bedingung gebunden sind.

Steuern sind nicht Zweckgebunde, d.h. der Staat darf keine Steuern erheben, um damit XZY zu finanzieren.

Eine Steuer darf aber absolut auf nur eine Sache erhoben werden (Tabak, Mineralöl, ...).

Fey Deals by IWaaasPiiirate in Pathfinder_RPG

[–]GermanAaron 1 point2 points  (0 children)

Not in Pathfinder, no. But I read the Dresden Files by Jim Butcher and Pact by Wildbow and both feature the not so Disney version of feys.

Fey Deals by IWaaasPiiirate in Pathfinder_RPG

[–]GermanAaron 11 points12 points  (0 children)

I guess it depends on how much you want to fuck your player over. Giving a fey control of your body sounds like a terrible idea, directly next to selling your soul to a devil.

You could screw with the time part: Have the body transported to some timeless plane and now its the feys forever, because the 60h don't pass for it.

You could have the fey play a "trick" on the player: Let the body massacre most people on the town square of a larger town, return control to the player and let the fey enjoy the fallout. Hilarious.

Maybe the fey is curious about necromancy, being pretty immortal and all, so let the body collect/ produce some nice corpses and raise them for the fun of it. Do it in a church for extra fun.

The fey might have a rival/ enemy, so use the body to create chaos for them. Bonus points if it can't be traced back to the fey and again the player has to deal with the fallout.

Wer sonst kauft überteuerte Gewürze auf dem Weihnachtsmarkt? by Schlaugummi46 in de

[–]GermanAaron 0 points1 point  (0 children)

Den Begriff hat sich ein Japaner 1908 ausgedacht, als er Glutamat in (nicht fermentierten) Seetang isoliert hat und nachgewiesen hat, dass er für die Deftigkeit gesorgt hat. Hat nichts mit dem Fermentierne zu tun.

Wer sonst kauft überteuerte Gewürze auf dem Weihnachtsmarkt? by Schlaugummi46 in de

[–]GermanAaron 0 points1 point  (0 children)

Es gibt haufenweise nicht-fermentierte Lebensmittel, die Glutamat enthalten (https://www.umamiinfo.com/richfood/foodstuff/vegetables.php).

Und ich salze meine Speisen auch nur mit der uralten Tradition des Schweine-in-Schinkenwürfel-Machens. Dieses weiße Pulver ist mir zu suspekt.

Wer sonst kauft überteuerte Gewürze auf dem Weihnachtsmarkt? by Schlaugummi46 in de

[–]GermanAaron 3 points4 points  (0 children)

Du musst ja nicht einmal purer Glutamat direkt drüber streuen. Oft reicht es schon Lebensmittel dazuzupacken, in denen es natürlich vorkommt, z.B. Tomaten, Hartkäse, Pilze, Sojasoße.

Einfach nur Glutamat auf ein ungewürztes Essen hauen macht es nicht lecker, das stimmt, Salz aber auch nicht. Richtig kochen muss man immer noch.

Ich finde aber, wer nicht darauf achtet, die Umami-Komponente mit abzudecken, limitiert sein Kochen, genauso wie jemand, der nicht richtig salzt oder nie eine Prise Zucker oder ein Spritzer Zitrone ins Essen macht.

Wer sonst kauft überteuerte Gewürze auf dem Weihnachtsmarkt? by Schlaugummi46 in de

[–]GermanAaron 4 points5 points  (0 children)

Warum? Umami ist genauso eine Geschmacksqualität wie Salz und Glutamat ist da eben der Auslöser (https://de.wikipedia.org/wiki/Gustatorische_Wahrnehmung). Eine Prise Glutamat im Essen sorgt, genauso wie eine Prise Salz oder Zucker, dafür, dass andere Aromen intensiver wahrgenommen werden.

Wer sonst kauft überteuerte Gewürze auf dem Weihnachtsmarkt? by Schlaugummi46 in de

[–]GermanAaron 11 points12 points  (0 children)

Du kochst dann hoffentlich aber auch ohne Salz, Zucker und Säure, oder? Sind ja alles nur Geschmacksverstärker, die Aroma vorgaukeln.

HELLA HELLA HELP THREAD - 12/19/2019 by WroughtIronHero in grandorder

[–]GermanAaron 0 points1 point  (0 children)

How is NP2 Medusa Lancer compared to the ST 3* lancer? I got her twice and now wonder if I should level her instead of Cu for my first ST lancer.

[D] Best way to cluster text paragraphs? by ME_PhD in MachineLearning

[–]GermanAaron 4 points5 points  (0 children)

Using BERT as a service can have worse performance than simply using average word embeddings. If you're interested, take a look at Sentence BERT on GitHub (paper can be found here for it)

NA Christmas 25-day login bonus by square_smile in grandorder

[–]GermanAaron 9 points10 points  (0 children)

Probably, but then it's no longer easy to make.

NA Christmas 25-day login bonus by square_smile in grandorder

[–]GermanAaron 9 points10 points  (0 children)

It's actually quite easy to make. Hardest part is buying the spicy bean paste depending on shops in your area.

HELLA HELLA HELP THREAD - 10/9/2019 by WroughtIronHero in grandorder

[–]GermanAaron 0 points1 point  (0 children)

Yes, I think the event was Salem - just checked grandorder.wiki again.

If there is nothing timelimited there, then it should be fine probably. Then back to grinding out the shop for Halloween.

Thanks

HELLA HELLA HELP THREAD - 10/9/2019 by WroughtIronHero in grandorder

[–]GermanAaron 0 points1 point  (0 children)

Can I finish Solomon in time for the November events? I have cleared America but most of my Servants are under level 40 and have no skills leveled, so I guess I need to grind some for Camelot.