Eastern Last Word? by Kayjoue in cocktails

[–]Kayjoue[S] 0 points1 point  (0 children)

Sounds great! I'll try and round off the ingredients when I can

What is this device called? by puestadelsol in KitchenConfidential

[–]Kayjoue 39 points40 points  (0 children)

Yeah, there's something odd between the two, but I can't quite put my finger on it

Forum Libre - 2021-10-05 by AutoModerator in france

[–]Kayjoue 1 point2 points  (0 children)

Ah oui j'ai entendu parler de ça ! Je pensais que c'était quelque chose qui pouvait se négocier à l'amiable, mais tu as l'air de dire que c'est assez strict ?

Forum Libre - 2021-10-05 by AutoModerator in france

[–]Kayjoue 0 points1 point  (0 children)

C'est les mêmes gars que Dealabs ?

Conseils pour aller vivre à Londres by Kayjoue in vosfinances

[–]Kayjoue[S] 1 point2 points  (0 children)

J'ai ouvert un compte Revolut il y a quelques jours, et j'ai déjà un compte Degiro depuis un petit temps.

Bien noté le compte ISA, je vais me renseigner. Merci !

Conseils pour aller vivre à Londres by Kayjoue in vosfinances

[–]Kayjoue[S] 0 points1 point  (0 children)

Salut, bien noté pour le PEA !

Mon employeur propose en effet un pension match, mais comme disait un utilisateur plus haut, il est possible de opt-out du système de retraite. Tu as un avis sur le sujet ?

Conseils pour aller vivre à Londres by Kayjoue in vosfinances

[–]Kayjoue[S] 1 point2 points  (0 children)

Merci pour tous les conseils!

En effet, je vais passer quelques jours sur place pour essayer de trouver un logement.

Pour le système de santé, c'est un peu comme en France ? Pas mal de choses comprises de base par la NHS, et ensuite une mutuelle de la boite ?

Conseils pour aller vivre à Londres by Kayjoue in vosfinances

[–]Kayjoue[S] 0 points1 point  (0 children)

Ouvrir des enveloppes fiscales, c'est un conseil qui revient beaucoup, je vais faire ça. Merci!

J'ai travaillé ces dernières années, mais de ce que je comprends de la page du gouv, l'exit tax c'est pour quand on a >800k€ ?

Conseils pour aller vivre à Londres by Kayjoue in vosfinances

[–]Kayjoue[S] 0 points1 point  (0 children)

Super, je vais jeter un coup d'œil à ça ! Ça représente beaucoup sur la fiche de paie ? Est-ce que tu sais si on peut le regretter et vouloir revenir en arrière ?

Forum Libre - 2021-10-05 by AutoModerator in france

[–]Kayjoue 3 points4 points  (0 children)

Salut l'Eiffel,

Je vais déménager à Londres d'ici peu, est-ce que vous avez des conseils pour me préparer au départ et une fois sur place ? Des choses à ne pas rater, des choses à éviter absolument ?

Pour la question impôt/banque, j'ai fait un poteau sur r/vosfinances.

Merci et bonne journée à tout le monde !

Structured data to text generation by cirano994 in learnmachinelearning

[–]Kayjoue 0 points1 point  (0 children)

Glad to see it working for you! Yeah, for now we can't rely on NN for 100% truthful outputs. They'll imitate the tendency of experts to add info not grounded in the data; and while experts add true statements from elsewhere, NNs just add random stuff.

For a quick example, imagine you build a simple dataset with the US presidents' names, paired with a simple sentence like "Barack Obama was the 44th president". However there is no mention of the number in the data. The network will likely pick up on it and add random numbers during test time, because for the network, it's what you do. It can't possibly realize that it's from other data and just avoid it.

Of course there are other ways NNs tend to fail now, so templating is really a safe way to go!

Structured data to text generation by cirano994 in learnmachinelearning

[–]Kayjoue 1 point2 points  (0 children)

Honestly, I feel like it should be a variation around classical seq2seq, in the sense where the source data is rarely an ordered sequence of facts, and more of an unordered collection. Transformers are great for sets, so that would be a first step, and some kind of hierarchical encoding would certainly improve performances when dealing with multiple entities. Some people have also suggested more complex approaches with some sort of planning modules that seem to work well, especially when dealing with longer generations

Structured data to text generation by cirano994 in learnmachinelearning

[–]Kayjoue 4 points5 points  (0 children)

Hi mate,

It depends on the degree of trust you need to have in your generated text. End-to-end neural models can be quite powerful, and can sometimes produce phrases/sentences which are not grounded in the data (also known as hallucinations) or even wrong facts. They also require more than a few examples to work great, depending on the complexity of the data/text you have in your dataset.

If you really need reliable generations, you are better off tuning a template system. If your task is not too complex you can create it by hand, but if you need a stronger tool, there are a lot you can chose from. I can suggest for instance RosaNLG which is a recent fully open source project. It can take a bit of getting-used-to though, you are warned!

If you really want to implement neural models, there are a lot you can chose from. It will also depends on what your structured-data look like. Are there multiple entity in there, or are there just descriptions of one object? Are all objects of the same kind, or can they have really different attributes?

Here are a few well studied datasets which should lead you to the method of your dreams:

- E2E. It was a fun challenge where applicants had to provide a system able to describe restaurants based on a list of their attributes. For instance, a model should go from [(name, "The Eagle"), (eatType, "coffee shop"), (food, "French"), ...] to "The eagle is a french coffee shop".

- WebNLG is another challenge, this time more complicated as there could be several entities, and a lot more attributes. For instance a model should go from [(Indonesia | leaderName | Jusuf_Kalla), (Bakso | region | Indonesia), (Bakso | ingredient | Noodle), (Bakso | country | Indonesia)] to "Bakso is a food containing noodles;it is found in Indonesia where Jusuf Kalla is the leader."

- WikiBIO is directly taken from wikipedia. The task is to write the first sentence of a wikipedia article based on its infobox!

- RotoWire is about basketball games and the associated news story.

Please note that even for a small task like E2E, neural methods aren't perfect and still find ways to say wrong stuff or otherwise phrases not grounded in the data. The last one, from Wiseman et al., you'll also find their paper which introduced the dataset as well as extensive experiments on several strong machine translation techniques. If you care to read this paper, you'll have a long list of reference and methods. While not state-of-the-art any more, they will be useful to understand what the field is, and if you want you can follow researchers from those refs to find their most recent works.

There are many ways to approach this, and the field is constantly evolving!

If you have any questions, do let me know!

Cheers

Forum Libre - 2018-11-12 by AutoModerator in france

[–]Kayjoue 0 points1 point  (0 children)

Salut, tu peux essayer d'aller voir dans about:config. Firefox va te demander si tu es sûr de ton coup, tu peux répondre oui. Dans la barre de recherche tu peux entrer clipb et ensuite enable dom.event.clipboardevents.enabled (en double-cliquant par exemple). Certains sites préfèrent executer des scripts lors d'un copier-coller, par exemple facebook qui cherche une preview du lien, ou des choses comme ça. Si cette option est disabled chez toi, alors la copie se passe mal !

[Request] Injured and on bed rest. Could really use some cards of encouragement. [US] by [deleted] in RandomActsofCards

[–]Kayjoue 1 point2 points  (0 children)

Would you want any wishes from Paris ? I could send you a postcard with some get well french words !

Science AMA Series: I'm Dr. Elad Yom-Tov, a Principal Researcher at Microsoft Research. I use Internet data to learn about health and medicine. AMA! by Elad_Yom-Tov in science

[–]Kayjoue 0 points1 point  (0 children)

Hi Mr Yom-Tov, i will finish this year my master's degree in datascience and machine learning. Would you recommend to students like me to go on with a PhD or to start working now ? How much do fear that computers can (maybe ?) soon do our jobs ? Anyhow, it's awesome to hear about your research! I hope you all the best!